Is today’s world (AI models training, security, etc.), I wonder if site owners are well equipped with robots.txt as the only way to control who’s in and who’s out when it comes to deciding free and massive content consumption on their site.