Google Complementary Robots.txt Protocols & AI

Hey guys! Google just dropped some big news last night. They’re working on developing a new protocol to go hand-in-hand with the good old robots.txt. Why? Well, it’s all because of the fancy new generative AI technologies that Google and other companies are putting out these days.

This comes right after Open AI made waves by getting access to paywalled content for their ChatGPT service. But honestly, I’m not surprised that Google and others are exploring alternatives to robots.txt considering all the wild generative AI stuff happening on the web.

Now, don’t get too excited just yet. This announcement doesn’t mean anything is changing immediately. All Google is saying is that they’re gonna have some discussions with the “community” in the “coming months” to come up with fresh ideas for a new solution.

In Google’s own words, “Today, we’re kickstarting a public discussion and inviting members of the web and AI communities to chime in on complementary protocols. We want a diverse range of voices, from web publishers, to civil society, academia, and everything in between, to join the conversation. We’ll be getting these interested folks together over the next few months.”

On top of that, Google believes that it’s high time for the web and AI communities to explore other machine-readable ways to give web publishers more choice and control for all the new AI and research cases popping up.

So, there you have it, folks. Google’s looking to shake things up a bit and get some input from all the different peeps in the web and AI worlds. It’ll be interesting to see what they come up with. Stay tuned!

Bookmark the permalink.

Comments are closed.