Google has announced a new option for website owners to opt out of having their content used to train the company's generative AI models, including Bard. This option will be available in the coming weeks and will be implemented through a robots.txt directive.
To opt out, website owners will need to add the following line to their robots.txt file:
User-agent: Google-Extended Disallow: /
This will prevent Google's extended crawler from accessing the website, which is the crawler that is used to train generative AI models.
Google says that it is committed to developing AI responsibly and that it understands that website owners need more options and control over how their content is used. The company also says that it will continue to work with website owners to ensure that they have the information they need to make informed decisions about how their content is used.
This new opt-out option is a welcome step for website owners who are concerned about how their content is being used to train AI models. It is also a sign that Google is listening to feedback from the community and is committed to developing AI in a responsible and ethical way.
It is important to note that this opt-out option will only apply to Google's generative AI models. Other companies may still be able to access and use website content to train their own AI models.
Will you be opting out?
Comments