What does "Block Known AI Bots" actually do?

Maybe the subject line of this post isn’t completely accurate, as I know what the option purports to do. But how does it actually accomplish this? What does it change in the HTML, etc. to accomplish this? I’d like to understand the effects of enabling or disabling this option, in the context of how it affects my web site visibility elsewhere.

Also, a question about the generated robots.txt. I notice that it isn’t exported with the site every time; it seems to regenerate only when it’s missing. And despite my choices of search engine control on individual pages, it only ever seems to generate generic rules of:

User-agent: *

Disallow:

Sitemap: url of site

It’s not critical - I can manually create a robots.txt file for the site. But I’m wondering if I’m missing something somewhere about any available options to customize it within Sitely.

Hi @syzygyrt,

the coding of robots.txt is correct.

The “Block known AI Bots” option adds a bunch of bot identifiers to the robots.txt file. By and large they respect it.

That’s what I thought it would do, but it doesn’t seem to be doing that. Whether I have “Block Known AI Bots” selected or not, the robots.txt file doesn’t change - it just has the default

User-agent: *
Disallow:

in it.

In the testing I’ve done with “Block Known AI Bots” toggled on and off, I delete the robots.txt file to ensure it is re-generated. But it always ends up with the default allow everything to everyone.

I’m running Sitely 6.1 (36100)

Dan