What does "Block Known AI Bots" actually do?

Maybe the subject line of this post isn’t completely accurate, as I know what the option purports to do. But how does it actually accomplish this? What does it change in the HTML, etc. to accomplish this? I’d like to understand the effects of enabling or disabling this option, in the context of how it affects my web site visibility elsewhere.

Also, a question about the generated robots.txt. I notice that it isn’t exported with the site every time; it seems to regenerate only when it’s missing. And despite my choices of search engine control on individual pages, it only ever seems to generate generic rules of:

User-agent: *

Disallow:

Sitemap: url of site

It’s not critical - I can manually create a robots.txt file for the site. But I’m wondering if I’m missing something somewhere about any available options to customize it within Sitely.