Generate a properly formatted robots.txt file for your website. Control how Googlebot, Bingbot, AI crawlers and other bots access your pages.
Add your website URL and choose whether to allow all crawlers, block all crawlers, or set custom rules per bot.
Add specific rules for Googlebot, Bingbot, AI crawlers like GPTBot and ChatGPT-User, and other search engine bots.
Copy the generated code or download the robots.txt file, then upload it to the root directory of your website.
Robots.txt is just one piece of the puzzle. Get a free technical SEO audit and ensure search engines can properly crawl and index your site.