Free Robots.txt Generator
Create a Robots.txt File for Your Website
Control how search engine bots crawl your site. Add crawl rules for any bot, block sensitive paths, and include your sitemap URL — all in one click.
- Add rules for All bots (*) or specific crawlers (Googlebot, Bingbot…)
- Set Allow and Disallow paths per user-agent
- Add Sitemap URL, Crawl-delay and custom directives
Place the generated robots.txt at the root of your domain:
https://example.com/robots.txt
Preview: minimal robots.txt file.
Configure your robots.txt
⚠️ Please fix the following errors:
Apply a common robots.txt preset as your starting point.
🗺 Sitemap & Global Settings
Recommended — helps search engines find all your pages.
Optional. Not supported by all bots.
Used by Yandex to specify preferred domain.
Additional Sitemap URLs
Add extra sitemap URLs (news, video, image sitemaps).