Robots.txt Generator
Tell search engines what to crawl and what to ignore. Generate valid robots.txt instructions in seconds.
Instructions
Robots.txt Output
User-agent: * Allow: /
Place this file in your website's root directory (e.g., yoursite.com/robots.txt) to guide search engine crawlers.
Guide the Search Engine Crawlers
A robots.txt file is a simple text file that resides in your website's root. It tells automated bots which pages they are allowed to visit and index. Properly configuring this file helps manage your crawl budget and prevents private or duplicate pages from appearing in search results.
Bot Control
Specify different rules for Googlebot, Bingbot, and others, or apply a global rule (*).
Protect Admin
Easily generate `Disallow: /admin` or other paths you want to keep out of search results.
Sitemap Link
Automatically append your sitemap URL so bots can discover your new content faster.
More Developer & SEO Tools
JSON Formatter
Validate and prettify JSON data.
JWT Decoder
Decode and inspect JWT tokens.
URL Encoder/Decoder
Encode or decode URL strings.
Base64 Encoder
Convert text or files to Base64.
Hash Generator
Generate MD5, SHA-1, SHA-256 hashes.
UUID Generator
Generate random UUIDs (v4).
Regex Tester
Test regular expressions online.
UTM Builder
Create tracked campaign URLs.
SEO Title Checker
Preview titles in SERPs.
Meta Description
Check description length and pixels.
Open Graph Preview
Test social media link previews.
Robots.txt Gen
Create robots.txt files easily.
Sitemap Generator
Build XML sitemaps for SEO.
Canonical Checker
Verify canonical tags.