Analyze robots.txt rules and instantly test whether a specific URL is allowed or blocked for search engine crawling. This browser-based tool works without fetching data, ensuring privacy and accuracy.

How to Use

Step 1

Paste Content

Copy your robots.txt source and paste it into the editor below.

Step 2

Target URL

Enter a specific page path (e.g. /admin) or a full URL to test.

Step 3

Verify

Check if the crawler is Allowed or Blocked by your directives.

Directives Editor
1

Why this tool?

The Robots.txt Analyzer & URL Tester helps website owners, SEO professionals, and developers understand how search engines interact with their site. By pasting your robots.txt file directly into the tool, you can safely analyze crawling rules and test individual URLs to see whether they are allowed or blocked. All processing happens locally in your browser, with no data stored or transmitted.

How to use

To analyze your configuration, simply paste the contents of your robots.txt file into the Directives Editor. Once pasted, enter the URL or path you wish to test in the input field below. The tool instantly calculates the accessibility of that path based on the standard robots.txt protocol (RFC 9309), using the "longest match wins" logic. Green indicators signify a crawlable path, while red indicators alert you to blocked directives.