Why this tool?
The Robots.txt Analyzer & URL Tester helps website owners, SEO professionals, and developers understand how search engines interact with their site. By pasting your robots.txt file directly into the tool, you can safely analyze crawling rules and test individual URLs to see whether they are allowed or blocked. All processing happens locally in your browser, with no data stored or transmitted.
How to use
To analyze your configuration, simply paste the contents of your robots.txt file into the Directives Editor. Once pasted, enter the URL or path you wish to test in the input field below. The tool instantly calculates the accessibility of that path based on the standard robots.txt protocol (RFC 9309), using the "longest match wins" logic. Green indicators signify a crawlable path, while red indicators alert you to blocked directives.