Robots.txt API
Parse and analyze website robots.txt files with our Robots.txt API.
Robots.txt Checker
Check if a URL is disallowed in a website's robots.txt file.
Robots.txt files are cached for 60 minutes. Use the flush setting to force an update.
Enter a URL and select tokens to check if they're disallowed in the robots.txt.
Try with example URL: https://www.google.com/search with token: googlebot