Go back

Can I add Oh Dear to my robots.txt?

Oh Dear crawls websites to report broken links and mixed content reporting. You can configure which pages we can and cannot crawl by adding us to your robots.txt file.

Default robots.txt

Most robots.txt files look similar to this.

User-agent: *
Disallow:

This essentially instructs any crawler (that adheres to the robots.txt spec) to crawl every page.

To limit the crawls, you can add our User-Agent to the list.

User-agent: OhDear
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

This instructs the Oh Dear crawler to ignore the

Want to get started? We offer a no-strings-attached 10 day trial. No credit card required.

Start monitoring

You're all set in
less than a minute!