The issue that I had was that I wanted ahrefs to crawl my site (outdoorgearreview.com) using their Site Audit function and it returned with an error saying:
The HTTP server returned error 403: “Forbidden”. This request was likely filtered by server configuration.
There are a couple of ways around this:
- Deactivate the cleantalk.org plugin while the crawl is running
- Add the following code to your robots.txt file:
- Or filter the ahrefs bot in cleantalk… see steps below
Login To Cleantalk.org and click “Personal Lists”.
Scroll down and click “Add New Filter”
- Select “Ip Network” – Note: Do not select IP Address
- Enter in IP network range for Ahref Crawler:
- Make a note so that you know what you are either blacklisting or whitelisting
- Select the site you want to activate this for. Else select all sites.
- Choose whether you want to blacklist or whitelist
That’s it. I hope this helps. Comment below if you have questions.