The 403 forbidden  status codes occurs when a web server denies access to our crawler for security purposes.

If you see this issue after you start crawling:

then try selecting "Include Custom HTTP Header"

in *Advanced Settings* so that we send the header with
a unique value assigned to your team with every request we send.
This is very useful for whitelisting our traffic.



If you do not own this site, then let us know via chat support, and we will look into a solution. Keep in mind some sites highly protected security measures to prevent all automated crawlers. Except for Google of course.

Did this answer your question?