What do I need to do about Google Coverage Issue: Blocked by Robots.txt?

Robots.txt is a file that blocks several pages from your site from being crawled by Google for Security Purposes.

Last edited: July 8th, 2021

If you have received a Google Coverage Issue notification regarding pages on your site being "Indexed, though blocked by robots.txt", do not worry, as these blocks are completely intentional.

There are pages that we lock down from being crawled by Google for security purposes. Usually pages where sensitive data is transmitted. (e.g. the Your Account page, the checkout page, etc.)


You can check your robots.txt file to see exactly which pages are blocked on your site by going to yourdomain.com/robots.txt