How to Fix the “Indexed, but Blocked by Robots.txt” Error in WordPress

Last Updated on by Azib Yaqoob

The “Indexed, but Blocked by Robots.txt” error in Google Search Console occurs when a URL on your website has been added to Google’s index, but access to it is restricted by the robots.txt file. This situation can confuse search engines and negatively impact your SEO performance, as it sends mixed signals about your site’s content.

In this comprehensive guide, I will explore the causes of the error, provide step-by-step instructions to troubleshoot and fix it, and share best practices for optimizing your robots.txt file.

What Does “Indexed, but Blocked by Robots.txt” Mean?

When you see this error in Google Search Console, it means:

  • Googlebot found the URL and decided to index it.
  • The robots.txt file explicitly disallows crawling of that URL.
  • The conflicting instructions create ambiguity about the page’s content, potentially affecting your site’s ranking.

Common Causes

  1. Misconfigured Robots.txt File: Specific rules in the robots.txt file prevent search engines from crawling parts of your site.
  2. Intentional Blocking: Pages or resources intentionally restricted (e.g., admin pages) may have been indexed accidentally.
  3. Dynamic URL Indexing: Search engines sometimes index URLs linked to restricted pages (e.g., parameters or duplicates).
  4. Legacy Rules: Outdated rules from older site setups might still be blocking valid URLs.

Step-by-Step Guide to Fix the Error

Step 1: Verify the Error in Google Search Console

Indexed though blocked by robotstxt Indexed but blocked by robotstxt
  1. Log in to Google Search Console.
  2. Navigate to Coverage > Indexed, but Blocked by Robots.txt.
  3. Review the list of affected URLs:
    • Identify which URLs are incorrectly blocked.
    • Note any patterns (e.g., blocked sections or file types).

Step 2: Locate and Access Your Robots.txt File

The robots.txt file is typically located at the root of your website. For example:

https://yourwebsite.com/robots.txt

How to Access the File

  • WordPress Plugin:
    • If you use SEO plugins like Yoast SEO or Rank Math, you can edit the file directly from your WordPress dashboard.
      • Go to SEO Plugin > Tools > File Editor.
  • FTP or File Manager:
    • Log in to your server using FTP or a hosting control panel (e.g., cPanel).
    • Navigate to the root directory (usually public_html) to locate the robots.txt file.

Step 3: Analyze the Robots.txt Rules

Examine the rules in the file to identify which lines might be causing the issue.

Example of a Misconfigured Robots.txt File:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /category/

Here’s what each rule does:

  • User-agent: *: Applies rules to all bots.
  • Disallow: /wp-admin/: Blocks crawling of the admin area (this is normal).
  • Disallow: /wp-includes/: Prevents crawling of WordPress core files (also normal).
  • Disallow: /category/: Blocks all category archive pages, which might unintentionally block valuable content.

Step 4: Test and Debug the Robots.txt File

Use Google’s Robots Testing Tool to debug your robots.txt file:

  1. Go to Google’s Robots Testing Tool.
  2. Enter the URL of the affected page and test whether it’s blocked.
  3. If the tool indicates a block, locate the corresponding rule in your robots.txt file.

Step 5: Modify the Robots.txt File

To fix the error, remove or update the problematic rules. Below are common adjustments:

1. Allow Access to Blocked Sections

If legitimate content is blocked:

  • Replace Disallow with Allow for the specific path:
    User-agent: *
    Allow: /category/

2. Use Wildcards and Specific Rules

To unblock specific pages but keep others restricted, refine your rules:

User-agent: *
Disallow: /category/*
Allow: /category/specific-page/

3. Remove Overly Restrictive Rules

If the entire directory doesn’t need blocking:

# Remove this:
Disallow: /blog/

# Replace with specific exclusions:
Disallow: /blog/private/

Step 6: Clear Caches and Resubmit Robots.txt

After updating your robots.txt file:

  1. Clear your website and server caches to ensure the new file is live.
  2. Verify the updated file by visiting https://yourwebsite.com/robots.txt.
  3. Resubmit the updated file in Google Search Console:
    • Go to Settings > Robots.txt Tester.
    • Submit the file and request a re-crawl.

Step 7: Use Noindex Tags for Specific URLs

If you want a page indexed but not crawled, use a noindex meta tag instead of blocking it in robots.txt:

  1. Edit the page in WordPress.
  2. Add the following meta tag to the page header:
    <meta name="robots" content="noindex, follow">
  3. Save changes and ensure no conflicting robots.txt rules exist.

Step 8: Monitor the Changes

After implementing fixes:

  1. Return to Google Search Console.
  2. Validate the fix for affected URLs under Coverage > Indexed, but Blocked by Robots.txt.
  3. Monitor the URLs for updated status reports.

Best Practices for Managing Robots.txt in WordPress

  1. Keep it Simple: Avoid overly complex rules that may unintentionally block valuable content.
  2. Block Irrelevant Sections:
    • Common exclusions:
      Disallow: /wp-admin/
      Disallow: /wp-includes/
  3. Enable Sitemap Access:
    • Always allow search engines to access your sitemap:
      Allow: /sitemap.xml
  4. Test Regularly: Use the Robots Testing Tool periodically to ensure rules are correctly applied.
  5. Combine Robots.txt with Noindex: Use noindex tags for finer control over indexed content.

Conclusion

The “Indexed, but Blocked by Robots.txt” error can hinder your SEO efforts, but it’s manageable with proper troubleshooting. By carefully analyzing and updating your robots.txt file, testing your changes, and monitoring Google Search Console, you can resolve the error and ensure your site performs optimally in search results.

Regular maintenance and adherence to best practices will prevent similar issues from recurring, keeping your WordPress site in good health and search-friendly.

Let me know in the comments if you face any issues.

Azib Yaqoob
Azib Yaqoob

Azib is dedicated to making website management easy for beginners. On his blog, he shares straightforward tips and tricks to help you master WordPress. From learning the basics to boosting your site's visibility with SEO, Azib's got you covered. Ready to take your website to the next level? Contact Azib today and start optimizing your WordPress journey!

Articles: 103

Leave a Reply

Your email address will not be published. Required fields are marked *