How to fix: Issues with blocked internal resources in robots.txt

Issue: Blocking internal resources like CSS, JavaScript, or images in your robots.txt file prevents search engines from fully rendering your webpages. This can negatively affect indexing and rankings.

Fix: Update your robots.txt file to allow access to essential resources.

How to Fix for Beginners

  1. Identify Blocked Resources: Use tools like Google Search Console’s URL Inspection tool to see which resources are blocked.
    • Example: If CSS or JavaScript files like /assets/styles.css or /scripts/main.js are blocked, they may appear in the report.
  2. Check Your robots.txt File: Locate the Disallow directives in your robots.txt file.
    • Example: Disallow: /assets/ blocks all files in the /assets/ folder, including critical CSS or JS.
  3. Allow Essential Resources: Remove or update the Disallow directive to ensure search engines can access key resources.
    • Example: Change:
      Disallow
      

      To:

      Allow: /assets/styles.css
      Allow: /scripts/main.js
      
  4. Test for Fixes: Use Google’s robots.txt Tester in Search Console to confirm that the updated file no longer blocks critical resources.
  5. Validate Page Rendering: After unblocking, check your site’s rendering using Google Search Console to ensure it displays properly.

Tip: Allowing access to essential resources ensures search engines can fully understand and index your pages, improving rankings.