How to fix: Issues with blocked external resources in robots.txt

Issue: External resources (e.g., CSS, JavaScript, or images) hosted on another website and blocked by their robots.txt file can prevent search engines from rendering your pages correctly, potentially harming your rankings.

Fix: If the blocked resources are critical for your site, contact the external website owner to request access. If they’re not essential, you can ignore the issue.

How to Fix for Beginners

  1. Identify Blocked Resources: Use Google Search Console’s URL Inspection tool to find which external resources are blocked.
    • Example: A font file hosted at https://externalsite.com/font.css is blocked by their robots.txt.
  2. Evaluate Importance: Determine if the resource affects how your site appears or functions.
    • Critical: CSS files used for layout or fonts impacting design.
    • Non-Critical: Images or scripts that don’t affect core functionality.
  3. Contact the Resource Owner: Reach out to the owner of the external site to request they unblock the resource in their robots.txt file.
    • Example: “Your file at https://externalsite.com/resource.css is blocked and affects my website’s rendering. Could you allow access?”
  4. Replace the Resource (If Necessary): If the resource is critical but cannot be unblocked, consider hosting it locally on your server if allowed by the external site’s terms.
  5. Ignore Non-Critical Resources: For resources that don’t impact your site’s functionality, no action is needed.

Tip: Ensuring access to critical external resources helps maintain proper page rendering and SEO performance.