Updated on December 9th, 2024 at 11:20 pm
Estimated reading time: 2 minutes
Issue: External resources (e.g., CSS, JavaScript, or images) hosted on another website and blocked by their robots.txt
file can prevent search engines from rendering your pages correctly, potentially harming your rankings.
Fix: If the blocked resources are critical for your site, contact the external website owner to request access. If they’re not essential, you can ignore the issue.
How to Fix for Beginners
- Identify Blocked Resources: Use Google Search Console’s URL Inspection tool to find which external resources are blocked.
- Example: A font file hosted at
https://externalsite.com/font.css
is blocked by theirrobots.txt
.
- Example: A font file hosted at
- Evaluate Importance: Determine if the resource affects how your site appears or functions.
- Critical: CSS files used for layout or fonts impacting design.
- Non-Critical: Images or scripts that don’t affect core functionality.
- Contact the Resource Owner: Reach out to the owner of the external site to request they unblock the resource in their
robots.txt
file.- Example: “Your file at
https://externalsite.com/resource.css
is blocked and affects my website’s rendering. Could you allow access?”
- Example: “Your file at
- Replace the Resource (If Necessary): If the resource is critical but cannot be unblocked, consider hosting it locally on your server if allowed by the external site’s terms.
- Ignore Non-Critical Resources: For resources that don’t impact your site’s functionality, no action is needed.
Tip: Ensuring access to critical external resources helps maintain proper page rendering and SEO performance.
More articles relating to Robots.txt file:
- How to fix: Sitemap.xml not indicated in robots.txt
- How to fix: Issues with blocked internal resources in robots.txt
- How to fix: Format errors in Robots.txt file
- How to fix: Robots.txt not found
- How to fix: Pages blocked by X-Robots-Tag: noindex HTTP header
- How to fix: Issues with blocked external resources in robots.txt