Tag Archive for: Semrush site audit tips

How to fix: Resources formatted as page link

Issue: Using <a href> tags to link to resources (like images) instead of pages can confuse search engines, leading them to interpret resource links as webpage links. This may signal poor site structure.

Fix: Replace <a href> links with the correct tags. For instance, use <img> tags for images, and avoid linking directly to resources with <a href> unless linking to a page.

Recommendations

  1. Use <img> Tags for Images: Ensure images are embedded with <img src="image-url.jpg" alt="description"> rather than <a href="image-url.jpg">.
  2. Check Media and Downloads: If linking to downloadable files or resources, specify in the link text that it’s a download, or consider a different approach to avoid confusing crawlers.
  3. Review for Consistency: Regularly audit your site to make sure all <a href> links point to actual pages, not standalone resources.
  4. Add Alt Text for Accessibility: Always include descriptive alt attributes for images to improve SEO and accessibility.

Tip: Structuring links correctly helps search engines understand your site and improves user experience.

How to fix: Links with no anchor text

Issue: Links with no anchor text, raw URLs, or symbols provide no context about the target page. This affects user experience and search engine indexing, missing an opportunity to optimize the linked page’s performance in search results.

Fix: Add clear, descriptive anchor text that explains what the linked page is about.

How to Fix for Beginners

  1. Identify Links with No Anchor Text: Look for links on your site that display as raw URLs or use only symbols.
    • Example: <a href="https://example.com"></a> or <a href="https://example.com">!!</a>.
  2. Write Descriptive Anchor Text: Replace empty or raw anchors with meaningful text that describes the target page.
    • Example: Change https://example.com to <a href="https://example.com">Learn More About SEO Basics</a>.
  3. Use Relevant Keywords: Include keywords that describe the content of the linked page, but keep it natural and concise.
    • Example: Use “SEO Starter Guide” instead of “Click here.”
  4. Avoid Overly Generic Text: Avoid anchors like “here” or “this page” and aim for specificity.
    • Example: Replace “Click here” with “Explore our SEO tools.”
  5. Audit Links Regularly: Use SEO tools to scan your site for links with missing or unhelpful anchor text and update them.

Tip: Descriptive anchor text helps search engines and users understand the linked content, improving usability and SEO.

How to fix: URLs with a permanent redirect

Issue: Permanent redirects (301/308) are useful but can waste crawl budget if overused. Too many redirects can also confuse users and slow down their browsing experience.

Fix: Review your permanent redirects and replace them with direct target URLs wherever possible.

How to Fix for Beginners

  1. Identify Redirects: Use an SEO tool to find URLs that redirect users to another page using 301 or 308 redirects.
    • Example: https://example.com/old-page permanently redirects to https://example.com/new-page.
  2. Update Internal Links: Replace links to the redirected URL with links to the final destination page.
    • Example: Change links pointing to https://example.com/old-page to point directly to https://example.com/new-page.
  3. Limit Chains: Avoid redirect chains where a URL redirects multiple times before reaching the final page.
    • Example: Instead of Page A -> Page B -> Page C, make it Page A -> Page C.
  4. Check for Unnecessary Redirects: Remove redirects that no longer serve a purpose or are outdated.
    • Example: If https://example.com/temp-redirect was a temporary fix, delete it and use the final page link.
  5. Monitor Crawl Efficiency: Use tools like Google Search Console to track your crawl budget and ensure search engines can efficiently access your site.

Tip: Minimizing unnecessary redirects improves crawl efficiency, speeds up navigation, and enhances user experience.

How to fix: Issues with broken external JavaScript and CSS files

Issue: Broken external JavaScript or CSS files hosted on another site can prevent your webpages from rendering properly. This impacts user experience and may harm your search rankings.

Fix: Contact the owner of the external site to fix the broken file or replace it with a functional alternative.

How to Fix for Beginners

  1. Identify Broken Files: Use browser developer tools or an SEO audit to find which external JavaScript or CSS files are broken.
    • Example: https://example.com/styles.css or https://example.com/script.js is not loading.
  2. Test File Accessibility: Paste the file URL directly into a browser to see if it loads. If it doesn’t, it’s broken.
  3. Contact the File Owner: Reach out to the external website hosting the file and request a fix.
    • Example: “The JavaScript file at https://example.com/script.js is not working. Could you check and resolve the issue?”
  4. Find a Replacement: If the file is critical and can’t be fixed, search for an alternative or consider hosting a working version on your own server (if allowed).
    • Example: Download the necessary script or CSS file and host it at https://yourdomain.com/scripts.js.
  5. Test Your Site: After resolving the issue, test your site to ensure all scripts and styles work as intended.

Tip: Regularly monitor external resources to avoid disruptions and maintain your website’s functionality and SEO performance.

 

More articles relating to JavaScript issues:

 

More articles relating to Broken elements:

How to fix: Issues with blocked external resources in robots.txt

Issue: External resources (e.g., CSS, JavaScript, or images) hosted on another website and blocked by their robots.txt file can prevent search engines from rendering your pages correctly, potentially harming your rankings.

Fix: If the blocked resources are critical for your site, contact the external website owner to request access. If they’re not essential, you can ignore the issue.

How to Fix for Beginners

  1. Identify Blocked Resources: Use Google Search Console’s URL Inspection tool to find which external resources are blocked.
    • Example: A font file hosted at https://externalsite.com/font.css is blocked by their robots.txt.
  2. Evaluate Importance: Determine if the resource affects how your site appears or functions.
    • Critical: CSS files used for layout or fonts impacting design.
    • Non-Critical: Images or scripts that don’t affect core functionality.
  3. Contact the Resource Owner: Reach out to the owner of the external site to request they unblock the resource in their robots.txt file.
    • Example: “Your file at https://externalsite.com/resource.css is blocked and affects my website’s rendering. Could you allow access?”
  4. Replace the Resource (If Necessary): If the resource is critical but cannot be unblocked, consider hosting it locally on your server if allowed by the external site’s terms.
  5. Ignore Non-Critical Resources: For resources that don’t impact your site’s functionality, no action is needed.

Tip: Ensuring access to critical external resources helps maintain proper page rendering and SEO performance.

 

More articles relating to Robots.txt file:

 

More articles relating to Blocked resources:

How to fix: Pages blocked by X-Robots-Tag: noindex HTTP header

Issue: Pages blocked by the X-Robots-Tag: noindex HTTP header can’t be crawled or indexed by search engines, preventing them from appearing in search results.

Fix: Check your HTTP headers to ensure that valuable content isn’t accidentally blocked by the X-Robots-Tag: noindex.

How to Fix for Beginners

  1. Identify Affected Pages: Use SEO tools or browser developer tools to find pages with the X-Robots-Tag: noindex header.
    • Example: Your blog page is marked with X-Robots-Tag: noindex, but it should be indexed.
  2. Review Intent: Confirm if blocking the page was intentional (e.g., admin or test pages) or a mistake.
    • Example: You might want admin-dashboard.html blocked but not your main blog page.
  3. Update the HTTP Header: Remove or modify the X-Robots-Tag: noindex directive for important pages.
    • Example: Remove X-Robots-Tag: noindex from the blog page’s server configuration or CMS settings.
  4. Check Non-HTML Files: Ensure non-HTML resources, like PDFs, that need indexing are not unintentionally blocked.
  5. Test Crawling: Use Google Search Console’s URL Inspection tool to confirm that the page can now be indexed.

Tip: Properly configuring X-Robots-Tag ensures that search engines can index valuable content while ignoring irrelevant or sensitive pages.

 

More articles relating to Robots.txt file:

 

More articles relating to Blocked resources:

How to fix: Orphaned pages (in sitemap)

Issue: Orphaned pages in your sitemap.xml are not linked internally, making them harder for users to find and potentially wasting your crawl budget. Search engines may still crawl these pages, even if they’re outdated or unimportant.

Fix: Review orphaned pages in your sitemap.xml and either link them internally, remove them, or decide if they can remain as-is.

How to Fix for Beginners

  1. Identify Orphaned Pages: Use tools like Google Search Console or SEO audit software to locate orphaned pages in your sitemap.
    • Example: If https://example.com/page3 is an orphaned page, ensure it’s useful and still relevant.
  2. Link Internally: For pages with valuable content, add links to them from other relevant pages on your site.
    • Example: Add a link to https://example.com/page3 in a related blog post or navigation menu.
  3. Remove Outdated Pages: If the page is no longer useful, delete it and update your sitemap to reflect the change.
    • Example: Remove https://example.com/old-page from your sitemap and delete the file.
  4. Leave Special-Purpose Pages: If the page serves a specific purpose (like a landing page for ads), you can leave it unlinked but ensure it’s still needed.

Tip: Linking valuable pages internally improves SEO and user navigation while optimizing your crawl budget.

 

More articles relating to Sitemaps:

How to fix: Orphaned pages (from Google Analytics)

Issue: Orphaned pages are pages on your site that aren’t linked to any other internal pages. These pages miss out on SEO benefits, like link juice, and may confuse users if they no longer serve a purpose.

Fix: Identify orphaned pages through Google Analytics and either remove them, link to them, or leave them if they serve a specific purpose.

How to Fix for Beginners

  1. Identify Orphaned Pages: Use tools like Google Analytics to find pages that have traffic but were not crawled by your site audit tool.
    • Example: A blog post at https://example.com/hidden-blog is getting traffic but isn’t linked anywhere on your site.
  2. Review Each Page:
    • If outdated or irrelevant, delete the page.
    • If valuable, ensure it’s linked to from other relevant pages on your site.
    • Example: Link to https://example.com/hidden-blog from your homepage or a related blog category.
  3. Link Strategically: Add links to orphaned pages using meaningful anchor text to ensure they are easy for users and search engines to find.
    • Example: In a related article, add: “Read more about this in our hidden blog.”
  4. Consider Special-Purpose Pages: For pages that serve a niche purpose (like a campaign-specific landing page), ensure they’re still relevant and functioning as intended. No linking might be acceptable for such cases.

Tip: Regularly review and link to orphaned pages to maximize their SEO potential and improve user navigation.

How to fix: Pages that were blocked from crawling

Issue: Pages blocked from crawling by search engines (due to robots.txt or noindex tags) won’t appear in search results, limiting your visibility.

Fix: Review your robots.txt and noindex tags to ensure that no valuable content is accidentally blocked from crawling.

Recommendations

  1. Check robots.txt: Verify that only pages you intentionally want hidden (e.g., admin pages) are blocked, and valuable content pages are accessible.
  2. Review noindex Tags: Make sure pages you want to rank don’t have noindex tags in their HTML.
  3. Run a Site Audit: Use Google Search Console or an SEO tool to check for blocked pages and confirm that key pages are open to search engines.
  4. Monitor Regularly: After updates, ensure that newly added or modified pages are not mistakenly blocked.

Tip: Allowing important pages to be crawled ensures they can appear in search results, helping with overall site visibility.

 

More articles relating to Blocked resources:

How to fix: Pages with hreflang language mismatch issues

Issue: If the language specified in your hreflang tag doesn’t match the actual language of your page, search engines may misunderstand your content’s language, causing display issues in search results.

Fix: Review and correct hreflang attributes on affected pages to ensure they accurately reflect the page’s language.

Recommendations

  1. Match Language Codes: Verify that each page’s hreflang language code matches the language used in the content (e.g., en for English).
  2. Double-Check for Multilingual Pages: If your page contains multiple languages, check that each hreflang tag accurately specifies the content language for each variant.
  3. Use Correct Region Codes: If applicable, add regional codes to indicate country-specific content (e.g., en-US for English in the U.S.).
  4. Run Regular Audits: Periodically audit your hreflang tags to catch mismatches early, especially after adding or updating localized content.

Tip: Accurate hreflang tags help search engines deliver the right language version to the right audience.

 

More articles relating to Hreflang issues: