How to fix: Pages blocked by X-Robots-Tag: noindex HTTP header

When Semrush flags pages blocked by the X-Robots-Tag: noindex, it means they’re being excluded from search results—sometimes unintentionally. This setting can limit your visibility if misused, so it’s worth checking closely.

How to fix: Orphaned pages (from Google Analytics)

Orphaned pages—those with no internal links—are regularly flagged in Semrush Site Audits, especially when spotted via Google Analytics. These pages are hard for users and search engines to find, reducing their visibility and value.

How to fix: Links to external pages or resources returned a 403 HTTP status code

Links returning a 403 status code are frequently reported in Semrush Site Audits. These errors indicate permission problems or blocked access to external resources, which can disrupt user flow and SEO signals.
Semrush Site Audit Tips graphic titled "How to fix: Too long link URLs" with scissors illustration

How to fix ‘URL Too Long’ issue in Semrush site audit

Getting the 'Too Long Link URLs' error in Semrush? Here I break down what long URLs are, why they’re bad for SEO, and how to fix them using tools like Google Search Console. Perfect for beginners.

How to fix: Issues with unminified JavaScript and CSS files

Semrush often flags unminified JS and CSS files as performance bottlenecks. These files slow down your site and hurt Core Web Vitals. Compressing them improves load speed and overall site health.

How to fix: HTTP URLs in sitemap.xml for HTTPS site

Seeing HTTP URLs in your sitemap.xml is a common issue flagged by Semrush, especially on HTTPS-enabled sites. This mismatch can confuse crawlers and needs to be fixed for better indexing.

How to fix: Pages with low text-HTML ratio

Learn how to streamline HTML, enhance page performance, and deliver a better user experience.

Web & SEO Projects

Curious how design and SEO come together to drive real results? JL Faverio’s Web & SEO Projects page showcases a powerful portfolio of client work that blends eye-catching design with high-performing optimization strategies.

How to fix: Sitemap.xml not indicated in robots.txt

If Semrush flagged the “Sitemap.xml not indicated in robots.txt” issue, don’t worry - it’s a quick fix. Adding your sitemap to robots.txt helps search engines find and crawl your site more efficiently, improving your SEO. Follow this simple guide to resolve the issue, step-by-step.

How ChatGPT’s Integration with Bing is Shaking Up SEO

ChatGPT is reshaping how Bing handles search, with impacts on SEO strategy. Discover how to stay ahead with conversational keywords and intent-based content.