Tag Archive for: Semrush site audit tips

How to fix: Sitemap.xml not indicated in robots.txt

Adding your sitemap to your robots.txt file helps search engines find and crawl your site efficiently.

Here’s a simple, step-by-step guide to resolve this issue and give your site’s SEO a nice boost.

 

screenshot of Semrush site audit issue sitemap.xml not indicated in robots file

screenshot of Semrush site audit issue sitemap.xml not indicated in robots file

1. Understand the Problem

When Semrush flags the “Sitemap.xml not indicated in robots.txt” issue, it’s telling you that search engines may not be able to locate your sitemap.

Without this indication, search engines might take longer to discover new or updated pages on your site, which can hurt your SEO.

 

screenshot of robots.txt file missing link to sitemap

screenshot of robots.txt file missing link to sitemap

2. Locate Your Sitemap URL

Most often, your sitemap will be at https://yourdomain.com/sitemap.xml but for WordPress sites, it will be at https://yourdomain.com/sitemaps_index.xml instead.

To make sure, check with your SEO plugin (like Yoast or Rank Math) or manually look for it by navigating to your site’s root domain with /sitemap.xml at the end.

 

screenshot of file manager without robots.txt file

screenshot of file manager without robots.txt file

3. Access Your robots.txt File

Your robots.txt file is typically located at https://yourdomain.com/robots.txt.

You can open it by typing this URL into your browser.

If you can’t find it there, use an FTP client or your hosting file manager to locate it in your site’s root directory.

Note: WordPress users with SEO plugins can usually access and edit robots.txt directly in the plugin settings.

4. Back Up Your robots.txt File

Before making any changes, save a copy of your existing robots.txt file. This way, if anything goes wrong, you can revert to your original file easily.

 

screenshot of robots.txt file with link to sitemap

screenshot of robots.txt file with link to sitemap

5. Add the Sitemap to robots.txt

Open your robots.txt file for editing and add this line at the end:

Sitemap: https://yourdomain.com/sitemap.xml

Replace https://yourdomain.com/sitemap.xml with the actual URL of your sitemap if it’s different. This directive tells search engines exactly where to find your sitemap.

Don’t forget to use /sitemap_index.xml for WordPress sites.

 

screenshot of Google Search Console sitemap tool

screenshot of Google Search Console sitemap tool

6. Save and Test Your Changes

Once you’ve added the sitemap line, save the changes and head over to the Sitemaps section of your Google Search Console account.

Submit your updated robots.txt to make sure there are no errors.

 

7. Re-run the Semrush Site Audit

Head back to Semrush and re-run the site audit to ensure the error is resolved. If you’ve correctly indicated the sitemap in your robots.txt, the issue should now be gone!

Bonus Tip: Make sure your sitemap is regularly updated with new pages. Most SEO plugins do this automatically, but it’s good to check once in a while to ensure everything on your site is crawlable.

 

Frequently Asked Questions

  • Robots.txt: Tells search engines which pages or files to avoid crawling on your site (like admin pages or private content).
  • XML Sitemap: Lists all important pages you want search engines to find and crawl, helping them understand your site’s structure and prioritize content.
Think of robots.txt as a do not enter sign, while the XML Sitemap is more like a here’s the map guide.
Exclude these from your XML Sitemap:
  • Admin pages (like /wp-admin/)
  • Login pages (e.g., /login or /my-account)
  • Duplicate content (like category and tag archives if not needed)
  • Thin or low-value pages (e.g., privacy policy, terms, or paginated content)
Keep it focused on high-value pages to make crawling efficient.
To add an XML Sitemap:
  • Use a Plugin: For WordPress, plugins like Yoast SEO or Rank Math automatically generate a sitemap for you. Just install, activate, and configure.
  • Submit to Search Engines: Go to Google Search Console and Bing Webmaster Tools, find the “Sitemaps” section, and enter your sitemap URL (usually yourdomain.com/sitemap.xml).
  • Verify: Check back in the tools to ensure it’s indexed correctly.
To fix an XML Sitemap:
  • Check for Errors: Use Google Search Console to find errors or warnings in the “Sitemaps” section.
  • Regenerate the Sitemap: If using a plugin, clear the cache and regenerate the sitemap (common in Yoast or Rank Math settings).
  • Review URLs: Make sure only essential URLs are included (no 404s, redirects, or blocked pages).
  • Validate Sitemap: Use an online sitemap validator or tools like XML Sitemap Validator to ensure proper formatting.
  • Resubmit: After fixing, resubmit the sitemap in Google Search Console.
Following these steps should clear up common sitemap issues.

 

More articles relating to Sitemaps:

How to fix: Sitemap.xml not indicated in robots.txt

Adding your sitemap to your robots.txt file helps search engines find and crawl your site efficiently.

Here’s a simple, step-by-step guide to resolve this issue and give your site’s SEO a nice boost.

 

screenshot of Semrush site audit issue sitemap.xml not indicated in robots file

screenshot of Semrush site audit issue sitemap.xml not indicated in robots file

1. Understand the Problem

When Semrush flags the “Sitemap.xml not indicated in robots.txt” issue, it’s telling you that search engines may not be able to locate your sitemap.

Without this indication, search engines might take longer to discover new or updated pages on your site, which can hurt your SEO.

 

screenshot of robots.txt file missing link to sitemap

screenshot of robots.txt file missing link to sitemap

2. Locate Your Sitemap URL

Most often, your sitemap will be at https://yourdomain.com/sitemap.xml but for WordPress sites, it will be at https://yourdomain.com/sitemaps_index.xml instead.

To make sure, check with your SEO plugin (like Yoast or Rank Math) or manually look for it by navigating to your site’s root domain with /sitemap.xml at the end.

 

screenshot of file manager without robots.txt file

screenshot of file manager without robots.txt file

3. Access Your robots.txt File

Your robots.txt file is typically located at https://yourdomain.com/robots.txt.

You can open it by typing this URL into your browser.

If you can’t find it there, use an FTP client or your hosting file manager to locate it in your site’s root directory.

Note: WordPress users with SEO plugins can usually access and edit robots.txt directly in the plugin settings.

4. Back Up Your robots.txt File

Before making any changes, save a copy of your existing robots.txt file. This way, if anything goes wrong, you can revert to your original file easily.

 

screenshot of robots.txt file with link to sitemap

screenshot of robots.txt file with link to sitemap

5. Add the Sitemap to robots.txt

Open your robots.txt file for editing and add this line at the end:

Sitemap: https://yourdomain.com/sitemap.xml

Replace https://yourdomain.com/sitemap.xml with the actual URL of your sitemap if it’s different. This directive tells search engines exactly where to find your sitemap.

Don’t forget to use /sitemap_index.xml for WordPress sites.

 

screenshot of Google Search Console sitemap tool

screenshot of Google Search Console sitemap tool

6. Save and Test Your Changes

Once you’ve added the sitemap line, save the changes and head over to the Sitemaps section of your Google Search Console account.

Submit your updated robots.txt to make sure there are no errors.

 

7. Re-run the Semrush Site Audit

Head back to Semrush and re-run the site audit to ensure the error is resolved. If you’ve correctly indicated the sitemap in your robots.txt, the issue should now be gone!

Bonus Tip: Make sure your sitemap is regularly updated with new pages. Most SEO plugins do this automatically, but it’s good to check once in a while to ensure everything on your site is crawlable.

 

Frequently Asked Questions

  • Robots.txt: Tells search engines which pages or files to avoid crawling on your site (like admin pages or private content).
  • XML Sitemap: Lists all important pages you want search engines to find and crawl, helping them understand your site’s structure and prioritize content.
Think of robots.txt as a do not enter sign, while the XML Sitemap is more like a here’s the map guide.
Exclude these from your XML Sitemap:
  • Admin pages (like /wp-admin/)
  • Login pages (e.g., /login or /my-account)
  • Duplicate content (like category and tag archives if not needed)
  • Thin or low-value pages (e.g., privacy policy, terms, or paginated content)
Keep it focused on high-value pages to make crawling efficient.
To add an XML Sitemap:
  • Use a Plugin: For WordPress, plugins like Yoast SEO or Rank Math automatically generate a sitemap for you. Just install, activate, and configure.
  • Submit to Search Engines: Go to Google Search Console and Bing Webmaster Tools, find the “Sitemaps” section, and enter your sitemap URL (usually yourdomain.com/sitemap.xml).
  • Verify: Check back in the tools to ensure it’s indexed correctly.
To fix an XML Sitemap:
  • Check for Errors: Use Google Search Console to find errors or warnings in the “Sitemaps” section.
  • Regenerate the Sitemap: If using a plugin, clear the cache and regenerate the sitemap (common in Yoast or Rank Math settings).
  • Review URLs: Make sure only essential URLs are included (no 404s, redirects, or blocked pages).
  • Validate Sitemap: Use an online sitemap validator or tools like XML Sitemap Validator to ensure proper formatting.
  • Resubmit: After fixing, resubmit the sitemap in Google Search Console.
Following these steps should clear up common sitemap issues.

 

More articles relating to Sitemaps:

How to fix: Missing the viewport width value

Issue: Your page is missing the necessary viewport settings, which can make it look bad or unusable on mobile devices.

Fix: Add the width and initial-scale values to your viewport meta tag. Ask a developer if you need help.

Tip: Proper viewport settings improve your site’s mobile experience.

How to fix: Malformed links

Issue: Malformed links have errors (like typos or wrong characters) that prevent them from being crawled or used properly.

Fix: Correct any link errors, ensuring URLs follow the standard format and are free of extra characters or mistakes.

Tip: Clean links help search engines and users navigate your site smoothly.

How to fix: Invalid structured data items

Issue: Structured data errors make it hard for search engines to understand your content, reducing your chances of getting rich snippets and better rankings.

Fix: Use the Rich Results Test tool to identify and fix errors in your structured data, ensuring it meets Google’s guidelines.

Tip: Valid structured data can improve your visibility in search results.

How to fix: Pages with slow load speed

Issue: Slow page load speeds hurt your rankings and user experience. Faster pages perform better in search results and can boost conversions.

Fix: Optimize your HTML code to make it leaner. If your server is slow, consider upgrading to better hosting.

Tip: Quick-loading pages improve SEO and keep visitors engaged.

How to fix: Sitemap.xml files are too large

Issue: Your sitemap.xml is too big (over 50 MB or 50,000 URLs), which may prevent search engines from crawling your site properly.

Fix: Split your sitemap into smaller files and use a sitemap index file to list them. Update your robots.txt with the new sitemap locations.

Tip: Smaller sitemaps ensure efficient crawling and indexing by search engines.

 

More articles relating to Sitemaps:

How to fix: Subdomains don’t support secure encryption algorithms

Issue: Outdated encryption on your subdomains poses security risks and can lead to browser warnings, scaring off visitors and lowering traffic.

Fix: Ask your website admin to update to modern encryption algorithms.

Tip: Up-to-date security builds trust and improves site safety.

How to fix: Issues with broken internal JavaScript and CSS files

Issue: Broken JavaScript or CSS files can prevent search engines from correctly rendering your pages, harming your rankings. They may also cause errors that disrupt user experience on your site.

Fix: Identify and repair all broken JavaScript and CSS files. Regularly monitor these files to catch any new issues.

Recommendations:

  1. Use Browser Dev Tools: Open your site in a browser and use developer tools (e.g., Chrome DevTools) to spot errors in the Console tab for broken scripts or styles.
  2. Check File Paths: Ensure that file paths are correct and that files are located in their expected directories, especially after site updates or migrations.
  3. Monitor with Site Audits: Use tools like Google Search Console or third-party SEO tools to scan for broken resources regularly.
  4. Minify and Consolidate: Minifying or combining CSS and JavaScript files can reduce load time and reduce the chance of individual file issues impacting your site.

 

More articles relating to JavaScript issues:

 

More articles relating to Broken elements:

How to fix: Pages with a meta refresh tag

Issue: Meta refresh tags are outdated and can cause slow redirects, hurting SEO and user experience.

Fix: Replace meta refresh tags with 301 redirects for a faster, more reliable redirection method.

Tip: 301 redirects improve site efficiency and search engine optimization.

How to fix: Pages with multiple canonical URLs

Issue: Having multiple canonical URLs on one page confuses search engines, making them ignore or choose the wrong one, hurting your SEO.

Fix: Keep only one canonical URL on each page to clearly indicate the preferred version.

Tip: One clear canonical tag helps search engines understand your content better.