A XML sitemap is a specific URL destination, where all your Posts and Pages are listed. This structured format, read by search engines like Google, Bing and others, understands your website and crawls your content. You can use this powerful tool to generate your blogger sitemap easily.
There are various websites that show this type of sitemap format:
But I want to let you know that this will only handle your post, not the page. I recommend that you follow the above tool to create a blogger-compatible sitemap that can display both your posts and pages.
Hence, it's necessary to use the right sitemap for your website.
Why is this sitemap considered the best?
- Well, as you know, your posts and pages both will get indexed by search engines because they can easily understand the structural data of the sitemap.
- Google Schema Testing is already available to test your rich structured information.
- This sitemap displays each of your articles in an optimized format for faster indexing.
Main Key Features of XML Sitemap
- This sitemap can easily fix your index issues.
- Your pages can run even better on Blogger.
- Easy to set up without technical coding.
- Supported by different search engines.
Configuring XML Sitemap in Blogger Dashboard
- Navigate to Blogger Control Panel → Settings → Crawlers and Indexing → Custom robots.txt.
- Take the XML sitemap code generated from the tool above.
- Insert it into the Custom robots.txt field
- Press Save Changes.
Submitting XML Sitemap to Google Search Console
- Access your Google Search Console account dashboard.
- Click on Sitemaps in the navigation panel.
- Take the generated sitemap URLs (both post and page sitemaps).
- Enter each URL individually and press Submit.
- Review the reference images below for step-by-step guidance.
Configuring Custom Robots.txt File in Blogger
- Access Blogger Dashboard → Settings → Search Preferences.
- Turn on Custom Robots.txt feature.
- Apply crawling settings as demonstrated in the reference images below:
- Follow the configuration shown in the reference image.
- Configure Index to Follow to ensure your homepage, articles, and static pages receive proper search engine indexing.
(ii) Archive and Label Pages:
- Apply the configuration displayed in the reference image.
- Configure Noindex and Noodp since archive and label pages should remain unindexed.
✅ Your Custom Robots.txt configuration is now properly set up for enhanced search engine optimization.
Conclusion: This powerful sitemap will help your website in different ways to make it more crawlable to the search engines and improve its index and chances. You can easily get over different index issues that will cause a barrier to being indexed by search engines. This sitemap provides a good signal for search engines to crawl your data.