Tool Point
SEO Tools
Apr 27, 202619 min read

XML Sitemap Generator: Create & Submit Sitemaps to Google

Generate a proper XML sitemap in minutes. Learn what to include, how to submit to Google Search Console, and fix common sitemap errors. Free tool included.

Tool Point Team avatar
Tool Point Team

Editorial Team at Tool Point

Featured image for XML Sitemap Generator: Create & Submit Sitemaps to Google

An XML sitemap is like a roadmap that tells search engines which pages on your site matter most.

Without one, Google can still find your pages by following links--but it might miss deep content, new pages, or important sections buried in your site structure. With a well-structured sitemap, you're giving crawlers a clear list of URLs you want discovered and indexed.

Creating an XML sitemap doesn't have to be complicated. You just need to know what to include (and what to leave out), how to format it correctly, and how to submit it to Google Search Console.

In this guide, you'll learn exactly what XML sitemaps do, which URLs belong in them, and how to use ToolPoint's free XML Sitemap Generator to create a clean, search-engine-friendly sitemap in minutes.

Critical notes:

  • Submitting a sitemap is a hint, not a guarantee that Google will crawl or index every URL.
  • Google ignores <priority> and <changefreq> tags. Use <lastmod> only when it's accurate.

What is an XML sitemap (plain English)?

An XML sitemap is a structured list of important URLs on your website that you want search engines to discover and potentially index. It's written in XML (Extensible Markup Language), which search engines can easily read and parse.

Think of it as a map you give to Google, Bing, and other search engines saying: "Here are my most important pages--please crawl these."

When it helps most

XML sitemaps are especially valuable for:

New sites: When you have few external links pointing to your pages, a sitemap helps search engines find content faster

Large sites: With thousands of pages, some deep content might not get crawled regularly without a sitemap

Deep pages: Content buried 5+ clicks from your homepage benefits from being explicitly listed

Frequently updated content: News sites, blogs, or stores with daily updates can signal freshness with lastmod dates

Poorly linked sites: If your internal linking structure is weak, a sitemap helps compensate

When it matters less

Small sites (under 100 pages) with good internal linking and regular external links may not see dramatic benefits from sitemaps--but they're still worth adding as a best practice.

What an XML sitemap does NOT do

It's important to understand the limitations of sitemaps. They're helpful discovery tools, but they're not magic ranking boosters.

It doesn't guarantee rankings

A sitemap has zero direct impact on where your pages rank. Rankings depend on content quality, backlinks, user experience, technical SEO, and hundreds of other factors--not whether you have a sitemap.

It doesn't guarantee indexing

Submitting a URL in your sitemap is a hint, not a command. Google may choose not to index a page for many reasons:

  • The page is low quality or thin content
  • The page duplicates content elsewhere
  • The page violates Google's guidelines
  • Google's crawl budget prioritizes other pages
  • The page has a noindex tag

It can't "force" Google to include low-quality pages

If you add 10,000 low-value parameter URLs to your sitemap hoping Google will index them all, you'll be disappointed. Google evaluates each URL independently and only indexes pages it deems valuable to searchers.

Use sitemaps to surface your best content, not to inflate your indexed page count with thin pages.

Sitemap rules that matter (Google + protocol basics)

To create a valid, effective sitemap, follow these technical requirements:

Use absolute, fully-qualified URLs

Every URL in your sitemap must include the full protocol and domain:

Correct: https://toolpoint.site/tools/seo/xml-sitemap-generator

Wrong: /tools/seo/xml-sitemap-generator

Include only canonical URLs you want in search results

Don't include redirect chains, duplicate variations, or URLs with tracking parameters. Use ToolPoint's Canonical URL Generator to identify your canonical versions, then list only those.

Must be UTF-8 encoded

Your sitemap file should use UTF-8 character encoding to support international characters properly.

Keep within limits

Each sitemap file can contain a maximum of:

  • 50,000 URLs, or
  • 50MB uncompressed (whichever limit is reached first)

If you exceed these limits, split into multiple sitemaps and use a sitemap index file.

Split into multiple sitemaps when needed

For large sites, organize sitemaps by content type:

  • sitemap-blog.xml
  • sitemap-tools.xml
  • sitemap-categories.xml

Then reference all of them in a master sitemap-index.xml file.

Use a sitemap index file for large sites

A sitemap index file lists all your individual sitemaps:

<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://example.com/sitemap-blog.xml</loc>
</sitemap>
<sitemap>
<loc>https://example.com/sitemap-tools.xml</loc>
</sitemap>
</sitemapindex>

Place sitemap at site root when possible

Store your sitemap at https://yoursite.com/sitemap.xml so it covers your entire domain. Sitemaps in subdirectories (like /blog/sitemap.xml) can only reference URLs within that subdirectory.

Also reference your sitemap in your robots.txt file using ToolPoint's Robots.txt Generator.

XML sitemap elements

XML sitemaps contain several elements, but not all of them matter. Here's what Google actually uses:

Table 1: XML Sitemap Elements (What Google Uses vs Ignores)

ElementWhat it meansUse it?Notes
<loc>The URL of the pageRequiredMust be absolute (full URL with https://)
<lastmod>Last modification dateOptional (use only if accurate)Format: YYYY-MM-DD or full ISO 8601. Only update when content actually changes, not template/footer edits
<changefreq>How often the page changesGoogle ignores thisValues like "daily" or "weekly" are not used by Google for crawling decisions
<priority>Relative priority (0.0-1.0)Google ignores thisDoesn't influence crawling or ranking. Google determines priority independently
Format: YYYY-MM-DD or full ISO 8601. Only update when content actually changes, not template/footer edits

Bottom line: Use <loc> (required) and <lastmod> if you track real content updates accurately. Skip <changefreq> and <priority>--they don't help and just bloat your file.

What to include (and what to exclude)

Knowing what belongs in your sitemap--and what doesn't--is critical. A clean sitemap contains only high-value, canonical URLs you want indexed.

Table 2: Include vs Exclude URLs (with Examples)

IncludeWhyExcludeWhy
HomepageYour main entry point404 pagesBroken URLs waste crawl budget
Category hub pagesImportant navigation layersRedirected URLs (3xx)Google should crawl the final destination, not the redirect
Tool pagesHigh-value utility pages users search forDuplicate URL variantsOnly include the canonical version (e.g., one of: /page, /page/, /page?ref=123)
High-value blog postsContent you want to rankParameter/tracking URLsURLs with ?utm_source=, ?sort=, etc. are usually duplicates
Product pagesCore offering pagesnoindex pagesIf you've noindexed a page, don't include it in sitemap
Landing pagesConversion-focused pagesThin/low-value pagesTag clouds, single-sentence pages, auto-generated pages with no unique content
Author/about pages (if they rank)Helpful for brand queriesInternal search results/search?q=keyword creates infinite crawl paths
Resource/download pagesIf they're valuable standalone contentPaginated duplicatesIf using rel=next/prev or canonical consolidation, only include the main paginated URL

General rule: If you wouldn't want the URL to rank in search results, don't put it in your sitemap.

How to use ToolPoint's XML Sitemap Generator (step-by-step)

ToolPoint's XML Sitemap Generator creates properly formatted XML sitemaps in seconds. Here's how to use it:

Step 1: Open the XML Sitemap Generator

Go to https://toolpoint.site/tools/seo/xml-sitemap-generator

Step 2: Enter your website domain

Add your full domain with protocol (e.g., https://yoursite.com)

Step 3: Add important URL paths

List the URLs you want included. You can add:

  • Relative paths (e.g., /about, /tools/calculator)
  • Full URLs (the tool will validate format)

Focus on high-value pages only--don't add every single URL if you have thousands.

Step 4: Add last modified date only for real content updates

If you track when content actually changes (not template updates), add lastmod dates in YYYY-MM-DD format. If unsure, leave it blank--Google can detect freshness other ways.

Step 5: (Optional) Set changefreq/priority for your own organization

Note: Google ignores these, but you can use them for internal tracking or documentation purposes.

Step 6: Generate sitemap XML

Click generate. The tool creates a clean, valid XML file following the sitemap protocol.

Step 7: Download and name it sitemap.xml

Save the file as sitemap.xml (or a descriptive name like sitemap-blog.xml if you're creating multiple).

Step 8: Upload to your site root

Upload the file to https://yoursite.com/sitemap.xml via FTP, file manager, or your CMS.

Step 9: Reference it in robots.txt (optional but helpful)

Add this line to your robots.txt file using ToolPoint's Robots.txt Generator:

Sitemap: https://yoursite.com/sitemap.xml

Step 10: Submit in Google Search Console and monitor status

Log into Search Console, go to Sitemaps, submit your sitemap URL, and watch for errors or warnings.

Pro tips for clean, effective sitemaps

  1. Keep the sitemap "clean": Only include URLs you actively want indexed. Quality over quantity.
  2. Use canonicals to consolidate duplicates: Before creating your sitemap, use Canonical URL Generator to set canonical tags on duplicate pages. Then list only canonical URLs in your sitemap.
  3. Use accurate lastmod dates: Don't auto-update lastmod for tiny template or footer changes. Reserve it for real content updates--Google trusts accurate signals more.
  4. If you publish often, update your sitemap when you publish: For blogs or news sites, regenerate your sitemap after publishing new content so search engines discover it faster.
  5. Keep URL formatting consistent: Decide on https vs http, www vs non-www, trailing slash vs no trailing slash--then stick to it. Mixed formats create duplicate signals.
  6. If you change URLs, keep redirects and update sitemap to the final URL: Don't list redirect chains. Update sitemap to point to the final destination after implementing 301 redirects.
  7. Don't block sitemap or key pages in robots.txt: Verify your sitemap file and all listed URLs are accessible. Use Robots.txt Generator to check blocking rules.
  8. Compress large sitemaps (gz) if needed: For sitemaps approaching 50MB, compress them to .xml.gz. Search engines support gzipped sitemaps.
  9. Split by content type: Organize sitemaps logically: sitemap-tools.xml, sitemap-blog.xml, sitemap-categories.xml. Makes troubleshooting easier.
  10. Use a sitemap index for big sites: If you have multiple sitemaps, create a master sitemap-index.xml that lists all of them.
  11. Watch for "submitted URL seems to be a Soft 404": This means Google thinks the page is thin or error-like. Improve content quality or remove from sitemap.
  12. If pages are slow, fix speed: Slow pages get crawled less frequently. Run a Page Speed Test and optimize performance before expecting regular crawling.

Small site vs large site strategy

Your sitemap approach should scale with your site size:

Table 4: Small Site vs Large Site Sitemap Strategy

Site sizeSitemap approachFile namingBest practice
Small sites (<500 pages)Single sitemap with all important URLssitemap.xmlInclude homepage, key pages, blog posts. Update monthly or after major changes.
Growing sites (500-5,000 pages)Separate sitemaps by content typesitemap-blog.xml, sitemap-tools.xml, sitemap-index.xmlSplit blog from static pages. Update blog sitemap weekly or after publishing.
Large sites (5,000+ pages)Sitemap index + multiple categorized sitemapssitemap-index.xml listing 10+ individual sitemapsSplit by category, date, or content type. Consider dated sitemaps (e.g., sitemap-2025-01.xml) for archives.
E-commerce sitesProduct sitemaps + category sitemaps + blog sitemapssitemap-products.xml, sitemap-categories.xml, sitemap-blog.xmlUpdate product sitemap when inventory changes. Use accurate lastmod for price/availability updates.

Pro tip: For large sites, prioritize your most important pages in the first sitemap. Don't hide your best content in sitemap-archive-2018.xml.

Submit + verify: the safe checklist

Before and after submitting your sitemap, run through this checklist to catch problems early:

Sitemap.xml returns 200 status

  • Open https://yoursite.com/sitemap.xml in a browser
  • Should load as XML (not a 404 or redirect)

Contains absolute canonical URLs

No 404s/redirects/noindex URLs inside

  • Test a sample of URLs manually
  • Remove any that return errors or redirect

Submitted in Search Console

  • Go to Google Search Console > Sitemaps
  • Enter your sitemap URL and click Submit

Errors monitored and fixed

  • Check back weekly for the first month
  • Address any errors or warnings immediately

Refreshed after new content

  • Update sitemap when you publish important pages
  • Resubmit in Search Console if you make major changes (optional--Google recrawls automatically)

Sitemap errors in Search Console Fix

Google Search Console shows sitemap status and errors. Here's how to interpret and fix common problems:

Table 3: Sitemap Errors in Search Console Fix

Error / warningWhat it usually meansFix
Couldn't fetchSitemap URL returns an error or times outVerify sitemap loads at the submitted URL; check server response time; ensure it's not blocked by robots.txt
Invalid date in lastmodDate format is wrongUse YYYY-MM-DD or full ISO 8601 format (e.g., 2025-01-13T10:00:00+00:00)
Submitted URL not found (404)URL in sitemap doesn't existRemove 404 URLs from sitemap; implement 301 redirects if URLs moved
Redirect errorURL redirects to another locationUpdate sitemap to list the final destination URL, not the redirect
Blocked by robots.txtURL is disallowed in robots.txtRemove blocking rule from robots.txt using Robots.txt Generator, or remove URL from sitemap
Soft 404Google thinks the page is thin/error-likeImprove content quality, add unique value, or remove from sitemap
Duplicate without canonicalMultiple URLs with same content, no canonical setAdd canonical tags using Canonical URL Generator; list only canonical URLs in sitemap
URL not allowed (wrong host/protocol)URL uses different domain or protocol than submitted sitemapEnsure all URLs match your main domain (e.g., all https://yoursite.com, not http:// or www2.yoursite.com)
Too large (size/URL count)Sitemap exceeds 50MB or 50,000 URLsSplit into multiple sitemaps; create a sitemap index file
Server error (5xx)Server returned 500, 502, 503, etc.Check server logs; fix server configuration; ensure sitemap loads reliably

Action plan: Check Search Console weekly for the first month after submitting a new sitemap. Fix errors immediately--they signal deeper site issues Google might penalize.

Workflow A: Launch a new site the SEO-safe way

Goal: Set up a new website with proper crawling and indexing infrastructure from day one.

Checklist:

  1. Finalize your site structure and main pages
  2. Set canonical URLs for all pages using Canonical URL Generator
  3. Open XML Sitemap Generator
  4. Add your homepage, key category pages, and initial content
  5. Generate sitemap.xml
  6. Upload to your site root (https://yoursite.com/sitemap.xml)
  7. Open Robots.txt Generator
  8. Create robots.txt with Sitemap: directive pointing to your sitemap
  9. Upload robots.txt to site root
  10. Verify both files load in browser (200 status)
  11. Set up Google Search Console and verify site ownership
  12. Submit sitemap in Search Console > Sitemaps
  13. Generate meta tags with Meta Tag Generator
  14. Create social sharing tags with OG Meta Generator
  15. Monitor indexing status weekly

Tools used: XML Sitemap Generator, Canonical URL Generator, Robots.txt Generator, Meta Tag Generator, OG Meta Generator

Workflow B: Publish blogs faster (discovery workflow)

Goal: Ensure new blog posts get discovered and indexed quickly.

Checklist:

  1. Write and publish your blog post
  2. Optimize title tag using Google SERP Simulator
  3. Generate meta tags with Meta Tag Generator
  4. Add post URL to your sitemap using XML Sitemap Generator
  5. Set accurate lastmod date (publication date)
  6. Regenerate sitemap.xml
  7. Upload updated sitemap to your site
  8. (Optional) Request URL inspection + indexing in Search Console for immediate discovery
  9. Share on social media (Google may discover via social signals)
  10. Check Search Console coverage report after 3-7 days

Tools used: XML Sitemap Generator, Google SERP Simulator, Meta Tag Generator

Workflow C: Clean up duplicates and crawl waste

Goal: Reduce crawl waste from duplicate URLs and improve site efficiency.

Checklist:

  1. Audit your site for duplicate URL patterns (check Search Console > Coverage)
  2. Identify canonical versions using Canonical URL Generator
  3. Add canonical tags to all duplicate pages
  4. Open Robots.txt Generator
  5. Block low-value parameter URLs (e.g., ?sort=, ?ref=) in robots.txt
  6. Save and upload updated robots.txt
  7. Open XML Sitemap Generator
  8. Remove all duplicate URLs--keep only canonical versions
  9. Regenerate sitemap with clean URL list
  10. Upload updated sitemap.xml
  11. Run Page Speed Test on important pages
  12. Optimize slow pages to improve crawl efficiency
  13. Monitor Search Console crawl stats for improvements
  14. Check "Duplicate without canonical" errors drop to zero

Tools used: XML Sitemap Generator, Canonical URL Generator, Robots.txt Generator, Page Speed Test

FAQ

Indirectly, yes. Sitemaps don't improve rankings directly, but they help search engines discover your content faster--especially new pages, deep content, or sites with weak internal linking. Faster discovery can lead to faster indexing, which means your content becomes eligible to rank sooner. However, the quality of your content determines whether it actually ranks, not whether it's in a sitemap.

No. Submitting a sitemap is a hint to search engines, not a command. Google may choose not to index URLs for many reasons: low quality, duplicate content, thin content, crawl budget limitations, or policy violations. A sitemap helps with discovery--indexing depends on whether Google finds the page valuable.

Use <lastmod> only when you have accurate data about when content actually changed. Don't auto-update it for minor template tweaks, footer changes, or sidebar updates--only for real content updates. Format should be YYYY-MM-DD or full ISO 8601 (e.g., 2025-01-13). If you can't track this accurately, leave <lastmod> out entirely.

No. Google officially ignores both <changefreq> and <priority>. These tags were part of the original sitemap protocol, but Google's algorithms determine crawl frequency and priority independently based on signals like content freshness, user engagement, and site authority. You can skip these tags entirely to keep your sitemap cleaner.

Each sitemap file can contain a maximum of 50,000 URLs or 50MB uncompressed (whichever limit is reached first). If you exceed these limits, split your content into multiple sitemaps and use a sitemap index file to list them all.

It depends on your pagination strategy:

  • If using rel="next" and rel="prev" tags, you can include all paginated pages
  • If using canonical tags to consolidate pagination (e.g., all point to page 1), only include the canonical page
  • For "Load More" infinite scroll, include the main URL only

Most sites benefit from including key paginated pages (like category page 1, 2, 3) but not deep pagination (page 47).

Upload to your site's root directory so it's accessible at https://yoursite.com/sitemap.xml. This allows the sitemap to reference any URL on your domain. Sitemaps in subdirectories (like /blog/sitemap.xml) can only reference URLs within that subdirectory. Also add the sitemap URL to your robots.txt file for easy discovery.

Active sites (blogs, news): Update after publishing new content or making significant changes. Daily or weekly updates are common.

Static sites: Update when you add new pages, remove old pages, or restructure navigation. Monthly or quarterly updates are usually sufficient.

E-commerce sites: Update when you add/remove products or change significant inventory. Some sites auto-generate sitemaps nightly.

There's no penalty for updating frequently--just ensure the updates are meaningful.

Yes, and it's often recommended for large sites. You can organize sitemaps by:

  • Content type (blog, products, categories)
  • Date (useful for archives)
  • Language/region (for international sites)

Create a sitemap index file (sitemap-index.xml) that lists all your individual sitemaps, then submit only the index file to Search Console.

Common reasons:

Low quality content: Google doesn't see value in indexing the page

Duplicate content: Page is too similar to existing indexed pages

Crawl budget: Google is prioritizing other URLs on your site

Technical issues: Page has noindex tags, robots.txt blocks, or returns errors

Manual actions: Site has a penalty or quality issue

Check Search Console's Coverage report for specific reasons. Improve content quality, fix technical issues, and be patient--indexing can take weeks for new sites.

Conclusion

XML sitemaps are one of the simplest yet most effective ways to help search engines discover your content. They won't boost your rankings directly, but they ensure your important pages get crawled and considered for indexing--especially on new sites, large sites, or sites with complex structures.

The key is keeping your sitemap clean: only canonical URLs you want indexed, accurate lastmod dates (or none at all), and no 404s, redirects, or noindex pages wasting space.

Use ToolPoint's XML Sitemap Generator to create a properly formatted sitemap in minutes, then submit it to Google Search Console and monitor for errors. Pair it with a clean robots.txt file from our Robots.txt Generator, and you've covered the fundamentals of helping search engines understand your site.

Ready to create your XML sitemap?

Your search-engine-friendly sitemap is just a few clicks away.

Tool Point Team avatar

Tool Point Team

Editorial Team at Tool Point

All articles by Tool Point Team

The Tool Point team publishes practical, no-fluff tutorials that help you get more done with free online tools. We focus on clarity, speed, and useful takeaways you can apply right away.

More articles

Tool Point

Free tools for everyday tasks, from quick text fixes to image edits, SEO checks, and calculators. No sign-up needed. Fast, private, and easy to use.

© 2026 Tool Point. All rights reserved.