XML Sitemap Generator
Generate an XML sitemap to help search engines discover and index your website's pages.
Enter your website domain name (e.g., example.com)
URLs to Include
| URL Path | Last Modified | Change Frequency | Priority | |
|---|---|---|---|---|
XML Sitemap Generator (Create sitemap.xml)
Create XML sitemaps for your website to help search engines discover and crawl your pages efficiently. This free online sitemap generator lets you manually add URLs with optional metadata like last modified date, change frequency, and priority to create a valid sitemap.xml file.
Build your sitemap by entering your domain and adding URL paths with their attributes, then download or copy the properly formatted XML for submission to search engines.
What This XML Sitemap Generator Does
An XML sitemap is a structured file that lists important URLs on your website along with optional metadata about each page. Search engines like Google and Bing use sitemaps to discover pages more efficiently, understand your site structure, and learn when content was last updated.
This generator creates a sitemap.xml file by combining your website domain with individual URL paths you specify. You can add optional attributes for each URL including last modified date (when the page was last significantly updated), change frequency (how often the page typically changes), and priority (the relative importance of this URL compared to others on your site).
The tool outputs properly formatted XML following the Sitemaps protocol standard defined at sitemaps.org, ensuring compatibility with all major search engines. The generated sitemap includes absolute URLs (full URLs with protocol and domain) as recommended by Google, proper UTF-8 encoding, and correct entity escaping for special characters.
Unlike automated sitemap generators that crawl your entire site, this manual approach gives you precise control over which URLs appear in your sitemap and their associated metadata. This is useful for small to medium sites, new websites still building content, or situations where you need specific control over sitemap contents.
How to Use the XML Sitemap Generator
Creating your sitemap takes just minutes with this straightforward manual process.
Enter your website domain in the domain field. Include the protocol (https:// or http://) and specify whether you use www or not. For example: https://www.example.com or https://example.com. This becomes the base for all URLs in your sitemap.
Add URL paths for pages you want in your sitemap. Enter the path portion of each URL (the part after your domain). For example, if your full URL is https://example.com/products/shoes, enter /products/shoes as the path. Add one path per row in the table interface.
Set optional metadata for each URL if desired:
- Last Modified: The date this page was last significantly updated. Use YYYY-MM-DD format (e.g., 2025-01-27) or include time in W3C Datetime format for better precision.
- Change Frequency: How often this page typically changes (always, hourly, daily, weekly, monthly, yearly, never).
- Priority: Relative importance from 0.0 to 1.0, where 1.0 is highest priority. Note that Google ignores this field.
Generate your sitemap by clicking the generate button. The tool creates properly formatted XML with absolute URLs constructed from your domain plus each path.
Download or copy the sitemap.xml file and upload it to your website's root directory (https://example.com/sitemap.xml) or submit it directly to Google Search Console regardless of location. The sitemap is now ready for search engine submission.
XML Sitemap Format Basics
Understanding sitemap XML structure helps you verify correctness and troubleshoot issues.
Required XML elements that every sitemap must include:
The <urlset> tag opens and closes the entire sitemap, declaring the XML namespace:
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<!-- URLs go here -->
</urlset>Each URL entry uses a <url> wrapper containing at minimum the <loc> tag:
<url>
<loc>https://www.example.com/page</loc>
</url>The <loc> (location) tag specifies the absolute URL of the page. This is the only truly required element within each URL entry. URLs must be properly encoded and escaped - special characters like ampersands (&) should be written as &, less-than signs as <, and so on.
Optional XML elements provide additional metadata:
<lastmod> indicates when the page was last modified using W3C Datetime format (YYYY-MM-DD or full ISO 8601 with time):
<lastmod>2025-01-27</lastmod>
<!-- or with time -->
<lastmod>2025-01-27T14:30:00+00:00</lastmod><changefreq> suggests how frequently the page changes. Valid values are: always, hourly, daily, weekly, monthly, yearly, never. This is a hint about typical update patterns, not a crawl directive:
<changefreq>weekly</changefreq><priority> indicates the relative importance of this URL compared to other URLs on your site, ranging from 0.0 (lowest) to 1.0 (highest). The default is 0.5:
<priority>0.8</priority>Encoding requirements: XML sitemaps must use UTF-8 encoding. Special characters must be entity-escaped: & becomes &, < becomes <, > becomes >, " becomes ", and ' becomes '. Failing to escape these characters causes XML parsing errors that prevent search engines from reading your sitemap.
Complete example of a minimal sitemap:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://www.example.com/</loc>
<lastmod>2025-01-27</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://www.example.com/about</loc>
<lastmod>2025-01-15</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
</urlset>Best Practices for XML Sitemaps
Following sitemap best practices ensures search engines process your sitemap effectively and index the right pages.
Include only canonical URLs you want in search results. Your sitemap should list the preferred version of each page - the canonical URL you want appearing in search results. Avoid including duplicate URLs, parameter variations, sorted/filtered versions, or any non-canonical URLs. If a page has a canonical tag pointing to another URL, don't list the non-canonical version in your sitemap. This prevents confusion and focuses crawling on your preferred URLs.
List only indexable pages. Don't include pages with noindex meta tags, pages blocked by robots.txt, pages behind login walls, or pages you don't want in search results. Sitemaps tell search engines "these are important pages worth crawling and indexing," so only list pages that meet that criteria.
Use accurate lastmod dates reflecting real updates. The lastmod date should change only when page content meaningfully changes, not every time the sitemap is generated. If you set lastmod to "today" for all pages every time you generate a sitemap, search engines learn to ignore these dates as unreliable. Accurate lastmod helps search engines prioritize crawling pages that actually changed.
Be honest about changefreq and priority despite Google ignoring them. While Google has stated it ignores the priority and changefreq fields, other search engines like Bing may still reference them. Set realistic values: don't mark everything as priority 1.0 or changefreq "always" as this defeats the purpose of signaling relative importance. Use priority to differentiate your most important pages from less critical ones, and set changefreq to realistic update patterns.
Use absolute URLs with the full protocol and domain. Each <loc> tag should contain the complete URL including https:// and the full domain. Relative URLs like /page are not allowed in sitemaps - they must be absolute like https://www.example.com/page. This ensures search engines interpret URLs correctly regardless of where the sitemap is hosted.
Place sitemap.xml at your site root when possible. The traditional location is https://example.com/sitemap.xml at the root directory. While sitemaps can be placed anywhere if submitted via Search Console, the default discovery method follows a "descendants rule" - a sitemap in a subdirectory can only list URLs within that directory and its subdirectories. A sitemap at the root can list any URL on the domain.
Keep URLs consistent with your preferred protocol and subdomain. If your canonical URLs use https:// and www, all sitemap URLs should use https:// and www. Mixing http and https or www and non-www in your sitemap creates confusion and signals conflicting preferences to search engines. Match your sitemap URLs to your canonical URL structure exactly.
Update your sitemap when content changes significantly. When you publish new pages, remove old pages, or make major content updates, regenerate and resubmit your sitemap. While search engines periodically check sitemaps automatically, resubmitting through Search Console after significant changes ensures faster discovery.
Sitemap Limits and Large Sites
XML sitemaps have technical constraints that require special handling for large websites.
The 50,000 URL limit per sitemap file is defined by the Sitemaps protocol. A single sitemap.xml file cannot contain more than 50,000 URL entries. Additionally, uncompressed sitemaps cannot exceed 50MB in file size. These limits ensure sitemaps remain manageable and parse efficiently.
Gzip compression is allowed and recommended for large sitemaps. Compressing your sitemap.xml file as sitemap.xml.gz can dramatically reduce file size while staying within the 50MB limit. Search engines automatically detect and decompress gzipped sitemaps. The 50,000 URL limit still applies regardless of compression.
Multiple sitemaps solve the limitation for websites with more than 50,000 pages. Split your URLs across multiple sitemap files: sitemap1.xml, sitemap2.xml, sitemap3.xml, etc. Each file must individually respect the 50,000 URL and 50MB limits.
Sitemap index files organize multiple sitemaps by listing all your individual sitemap files in a single master file. A sitemap index follows a similar XML structure but uses different tags:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://www.example.com/sitemap1.xml</loc>
<lastmod>2025-01-27</lastmod>
</sitemap>
<sitemap>
<loc>https://www.example.com/sitemap2.xml</loc>
<lastmod>2025-01-27</lastmod>
</sitemap>
<sitemap>
<loc>https://www.example.com/sitemap3.xml</loc>
<lastmod>2025-01-27</lastmod>
</sitemap>
</sitemapindex>Submit the sitemap index file to Search Console instead of individual sitemaps - the index file references all component sitemaps automatically. The sitemap index file itself can contain up to 50,000 sitemap references.
Organize sitemaps logically by content type or section. Many large sites create separate sitemaps for different content types: sitemap-posts.xml for blog posts, sitemap-products.xml for products, sitemap-pages.xml for static pages. This organization makes maintenance easier and helps you track which sections change most frequently.
Consider update frequency when splitting sitemaps. Separate frequently-updated content (blog posts, news) from static content (about pages, policies). This allows search engines to crawl dynamic sitemaps more often while checking static sitemaps less frequently, optimizing crawl budget allocation.
How to Submit Your Sitemap
Submitting sitemaps to search engines ensures they discover and process your URLs effectively.
Google Search Console submission is the primary method for Google:
- Log in to Google Search Console for your website
- Navigate to "Sitemaps" in the left sidebar under "Indexing"
- Enter your sitemap URL in the "Add a new sitemap" field (e.g.,
sitemap.xmlif at root, or full URL if elsewhere) - Click "Submit"
- Monitor the status table showing submitted URLs, discovered URLs, and any errors
Google Search Console shows sitemap processing status, validation errors, and how many URLs were discovered. Check back periodically to ensure your sitemap remains processed successfully and address any errors that appear.
Bing Webmaster Tools submission follows a similar process:
- Log in to Bing Webmaster Tools for your website
- Navigate to "Sitemaps" under the "Configure My Site" section
- Enter your sitemap URL and click "Submit"
- Monitor sitemap status and any errors Bing reports
While Bing's market share is smaller than Google's, submitting your sitemap helps coverage across all major search engines and is worth the minimal effort.
robots.txt Sitemap directive provides automatic discovery:
Add a Sitemap line to your robots.txt file pointing to your sitemap location:
User-agent: *
Allow: /
Sitemap: https://www.example.com/sitemap.xmlPlace the Sitemap directive anywhere in robots.txt (typically at the end). You can include multiple Sitemap lines if you have multiple sitemap files. Search engines check robots.txt regularly and automatically discover sitemaps referenced there.
Sitemap ping URLs (less common but available) allow you to notify search engines of sitemap updates by accessing special URLs. Google previously supported www.google.com/ping?sitemap=URL but now recommends Search Console submission instead. Focus on Search Console and robots.txt methods for reliable submission.
Resubmit after significant changes to your website. When you add substantial new content, restructure your site, or make major updates, resubmit your sitemap through Search Console to prioritize crawling of changed URLs. The lastmod dates help search engines identify what changed since their last crawl.
Troubleshooting XML Sitemap Issues
Common sitemap problems have identifiable causes and clear solutions.
"Submitted but not processed" in Search Console
Causes: Invalid XML syntax, improper entity escaping, incorrect encoding declaration, or structural errors in the sitemap file.
Fixes: Validate your XML using online validators or tools like xmllint. Check that special characters (&, <, >, ", ') are properly entity-escaped as &, <, >, ", '. Verify the XML declaration specifies UTF-8 encoding: <?xml version="1.0" encoding="UTF-8"?>. Ensure all opening tags have matching closing tags and the namespace is declared correctly in the urlset tag.
"Too many URLs" or "File too large" errors
Causes: Exceeding the 50,000 URL limit or 50MB uncompressed file size limit.
Fixes: Split your sitemap into multiple files, each containing fewer than 50,000 URLs and under 50MB. Create a sitemap index file that references all individual sitemaps. Consider compressing sitemaps with gzip to reduce file size. Organize sitemaps by content type or section for easier management.
Wrong URLs appearing (http vs https, www vs non-www variants)
Causes: Listing non-canonical URLs in the sitemap, inconsistent URL formats, or including both variants when only one should be canonical.
Fixes: Ensure every URL in your sitemap exactly matches the canonical URL you want in search results. If your site uses https://www.example.com as the canonical version, every sitemap URL should use that exact format. Review your canonical tag implementation and verify sitemap URLs match. Don't include both http and https versions or www and non-www variants of the same page.
"Google ignores my priority and changefreq values"
Causes: This is expected behavior - Google has publicly stated it ignores these fields.
Fixes: This isn't actually a problem. Google uses its own algorithms to determine crawl priorities and doesn't rely on the priority or changefreq hints in sitemaps. Other search engines may still reference these fields, so set them accurately for those engines, but don't expect them to influence Google's crawling behavior. Focus on providing accurate lastmod dates instead.
"Last modified dates seem ignored"
Causes: Inconsistent or inaccurate lastmod dates that train search engines to distrust these values, or lastmod dates set to future dates.
Fixes: Only update lastmod when content genuinely changes. Don't set lastmod to "today" for every page every time you generate a sitemap. Use accurate historical dates reflecting real updates. Never set lastmod to future dates - this signals unreliability. If lastmod has been inaccurate historically, it may take time for search engines to trust these signals again after you fix them.
Sitemap not being discovered automatically
Causes: Sitemap not in the expected location, no Sitemap directive in robots.txt, or discovery delays.
Fixes: Place sitemap at the root (https://example.com/sitemap.xml) for automatic discovery. Add a Sitemap line to robots.txt pointing to your sitemap location. Alternatively, submit directly through Google Search Console and Bing Webmaster Tools which works regardless of sitemap location. Don't rely solely on automatic discovery - proactive submission via Search Console is more reliable.
URLs from sitemap not being indexed
Causes: Sitemap submission doesn't guarantee indexing. Pages may have noindex tags, be blocked by robots.txt, have quality issues, duplicate other content, or simply not be prioritized for crawling yet.
Fixes: Verify pages are actually indexable (no noindex tags, not blocked by robots.txt, accessible to crawlers). Check Google Search Console URL Inspection for specific pages to see why they're not indexed. Ensure pages have sufficient quality and unique content. Review coverage reports in Search Console for detailed explanations. Remember that sitemap submission requests crawling, not guaranteed indexing.
Frequently Asked Questions
What is an XML sitemap and why do I need one?
An XML sitemap is a file listing important URLs on your website to help search engines discover and crawl pages more efficiently. You need one because it helps search engines find pages that might not be easily discoverable through normal crawling (new pages, deep pages, pages with few internal links), understand your site structure, and learn when content was updated. While not mandatory, sitemaps improve crawl efficiency and are considered a best practice for SEO.
How do I create a sitemap.xml file manually?
Use this generator by entering your domain, adding URL paths for pages you want included, optionally setting lastmod/changefreq/priority metadata, then generating and downloading the XML file. The tool creates properly formatted XML with absolute URLs, correct encoding, and entity escaping. Upload the resulting sitemap.xml to your website's root directory or submit it to Google Search Console.
How many URLs can a single sitemap have?
A single sitemap file can contain a maximum of 50,000 URLs and cannot exceed 50MB uncompressed. If your site has more than 50,000 pages or your sitemap exceeds 50MB, split it into multiple sitemap files and use a sitemap index file to organize them. You can also compress sitemaps with gzip to reduce file size while staying within limits.
What is a sitemap index file and when should I use it?
A sitemap index file is a master file that lists multiple individual sitemap files, using similar XML structure but with <sitemapindex> and <sitemap> tags instead of <urlset> and <url> tags. Use a sitemap index when your site exceeds 50,000 URLs and requires multiple sitemap files, or when you want to organize sitemaps by content type (posts, products, pages). Submit the index file to Search Console instead of individual sitemaps.
Does Google use the priority and changefreq fields?
No, Google has publicly stated it ignores both the <priority> and <changefreq> fields in sitemaps. Google uses its own algorithms to determine crawl priorities and update frequencies rather than relying on webmaster hints. However, other search engines like Bing may still reference these fields, so including accurate values is worthwhile for comprehensive search engine coverage.
What should I put in the lastmod (last modified) tag?
Put the actual date when the page content was last significantly updated, using YYYY-MM-DD format or W3C Datetime format with time (YYYY-MM-DDTHH:MM:SS+00:00). Only update lastmod when meaningful content changes - not every time you generate the sitemap. Inaccurate lastmod dates train search engines to ignore them, while accurate dates help search engines prioritize crawling recently updated pages.
Where should I upload sitemap.xml on my website?
The recommended location is your website's root directory: https://example.com/sitemap.xml. This allows the sitemap to include any URL on your domain. Sitemaps in subdirectories can only list URLs in that directory and below (the "descendants rule") unless submitted via Search Console. You can place sitemaps anywhere and submit them directly to Search Console, which bypasses the descendants rule.
How do I submit a sitemap to Google Search Console?
Log in to Google Search Console, select your property, navigate to "Sitemaps" under "Indexing" in the left sidebar, enter your sitemap URL in the "Add a new sitemap" field (just the filename if at root, or full URL otherwise), and click "Submit." Monitor the status table for processing confirmation and any errors that need fixing.
How do I submit a sitemap to Bing Webmaster Tools?
Log in to Bing Webmaster Tools, select your website, navigate to "Sitemaps" under "Configure My Site," enter your sitemap URL, and click "Submit." Check sitemap status and address any errors Bing reports. While Bing has smaller market share than Google, submission takes minimal effort and improves overall search coverage.
Why is my sitemap failing validation with invalid XML or escaping errors?
XML sitemaps require strict formatting: special characters like &, <, >, ", ' must be entity-escaped as &, <, >, ", '. URLs with query parameters containing ampersands are common culprits - ?id=1&name=test must become ?id=1&name=test. Use XML validators to identify specific errors. Ensure UTF-8 encoding is declared and all tags are properly closed.
Should I include non-canonical or parameter URLs in my sitemap?
No, only include canonical URLs - the preferred versions you want appearing in search results. Don't include parameter variations, duplicate URLs, sorted/filtered versions, or any URL with a canonical tag pointing to a different URL. Including non-canonical URLs confuses search engines and wastes crawl budget. Your sitemap should list only indexable canonical URLs.
How often should I update my sitemap?
Update your sitemap when you publish new pages, remove old pages, or make significant content changes. For sites with frequent updates (daily blog posts), consider regenerating and resubmitting your sitemap regularly. For static sites with infrequent changes, update only when actual changes occur. The key is keeping lastmod dates accurate - frequent regeneration with unchanged lastmod dates is fine, but changing lastmod without real updates trains search engines to ignore those signals.
Do XML sitemaps improve SEO rankings directly?
No, sitemaps don't directly boost rankings. They help search engines discover and crawl pages more efficiently, which can lead to faster indexing and better coverage in search results. However, sitemaps don't improve individual page rankings - quality content, technical optimization, and backlinks drive rankings. Think of sitemaps as facilitating discovery and crawling, not influencing ranking algorithms.
Category Hub
Related Tools
Daily Inspiration
The pen is mightier than the sword. - Edward Bulwer-Lytton
