XML sitemap generator — turn a URL list into sitemaps.org XML for Search Console

This free XML sitemap generator builds a standards-compliant sitemaps.org urlset from plain text: paste absolute https URLs or, with a site origin, root-relative paths like /pricing. You can optionally set the same lastmod date, changefreq, and priority for every entry, then use the copy icon to grab ready-to-host sitemap XML. The upload icon loads a local .txt URL list—nothing is sent to a server. Pair the output with our robots.txt generator for a Sitemap: line, and browse SEO tools for schema, redirects, and metadata checks.

Use when lines are paths like /blog/post instead of full URLs.

Import URL list

Valid lines: absolute http(s)://… or paths starting with / when a site origin is set. Duplicates are removed.

lastmod

0 unique URLs

What is an XML sitemap and why do SEO teams use it?

An XML sitemap is a machine-readable inventory of important pages on your site. Search systems like Google Search Console use it to discover URLs, especially on large sites, new domains, or deep sections with few internal links. The file follows the public sitemaps.org protocol: a root urlset with child url elements, each containing a required loc (canonical URL) and optional lastmod, changefreq, and priority hints. Submitting a sitemap does not guarantee indexing, but it aligns your technical SEO signals with what you want crawlers to prioritize.

How to use this XML sitemap generator (step by step)

  1. Decide whether you will paste full URLs only or also path-only lines. If you use paths, set the site origin field to your canonical scheme and host (for example https://www.example.com).
  2. Enter URLs in the text area—one per line—or use Upload .txt to load a file from your computer. Invalid lines are listed so you can fix typos or protocol mistakes.
  3. Toggle lastmod when you want a single date on every URL (useful after a bulk update). Choose changefreq and priority only if you have a consistent policy; omitting them is valid and common.
  4. Click Copy XML and save the result as sitemap.xml (or another name) on your HTTPS host. In Google Search Console, open Sitemaps and submit the public URL of the file. Add a robots.txt Sitemap: directive if your workflow relies on discovery.

Keywords and topics this sitemap tool supports

Content and growth teams often search for an XML sitemap generator, sitemap.xml generator, Google sitemap format, Search Console sitemap submit, urlset lastmod changefreq, and SEO sitemap for new site. This page explains the fields, limits, and how to combine a sitemap with robots.txt and on-page structured data. For rich results markup, use the schema markup generator; for social previews, see the Open Graph tag generator.

Limits, best practices, and migration audits

Each sitemap may contain up to 50,000 URLs and must stay under roughly 50 MB uncompressed; bigger sites use a sitemap index and multiple segment files. Prefer HTTPS locations, avoid session parameters in loc, and keep lastmod truthful when you use it. When you change URL structures, combine this workflow with the redirect type checker so old URLs resolve cleanly. For snippet tuning before launch, the meta title and description checker helps keep titles and descriptions within common display limits.

Related SEO and site utilities

Explore the full SEO tools section on the homepage, or open a focused utility below.

Frequently asked questions

What is an XML sitemap?
An XML sitemap is a file that lists important URLs on your site so search engines can discover and crawl them efficiently. It uses the sitemaps.org format (urlset with loc entries). It does not guarantee indexing, but it signals canonical URLs and optional metadata like last modified date.
How do I submit a sitemap to Google Search Console?
Host the XML file on your site (for example at https://yoursite.com/sitemap.xml), verify the property in Google Search Console, then open Sitemaps and enter the path (/sitemap.xml). Google will fetch it periodically. You can also reference the sitemap URL in robots.txt with a Sitemap: line using a robots.txt generator.
What is the maximum number of URLs in one sitemap file?
The sitemaps protocol allows up to 50,000 URLs per file and the uncompressed file must stay under 50 MB. Larger sites split into multiple sitemap files and use a sitemap index file that lists them. This tool focuses on a single urlset for typical small and medium lists.
Should I include lastmod, changefreq, and priority?
loc is required; lastmod, changefreq, and priority are optional. Google has indicated it uses lastmod when it is accurate and consistent. Use real update dates when you can. changefreq and priority are hints—many sites omit them or set them conservatively to avoid misleading crawlers.
Can I use relative URLs in this generator?
Enter a site origin (scheme plus host, such as https://example.com) and then list paths starting with /. The tool resolves each path to an absolute URL. You can also paste only full http or https URLs and leave the origin empty.
Does this tool upload my URLs to your servers?
No. Parsing and XML generation run entirely in your browser. Upload only reads a local text file you choose to fill the URL list—nothing is sent to us. Avoid pasting private or authenticated URLs in shared environments.
How does this relate to robots.txt?
robots.txt tells crawlers what they may fetch; a sitemap lists URLs you care about for discovery. Best practice is to allow crawling of important pages in robots.txt and point to your sitemap with a Sitemap directive. Use our robots.txt generator to build that file and keep rules aligned.
What about hreflang and multilingual sites?
Standard XML sitemaps list URLs per locale or you can use hreflang annotations in pages. For tag clusters across languages, many teams maintain separate sitemap entries per URL and use the hreflang tag generator to keep language-region pairs consistent.