robots.txt & sitemap.xml Generator
Generate basic robots.txt and sitemap.xml files to guide search engines and improve your website's SEO. This tool automatically includes all pages registered in the application.
A `robots.txt` file tells search engine crawlers which pages or files the crawler can or can't request from your site. It's used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use `noindex` directives or password-protect your page.
Our generator creates a simple file that allows all crawlers access to all parts of your site, which is the correct setting for most public websites, and includes a link to your sitemap.
A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to more intelligently crawl your site. A sitemap tells the crawler which files you think are important in your site, and also provides valuable information about these files: for example, for pages, when the page was last updated, how often the page is changed, and any alternate language versions of a page.
This tool automatically generates a sitemap including all the static and tool pages found in this application, giving you a great starting point for your own site.
After generating the content, you should save each file with its respective name (`robots.txt` and `sitemap.xml`) and place them in the root directory of your website. For example, they should be accessible at `https://yourdomain.com/robots.txt` and `https://yourdomain.com/sitemap.xml`.