Seo - What is XML Sitemap?

What is an XML Sitemap?
An XML sitemap is a quite easy XML file that contains information regarding one or more URLs on your Web site. The information that is stored there helps search engines better spider your website. All it requests to be is a list of URLs for your Web site, but to get more out of it, you want to take account of other information as well.

Dynamic Google XML Sitemap generator

SiteMap generator is an dynamic .net based sitemap generator.It crawls your website to generate an xml sitemap.It contains a range of configurable variables for definig the sitemap content.for getting sitemap
of your site.The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are accessible for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include extra information about each URL: when it was last updated, how frequently it changes, and how vital it is in relation to other URLs in the site. This allows search engines to crawl the site more cleverly. Sitemaps are a URL addition protocol and match robots.txt, a URL exclusion protocol.

Google, Microsoft and Yahoo united to support sitemaps, a standardized method of submitting web pages through feeds to the search engines. Today, the three are now joined by in supporting the system and an extension of it called autodiscovery. This is where the major search engines will automatically locate your sitemaps file if the location is listed in a robots.txt file. Announcements are up from Google and Ask now Yahoo and Microsoft.

Information on how to create sitemaps files can be found at the site. Aside from the sitemaps XML formal, you can also provide RSS 2.0 or Atom 0.3 or 1.0 feeds. That’s handy for those with blogs that already generate these feeds.

Sitemaps XML files too complicated? Don’t run a blog? Note that the site has newly expanded information on how you can submit a simple list of URLs in a text file.

In the past, if you created a sitemaps file, you then had to manually tell the search engines where to find it. With today’s announcement, search engines will check your robots.txt file for a link to a sitemaps file, then get the file from that location. This is a big plus because all the major search engines regularly check robots.txt files as part of their ordinary crawling.


Replace the LOCATION-OF-SITEMAPS-FILE with the actual location. For example, if you ran a site at and had a sitemaps file called allmypages.xml in your top level, the reference would be like this:


Have more than one sitemaps file? Ideally, you’d create a special "sitemaps index" file that links to all of them, then put a link to the sitemaps index file in your robots.txt file. If that sounds like too much work, you can have more than one sitemaps URL listed in the robots.txt file.

Aside from autodiscovery, you can also ping Google and Yahoo with the location of your file. The site has more instructions on this in general. For specifics:
Google: See here. Note that this pinging is different than the pinging Google also supportsfor blog search.

Yahoo: See here and here. Unlike Google, the same pinging system is used for both web and blog search, to my understanding.

Both Google and Yahoo also allow you to manually submit sitemaps files. In both cases, doing this via their Google Webmaster Central or Yahoo Site Explorer systems gives you access to specialized monitoring and reporting tools or information on how they crawl you.
Quote : Searchengineland