Need someone who know to do great google xml sitemap protocol
I have a client that have two site with 50 html and CF pages and they need to have google xml sitemap done.
**About Google Sitemaps** Search engines such as Google discover information about your site by employing software known as "spiders" to crawl the web. Once the spiders find a site, they follow links within the site to gather information about all the pages. The spiders periodically revisit sites to find new or changed content.
Google Sitemaps is an experiment in web crawling. By using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and speed up the discovery and addition of pages to our index.
If your site has dynamic content or pages that aren't easily discovered by following links, you can use a Sitemap file to provide information about the pages on your site. This helps the spiders know what URLs are available on your site and about how often they change.
A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. A Sitemap simply gives Google additional information that we may not otherwise discover. Sites are never penalized for using this service. This is a beta program, so we cannot make any predictions or guarantees about when or if your URLs will be crawled or added to our index. Over time, we expect both coverage and time-to-index to improve as we refine our processes and better understand webmasters' needs.
Google Sitemap Protocol:
The Sitemap Protocol allows you to inform search engines about URLs on your websites that are available for crawling. In its simplest form, a Sitemap that uses the Google Sitemap Protocol is an XML file that lists URLs for a site. The protocol was written to be highly scalable so it can accommodate sites of any size. It also enables webmasters to include additional information about each URL (when it was last updated; how often it changes; how important it is in relation to other URLs in the site) so that search engines can more intelligently crawl the site.
Sitemaps are particularly beneficial when users can not reach all areas of a website through a browseable interface i.e. users are unable to reach certain pages or regions of a site by following links. For example, any site where certain pages are only accessible via a search form would benefit from creating a Sitemap and submitting it to search engines.
This document describes the formats for Sitemap files and also explains where you should post your Sitemap files so that search engines can retrieve them.
Please note that the Sitemap Protocol supplements, but does not replace, the crawl-based mechanisms that search engines already use to discover URLs. By submitting a Sitemap (or Sitemaps) to a search engine, you will help that engine's crawlers to do a better job of crawling your site.
Using this protocol **does not** guarantee that your webpages will be included in search indexes. In addition, using this protocol will not influence the way your pages are ranked by a search engine.
Sitemap 0.84 is offered under the terms of the [Attribution-ShareAlike Creative Commons License].
| **XML Sitemap Format** | [[Contents]] |
The Sitemap Protocol format consists of XML tags. All data values in a Sitemap must be [entity-escaped]. The file itself must be UTF-8 encoded.
A sample Sitemap that contains just one URL and uses all optional tags is shown below. The optional tags are in italics.
<?xml version="1.0" encoding="UTF-8"?>
<[urlset] xmlns="[url removed, login to view]">
<[loc]>[url removed, login to view]</loc>
The Sitemap must:
* Begin with an opening <urlset> tag and ending with a closing </urlset> tag
* Include a <url> entry for each URL as a parent XML tag.
* Include a <loc> child entry for each <url> parent tag.
What I need is for the coder to create google sitemaps for the two sites which I have. The sites are currently in HTML and CF.
IE, Firefox, Windows, and Mac.