Google has for a long time depended on external links from pages they already know about in order to find new websites.
For webmasters and website owners Google Sitemaps is the most important development since RSS or Blog and Ping, to hit the Internet.
Using RSS and Blog and Ping enabled webmasters to alert the search engines to new additions to their web pages even though that was not the primary purpose of these systems.
If you’ve ever waited weeks or months to get your web pages found and indexed you’ll know how excited we webmasters get when someone discovers a new way to get your web pages found quicker.
Well that new way has just arrived in Google Sitemaps and it’s a whole lot simpler than setting up an RSS feed or Blog and Ping. If you haven’t heard of Blog and Ping it’s a means by which it’s possible to alert the search engines to crawl your new website content within a matter of hours.
If you’re a webmaster or website owner Google Sitemaps is something you Can’t afford to ignore, even if you’re also using RSS and/or Blog and Ping
The reason you should start using Google Sitemaps is that it’s designed solely to alert and direct Google Search Engine crawlers to your web pages. RSS and Blog and Ping are indirect methods to alert search engines, but it’s not there primary purpose.
It works now, but like most things it’s becoming abused. Search Engines will find ways to combat the abuse as they’ve done with every other form of abuse that’s gone before.
Abusing the search engines is a short term not a long term strategy and in some cases certain forms of abuse will get you banned from a search engines index.
You may also be thinking, don’t we already have web page meta tags that tell a search engine when to revisit a page. That’s true, but the search engine spider still has to find the new page first, before it can read the meta tag. Besides that meta tags are out of favour with many search engines especially Google, because of abuse.
If talk of search engine spiders leaves you confused, they’re nothing more than software programs that electronically scour the Internet visiting web sites looking for changes and new pages.
How often the search engine spider alias robot, visits your website depends on how often your site content is updated, or you alert them to a change. Otherwise for a search engine like Google they may only visit a website once a month.
As the internet gets bigger every second of every day, the problem for search engines and webmasters is becoming evidently greater. For the search engines it’s taking their search spiders longer to crawl the web for new sites or updates to existing ones.
For the webmaster it’s taking longer and becoming more difficult to get web pages found and indexed by the search engines
If you can’t get web pages found and indexed by search engines, your pages will never be found in a search and you’ll get no visitors from search engines to those pages.
The answer to this problem at least for Google is Google Sitemaps
Whilst still only in a beta phase while Google refines the process, it’s fully expected that this system, or one very similar, is here to stay.
Google Sitemaps is clearly a win-win situation
Google wins because it reduces the huge waste of their resources to crawl web sites that have not changed. Webmasters win because they alert Google through Google Sitemaps what changes or new content has been added to a website and direct Google’s crawlers to the exact pages.
Google Sitemaps has the potential to speed up the process of discovery and addition of pages to Google’s index for any webmaster that uses Google Sitemaps.
Conventional sitemaps have been used by webmasters for quite some time to allow the easier crawling of their websites by the search engine spiders. This type of sitemap is a directory of all pages on the website that the webmaster wants the search engines or visitors to find.