In this second article of my SEO 101 series, I will explain the process of creating a search engine specific XML sitemap. As well as, adding your site to search engines via Webmaster Tools. Finally, I cover a few of the basic Webmaster Tools concepts and features that will help you maintain your site's listings in search engines.
With a XML sitemap it is much easier for search engines (such as Google, Bing, Yahoo, and Ask.com) to crawl and index your website and blog. The sitemap gives the search engines a complete structured view of your site, indicating when the page / post was last updated and how often it changes, in addition to, what priority you have put on the page.
Create a sitemap following the XML sitemap protocol found at sitemaps.org. This is the format used by all the major search engines. Google also goes into great detail explaining sitemaps and their creation.
Many Content Management Systems (CMS) will have a sitemap plugin that you can install and use with minimal effort. They also properly adjust the modified date of each page to give search engines the best possible change of indexing your website.
Submitting your sitemaps to the Webmaster Tools of each search engine will help. Once your sitemap is finished you will want to submit it to search engines. The best way to do this is by creating webmaster tools accounts with at least Bing and Google.
Please note that submitting an XML sitemap to any of these search engines is not a guarantee that your pages will be included in the search engine.
You will have to signup with each of the search engines, and verify that you own your site, which can be done by uploading a file to your server, or adding a meta tag to your home page. Directions will be provided to you when you sign up.
The main benefit of using the Webmaster tools is to analyze, monitor, and improve your site's Search Engine Optimization (SEO). Some of the things you can do:
There is a ton of documentation online about the specific uses of the webmaster tools. So instead of going into great deatail here are some references to help get you started.
The robots.txt file gives more information about your site to the search engine crawlers, and other web robots, about your site. Google Webmaster Tools also has a Generate robots.txt tool. Or if you want to build your own robots.txt file, can learn more about robots.txt at robotstxt.org. Robots.txt not only tells search engines what directories to include, it can also be used to tell search engines to skip directories to make their crawling more efficient.
If you are using a CMS you may already have a robots.txt file. So either create your robots.txt file or open the one that came with your CMS, and add the sitemap auto-discovery directive to your robots.txt file with one line of text right at the bottom of the file that says:
The above sitemap location should be the fully qualified sitemap URL.