In my opinion, the most important factors in SEO are content, backlinks and structure. Building a new website with a well-thought site structure will help users as well as search engines to smoothly navigate through your site and spider one page after the other. Quite nice, but no matter how clear the site structure may be, I would always recommend creating a sitemap.xml. This is a file which lists the URLs of your pages in a way that bots can handle them properly and also gives them additional information, e.g. how often the pages are updated, how important they are compared to the other pages of your site, etc. The sitemap.xml follows the XML schema for sitemap protocols which all major search engines agreed on. Depending on the system you use, there are different ways to create such a sitemap.xml file: If you build the website by yourself, you can create and update the sitemap.xml file manually using a web development tool or a simple text editor. Going this way, you always have to keep in mind that you have to make every change (e.g. creating new pages, deleting old pages, etc.) manually. Remember, always stick to the XML schema for sitemap protocols. Otherwise, bots are not going to be able to read your sitemap.xml. You can also implement a service that generates a current version of your sitemap.xml in a time interval that you define. Using a CMS like WordPress, it’s much easier: There are many helpful plugins for this job – you will find a sitemap XML generator in a few minutes. Most plugins offer various customization possibilities – e.g. to define the time interval in which a new sitemap.xml is being generated, to exclude certain page types, etc. Using Google Webmaster Tools you can inform Google that you provide a sitemap.xml for your site. In my next blog post, I will tell you why it is highly important to have a robots.txt file and how to connect it with your sitemap.xml file.