In this blog I will give you some tips how to improve the searchability of your website. I will regularly update this blog with new tips. If you have other tips please feel free to add a comment to this blog.
1. Crawlability & indexation
1.1 Sitemap XMLs
By providing XML sitemaps search engines can be informed about which (new) pages a site contains as well as being informed about which pages have been modified. Something that’s very important for pages which otherwise possibly wouldn’t be found nor recrawled after changes have been made.
Example:
- Create a sitemap.xml file
- Make the sitemap.xml discoverable by adding it to robots.txt file
1 2 3 4 5 6 7 8 |
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://example.com/voorbeeld.html</loc> <lastmod>2014-22-13T07:14Z</lastmod> </url> <!-- etc. --> </urlset> |
More information about XML sitemaps can be found on:
1.2 Providing a robots.txt file
Create the robots.txt file and make it accessible under http://www.yoursite.com/robots.txt. The contents of the file should instruct the search engines to allow indexing of the website. When there is a sitemap add it as well to the sitemap.
Example
1 2 3 |
User-agent: * Allow: / Sitemap: http://www.yoursite.com/sitemap.xml |
1.3 URL structure
URL structure is important factor for both users and search engines alike. By selectively adding keywords to URLs they tend to get more recognizable which leads to better results then when keywords are only written on the page itself. Take note that a short URL commonly tends to be better then a long URL because shorter URLs are easier to use for humans. A fact search engines recognize and reward.
URLs should also be translated such that they target the language of the user. For example German searchers are more likely to follow an URL when it’s in their native language instead of an English translation.
- Translate URL’s for each country
- Make sure to use keywords in the URL
- Lowercase URLs (see duplicate content issues 2.2)
- Shorten long URLs
When URLs are changed make sure you redirect the old URL to the new URL, otherwise this will result in lowered rankings.
2. Duplicate content issues
2.1 Using the canonical tag
Make sure your website uses the <link rel=”canonical”> tag. This tag can be used to solve many duplicate page issues. By using this tag search engines will treat paginated pages as if they are one page, consisting out of multiple parts, as opposed to treating them all as separate pages with nearly identical content. Resulting in those pages competing with each other to rank.
For example, at YourSite the same webpage can be visited using the following different URL’s
- http://www.yoursite.com/nl/Latest-News/
- http://www.yoursite.com/nl/latest-news/
- http://www.yoursite.com/nl/Latest-News
All of these pages are exactly the same, yet search engines see them differently because the URL changes. By providing a canonical tag, which points to one unique address, search engines will come to understand which page to return in the search results.
- Implement the canonical tag on each webpage within the <head>-section of the HTML. The value of the href attribute should point to the URL that the search engine should index.
Example
1 |
<link rel=”canonical” href=”http://www.skyteam.com/nl/supporting-your-business/”> |
Additional information about the canonical tag can be found on:
2.2 Multiple (sub) domains
Many organisations use multiple registered domains, yet having the same content available on multiple domains results in duplicate content, which will lead to those domains competing against each other in the search results. Luckily the solution for this is quite simple.
By 301-redirecting all domains to the main domain, only one source will be left.
- Make sure that the content of yoursite.biz is not accessible on www.yoursite.com.
- Change 302 redirects to a 301 redirects so to make sure search engines understand the URL has permanently been moved.
2.3 Redirecting HTTP to HTTPS
All webpages are currently accessible by using both the HTTP and HTTPS protocol. Something which is suboptimal as it results in a duplicate page for every page the site offers. Therefor it’s important the website is only accessible via one protocol. Preferable HTTPS as Google recently added it as a ranking signal because it prefers a secure over a non-secure web.
2.4 URL’s without contry structure
The webpages in the website have a country specification in the URL. For example the webpage: http://www.yoursite.com/nl/News/. This page shows the visitor Dutch content of the Airport services. When changing the country specification to “en” the English content is shown.
Yet when there is no language specification the content that is shown is in the same language as that of the visitor’s browser. For example http://www.yoursite.com/News/. This causes indexation problems as search engines do not provide a language preference to the server and therefor only get to see the English content. Content which they they also get presented when crawling a country specific URL like /en/. Per facto generating duplicate content.
Example
- http://www.yoursite.com/News/ shoould 301 redirect to http://www.yoursite.com/en/News/
2.5 Rewrite URLs without trailing slashes
For many instances it is possible to request an URL both with and without a trailing slash. By changing this behaviour duplicate content can be reduced as well as improving the crawl efficiency of the website.
This page is regularly updated with new tips and tricks for SEO. Feel free to put more tips in the comments below.
I will check them and them to this article.