Just starting the Complete SEO Guide? Start here!
Next: 2. Complete SEO Guide – Off-site SEO
First let’s understand how search engines view your website. Quite simply, they send robots that “crawls” the web and sites likes yours. When these pages are crawled, they are “indexed” on a very large database and are graded on over 200 factors. An algorithm is used to give the page a score and ranking against all the other pages based on the user query and keywords searched.
On-Site SEO comprises of all the SEO work that is done within your website. This is something you have complete control over and not relying on external websites or directories linking to your website. This will include content on Keywords, Content Structure, Site Structure, Domain etc.
Many SEO guides will guide you to “build for users, not for search engines”, which is half true. Of course, we want to build for our users because that is primarily what the website is for, and Google tries its best to rank websites on how useful they are for users. However, you may find that if you ignore the search engine aspect of things and build solely for the users, the search engine robots may not recognize how great your website to its users are, therefore not attain the rank you deserve. Consult with professional SEO for an onpage audit, our partners at PDX SEO can help.
The Perfect on-site SEO will have the following:
- Great content that provides value to users – which in turn makes it easy to market and attain backlinks from other sites, reduces bounce rates of visitors (visitors visiting then quickly leaving your site).
- Great User Experience –A website design should be easy and intuitive for a user to browse. Site should load quickly, multi-browser compatible and visually attractive.
- Crawlable/can be accessed by bots – Use rel=canonical to focus on one main address, URL structure should be consistent, use robots.txt and a sitemap, use 301’s for permanent redirects and a code 503 for temporarily down pages.
- Keyword targeted – The target keyword for the page should be on page name, title, H1 tag, content, URL and images and image alt attributes, internal and external links, meta description, meta keywords.
Domain Name and Hosting
Although domain name does not have as strong influence over rankings as it did in the past, it is still a factor. You may not have control over this as it may be a brand name such as Graftene but if you do, it’s important to get it right.
Exact Match Domain
Try to get the exact match domain, for example, www.southamptonplumbers.co.uk
if you are a plumber in Southampton. However the bonus of having an exact match domain comes with a price, make sure your website is not in violation of Google’s EMD Update (http://searchengineland.com/google-emd-update-research-and-thoughts-137340
). In short, make sure that if you will adopt an exact match domain, you have a high quality website with high quality backlinks and don’t spam your website with keywords which would be over-optimisation. Google thinks you are trying to manipulate them and they don’t like that!
Keywords in Domain
Get the target keyword within your domain, it helps with the CTR (click through ratio) and it’s still a ranking factor.
Country TLD Extension
If you are targeting local customers and want to attract local business, go with the country domain such as www.yourwebsite.co.uk for ranking in Google.co.uk. If you want to target globally or want to be more flexible and generic, you can use .com which can rank in google.co.uk or google.fr. The opposite would not be true however, www.youwebsite.co.uk would have difficulty ranking in google.fr.
You should try to host your website in the same country you are targeting, if you are targeting google.co.uk customers, pick a host that is based in UK.
There is a debate on the importance of country TLD (Top level domain) as many downplay its significance. In fact, both in Bing (http://www.bing.com/toolbox/webmaster) and Google Webmaster Tools (https://www.google.com/webmasters/tools/) you can geo-target the location you want to target so even if you choose .com tld you can tell the search engines to target google.co.uk. I however believe that country TLD helps and is a factor, however small. You can read more on this topic here: http://www.seochat.com/c/a/google-optimization-help/geo-targeting-techniques-in-google-for-seo/
This is also a factor, older domains with no bad history are considered more trustworthy. However, you have no control over this, other than perhaps not buying a blacklisted domain.
Keyword research could be a guide in itself. However for the purpose of this guide, use google keyword planner (https://adwords.google.co.uk/KeywordPlanner
) to get a basic idea of the performance of keywords. You want to balance out the difficulty of the competition for the keyword vs how much volume of searches there are for that keyword. So an ideal keyword would be one that doesn’t have much competition, yet gets searched often per month. Google Keyword Planner is fairly straightforward to use, you set your business, target keyword, and target location, get ideas and Google will generate a list of keyword ideas for you. These keywords will also contain interesting stats such as how much it has been searched per month and how difficult the competition is for that particular keyword.
Moz keyword-difficulty tool (https://moz.com/researchtools/keyword-difficulty) is also a useful tool to use for keyword research but unfortunately it’s not free. Of course keyword research is far more complex but that should help you get started.
Each page should target 2-3 keywords that you are looking to get ranked for, and the keywords should be relevant to the page content. Those keywords would of course be decided from your keyword research. We no longer spam keywords all over the page as did the old SEO practice, however it is still important to get those keywords in the page to let the Search Engines know what the page is primarily about. Here are the basics and good keyword targeting practice:
Not only are page titles crucial for improving your site’s CTR (click through ratio) as it’s the title users see in the search engine results but this element also has strong influence over your ranking as it determines the page relevancy vs the users search query.
- Use target keyword preferably at start of title
- Limit to 70 characters
- Use “|” symbol to divide phrases
Changed in the meta-description and this is the description users will see on your website’s search engine ranking. Doesn’t directly affect SEO but can have a big impact on your click through ratio.
- Use targeted keywords
- Accurately describe the page
Headings let the user know what the content is about, it’s good practice to split the headings into sections (H1, H2, H3) and be careful not to over optimise.
- Include Keyword in H1 and ideally H2.
Write the content for your users, the content should be relevant and look natural, not spammed with keywords for the purpose of manipulating search engines. Get a good balance on this, you want the users to easily navigate your site and be intuitive but you also want to get the keywords in there. This should happen naturally, for example if you do Web Design in Bournemouth, then it should be easy to mention “Web Design in Bournemouth” in your content without looking forced.
- Include the keywords at least once in the copy.
- It would be preferable to separate the content with header tags (H2, H3).
- Good content is easier to market and share to acquire backlinks.
If you see an image of a dog, you know it’s a dog, but search engine crawlers don’t. You can use Alt Tag and Title tag for the image to let users and crawlers know.
<IMG src=”filename.gif” alt=” alt-text here”>
- Use a Title tag on images.
- Use Alt tag on images and ideally have the keyword on at least 1 Alt attribute tag.
- Optimise the images, reduce file size to increase site load speed.
- Use dash (-) to separate words in URL structure.
- Have the keyword within the URL if it’s relevant.
- Write URL as static readable text, exclude URL parameters.
- Keep URL structure consistent, for example: If you want to keep the trailing slash ie graftene.com/ then keep it consistent throughout the site, graftene.com/About-us/.
Avoid duplicate content, some CMS may generate different versions of the page and could be classified as duplicate content. You can use Siteliner tool (http://www.siteliner.com/) to identify duplicate content.
Feel free to link out to another website from within your content if you feel this will help the user. This can also help increase the site trust rating if it’s linking out to a high authority website.
These are the important SEO factors you can optimise across your site in order to improve your SERP ranking. In short, make sure your website is user friendly with good navigation system, good indexable content, fast loading, have a sitemap, crawlable by robots and consistent URL structure.
Websites that load faster is better for user experience and conversion rates. It’s also an important ranking factor. Make sure your website is up to speed by optimising the code, images and media. You can use tools
like Google pagespeed (https://developers.google.com/speed/pagespeed/insights/) to help you with this process.
Setup the links and navigation within your website to be intuitive and easy to navigate. A user should not have to go through 4 pages to get to a page. Navigating a website shouldn’t be like trying to solve hard riddles
. It is important to not link to broken links/pages. Internal linking also helps with the site trust flow.
Schema / Rich Snippet Markup
This may not directly contribute to SEO but would help by increasing conversion rates. Read more at https://developers.google.com/structured-data/
and create schema at http://schema-creator.org/
User engagement is measured by many factors, Google uses this as a measure of user experience and it’s a contributing factor towards the website SERP (Search Engine Results Page) ranking. Ensure good user engagement with the basics, good site speed, intuitive navigation and useful content. Some measuring sticks for quality user engagement are:
- Bounce Rates – Bounce rates are when users visit your website and quickly leave. There could be a number of reasons for the users leaving quickly, site too slow, not informative, or poor UX. Whatever the reason, Google sees a high bounce rate as poor user experience and would therefore result is a lower SERP ranking. A low bounce rate on the other hand is a positive.
- Direct Traffic – It’s a positive indicator if a site receives more direct traffic.
- High click through rate – A better click through rate on the SERP is a positive indicator.
- Returning visitors – A good amount of returning visitors is a positive indicator.
A website with a good domain trust ranks higher on the SERP rankings and can get away with more without being penalised. Here is what you can do to increase domain trust:
- Link out to High Authority Websites – Linking out won’t hurt your SEO, it can help in fact if the website you are linking to is relevant.
- Keep bounce rate low – Improve relevancy of content, meta description and UX design of site to lower bounce rate.
- Keep domain WHOIS public – Public whois are more trusted than private.
- Get some trusted backlinks – Links from .edu or .gov or .ac.uk helps.
Make sure your website is mobile responsive. Used the tools below to check.
Canonicalisation refers to pages that can be loaded from multiple URLs (web address). For example the web pages below are all identical:
For SEO purposes we don’t want this to occur because the search engines may treat these pages separately and therefore separate the overall popularity of the page, instead we want to combine these into one. 10 backlinks going to http://example.com
and 10 to http://www.example.com
could be determined as 20 backlinks to the canon URL instead. There are a few ways to fix this issue but the 2 most common are:
<link href=”http://www.example.com/canonical-page/” rel=”canonical” />
More on canonical https://support.google.com/webmasters/answer/139066?hl=en
A common mistake in redirecting is creating an infinite loop http://www.example.com and http://www.example.com/index.html
. An explanation on the fix here: https://moz.com/blog/apache-redirect-an-index-file-to-your-domain-without-looping
Most sitemap comes in XML format (Extensible Markup Language). You will see it as sitemap.xml on the main website directory. The file basically tells search engines your website structure and the pages on it. This can help the search engines index the right pages on your website.
Visit Sitemaps.org for more info or build them at XML-Sitemaps.com.
Robots.txt file instructs search engine crawlers how to crawl the website. You can for example disallow robots from crawling a certain page if you do not want the search engines to index it. Create a robots.txt file from (http://tools.seobook.com/robots-txt/generator/), just insert into notepad and name robots.txt – save on the main directory of your website.
Webmaster Tools (Search Console)
Webmaster Tools is an essential for managing your website, from a developers point of view, webmasters and SEO marketers. It gives invaluable data and control over your website and how Google should treat the site.
More on Webmaster Tools here https://blog.kissmetrics.com/beginners-guide-to-google-webmaster-tools/
: There is also a Bing Webmaster Tools equivalent and the features are very similar.
Google Analytics is a free web analytics software from Google. It is essential for online marketing and SEO purposes. Google analytics allows you to track your website’s traffic, getting important information such as:
- How many people visit my Website?
- Where are they from?
- How did they find you?
- What pages are the most popular?
- What is the conversion rate?
More on Google Analytics https://moz.com/blog/absolute-beginners-guide-to-google-analytics
: 2. Complete SEO Guide – Off-site SEO