How a site audit can improve your SEO
You can check your site for some of these issues in order for it to be a better user experience and be better for your SEO.
Matching your website to a set of web standards is a key contributor to your SEO performance. So I thought I would give some basic checks you can do, to improve your SEO. There are some websites that can help you audit your website, they may cost money or only analyse one page. However a great one to use is 'Screaming Frog SEO Spider Tool' which is free for smaller sites.
Page structure
Making sure that your page is well structured is often a quick win.
Nest your headings correctly
Your headings should be structured correctly, there should be only one main page title (H1). Underneath this should be H2's if there are any subheadings for the H2 then this would be a H3. Same for H4's if there's a subheading under H3's. Below is an example of correctly nested headings:
<h1>Main title</h1>
<p>A bit about the page</p>
<h2>A subject</h2>
<h3>Part of the subject</h3>
<p>More about part of the subject which is related to a subject</p>
<h3>Another part of the subject</h3>
<p>More about another part</p>
<h2>A different subject entirely</h2>An example of correctly nested headings
Check your metadata
Make sure your metadata meets the following requirements:
Page title: under 60 characters
Meta description - between 150 and 160 characters
Meta keywords (optional) - between 150 and 160 characters
Social tags (optional) - These are tags that aren't used for SEO but can really help your site stand out when a page is linked to on Facebook, Twitter or a lot of other services.
Use of bold/strong tags
Using bold/strong tags can help highlight keywords within the text. However, these can be overused a general rule is no more than 24 words or phrases should be made bold.
Make sure images have alt tags
This is best practice anyway however all images should have alt tags or a caption for partially sited users to understand what the picture is of. Alt tags are also useful for SEO as they can contain keywords to help search engines understand what the page is about. An alt tag should use 50-55 characters (up to 16 words).
Check for broken links
Look at all your links and check to see if any are not working. These can be a bad user experience as the user experience of your site can be jarring if a user clicks a broken link. Search engines also crawl the links on your site then attribute link juice, including any internal links. Chrome extensions can help you with this such as 'Broken Link Checker' and 'Check my links'.
Your content
Content length
There should more than a few words on a page. There should be enough content for a person or search engine to understand the purpose of the page and gain something from it. For instance, I try to reach a minimum of 1000 words per blog post. As this helps a reader understand the subject (and with my blog at least) help them understand a subject and give tips on what to look at going forward.
Content
Your content should also have keywords within the text itself. As search engines will try and understand the subject of the page. For instance, if you have pie in your page title and description but don't mention it in your text, search engines will disregard this subject.
Mobile optimisation
Make sure your page works on mobile, as search engines generally rank pages which work on mobile higher (especially on mobile). This may be hard to do however test your pages on your mobile see if the content works and if any changes need to be made to make it a better experience.
Your page speed
Your page speed can be a big contributor towards your page speed. I have gone into more detail on how you can improve your page speed in another blog post. If you're using a tool such as PageSpeed Insights, then I also have a blog post detailing what your score means.
sure you have a sitemap and a robots.txt
Sitemap
A site map tells a search engine where all your pages are, how often they're updated, when it was last updated and the page priority. Screaming Frog can generate one for you, also there are various websites that can also generate one for you such as 'XML-Sitemaps'. There is a gulp plugin too which can generate a sitemap for you too. Here is an example of a one-page sitemap:
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:image="http://www.google.com/schemas/sitemap-image/1.1" xmlns:xhtml="http://www.w3.org/1999/xhtml">
<url>
<loc>https://aaron-russell.co.uk/</loc>
<lastmod>2019-01-23</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
<image:image>
<image:loc>https://aaron-russell.co.uk/images/me.jpg</image:loc>
<image:title>A Picture of Aaron Russell</image:title>
</image:image>
</url>
</urlset>
As you can see in the above example you can also add images which can be added to image search too.
Robots.txt
This file should be under the root of your site (https://example.com/robots.txt). It lets search engines know whether they should crawl your site or not. Here is an example of my sitemap below:
User-agent: *
Disallow:
Sitemap: https://aaron-russell.co.uk/sitemap.xml
The above example allows all crawlers to crawl the entire site. If I wanted to disallow a certain page or set of pages you can add it to Disallow. You can set rules for certain search engines too. So if you didn't want Google to crawl the section /dont-crawl/ you can do so:
User-agent: Googlebot
Disallow: /dont-crawl/
User-agent: *
Allow: /
Sitemap: https://aaron-russell.co.uk/sitemap.xml
Finally, in the above example, you can see I've also added my Sitemap location. This just makes it easier for crawlers to find the sitemap.
Is your site avaliable over HTTPS?
As you can see from this article, Google takes into account whether the webpage is secure. There's plenty of ways of becoming a secure site, both paid and free. If you use shared hosting often your hosting provider will let you buy a certificate and have it installed. Although some free alternatives are Let's Encrypt (This may require some technical knowledge of command line) and Cloudflare (This may require some technical knowledge of DNS).
The reason Google likes sites that are accessible over HTTPS is that a user is more secure over HTTPS rather than HTTP. So it's easier for Google to have this result higher in the search results.
Don't want to do it yourself?
I'll do it for you just like I am for Zebra Hardware! Send me an email to discuss pricing or any queries! My email address is: aaron@russell-tech.co.uk