If you’re unsure about why you’re not getting the rankings that you want, an SEO audit is the best way to find out.
When an agency takes on a new client, it’s the first thing that they should do. It’s an important part of the process and allows the agency to figure out what you’ve done in the past, where there are gaps and what should be done going forward.
In this article, we’ll be looking at how to complete an SEO audit and why it’s something that all business should do.
The usefulness of an audit is dependent on the questions that you ask. Therefore, it’s important to have a good understanding of SEO before you conduct and audit.
Some can take only an hour, while an audit for a larger company could take days or even weeks. It’s all dependent on the amount of pages, links and the complexity of the website itself.
What is an SEO Audit?
An SEO audit is a complete look at the content, the backlink profile and the technical SEO of a website.
The goal of any audit should be to see what you’re doing well and to spot opportunities for you to improve.
By the end of an audit, you should have a clear list of where you need to take action and what plans you should make for future improvements.
As well as this list of action points you should be able to see what you’ve done well in the past and learn from the successes you’ve had.
When Should You Do An SEO Audit?
SEO audits aren’t something that you need to be doing every single week. Instead, they should be done periodically when you’ve been working actively to improve your rankings.
We will complete an audit when we first sign a client and also periodically throughout the year to assess the impact of our work.
An audit is the best way for you to evaluate performance. Rankings are the ultimate goal, but we all know that rankings can take time and an audit can give you information about the quality of your work before the rankings reflect the changes.
What Are Your Strategic Objectives?
SEO should be just one of many different marketing methods you’re using to grow your business.
While it’s often good to keep SEO-related goals, most of your goals should be tracked with traditional business KPI’s.
At the end of the day, while it’s nice to say that you built 100 new backlinks if it doesn’t translate to money in your pocket at the end of the year it’s useless.
It’s important that your goals are as useful as possible to you as you work. That’s why we recommend using S.M.A.R.T goals. Specific, Measurable, Attainable, Relevant and Timely.
This type of goal gives you the easiest feedback and prevents you from cheating your way to completing a goal.
Here are some examples of goals that you might have on your next SEO campaign:
My website will see an increase in organic search traffic by 100% within the next 24-months.
My website will acquire 30 referring domains within the next 4-months.
Again, these are great metrics to track the progress of your SEO campaign, but they don’t tell us a lot about how they have impacted the business. To do this, you’ll need to use similar goals for your business:
Monthly revenue from online leads will increase by 20% within the next 12-months.
Cashflow is king, and if your revenue isn’t growing, your SEO isn’t working for you.
When you’re analyzing your keywords, you want to figure out if you’re targeting the right keywords and if not, what you should be targeting instead.
We would all love to rank for terms that have 100,000 searches per month, but sometimes that’s not realistic.
A good audit should be able to tell you if you’re targeting keywords out of your league or if you’re overly conservative. Most of the time businesses will try and rank for keywords that their authority doesn’t deserve.
These huge keywords are normally reserved for the biggest brands in each industry.
Instead, you want to find keywords that you have a strong chance of ranking for. You want to take the ‘guess work’ out of your SEO.
It’s a good idea to regularly do keyword analysis, at least twice per year, to figure out if you’re on the right path or if you need to make some adjustments.
You’ll often find that a huge majority of your traffic comes from just a few keywords.
In this case, you want to isolate the keywords that you do well on and focus on maintaining those.
If you’re unsure about what keywords you should be targeting, then you can go back to using S.M.A.R.T goals.
You want to have a very set list of keywords that you want to rank for.
It’s not particularly helpful if you change your mind every week about what you want to rank for.
Instead, choose a small list of 50 keywords that you’re going to focus on in the short-term. Once you’ve ranked for those keywords, you can expand into other niches, but having a specific list will prevent you from becoming distracted by other shiny keywords.
Fortunately, tracking your keywords is very easy.
You can either choose to use an external tool like Pro Rank Tracker which will keep all of your keywords in one place and generate reports, or you can use Google Search Console.
Either way, you should be looking at your keyword progress on a weekly basis.
Any more frequently than that is a waste of time.
Although your keyword rankings aren’t the most important part of your business, they give you a good indication of the progress you’re making, and the time it might take to rank.
There’s no use in wasting your time trying to rank for keywords that are beyond what your website is capable of.
If you’re competing against much bigger competitors, then you’re going to struggle because:
Your competitor’s websites are older and more trusted by Google, giving them an edge in the SERP’s.
Their websites have a better backlink profile, and so they can outrank you, even with fewer links to that particular page.
They might have much larger budgets than you which enable them to rank for keywords that would take you years to rank for by yourself.
When you’re looking at the kinds of keywords that you’re capable of ranking for you need to check your ego at the door.
It might be the case that in the short-term you need to set your focus on long tail keywords that have much fewer searches per month.
Fortunately, ranking #1 for a keyword that has 200 searches per month is far more valuable than ranking #100 for a keyword that has 20,000 searches per month.
Focus on the traffic that you can realistically obtain rather than on the traffic you could get if you ranked #1.
Obviously, you want to rank for keywords that will bring targeted traffic to your website. You can rank for massive keywords, but if the visitors aren’t interested in your business, it won’t translate into revenue.
You should be realistic about how long you’re willing to pursue a keyword.
You might only have a limited budget that can only stretch for one year, meaning that you need to see a return from your efforts before the budget disappears.
When you’re picking keywords, you need to factor in the time it will take to rank for them and the return that you can expect when you do rank. This will help to give you a better idea of the value of the keyword to you, preventing you from wasting your time and resources.
Once you’ve found some keywords that you think you’d like to target you need to look at the competition.
Any SEO audit should analyze the competition because:
- It will allow you to see if a keyword is too competitive
- It shows you what content type and length is performing well in the SERPs
- You can look at your competitors link profile to see if you can copy any of their links
This is going to give you more information, allowing you to validate the keywords you’ve selected. This is a critical stage. Without this competitor research, you’ll waste time working on trying to rank for a keyword that wasn’t possible.
Whenever you look at a keyword and its competition you should be asking yourself if it’s too competitive. If it’s not, is it too easy?
You shouldn’t disregard easy keywords, but you also want to be ambitious. Log those easy keywords into a spreadsheet and leave it for when you’re running out of keywords to target.
You want to find the Goldilocks keywords. They’re not too hard, and they’re not too easy, they’re just right.
These are the keywords that you won’t instantly rank for, but you should be #1 for within a month or two. They are going to have decent search volume and should be relatively valuable.
The first thing you should do is take a look at the strength of the pages and the domains in the top five spots.
You can do this in Ahrefs Content Explorer.
Ideally, you want keywords that have lower UR and DR. UR is a good metric for figuring out how powerful a particular page is and how difficult it will be to outrank.
Although this isn’t the most in-depth research, this is a quick way to rule out overly competitive keywords so that you can spend more time on those keywords that have potential.
When you’re looking at the competition, you should be looking at the strength of the other websites in comparison to yours.
A brand new website might find UR 20 very difficult to outrank, but Gizmodo or Vogue laugh in the face of UR 20 because they have thousands of referring domains and years of trust.
Once you’ve found a keyword that you think has potential, you should look into the individual backlinks to each of the top five pages for that keyword.
You’re looking at the UR and DR of each of the backlinks to gauge the strength of the links they’ve acquired and how hard they would be to beat.
DR is a metric that represents the strength of a website’s backlink profile. So, a link from a site with a higher DR will have more impact.
When you’re looking at the backlink profile, you want to compare your own to theirs. But just as importantly, you need to ask yourself if you could replicate the links that they have to their page.
Tip for finding “Easy Wins.”
If you’re struggling to find keywords that you can rank for, then you’ll love this tip.
Firstly, find a website in your niche that you admire, that ranks for similar keywords and hopefully doesn’t have the absolute best content in the world.
Take their URL and put it into Ahrefs. Now, if you go to organic search, you can get a full report of all the keywords that their entire website ranks for.
Ahrefs has a great metric called KD, keyword difficulty, which represents the difficulty of a keyword based on how many backlinks the top pages have.
Filter by keywords that have KD 0-1 and then also filter by positions 1-2. This will show you keywords that the website ranks first or second for and that have little competition.
Finally, take a look at the page itself. If you can produce better, more useful and more detailed content around that keyword, you’ve found a great keyword to target.
During an audit, you should perform a technical analysis on your website. The purpose of this is to find any issues with your website that could either harm your SEO or hurt your user’s experience when they use your site.
The speed at which your page loads is more important than any other factor because if it loads slowly, those users might click back before they even land on your website.
People have come to expect websites to load instantaneously.
If your page takes longer than 1 second to load, you need to improve it.
It’s not good enough in this day and age for your website to take 3 seconds to load. You might not realize it, but a large percentage of the population is still using the dialup internet. So, if it’s taking 2-3 seconds on fast broadband, it might be taking 10-15 seconds on dial-up.
To make your site load faster there are a few things to look at:
- Make sure that you’re using browser caching to reduce demand
- Optimize the size of all your images; there’s no need for them to be more than 150kb each
- Upgrade your hosting account to a VPS or dedicated server
- Flatten your JS, CSS and HTML files
- Use lazy load plugins so that only the content above the fold loads first
With a little bit of effort, you can’t probably cut your load speed in half. We know that Google takes page speed into account when they are deciding where you rank, so don’t give up valuable revenue by letting your pages load slowly.
A lot of users are on mobile devices and pages that are over 2mb in size will be painfully slow to load.
Not only should your pages be small and fast enough to load on mobile, but they should also be optimized to look good on mobile.
Google factors in mobile-optimized pages and with more users than ever using search on their mobile, this isn’t something that you can ignore.
If you’re unsure if your website is responsive to different device sizes or is optimized for mobile, check out Google’s mobile analysis tool.
If you have two different pages on a single domain that are both optimized for the same keyword, then you might have problems with Google struggling to decide which to show.
This can potentially be a problem if Google believes that you’re trying to cheat the algorithm, they could decide to rank neither of your pages.
In general, it’s best to avoid having multiple pages which target similar or the same keywords.
Keyword cannibalization is particularly common with local businesses that don’t have very many pages and are optimized around a few keywords.
You might have a gardener in Las Vegas.
On their homepage, they would have the title tag “Las Vegas Gardener | Gardening Co.”
However, on their subpage, they might have decided that they wanted to rank for “Las Vegas Gardener” because it was such a valuable term.
So, the subpage has the title tag “Best Las Vegas Gardener | Gardening Co.”
As you can see, they are essentially the same tags and Google might believe that you’re trying to cheat their algorithm.
Of course, there’s nothing wrong with this company writing about gardening in Las Vegas in a whole bunch of different blog posts. The problem is when you appear to be optimizing for the same keyword in your title and H1 tags.
To avoid this, you should try and vary your keyword usage in important tags.
If you want to detect if your website has any problems with keyword cannibalization, it’s quite simple. Download a program called Screaming Frog SEO Spider. This tool scans your website and gives you a report of title tags, header tags, meta text, etc.
Input your domain and start the scan. Click the “Page Titles” tab, and you’ll be able to see all of your different title tags.
By inputting one of your main keywords into the search bar, you can filter for title tags with that keyword.
This will allow you to quickly glance through and see if any of the pages are optimized for keywords that might be too similar.
Redirects by themselves aren’t a problem, but redirecting incorrectly can cause you to lose hard earned link juice.
There are four redirects that you want to avoid:
- 302 redirects
- Chains of redirects
- Non-preferred versions of a domain failing to 301 to your preferred version
- Non-secured versions of a domain failing to 301 to your secured version
You’ve probably heard of a 301 redirect, but you might not have heard of a 302 redirect. A 302 is very similar to a 301 redirect, but instead of being permanent it’s only temporary.
With it being temporary, it also doesn’t pass any of the link authority from the previous page. To fix this, you want to change any 302 redirects (that aren’t temporary) to 301 redirects.
To find out if you have any 302 redirects you can search your URL in Screaming Frog Spider, go to the response codes tab and filter by “redirection 3xx”. Here you’ll find your 301 and 302 redirects.
A redirect chain is a problem because with each link ‘step’ you lose part of the link juice power. So, in the following example, you can see that Page A redirects to Page B and Page B redirect to Page C.
However, it would be more efficient if Page A redirected directly to Page C and Page B redirected to Page C.
If you want to find any redirect chains, then you can go back to Screaming Frog Side and go to “Configuration” and click on “spider.” Click on the “Advanced” button and check “Allow follow redirects” and then click “ok.”
This time, when you enter your target URL, you can go to “reports” on the menu bar at the top and click on “redirect chains.” This will show you any chains on your domain that can be improved.
When you created your website, you would have decided whether you wanted www.domain.com or just domain.com.
Either way is completely fine, and it won’t have any impact on your SEO. However, what will have an impact is failing to redirect your non-preferred version to your preferred.
Let’s look at an example so you can wrap your head around this.
If you prefer to have “www.” then your website might be “www.domain.com.” This is your preferred domain.
Similarly, your non-preferred domain would be “domain.com.” Of course, you want to 301 redirect the non-preferred domain to your preferred version. Otherwise, you’ll have duplicate websites or links won’t redirect and pass their authority.
Finding out if you’ve done this only takes a few seconds. Use this tool to input your domain, and it should tell you that “everything seems to be fine.”
This is the same problem as the non-preferred version. You might have failed to create the 301 redirect and are therefore losing link authority.
If you’ve made the transition to SSL, you might have struggled with implementing your certificate and failed to redirect your HTTP website to https.
Testing for this is equally easy:
Go to your website: https://www.yourwebsite.com/
Then, remove the “s” like this” http:///www.yourwebsite.com/
If it doesn’t redirect you straight back to the “https” version of the website, then you’ve failed to setup the redirect properly.
Indexing problems can be a real problem, after all, you can’t receive any traffic from Google if they don’t include your pages in their database.
You should always be striving for a 100% indexation rate. It’s likely that once you grow past a few hundred pages that a couple won’t be indexed, but any less than a 98% indexation rate and you might have a problem.
If this is the case, the best place to start your hunt for the problem is your robots.txt file.
This file is there to tell any robots that come to your website how to behave. It could tell them to go away, or it might tell them not to store data.
Sometimes website owners might accidentally block Google or other search engines when they copy the code from somewhere.
What you want to look for in your robots.txt file is the phrase “disallow.” Any disallow command will tell robots not to crawl a particular page. However, if you simply wrote “Disallow: /“ you would be preventing search engine bots from crawling any page beyond your homepage.
If your website isn’t being regularly crawled automatically by Google or other search engines, then you should make sure that you’re submitting sitemaps through the Google search console.
You can go ahead and create one at https://www.xml-sitemaps.com/.
A sitemap is essentially just a list of pages on your domain, and it helps to explain to the search crawlers how your pages interlink and gives them a checklist to make sure they don’t miss any.
Without a sitemap, you might find that occasionally a page will go un-indexed.
Search in Google
The quickest way to discover if your website is likely being penalized or you’ve accidentally blocked the crawlers is to search your website.
However, there’s a special query that you can use for this:
Obviously, replace the filler domain with your own. The first page to come up should be your own.
When Google Panda released one of the main issues that are targeted was websites that were using duplicate content.
Duplicate content is simply pages that are so similar that Google believes they are mostly clones. This is an issue because it suggests to Google that you are trying to cheat their system because the duplicate page provides no extra value to users.
E-commerce websites often fall into these penalties accidentally because they will copy product descriptions from page to page and they may even have two pages advertising the same product.
There are two types of duplicate content problems; in the metadata and on the page itself.
Web sites with tens of thousands of pages often get lazy and decide that they are going to use a single meta description for all of their pages. Obviously, this often lands them with a duplicate content penalty.
If you do have unique pages, then you should be using unique metadata for them.
However, if your pages are identical then consolidate them into a single page. Users don’t benefit from you having different pages for the same product in different colors or sizes.
Writing unique meta data for each page on your website can be a huge time drain, but it’s necessary.
To check if you have any duplicate metadata you can use Screaming Frog SEO Spider. If you enter your URL and start your scan, you’ll be able to click on the “meta description” tab and filter to only show duplicates.
An alternative method is through the Google Search Console. However, if you’re having problems with indexing, then Google might not have crawled your pages, meaning that it won’t show you all the duplicate descriptions.
Regardless, to do this, you can go to “Search Appearance” and then “HTML improvements.” On this page, you’ll find a list of areas you can improve on your website, including duplicate meta descriptions.
Duplicate metadata is common among e-commerce websites, but duplicate content happens in all different industries.
To find out if you’ve got any duplicate pages you can use a tool call Siteliner.
This tool will scan all of the pages on your website and tell you if any of them are duplicates.
Not all 404 errors are bad. In fact, you might purposefully put a 404 page in place to help Google.
Think of this, you have a page about a topic, but then you decide to delete that page without a 404.
In the short-term, Google would continue to rank that page and users would land on a page that wouldn’t load, giving them an error.
An intentional 404 page tells Google to remove the page from their index.
That’s not a problem for most pages.
The main problem is when you have backlinks pointing to a dead 404 page. Of course, this still helps to improve your overall domain authority, but they aren’t being used to their full potential.
To fix this issue, you want to 301 redirect that dead page to the most relevant page on your website, or to your homepage.
You can find 404 errors easily in your Google Search Console. Go to the “Crawl” section on the sidebar, and you’ll find a page called “crawl errors.” If you then click the “not found” tab on the graph, you’ll be able to find a list of all the 404 errors that Google has found.
Most web designers don’t understand SEO and most SEO’s don’t know how to design a beautiful website.
This does cause a problem.
Lots of websites aren’t designed to be effective for SEO. The architecture of the website is fighting against your rankings.
Of course, you should be putting the user first and making sure that the site offers the best possible user experience. However, it shouldn’t be at the expense of the robot’s experience. Both should work harmoniously together.
Take a look at your site architecture:
- Is the navigation easy to use or is it confusing?
- Do your internal links use the correct types of anchor text?
- Can you make it easier for humans and robots to move around your pages?
When you’re auditing the technical SEO of a domain, you need to pay attention to the structure of the pages and the structure of the URLs.
However, this doesn’t always mean that we recommend you make changes.
Often there are ways that you could use a more efficient URL structure, but that would involve making 301 redirects.
As good as 301 redirects are, it’s not always clear whether they transfer 100% of the trust and authority from the old URL. This means that by changing URL structure, you’re potentially risking losing your current rankings. Therefore, it’s normally best to stick with what you already have.
However, if you’re starting a new website, then there are better structures to use.
In the large majority of cases, there is no need to have multiple subfolders within the URL.
One example could be: “coolwebsite.com/cool-books/books-for-men/2017/07/sapiens.”
Instead, you could have: “coolwebsite.com/books-for-men/sapiens.”
It’s more efficient, shorter and easier for humans to remember.
Another point to consider is over-optimized URLs. You might have heard SEOs talk about the benefits of using your keyword in your URL and this is true. However, this doesn’t mean that you should stuff the keywords in.
Using extra subfolders just to include the keyword again is a horrible idea and could end up with you being penalized for over-optimizing your page.
The great thing about internal links is that you can use exact or partial match anchor text on basically all of your internal links.
There doesn’t appear to be any penalty for doing this, and it’s probably the most natural way for you to link on your website.
If your page is about “iPhone headphones” then the anchor text should be “iPhone headphones.” You would use a branded anchor text because people are already on your website, using naked URLs is odd, and a generic anchor text would seem out of place.
Unfortunately, detecting ineffective uses of internal links is very time-consuming.
You’ll need to go page by page and find the anchor text manually.
This is extremely time-consuming, but it can make a big difference in your rankings, especially if you’re using internal links from a strong page.
To avoid this in the future, you should make sure that everybody on your team knows that they should be trying to use exact or partial match anchor text.
Page Level Analysis
Once you’ve analyzed your keywords and your technical SEO, you want to do a page level analysis. The goal is to make sure that you each of your pages are optimized most effectively for the keywords that it is targeting.
After all the hard work that you put into creating the content and building links, it would be a shame if it didn’t rank because you hadn’t optimized the page correctly.
Similarly, it would be a waste to take the time to optimize a page that has garbage content. Everything needs to be strong; there can’t be any weak links in your SEO.
To drive the best rankings, you want content that appeals to both humans and robots. That means that it has to be SEO optimized, but it must also read well and convince humans to share it and link to it.
If you’re outsourcing work to a freelancer, or even if you’ve written it yourself, you want to run your pages through Copyscape.
This isn’t just to see if your freelancer has copied from others, it’s also to see if others have copied from you.
If they have, you can file a DMCA report with Google, and they will remove the thief’s page from their database. It’s important to get their pages removed otherwise Google might accidentally think that you were the one who stole the content.
Throughout your content, you want to make sure that you’ve optimized to rank for the keywords that the content is related to. Obviously, you do need to be careful to avoid over-optimizing otherwise you could attract a Google Panda penalty.
- Title: Your main keyword should appear once in the title
- Meta: It’s okay to include your main target keyword in your meta description
- First paragraph: Using your main keyword in your first paragraph will help with relevancy
- URL: Your target keyword should have a partial match in your URL
- First ALT tag: It’s smart to include your main keyword in your first images ALT tag
- Last paragraph: Similarly to your first paragraph, this helps to solidify relevancy on your page
- Internal links: Make sure that you’re using exact or partial match anchor text
A quality content analysis will be able to tell you whether or not your current content strategy is working for your business or not.
By the end of the analysis, you should know exactly what you’ve been doing right and where there are opportunities for you to improve.
You’ll be looking at every piece of content on your website, from keyword optimized landing pages and blog content, through to sales pages and even your about page.
It’s very tough to make up for poor content. You might be able to overcome garbage content with hundreds of links to a single page, but using great quality content would have made ranking 10x easier.
Ask a friend
It’s always tough to be critical of our work. The only way you’re going to get an impartial view of your content is to ask a friend who is involved in the industry, to be honest about your work.
It might mean that you have to keep asking them to be honest, but eventually, they’ll tell you exactly what they think and where you can improve.
It’s important to get other people’s views because other people are the ones who will hopefully become your customers.
It doesn’t matter if you like it, it’s more important that the rest of the world like it.
Most businesses fail to implement an effective content strategy if they even attempt to have one at all. Here’s what to ask yourself and others when you’re analyzing your content:
Is it unique?
In some cases, copied content can play a role, but most of the time you need to be creating unique content. If you’re creating the same content as a competitor, there is no reason for somebody to link to your content instead of the other persons.
Go out of your way to be creative and come up with original ideas.
There’s only so far you can get with boring content that a hundred other websites have created in the past. If you want to join the elites at the top of the industry, you need to break ground and create some original.
Is it valuable?
Every piece of content that you create should have a purpose. Depending on the theme of your website that piece of content might be informative, funny, entertaining or instructional.
Whatever the value you intend to deliver, you need to step back and ask yourself if you truly deliver that value.
If not, you need to make changes so that your readers leave your website feeling like they used their time wisely.
Again, it’s irrelevant if you or your co-workers enjoy the content and find it valuable. It needs to be valuable to the real people out there who are searching for answers to their questions.
Is it better?
Creating content that is below what your competitors have created is a complete waste of time. This might work for a few months, but eventually, you’re going to need to step your game up.
Every piece of content that you publish should be superior to everything else on the subject.
This is the only way to attract the huge and valuable links that are going to make sure that your pages rank instead of your competitors.
It it engaging?
Some writers make the mistake of thinking that they are writing for an audience. You don’t write an audience, you write for an individual.
Each person that reads your content should feel that you are talking directly to them.
It should feel personal, it should feel like you care and the reader should be engaged with the content.
Is it deep enough?
We made sure to say “deep” rather than “long.” It’s true that longer content does rank better, but you shouldn’t aim to write the longest articles possible.
Your aim should be to write the most in-depth articles you can that answer the questions that your readers are searching for. Every reader should come away from your content having an answer to all of their questions.
Are there problems?
Firstly, make sure that there are no grammar or spelling mistakes on your page. Google probably already checks content for grammar and spelling errors, but if it doesn’t, it will in the future.
Secondly, you should check for broken links in your content. Not only does it look unprofessional, but it also leaks ‘link juice’ from your pages for no reason.
Is it intrusive?
In recent years Google has punished websites for having intrusive elements on their site that prevent the user from having an enjoyable user experience.
One example of this would be pop-up ads. The worse that the intrusion is, the greater the negative impact on your rankings.
Everything should be in moderation. There’s nothing wrong with using ads on your page, but make sure that they don’t make it difficult for the user to read the content that they came to your website for.
Are you moderating comments?
If you’re choosing to use blog comments on your website, then you need to make sure that you’re moderating them.
Spammers are always sneaky about trying to include links in their blog comments. There’s nothing wrong with allowing people to use links in their comments, but they shouldn’t be referring to spammy websites.
By allowing spammy links on your pages, you are associating your website with those pages.
User Experience (UX) Analysis
Part of your content analysis should have covered the user’s experience, but it’s important to do a complete UX analysis. This will allow you to find out how well your readers are interacting with your website, showing you improvements that you can make.
Every website should make sure that they have Google Analytics installed. It will allow you to get a much better understanding of how your users are reacting to your website and how they use your pages.
The first stat that you’ll want to look at is bounce rate. Bounce rate is shown as a percentage of visitors who go back to the search results after landing on your page.
Bounce rate is going to be dependent on the type of content that you create and the audience that you are targeting. Entertaining content often has an extremely high bounce rate, whereas information or commercial content might be considerably lower.
In general, you should be aiming to keep your bounce rate below 80%.
Time on site
Time on the site refers to the amount of time that users spend on your site before they either close the window, bounce back to the results pages or go to another website.
Time on site is a good indicator of how engaging your content is and how much your readers enjoy your site.
However, if someone was asking “when was Angelina Jolie born” they might only need to be on your site for 10 seconds to see the birthdate and then leave.
The reason why time on site is important is that the longer someone is on your website, the greater the chance that they will become a customer.
Most websites should aim for a time on site of longer than 1-minute.
If yours is lower than this, then you should look at expanding your content. If there isn’t much content for your users to consume they can’t stay on your website very long.
Focus on trying to create the highest quality content that you can and make sure to use internal links to drive people to your site to other pieces of content.
If you’re giving your readers more value than your competitors, they will stay on your website for longer, and this will improve your rankings.
Just like in real life, it often takes some time to build up trust and to form a relationship with people before they are comfortable becoming a customer.
If someone is on your website for 5-minutes instead of 20-seconds, they’ll be much more comfortable with the idea of giving you their hard earned cash.
As we’ve mentioned before, cash flow is king, and the only way to generate cash flow is to make sales.
You should ensure that you’ve got goals created in Google analytics that will track when readers convert or fulfill a goal. This will allow you to get more useful data that you can use to impact your decisions about which marketing channels to prioritize.
Goal completions is the most important metric that Google Analytics can show you. It doesn’t matter if a user is on your website for 10-seconds or 10-hours, the most important thing is that they convert.
Each of your other metrics should be improved so that you can make more money.
You only make money if you convert leads and that means that your page needs to be compelling.
If you are the best SEO in the world, then you might be able to get 20 millions visitors per month, but if your copywriting is atrocious, you’ll only convert a pathetic percentage.
Your users are the most important thing; they need to feel compelled to purchase. If your UX is turning them off, you have a problem.
If your goal completions are low then you should look at:
- How easy is it for visitors to get in touch with your sales team?
- Is it clear what your business offers?
- Is there pricing information?
- Is there any text or video that convinces them to purchase?
- Are there testimonials from previous customers?
- Does the website even load correctly on their device?
Information is power.
You might find that a large percentage of your visitors are leaving from a single page. In this case, you’ll know that you need to fix one page rather than make changes to your entire website.
It’s not uncommon that a single page can have a huge exit rate, dragging the entire websites exit rate up significantly.
However, you need to look at pages on a case by case basis. Once a page has served its purpose there is no reason why the user wouldn’t leave.
When you’re looking at exits in Google Analytics, ignore the volume of exits, focus on the “% Exit.” Pages with the most visitors will obviously have the higher number of exits, making this data useless.
Whereas the percentage of visitors who exit will give you a better metric to compare with other pages on the same domain.
Usually, you’ll find that most pages have an exit percentage around 60%. Although it’s best to take an average of all of your pages and then look for pages that are significantly greater than the average.
Most of the time users leave your website because they didn’t find the content valuable or the UX was poor. To lower your exit rate on those pages, you need to put yourself in the user’s shoes and ask yourself:
- Is this content valuable?
- Does this content give me all the information that I need?
- Is this content easy for me to read?
- Is the page visually appealing?
- Did it load fast enough?
- Are there intrusive elements like popups or ads?
Is any part of the page break?
Similar to earlier in our audit, you might be better to ask a friend or colleague to look at the page critically and try to find things that would deter users.
The percentage of users that return to your website will give you a good indication of how engaging your content is.
However, this isn’t always the case. In some niches, you might desire to have a 0% return rate. For example; if someone has a problem with their boiler, you want to solve that problem. You don’t want them to have to return for more advice.
Of course, if you deal with a much wider industry like home repair in general, you want a higher return rate because it suggests that users value your content over your competitors.
As with any metric, it’s not good enough to just have a nice number; you need to understand what that means about your website.
If people are searching for the name of your brand or website, you’ll know that you’ve established brand recognition and trust with your users.
There are some sites that people exclusively use for particular types of content because they believe the website and know that they are going to get high quality content.
For example; you probably use the same site for your news each morning. You don’t just decide to swap to another website because you have already developed trust with your current news network.
To see the number of people searching for your brand, you can go into your Google Search Console and look at “search analytics” under “search traffic.” This will show you the top queries that people used to find your website.
Google has admitted that links are either the 1st or 2nd most important ranking factor in their algorithm. Failing to maximize the effects of your existing link profile can lose you valuable ranking positions and therefore cash at the end of the year.
When you’re conducting a link analysis, you’ll want to use a few paid tools. Google Search Console is great for seeing your links, but it’s not as useable and detailed as a tool like Ahrefs.
Even to this day many SEO’s aren’t giving relevancy the credit that it deserves.
The relevancy of your links makes a huge impact on how much weight they carry.
Your goal should be to make sure that the large majority of your backlinks are relevant to the pages that they are directed to. It doesn’t need to be 100%, that’s not realistic, but you should aim for above 90%.
If you want to look at the relevancy of your link profile quickly, you can use Majestic to look at the Topical Trust Flow Topics.
This tool will show you the topic that is associated with the backlinks in your profile. Scan through the links and mark those that are irrelevant. There is no need to remove them, but it’s good to keep track of them so you can remove them quickly if you are penalized or lose rankings.
The best metric for tracking the power of an individual link is Ahrefs Domain Rating (DR) metric. DR represents the power of a domain and therefore the strength of a link from that domain.
DR is far more accurate than Moz’s DA or PA because Ahrefs have a much larger and fresher link database.
To look at the authority of the links that you’ve acquired you go to Ahrefs, search for your domain in the site explorer and then look at your referring links.
Part of your audit should be looking at how natural your backlink profile is. You’re aiming for a natural variety of links types (contextual, comments, resource pages, forums, etc.) and also a natural do-follow/no-follow ratio.
If you look at a large site, you’ll often find that they don’t have more than 60% do-follow links.
If all of your links are pointing to your homepage, you might have a problem.
Most websites will have a huge percentage of links to their homepage, but they’ll also have deeper pages that have hundreds of referring domains.
A natural split would be 50% – 70% of links pointing to a homepage, with 30% – 50% referring to deep pages.
Google Penguin has already altered the algorithm so that any website that abuses exact or partial match anchor texts will be punished.
It’s very important that you keep track of your anchor text usage so that you can keep your website as far away from a penalty as possible.
Here’s a rough guideline for anchor text percentages:
- Branded anchor text: 50%
- Naked (URL) anchor text: 20%
- Generic anchor text: 15%
- Page title anchor text: 10%
- Partial match anchor text: 4%
- Exact match anchor text: 1%
Of course, this will vary depending on the topic of your website and isn’t an issue for new websites with small link profiles.
However, as you grow you want to avoid optimizing your anchor texts too much for keywords.
Total referring domains
Total referring domains is the metric which has one of the closest correlations to high rankings.
If your website doesn’t have many referring domains, then you’ll struggle to compete for any keywords that don’t have remarkably little competition.
The best way to figure out how well you’re doing is to compare your total referring domains to competitors in your industry.
Of course, the strength of those referring domains will also factor in, but quantity is also very important.
A citation is simply an online reference to your business, letting any interested customers know where you’re located, your phone number and your website URL.
There are thousands of different citation sites, and you should make sure that you’re included in those that are relevant to your business. These could include; industry specific directories, professional directories, and local directories.
When you’re conducting a citation analysis, you’re aiming to make sure that you have NAP-W consistency.
NAP-W stands for name, address, phone, and website. Google looks at all the different citations that you have and thinks it’s odd if you have different NAP-W information.
To track this information and make sure that it is consistent across different websites you should use one of these tools:
- Moz Local
- Bright Local
If you don’t have any citations, then that could be one reason why weak rankings plague your local business. Go out and find five to ten business directories that are relevant to your website. Make sure that you submit the same NAP-W information.
Conducting an SEO audit doesn’t need to take weeks, but it does need to be thorough.
SEO is one of those marketing channels where it’s not good enough to be 95% of the way there; you need to aim for perfection.
Small issues can plague your site and cause you to lose thousands of dollars in lost customers.
Take however long you need to complete a quality audit and to ensure that you’re getting the best rankings that you can.
Leave A Comment