How to do an SEO audit in 8 steps? The Ultimate Checklist 2022
As SEO specialists, your priority is to put your website (or your client's) at the top of organic searches for targeted keywords. But to achieve this, you need to know how your site is currently performing.
This is where the SEO site audit process comes in.
It allows you to effectively break down your website into its main SEO basics and identify those that need to be fixed. If done correctly, you should see your website move up in the SERPs (search engine results pages).
The biggest hurdle you need to overcome now is organizing your audit process to help you effectively analyze the most important ranking factors to generate the fastest SEO results.
This post is an SEO audit checklist that breaks down four critical SEO areas in detail. We will then look at how to audit your website in these areas and fix the problems you find there.
Technical referencing:
Simply put, technical SEO is all about making your website easier for search bots to access.
Technical SEO has many factors that you can work on. Here are the most important things you should look for when performing a technical SEO audit:
1.Crawlability and indexability:
The goal of technical SEO is to optimize your site for crawling and indexing. However, just because search bots can crawl your website doesn't mean they can index it in search results. At the same time, this doesn't mean you have to get spiders to index them.
For example, you may have pages on your site, such as the Terms and Conditions or Privacy Policy, that you do not want Google or other search engines to display in search results.
Besides the fact that they don't add any value to search engine users, they also waste your crawl budget. This is the number of pages Google can crawl on your site in a given time period. When the budget runs out and your site still has a lot of pages to crawl, the search engine will crawl them as soon as your crawl budget is replenished.
In this case, you won't be able to get search bots to crawl all pages, especially the most important ones. This results in a longer wait before search bots can crawl and index your new pages.
To avoid this, search engines need to crawl your most important pages first. Here are some steps you can take as part of your audit process to improve your website's technical SEO performance:
Create an XML sitemap:
A sitemap is an extensible markup language (XML) file containing all the important pages of your site. With a sitemap, search bots don't need to search everywhere for pages on your site that they need to crawl and index. A quick look at your sitemap gives them the entire list of pages they need to process and index in the SERPs.
Most content management systems have built-in sitemaps that you can submit to Google Search Console or Bing Webmaster. Once downloaded, search bots will do their job by crawling your site for pages to index.
Identify and eliminate duplicate content:
Duplicate content consists of pages on your site that have the same content but separate URLs. This creates an index overload on your site and drains your crawl budget, preventing your site from being crawled properly.
Here are the causes of duplicate content creation:
- Pages do not have canonical URLs. This results in the creation of URL variants. They confuse search spiders as to which versions of the same URL and content to crawl and index.
- For WordPress site owners, you've created many tags and categories for all your pages.
- The site offers WWW and non-WWW versions to search robots.
- The site offers HTTP and HTTPS versions to search robots.
- To find out if your site contains duplicate content, you can use Google's search operators. If your site offers non-WWW URLs, Google the term below:
site:yourdomain.com inurl:www
The search operator will determine if Google has indexed your site's WWW URLs. If the search results return a lot of pages, you have a duplicate content problem.
Depending on the duplicate content issue, each solution is different. As for WWW URLs, you must redirect them to their non-WWW versions by configuring your .htaccess file. Insert these lines of code into the file:
**RewriteEngine On**
**RewriteCond %{HTTP\_HOST} ^yourdomain.com [NC]RewriteRule ^(.\*)$ http://www.yourdomain.com/$1 [L,R=301]**
Create and configure your Robots.txt
This file on your site tells search bots which files and folders to crawl and/or index. Robots.txt helps manage your crawl budget by preventing bots from crawling unnecessary pages and directories on your site. As a result, bots can prioritize reading and indexing the most important pages on your site.
For example, you can configure your robots file.txt to prevent search bots from crawling your site's tags and categories. If your site doesn't already have a robots file.txt, create one and include the following in the file:
User-Agent: \*
Disallow: /tag/Do not accept: /category/
You can also "prohibit" bots from crawling your site's pages using bots.txt. You can also use a WordPress plugin that allows you to determine which pages search robots can not only crawl but also index. This gives you more control over the pages you want to appear on search engines.
2. Increase site speed:
The faster your site loads, the better the experience you offer users.
This is the principle that governs site speed as a ranking factor since 2018.
More recently, Google rolled out the Page Experience Update that prioritizes websites that create an optimal browsing experience for users. The search engine determines this using the Core Web Vitals score of your site's pages.
Core Web Vitals takes into account the following metrics outside of site speed:
First entry delay – measures the time it takes the browser to respond to a visitor's first interaction when the site loads.
Cumulative layout offset – refers to unexpected lags that occur on elements of your site, i.e. fonts, images, videos, etc. during loading.
The largest content board – the time it takes for a page's main content to load and be ready to interact with visitors.
You can calculate the Core Web Vitals score for each URL on your site using Google PageSpeed Insights.
3. Organize the structure of the site:
Site structure refers to the organization of your site's pages into silos and groups.
You can improve the structure of your site by grouping pages that share the same topic together to form a silo. Under the silos, you can create subsilos and extend the topic even further if necessary.
The goal is to make it easier to navigate the pages of your site and allow users to find the information they are looking for. Because pages on similar topics are grouped together, users are only one or two clicks away from the page they need to visit.
At the same time, you allow crawlers to better understand the relationships between the pages on your site. This can help increase your site's relevance to a topic, which influences how your pages rank in the SERPs.
The approach to site structure depends on the type of site you manage. For blogs and publishing sites, the process of structuring the site is simpler. Simply categorize the articles and link the most relevant pages together.
Things get trickier if you run an ecommerce site with lots of products and categories. Compartmentalizing related issues shouldn't be a problem, but it's creating a flat site architecture that should worry you.
Ideally, you want all pages on the site to be four clicks or less away from the home page. In addition to making it easier for search bots to crawl your site, you make sure to convey link authority to all your pages equally.
5. Pay attention to other advanced technical SEO factors:
Below are advanced technical SEO factors that are not as crucial as those mentioned above. Nevertheless, they can help move the needle and improve your site's SEO performance over time.
Structured data (schema markup) – provides search engines with information about the content of your pages. Search engines then display this information in search results to help increase click-through rate (CTR). Use Google's structured data markup helper to create schema markup for recipes, products, events, and more.
Mobile SEO – Google rolled out the mobile-first indexing update in 2018, so it makes sense to build your website with mobile in mind. Use the Google Mobile-Friendly test to see if your site's pages are optimized for mobile viewing.
Server log file scanning – a process that refers to examining a file containing all HTTP requests made to your web server.
This analysis allows you to better understand your technical SEO efforts, including crawl budget, accessibility issues, page crawl frequency, and more. You can analyze your server log file using Apache (Linux) or IIS (for Windows).
On-Page SEO:
Once you've mastered technical SEO, it's time to move on to on-page SEO. In this part, we want to make sure to optimize your pages for their respective target keywords. To do this, you need to include the keywords in the elements of the page that carry the most weight as a ranking factor.
Below are the elements for which it is essential to include your keyword :
6. Include a keyword in the page title:
The page title tells users and search engines what the page is about. It is included in the title tag and is the first thing users see in search results about a page.
Therefore, it's not just about including the page keyword here. You should also write a title that users will want to click on when they see your page in the SERPs.
Another factor to consider for your page title is its length. If the title is too long, Google will truncate it and users won't be able to see it in full, which could lead to a drop in click-through rate.
To make sure the length of the title is just enough for it to appear in full in Google's SERPs, use Mangool's Google SERP simulator.
It tells you if the titles and meta descriptions of your pages are too long so you can shorten them as you see fit.
7. Include the keyword in the header:
You should also mention the keyword in the page title, especially in the H1 tag, which is an important ranking factor.
But, unlike the title tag, the H1 tag only appears on the page after clicking on it in search results.
It is possible to have the same H1 and title tags, but it is better to create different versions for each of them. The purpose of a title tag is to encourage people to click on the page in the SERPs, while the H1 tag describes what users can expect when reading the content.
8. Include a keyword in the URL:
The URL complements the "three kings of on-page SEO", with the page title and H1 tag being the first two.
By mentioning the keyword, you help search bots understand what the page is about. They can then rank higher in organic searches by simply including the page's target keywords in the aforementioned elements.
9. Create internal links:
Internal links are links from pages on your site that point to other pages on your site. These links help you build the structure of your site, especially your content silos.
In addition to the XML sitemap, internal links help search bots know if you have new pages to crawl and index on your site. So, if you have a recently published blog post on your site, linking to it from other related pages would help the article appear faster in Google search.
Therefore, when checking internal links, it is important to link to pages that share the same topic. Avoid linking unrelated pages, as this would make no sense to readers and search bots.
Through internal links, you should be able to spot pages that don't have them. Also known as orphan pages, they don't rank well or aren't ranked at all on Google. From there, you need to find related pages that could be linked to your orphaned pages.
Finally, internal links may be broken on your site if you have updated the URLs of your old articles. From there, identify the internal links on your site that no longer exist and edit them with the correct URL.
Link Whisper is a WordPress plugin that allows you to find broken links and orphaned pages on your site that you can fix to improve your on-page SEO.
Next Article
The only real address is www