Do a manual review of the website

It is a great practice to start your audit by manually reviewing the website and simply taking notes of what draws your attention. This is an awesome pre-audit task that will help you take a different look at the data provided by the tools. Your initial analysis at this stage will let you easily connect the dots later once you have gathered the data from SEO tools.

Check the website using Semrush or other SEO tool

The next step after manually reviewing the website and before getting into the details is to simply analyze the website using one of the most popular SEO tools, such as Semrush or Ahrefs.Run the website analysis and pay attention to things, such as domain metrics, backlinks, overall visibility, organic traffic, organic keywords, traffic trends to get a general “feel”. This is an essential step!

Check the backlink profile

Even though this is a technical SEO audit (not a link audit), you should still do at least a general backlink profile analysis. If the website is involved in low-quality link building, then you – as an auditor – must know about it because this can have a negative influence on the overall performance and visibility of the website in search. You can do a general link audit using Semrush which will group links and indicate potentially low-quality links. Remember that this is only a tool and your human judgment is essential here.

Check the CMS of the website

The chances are the website you are auditing uses a CMS (Content Management System). And there is a good chance it is WordPress (which currently powers almost 40% of websites). 

However, you – as an auditor – must know what CMS the website uses so that you can provide relevant recommendations on how to fix, for example, some of the CMS-specific issues. 

Check the hosting provider of the website

It’s also worth knowing the hosting provider of the website. With this knowledge, you can give detailed and relevant recommendations on things, such as how to adjust specific server settings or how to add an SSL certificate. 

Assuming that you already have some experience, you will instantly know if a given hosting provider is a good fit for this website. 

Check if the website is on a shared hosting plan

The opinions on whether shared hosting can or cannot impact a website’s ranking differ a lot.  Anyways, it’s always good to know if the website is using shared hosting and what other websites are on the same IP.  To learn that, you need to perform the reverse IP domain check.

You can do a reverse domain IP check using You Get Signal or any other similar tool.  

Check the domain history

You cannot get the whole picture of the website without knowing at least a bit about its history. Make sure to check the age of the domain, its registrant history, and who its current registrant is.

A simple tool like WhoISrequest will let you check the domain history. 

Check the Wayback Machine

In addition to the domain history, it’s also great to know how the website looked like in the past and what type of content it had. This knowledge will let you determine if there was a redesign or some other important change on the website. 

You can use the Wayback Machine to see how the website used to look like in the past.  

Check if the site has undergone a major change or redesign recently

It is good to know if there have been some major changes on the website, especially if you are auditing it to find the reasons why it lost traffic. To check if there were some redesign changes, you can also use the Wayback Machine.

Now let’s dive into the data from Google tools

Google tools provide tons of useful data about technical SEO aspects of a website. Any good and comprehensive technical SEO audit should start by looking at the data from those tools! 

GSC is the bread and butter of any technical SEO analysis. If you have access to the GSC data of the website, then start by taking a brief look at the elements discussed below. 

If you love Google Search Console as much as I do, then don’t miss my guide on how to audit a site with Google Search Console (only).

Check if the website has a GSC account set up.

It is hard to believe but I still come across websites that have never even set up a GSC account! If the website you are analyzing is this rare case, then you can skip all of the points from this section. 

Instead… set up a Google Search Console account for the website. If you don’t have the power to do that, make it a priority that the client does that. 

Check the Performance report

Open the Performance report to get an overview of the website’s performance in search. Check metrics, such as total clicks, total impressions, average position, and average CTR. 

Set the date range to the last 12 months. Take notice of any traffic trends.  

Check the Coverage report

The Coverage report will show you:

  • what web pages are indexed and can appear in Google (Valid and Valid with warnings),
  • and what web pages are excluded and why (ErrorExcluded). 

From a technical SEO standpoint, this is a very important report.  

Check XML sitemaps

Now go to Index > Sitemaps to check if an XML sitemap or a sitemap index has been submitted to Google. If no sitemaps are submitted, then you can move to the next point. You will take a deeper look at this element later on. 

If under Status you see Success, then a sitemap has been processed successfully. You can also click on a specific sitemap to see its details.  

If you are not very sitemap-savvy, make sure to read the sitemap guide from Google.  

If you are not sure if the website audited has an XML sitemap, learn how to find the sitemap of a website.

Check Removals

You also want to know if someone intentionally or unintentionally requested the removal of the content of the website. Simply navigate to Index > Removal and see what’s there. 

Check Enhancements

The next step is to check if there are any enhancements to be made. Simply navigate to Enhancements and take a look at each element listed there. 

These are important SEO elements, such as Core Web VitalsMobile UsabilityBreadcrumbsFAQ, and more (depending on the type of structured data used on the website).  

I would take a detailed look at Core Web Vitals and Mobile Usability. These are all very important reports which you should understand well.

Check if the website has a manual action

Even though manual penalties are less common now they still happen! You need to know if there is or was a manual penalty for the website. Simply go to the Security & Manual Actions > Manual actions. Ideally, you should see no issues. 

If the website has a manual action, it should become a priority to clean it and submit a reconsideration request. Google explains manual actions in a very straightforward and comprehensive way in their article. 

Check if the website has security issues

Manual actions are bad and so are security issues. To check if the website has this problem, go to Security & Manual Actions > Security issues. In an ideal SEO world, you will not see anything there. 

Check Links

Last but not least come links. This is not a link audit but you should have a general overview of the backlinks. Simply go to Links and check what’s there. You may also compare these data with the data from other tools like Ahrefs or Semrush. 

Your task now is to navigate to Links under Legacy tools and reports. You will the tables with data on external and internal links.

Here is what to do:

  • Check what the Top linked pages tab says and if there is one or a few specific URLs that have some really huge numbers of links in comparison to other web pages of the website.
  • Overlay the above information with that you see in the Top linking sites. Do the majority of links come from one or a few sites only?
  • Analyze the Top linking text tab to make sure that exact-match keyword anchor texts are not overused. If they are, this should be a red flag for you. Click MORE to learn more details.

Check Crawl Stats report

The Google crawl stats report lets you take a deeper look into how Google is crawling the website.

Navigate to Settings and then under Crawling click on OPEN REPORT next to Crawl stats.

The crawl stats report will let you take a quick look at the total number of crawl requests, total download size, and average response. In addition, in section Hosts, you will see the information about the health of your hosts (if there have been issues with the robots.txt fetch, DNS resolution, or server connectivity.

Make sure to check this section to quickly spot any crawling issues on the website.

Check the disavow file

When auditing the website, it is also essential that you check if the disavow file was submitted and (possibly) if it indeed has the links it should have.

Checking whether the disavow file has been submitted is essential especially if you are auditing a website that has lost or has been losing its organic visibility.

You cannot access the disavow tool from Google Search Console because it is an advanced tool that can do a lot of harm if used incorrectly. You can only find it on Google.

Your task here is to check if the disavow file has been submitted and if possible take a look at its content to make sure it has been used correctly and in line with its purpose.

Check the primary crawler

All the websites will be switched to mobile-first indexing in March 2021. Before that happens always check the primary crawler of the website.

To check the primary crawler of the website, log in to Google Search Console and navigate to Coverage. You will see the information about the primary crawler at the top of the page.

The majority of websites have already been switched to mobile-first indexing. What if the website audited hasn’t been moved yet and its primary crawler is still Desktop?

  • It probably is a bit obsolete and has serious loading and display issues on mobile devices.
  • It may not be mobile-friendly. If that’s the case, then making it mobile-friendly should be the top priority.
  • It still has the m. version for mobile devices.
  • It may be a new website whose domain had been registered before. This is the only OK scenario for the website still using the desktop crawler (this is the case with my website).

This is also a very general overview of the most important data from Google Analytics. Check my Google Analytics 4 basic SEO guide.

Check if the site has a Google Analytics account

If your answer is NO, then you can skip the rest of the questions from this section. Instead, your task is to set up a GA account for the website. 

Check if there are any visible trends in the data from the last 12-18 months

To check that, go to Audience > Overviewand then set the date range to be at least the last 12 months.

Check how the site acquires traffic

To check that, go to Acquisition >Overview. Organic search should bring the most traffic. But this is, of course, also case-specific.

If you are new to GA, check how to find organic traffic in Google Analytics.

Check if traffic trends look similar in both Bing and Google

Comparing the organic traffic trajectory in Google with the traffic in Bing may be the key to understanding the causes of drops in traffic. To check and compare the organic traffic from Google and Bing, navigate to Acquisition > All Traffic > Source/Medium. Compare google / organic with bing / organic.

Here are the 2 scenarios:

  • If the traffic trajectory is similar both in Google and in Bing, then it may mean that there are some technical issues with the website like some URLs not resolving.
  • If the traffic drop is visible only in Google, then the website may be suffering from Google penalty.

Check the bounce rate of the website

To check that, go to Behavior > Overview and you will see Bounce Rate there. 

Check the average time spent on the pages of the site

To check that, go to Behavior > Overview and you will see Avg. Time on Page. 

Check where the majority of the audience comes from

To check that, go to Audience > Geo > Location

Check the language of the majority of the audience uses

To check that, go to Audience > Geo > Language

Check the most often visited web pages of the website

To check the most visited web pages, go to Behavior > Site Content > All Pages

Check what types of devices the majority of users of the site use

Is it mobile or desktop? To check that, go to Audience > Mobile > Overview

In most cases, websites get the majority of traffic from Google and a small percentage from Bing. But there are other search engines, such as Yandex or Baidu, and some websites indeed get a lot of traffic from them.

These cases are quite rare but they happen! That’s why we need to check if a website uses or should use the webmaster tools of other search engines. 

Check if the website has a Bing Webmaster Tools account set up. Set it up if needed.

It’s good to verify the website with Bing because there are a few very nice tools within Bing Webmaster Tools, such as Site Scan, Robots.txt Tester, and Site Explorer.

I strongly recommend taking a look at those tools! 

Check if the website has and should have a Yandex Webmaster account. Create one if needed.

It is for you to decide if the website needs a Yandex Webmaster account. Set up a Yandex Webmaster account if necessary. And check my list of Yandex search operators if you use this search engine.

Check if the website has and should have a Baidu Webmaster Tools account. Create one if needed.

This is mainly for Chinese websites. Set up Baidu Webmaster Tools if applicable. 

Visibility in popular SEO tools

In addition to checking what Google tools have to say about the website, it’s crucial to analyze the visibility of the website using an SEO tool like Semrush or Ahrefs.

Check the visibility of the site in Semrush

Simply check the domain overview to get a general idea of how the website is performing.

The things to look at include Authority ScoreOrganic Search TrafficTraffic TrendKeywords TrendSERP FeaturesTop Organic Keywords, and Organic Position Distribution.

This will give you a pretty good idea of how things are.

Check the visibility of the site in Ahrefs

Simply type the domain in Site Explorer in Ahrefs and hit enter.

The things to look at include URDROrganic keywordsOrganic trafficOrganic positions. You may also want to check Top pages.

Any in-depth technical SEO audit would be incomplete without briefly analyzing the backlink profile of the website.

Check the backlink profile of the website

The quickest and easiest way to analyze the backlink profile of the website is to use the Semrush Backlink Analytics and Backlink Audit tools.

Mobile-Friendly Test

The next vital step in your technical SEO analysis is to run the 

Mobile-Friendly Test. This is a quick way to check how Google sees and renders a web page, if there are any loading issues, or if it is mobile-friendly. 

Once you run the test, you will be able to see the rendered page and HTML code. 

Check if the website is mobile-friendly

Run the Mobile-Friendly Test to make sure the website is indeed mobile-friendly… in the eyes of Google. 

Check if there are any loading issues.

Next, click VIEW DETAILS under Page loading issues to check the details of page loading issues (if there are any). 

 

Check the rendered screenshot and its HTML code

Compare what you see in the rendered screenshot with what you see in the browser. Check the HTML code of the rendered screenshot and make sure that the most important links (like navigation links) and content are indeed there.  

There is also a nice tool JavaScript rendering check that will check any URL for differences between the original source and the rendered HTML.

Google PageSpeed Insights

Google PageSpeed Insightsis an awesome tool that both examines the speed of the website and gives actionable tips on how to improve its speed and performance. 

Note that Google PageSpeed Insights provides information on a per-page basis. This is not the score for the entire website but for the specific URL (web page) you test. 

If the website uses WordPress, then the WP Rocket plugin will probably solve all the performance and speed problems. Check my review of WP Rocket to learn more.

Analyze the website with Google PageSpeed Insights

Check the scores for both the mobile and desktop versions of the website. If the score is below 80/100, you should take a closer look at the issues indicated by the tool. Anything below 50 (RED) requires your immediate attention, examination, and action. 

Check if the website passes Core Web Vitals

Core Web Vitals has become an official ranking factor.

With that in mind, you should pay special attention to Core Web Vitals and make it a priority that the website passes this assessment. This is a great investment for the future! 

If the website does not pass Web Vitals, fixing this in the nearest future should be a priority

⚡ Check my in-depth guide to Core Web Vitals. And my guides to Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.

Check if the tool indicates that images are not optimized

The uncompressed and unoptimized images slowing down the website are usually the easiest and the quickest to fix. 

If there are such images on the web page, the tool will indicate them along with potential savings. If possible, make it a priority to optimize all the images the tool indicates. 

Check if the tool indicates that JS, HTML, and CSS code is not optimized

These optimizations are also usually quite easy to implement, especially if the website is using WordPress. 

You will find my recommendations on WordPress plugins at the end of this guide. 

Check if there are any other important opportunities or diagnostics indicated by the tool

The suggestions given under Opportunities and Diagnostics do not affect the performance score of the website. However, they can help the website load faster. Sometimes a lot!

In most cases, the tool gives you all the information and suggestions you need to make a given web page faster and improve its performance score. 

Check if the website is compliant with the Google quality guidelines

Make sure the website is free of any glaring errors, such as keyword stuffing, doorway pages, sneaky redirects, or other obvious violations of the Google quality guidelines.

Many of those are obsolete black hat techniques but there are still websites that use them. 

By now you should have a pretty clear picture of the website, so you can move on to performing its in-depth technical SEO analysis. 

❗ To perform most of the below tasks you will need a site crawler. I mainly use Screaming Frog and Semrush (I love their crawler and SEO tools) but you can complete these tasks with any other decent crawling tools as well. Most of the screenshots come from these two tools.

Indexing, Crawling & Rendering

Let’s now get into details of how Google is indexing, crawling and rendering your website.

⚡ Make sure to check my guide to the crawl budget optimization.

Check how many web pages of the website are indexed

Use the site: command to check the approximate number of the web pages indexed. In most cases, the homepage should be the first result of the site: search command. 

Here is how I check my domain: site:seosly.com

The site: command used to check the rough number of the web pages in the Google index. And don’t forget that this works with Bing as well.

⚡ And I have the whole guide about Google search operators and Bing search operators if you want to learn more.

Check if the number of the web pages indexed corresponds to the number of valid web pages in Google Search Console

The number of the web pages indexed shown by the site: command is approximate. But it should still be similar to the number of the canonical (indexable) web pages of the website. 

In Google Search Console, go to Index > Coverage > Valid to check the exact number of the web pages indexed. 

Any discrepancy may need your special attention! 

Check if any weird or irrelevant web pages are indexed

The site: command is also very useful for checking if any weird or irrelevant web pages got indexed.  Simply take a look at 2-5 pages of search results returned with the site: command. You may get really surprised! You may also put your domain in quotes and do an exact match search like “seosly.com”.

If you are not very robots.txt-savvy, start by reading the Google introduction to robots.txt.

Check if the website has a robots.txt file

Simply add /robots.txt to the website address to check if it has a robots.txt file and to see its content.  

For my website that would be https://seosly.com/robots.txt.  If there is no robots.txt file, make sure the correct status code is returned (200, 403, 404 or 410), or the website may not be crawled. 

⚡ In my other article you will learn how to access and modify robots.txt in WordPress.

Check if the robots.txt file blocks website resources that should be indexed

Any errors in the content of the robots.txt file may result in the valuable website resources not being discovered by search engine robots. Blocking a resource in the robots.txt file won’t prevent it from being indexed BUT from being crawled. If there is a link to the blocked resource anywhere on the internet, then this resource may get indexed and still appear in search results. 

Check if the robots.txt really blocks website resources that should be blocked

Really huge websites (with millions of pages) should make good use of the robots.txt file so that it blocks the resources that should not be crawled like URLs with parameters, thin tag pages, etc.  It’s you – an SEO specialist – to decide what web pages (and if any) should be blocked from crawling.

Check if the robots.txt file is valid

And you really want to be sure there are no typos or misspellings in the robots.txt file. You can use these tools to check if the robots.txt file is valid: 

  • The Google Search Console robots.txt tester
  • New enhanced Bing robots.txt tester 

Check if there any other errors in the robots.txt file

Robots.txt may be completely valid and free of any syntax errors but there still may be other errors, such as the wrong name of the directory to be blocked.  Double-check that robots.txt really does what it is supposed to do and gives the Googlebot correct directives. 

Check if the robots.txt file indicates an XML sitemap address

The standard location of the sitemap (/sitemap.xml) does not need to be indicated in robots.txt. If the sitemap is not at a standard location or there are multiple XML sitemaps (or a sitemap index), then the sitemap URL should be indicated in the robots.txt file. 

In the case of my website, it looks like this:
Sitemap: https://seosly.com/sitemap_index.xml

You can host the XML sitemap on an entirely different domain. You just need to make sure to include it in the robots.txt. 

Check if the robots meta tag blocks website resources which should be indexed

Both the intended and unintended “noindex” value of the robots meta tag will block a page from being indexed. Crawl the website to check the indexability status of its web pages in bulk.  

You can also use the SEO Indexability Check Chrome extension to check indexability on a per-page basis. Note that if there are two different robots meta tags, Google will choose the more restrictive one. This is a rare case but it happens!

Check if there are indexable website resources which should be blocked by the robots meta tag

The opposite can also be harmful… especially if we are talking about thousands of low-quality indexable web pages. Your task here is to analyze all the indexable web pages and assess if they are valuable enough to be indexed. Use your common SEO sense!  

Check if the X-Robots-Tag blocks website resources which should not be blocked

You can also block a web page from being indexed with the help of the X-Robots-Tag. Your detective work is by no means over. Your task now is to detect and analyze the resources blocked by the X-Robots-Tag.

Check if the directives given in the robots.txt, robots meta tag, and X-Robots-Tag do not contradict one another

Yes, you can instruct robots in multiple ways, such as with the use of the X-Robots-Tag, the robots.txt file, and the robots meta tag. Your task is to check if the directives are not contradicting one another, or bots will get confused and not behave as you want them to. 

Check if the web pages like a privacy policy or terms of service are indexable

The old school of SEO would advise against indexing privacy policy, terms of conditions, and the like web pages. These pages, after all, rarely have unique content. This seems to no longer be true. You should indeed index those pages because they help build up your E-A-T. You will learn more about E-A-T at the end of this guide.

Check if the web pages of the website render correctly

Rendering is seeing your web page through the eyes of a bot. It is really good to know how it sees the website! The Mobile-Friendly Test will render the web page and show the HTML code of the rendered page. 

I use either Sitebulb or Screaming Frog to render all the web pages of the website in bulk. 

In Sitebulb, once you start a new project, simply choose Chrome Crawler and Sitebulb will do all the dirty work for you together with analyzing the differences between the source and rendered code.

If you want Screaming Frog to render JavaScript, go to Configuration > Spider > Rendering. Choose JavaScript and Googlebot Mobile

Check if the website has an XML sitemap

Let’s now check if the website has an XML sitemap. A lack of an XML sitemap may prevent search engine robots from discovering all of the web pages of the website (especially the web pages that are deep in the structure). 

So where do you look for an XML sitemap?

  • Check the default location which is /sitemap.xml or sometimes /sitemap_index.xml.
  • Check the robots.txt file. 
  • Check Google Search Console (Index > Sitemaps). 
  • If possible and applicable, log in to the CMS of the website and look for sitemaps settings there. 

If the website does not have a sitemap, you may skip this section. Instead, create or suggest creating an XML sitemap for the website. 

⚡ In my guide of how to find the sitemap of a website, I’m showing you 7 different ways to detect a sitemap.

Check if the XML sitemap contains all the URLs it should contain

By definition, an XML sitemap should contain all of the indexable & canonical web pages of the website. In practice, however, that is not always the case

The quickest way to check if an XML sitemap contains the canonical URLs of your website is to crawl it. I strongly recommend using Sitebulb for that.

Check if the sitemap contains incorrect entries

Make sure the sitemap is free from incorrect entries, such as redirected URLs, 4xx web pages, password-protected URLs, etc. Again, you can quickly check that by crawling the XML sitemaps of a website. 

Sitebulb will crawl the sitemap and show you any issues with it automatically.

Here is how you can analyze the content of sitemaps in Screaming Frog:

  • In Spider Configuration and under XML sitemaps, I check Crawl Linked XML Sitemaps and paste the URLs of the sitemaps of the website. 
  • Once the crawl of the website is done, I run Crawl Analysis.
  • Next, I analyze the contents of Sitemaps in the Overview tab. This tells me pretty much everything I need to know. 

Check if the sitemap uses the deprecated and parameters

Simply open the XML sitemap to check if there are <priority> and <changefreq> parameters. These parameters are currently ignored by Google, so there is really no need to have them in the sitemap. I usually recommend removing them entirely. 

Check if the website uses the parameter. If the parameter is used, check if it is used correctly.

Google is able to determine the actual last modification date of the website.  If the <lastmod> parameter is misused, Google will simply ignore it. 

If, for example, the <lastmod> parameter indicates the same date across all the sitemap entries, then you can be pretty sure it’s not used correctly. 

In such a case, I usually recommend removing it entirely. 

Check if the website has an image sitemap

If a website has a lot of valuable images and relies on Image Search, then it should definitely have an image sitemap. Lack of thereof may result in search engine robots being unable to discover and index all of the images. Images can be added as a separate sitemap or together with the regular XML sitemap. 

Check if the website has a video sitemap

This applies only if a website hosts videos on its own server. A video sitemap is not intended for indicating YouTube videos.  

Skip this section if the website is not multilingual. 

Practically any website crawler will provide the information needed to assess if there are issues with hreflang implementation. If the website you are auditing is not multilingual, you can skip this section

Check if language versions are clearly divided

Multilingual websites should have clear language division. A lack thereof may result in incorrect indexation of particular language versions. For example, some web pages may be indexed only in one language while other web pages only in the other language. 

To avoid this issue, a website should:

  • (ideally) put different language versions in different directories, 
  • have a language switch visible on every web page,  
  • and use hreflang tags (see the point below). 

Check if hreflang tags are used on the website

Multilingual websites should use hreflang tags because: 

  • Hreflang tags let search engine robots discover alternate language versions of web pages. 
  • Hreflang tags help eliminate content duplication caused by the availability of web pages in the same language but meant for different regions. 

To analyze the hreflang tags on your site with Sitebulb (and get very clear suggestions and possible fixes), make sure to check International in project settings.

Screaming Frog will let you analyze hreflang tags too. 

Check if hreflang tags are used correctly

Hreflang tags will not work unless they point to the correct (corresponding) web pages. Your task is to check if hreflang tags really link to the alternative language versions of the web pages. 

Check if the return links are missing

Without return links, Google will ignore hreflang tags on the web page. This is a rare case where mutual linking is desirable!???? Screaming Frog or any other similar tool will let you check if return links are missing. 

Check if the x-default hreflang attribute is used

Each web page using hreflang tags should also point to the default language version. To learn more about hreflang tags and how they work, read this Google article about managing multilingual and multiregional sites.  

Internal Linking

Internal linking can literally either make or break a site. This is a technical SEO aspect you should put a lot of focus on. A decent crawler like Sitebulb will analyze the internal linking structure of your site and notify you of any issues or areas of improvement.

Check if there is an excessive number of links on the homepage

If there are hundreds or even thousands of links on the homepage (including multiple links), then something is not quite right. There is no specific number to aim for but anything above one hundred should be a red flag and a reason to investigate further. 

Sitebulb does a brilliant job of analyzing your internal links and letting you understand how internal linking is implemented on a site. Just open the audit and go to Link Explorer.

Here is how to check the number of links on the homepage in Screaming Frog: 

  • Navigate to Internal > HTML.
  • Click on the (canonical) URL of the homepage. 
  • Navigate to the Outlinks tab.  
  • Click Export to export all the links. 
  • Analyze!  

Check if every web page of the website has a link to the homepage

This issue rarely happens but it happens. Each web page should link to the homepage. The most common placement for this link is the link form the logo or/and the “Home” link in the main navigation.

The most common placement for this link is the logo (a graphic link) or the “Home” link in the main navigation.

Check if there are multiple links on the webpages of the website

Multiple linking is less of an issue now.  Even John Mueller in one of the recent Google SEO office hours said that the-first-link-counts rule is obsolete and quite irrelevant. It is your task as an SEO is to take a look at multiple links on the website and assess if these links are problematic or not.  

Again, you need to use your experience and common sense. If there are tens or hundreds of multiple links on the webpages, then you may want to analyze it a bit more deeply. You can use Screaming Frog or any other website crawler to check that.  

Check if lower-level web pages link to other thematically-related pages

Product pages and blog articles are very helpful when it comes to using the potential of internal linking. Linking between thematically-related web pages strengthens both the linked and the linking page. For example, a product page should have links to other products. 

And it is also a great opportunity to use keyword-rich anchor texts, which helps search engine algorithms understand the topic of those pages.

Check if the website makes use of contextual linking

Contextual linking is a bit similar and is also very valuable. Contextual links also help search engines associate the linked web pages with specific keywords used in the anchor text. Using keyword-rich anchor texts in internal links is totally OK!

The best place for contextual links is blog articles that should link to product or offer pages. If the website has articles or tutorials, contextual linking should be put into action.  

Check if there are links pointing to non-existent or blocked resources

Links to blocked or non-existent resources may lead to a bad user experience which can negatively influence SEO. Your task is to remove the links pointing to non-existent or password-protected resources. You can also replace such links with working links returning status 200 (OK). 

Sitebulb will let you check both internal and external links very easily.

Check if there are low-value text links with inappropriate anchor text

Low-value text links make it harder or even impossible for search engine algorithms to associate the linked page with appropriate keywords related to its content. Links with low-value anchor text (like “Read more” or “Click here”) are a missed opportunity to give search engines information about the topic of the linked page. 

Check if there are low-value graphic links with the inappropriate or missing ALT attribute

The ALT attribute in image links is like the anchor text in text links. Use it! Make sure that graphic links are high-value links! 

Website Structure

Any crawling tool will help you analyze the structure of the website. However, I love how Sitebulb does that.

And here is how Screaming Frog does that.

But don’t blindly believe what the tool is saying. Make good use of your SEO experience and common sense here. The best website structure is the one that avoids any extremes. 

Check if the website structure is too flat

One extreme is when the homepage contains all the links to all the web pages. 

Check if the website structure is too deep

Another extreme is when the structure is too deep with more than 4-5 levels and lots of orphan web pages. 

Breadcrumbs

Breadcrumbs help users and search engine robots navigate through the website and better understand its structure. Breadcrumb navigation (or a breadcrumb trail) creates a return path to the superordinate web pages of the currently browsed page (including the homepage).

Check if the website has breadcrumbs

It is generally a good practice to use breadcrumbs on both smaller and bigger websites. On huge websites, it is even a must! 

Check if breadcrumbs are implemented correctly

Two things to note here:

  • Breadcrumb navigation should be implemented with the use of structured data. 
  • Breadcrumbs should not omit any web pages in the path and the last item (the page itself) should not be a link. 

If the site you are auditing has breadcrumbs, you can quickly check if they are added with the use of Schema.org by simply going to Structured Data > Search Features in Sitebulb. You should see Breadcrumb there.

Check if breadcrumbs are used consistently across the entire website

The website should use breadcrumbs consistently on each page. You can learn more about breadcrumb trails directly from Google. 

Navigation

The main navigation informs both users and search engine robots about the most important web pages of the website. 

Sitebulb has an interesting feature that lets you to analyze internal links in terms of their placement. For example, you can analyze the links added in the main or footer menu.

Check if the main navigation of the website contains links to the most important web pages

Navigation should have links to main category pages, hub pages, or important info pages (contact or about pages). 

Check if the navigation of the website is implemented based on text links

I know this seems pretty basic and obvious but it’s still worth checking. 

Check if list tags ( < ul > and < li > ) are used to build navigation elements

And making sure that navigation is built with the use of list tags. 

Check if navigation links are visible to search engine robots

Since navigation links are the most important, you need to make sure that robots can really see those links. This is especially important for JavaScript-heavy websites. You can check that by simply comparing the source and rendered HTML code of the website.

No. 91: Check if navigation works correctly on a mobile devic

Check if navigation works correctly on a mobile device

In addition to being accessible to search engine robots, navigation links also need to work as expected from the user’s side.  Simply open the website on a mobile phone and check how navigation works. Does it drop down where it should? Does it open a new web page as expected?

External Links

To check external links in bulk (in Screaming Frog), go to Overview > SEO Elements > External

Check if external links that are not true recommendations have a rel=”nofollow” or rel=”sponsored” attribute

Any outbound link that is not a true recommendation of the website audited should have a “nofollow” or “sponsored” attribute. And, conversely, there should also be true quality non-sponsored dofollow links to other thematically related web pages. There needs to be some balance!

Check if the links added by users have a rel=”ugc” attribute

If there is user-generated content on the website, then the website should make use of a rel=”ugc” attribute. This is especially important for links in comments and in the forum section. Make sure to check that! 

Check if there are site-wide dofollow links

In most cases, site-wide dofollow links should also have a rel=”nofollow” attribute. It’s almost 2021 but there are still websites that use site-wide links to boost SEO!

Check if there are external dofollow links to valuable resources

The website should have external dofollow links to high-quality resources. It is the way of the web to link out to the web pages that the website author considers valuable. Did you notice that I link to many external resources in my guide on how to do an SEO audit? I do that because I know these are valuable resources that may further help you.

URL Addresses

To analyze URL addresses in bulk (in Screaming Frog), go to Overview > SEO Elements > URL.

Check if URLs contain parameters (i.e. session or user identifiers) that do not influence the content displayed

URL addresses should not contain parameters that have no influence on the content displayed (e.g. like session or user identifiers). If there are such addresses, then they should have a canonical link pointing to the URL version without parameters. 

Check if URLs contain keywords

In one of the recent Google SEO office hours, John Mueller said that keywords in URL play a minimal role. However, when it comes to users, that’s slightly different. Users like clear URLs and Google likes what users like!

URLs should contain appropriate keywords describing the topic of the web page instead of unfriendly characters like “/?p=123”. 

For example, an article talking about Google search operators should have these keywords in the URL like here: https://seosly.com/google-search-operators/

Check if URLs contain words in a different language than the language of the website

And, of course, URLs should contain words in the language of the web page. URLs in a different language might confuse both users and search engine robots. 

Check if dash characters are used to divide words in URLs

You should use dashes to separate words in URLs. Again, it’s both user and bot friendly. 

Check if URLs contain unnecessary words

Ideally, URLs should not contain unnecessary words that would make them super long.

Redirects

Any crawling tool will let you check the redirects implemented on the website. In Screaming Frog, go to Overview > SEO Elements > Response Codes to examine all the redirects. 

Check if there are multiple redirects (redirect chains)

Ideally, one URL should be redirected only once. Note that Googlebot may stop crawling after more than 2-3 redirects. 

Check if there are redirects with incorrect statuses

In most cases, you should use 301 (permanent) redirects. 302 (temporary) redirects are to indicate a temporary change. People often confuse these two and use 302 redirects for permanent site changes (like a redirect from non-HTTPS to HTTPS version). Make sure the website uses redirects in line with their purpose. 

No. 103: Check if there are meta refresh re

Check if there are meta refresh redirects

Unlike 301 or 302 redirects which are server-side redirects, meta refresh redirects are client-side. A meta refresh redirect instructs the browser to go to a different page after a specific time period. For Google meta refresh redirects are sneaky redirects.

Meta refresh redirects should be replaced with regular HTTP redirects. Yu can learn more about sneaky redirects and 301 redirects straight from Google.

Status Code

An HTTP status code is the server’s response to the browser’s request.  Status codes indicate if the HTTP requests were successful (e.g. 2xx), or if there were some errors (e.g. 4xx), redirections (3xx), or other problems with the server (5xx).   

Check if there are web pages returning 5xx errors

A huge number of web pages returning status code 5xx may indicate that there are some problems with the server. The server may be overloaded or may need some additional configuration. 

Check if there are web pages returning 4xx errors

We have already touched upon this a bit. Lots of web pages returning status 404 (not found) or 410 (content removed) on the website may lead to a bad user experience. This applies to both internal and external links within a website.  

If there are backlinks pointing to those 4xx web pages, then Google will not count those links. The internal links to these 404 web pages should either be removed or replaced with working links. If there are external links pointing to those 404 URLs, then I recommend 301-redirecting these web pages to working URLs.

Check if an error page returns a 404 status code

A website should be able to handle error pages correctly. A non-existent page should return a 404 status code (not found) instead of 200 (OK). An error page returning the status code 200 may get indexed and become a soft 404. 

Google is getting better and better at handling soft 404 pages but you still should ensure that the website handles error pages in an optimal way.  

You can use the Link Redirect Trace Chrome extension to quickly check the status code of any web page. 

Check if the website has an error page

A blank error page that screams “ERROR” in red is not a good user experience.  A website should have an error page that clearly says that this is an error page and a user landed on it because the URL typed does not exist or cannot be found. 

Check if the website has a dedicated error page

In an ideal SEO world, an error page should also have links to the most important web pages of the website. Its layout and design should also be like the rest of the website. 

A dedicated error page is there mainly for users to make their page experience better. 

Duplication

Yup, it is true that Google is getting better and better at handling duplication. But you, as an effective SEO, can make it a lot easier for Google with just a few technical fixes. 

Check if there is duplicate content caused by the incorrect technical implementation of the sorting of content

The incorrect implementation of the sorting of the content may result in a lack of control over which URLs get indexed. A quick fix to this is usually to add a canonical link element that points to the URL without the sorting parameters. 

Note that this may vary from website to website. There may be situations where you want to index URLs with specific sorting or filtering parameters.  

Check if the website is available both at the HTTPS and non-HTTPS URL versions

We are coming back to redirections! I see this so often that I need to repeat myself. If the website has an SSL certificate, then all the non-HTTPS versions should permanently redirect (301) to the HTTPS versions. And there should be one redirect only. 

Check if the website is available both at the WWW and non-WWW URL versions

And the same applies to the WWW and non-WWW versions of the website. The website should not resolve at both the WWW and non-WWW URL version. 

One version needs to be chosen as canonical and the other one needs to redirect (301) to the canonical version. Here again the Link Redirect Trace Chrome extension will come in very handy. 

Check if the web pages of the website are available at URLs in which letter case is insignificant

In an ideal SEO world, the letter case in URLs should not be significant. But if it is significant, then you may:

  • add a canonical link pointing to the lowercase URL,
  • or use the so-called URL rewriter (like URL Rewrite, URL Rewriter, or ISAPI Rewrite 2) that will rewrite URL addresses so that there are only lowercase characters.

Check if pagination is handled correctly

Make sure that the paginated web pages can be crawled by search engine robots. If you feel like having a long read, check this guide to pagination in SEO. 

Check if there are duplicate or near-duplicate pages

It’s for you to judge if they really exist. Screaming Frog and Semrush may help you find such pages. 

In Sitebulb, go to Duplicate Content to check if there are duplicate content issues on the site.

In Screaming Frog, go to Crawl Overview > Content and check what pages are listed under Exact DuplicatesNear Duplicates, Low Content Pages.  

Check if the website is available at a different URL

It’s better to be safe than sorry. So make sure there is no indexable copy or test version of the website somewhere on the internet. Use Copyscape to detect duplicate content. 

Check if the content of the website is unique on the Internet

You can check if the website is unique by simply copying a unique block of text from the website and pasting it in quotes into Google. The only result returned should be the web page from which you copied the text.

Canonicalization

Let’s now dive deep into canonicalization. Sitebulb or Screaming Frog or any other website crawler will let you check if there are canonicalization issues on the website.

When in Sitebulb, go to Indexability > URLs and then check what’s under Canonical URL.

When in Screaming Frog, go to Overview > Canonicals to check how and if canonicals are used on the website. 

 

Check if canonical link elements are used on the website

The rule of SEO thumb I recommend is that all the web pages of the website should have a canonical link element. Check if this is the case for the website you are auditing. 

Check if canonical link elements are used correctly

Having canonical link elements is one thing, but having them implemented correctly is another! Your task is to check canonicalized URLs and make sure that, for example, a group of unique web pages is not canonicalized to one general URL.

I once even saw all the web pages being canonicalized to the homepage.  

Check if for the most important web pages Google has chosen the same canonical address as indicated by the webmaster.

You cannot force Google to choose the canonical URL you set for a given web page.  A canonical link is not a directive but only a hint for Google. 

But it’s great to know if Google took into account that hint. The URL inspect tool will help you check that.

Title Tags

Page titles are more related to the content side of SEO but let’s briefly take a technical look at them. 

The <title> element is what users see directly in search results and in the tab name in modern web browsers. 

Page titles provide both search engine robots and users with a lot of valuable information about the web page. I recommend using a crawler like Sitebulb or Screaming Frog to analyze the website’s meta tags in bulk. 

When in Sitebulb, go to On Page and scroll down until you see Title Length and Title Identification. Click on any to see more detail.

To analyze page titles in bulk using Screaming Frog, go to Overview > SEO Elements > Page Title

 

Check if the web pages of the website have the < title > tag

In Sitebulb, go to On Page and then Title Identification and click on Missing. The link will be active if you have missing titles on any of the web pages on your site.

In Screaming Frog, to check if there are any web pages with missing page titles, simply navigate to Page Titles > Missing

Make sure to check all the web pages of the website. All valuable web pages of the website should have <title> tags with high-quality and relevant content. 

Check if the < title> tags are of the recommended length (30-60 characters)

Page titles should contain between 40 and 60 characters to look attractive in search results.

In Sitebulb, go to On Page and Title Length to check the length of the title tags on the site.

In Screaming Frog, go to Page Titles and check web pages appearing under over 60 Characters and Below 30 Characters

Check if the < title > tags used are duplicate

You cannot force Google to choose the canonical URL you set for a given web page.  A canonical link is not a directive but only a hint for Google. 

But it’s great to know if Google took into account that hint. The URL inspect tool will help you check that.

Title Tags

Page titles are more related to the content side of SEO but let’s briefly take a technical look at them. 

The <title> element is what users see directly in search results and in the tab name in modern web browsers. 

Page titles provide both search engine robots and users with a lot of valuable information about the web page. I recommend using a crawler like Sitebulb or Screaming Frog to analyze the website’s meta tags in bulk. 

When in Sitebulb, go to On Page and scroll down until you see Title Length and Title Identification. Click on any to see more detail.

To analyze page titles in bulk using Screaming Frog, go to Overview > SEO Elements > Page Title

 

Check if the < title > tags used have appropriate content

Page titles need to be unique and form a kind of summary of the content of the web page. 

In Sitebulb, you can quickly analyze the content of all the title tags by going to On Page > URLs and scanning the list.

Check if the < title > tags used contain important keywords

And page titles should also contain the most important keywords in them. But please, no keyword stuffing! 

Check if keywords are placed at the end of the < title > tag

Keywords should be placed at the beginning of the title (starting with the most important keyword). Again, no keyword stuffing!

Check if the brand name appears at the beginning of the < title > tag

If you want to put the brand name in the title, then it should appear at the end of the page title. The only exception is the homepage where it can appear at the start. 

Page Descriptions

And now a technical look at another content-related SEO element: page descriptions. If the page description is attractive and gets to the point quickly, then users will be more likely to click the website’s snipped in search results.  

In Sitebulb, go to On Page and scroll down until you see Meta Description Length and Meta Description Identification.

To analyze page descriptions in bulk using Screaming Frog, go to Overview > SEO Elements > Meta Description

Check if the web pages of the website have content in meta description tags

In Sitebulb, go to On Page and Meta Description Identification. Check what’s under Missing.

In Screaming Frog, go to Meta Description > Missing to see the web pages without meta descriptions. If there is no content in the meta description element, then Google will generate it on its own. 

Having no meta description is not a big issue because even if a web page has its unique description, Google will probably rewrite it more often than not. However, it is still a good SEO practice to have unique page descriptions for at least the most important web pages of the website. 

Check if the content of meta description tags is of the recommended length (140-155 characters)

Assuming that Google chooses our own meta description, we should keep it within 140-160 characters so that it does not get trimmed down in search results or is not too short. 70 characters is an absolute minimum length. 

In Sitebulb, go to On Page and Meta Description Length to see the details. Click to see the details.

In Screaming Frog, go to Meta Descriptions > Over 155 Characters and Below 70 Characters.

 

Check if meta description tags are duplicate

Just like with page titles, we want meta descriptions to be unique. Go to Meta Description > Duplicate to see the web pages with duplicate meta descriptions. 

Note that it is better to have no page description at all than to have this element duplicated across many web pages. You can learn more about page titles and descriptions straight from Google in this article

Check if the content of page descriptions is appropriate

Unattractive or random content of meta description tags will make users much less likely to click the snippet in search results. 

Check if the page descriptions contain keywords

Page descriptions should include the page’s most important keyword,its variation, and, if possible, a synonym. This time it’s mainly for users, not search engines. 

Whatever they say about them at the moment, headers are important in terms of SEO.  Headings are very important when it comes to SEO. They help both users (especially users of screen readers) and search engine robots understand the topic and subtopics of the web page. In this technical SEO checklist, we are taking a more technical look at them. 

Check if the web pages of the website have an H1 tag.

Every web page of the website should have one H1 header.

To analyze headings in Sitebulb, go to On Page and check the hints.

To check H1 headings in bulk in Screaming Frog, go to Overview > SEO Elements > H1 > Missing and you will see the list of web pages without an H1 tag.  

A web page without an H1 tag is missing a huge opportunity to give search engine algorithms valuable information about itself. Each page of the website (including the homepage) should have exactly one unique H1 heading. 

Check if there are multiple H1 headings

In Sitebulb, go to On Page and scroll down until you see H1 Tag Identification. Click to see details.

In Screaming Frog, go to SEO Elements > H1 and Multiple to check if there are web pages with multiple H1 headings. 

It is certainly better to have multiple H1 tags than to have no H1 tags at all. But, if possible, stick to one H1 tag! 

Check if the content of H1 headers is SEO-friendly

H1 headers should contain the most important keyword for the web page to clearly communitate its topic both to users and search engines.

Check if headings are used on the web pages of the website

H1 tag is not enough to provide users and search engine robots with information about the content structure of the web page. A web page that has no headings at all or just one heading is difficult to understand for both users and robots. 

You can use Chrome plugins, such as Web Developer or Detailed SEO Extension to check the structure of headers on a web page.  

Check if headings are used excessively

Headings should be used to highlight the most important content and individual sections of the website. Excessive use of headings will confuse people using screen readers just as it will confuse search engine algorithms. 

Check if the structure of headings is corrupt

Another great SEO rule of thumb is to have a logical order of headings. You should treat headings and subheadings as chapters and subchapters in a book. A web page is that book. 

Chrome extensions for checking headers:

  • Web Developer Chrome extension
  • Detailed SEO Chrome extension

⚡ Make sure to check my list of SEO Chrome extensions (79 extensions reviewed).

Graphic Elements

Graphic elements, if optimized correctly, can give both search engine robots and users of screen readers a lot of extra information about the web page.

When in Sitebulb, go to Page Resources > Overview to see the overview of the page resources on your site.

Click on Images to check the details of the images (ALT attribute, file size, compression, etc.)

To check images in bulk in Screaming Frog, go to Overview > SEO Elements > Images.

Check if images are embedded correctly

Google will not treat the images embedded with CSS as part of the content of the webpage. To check how a given image is embedded, simply right click on it and click Inspect

Except for the images forming the layout of the website, graphics should be embedded with the <img> tag. 

Check if there are images with low-value ALT attributes

Low-value ALT attributes will confuse both users of screen readers and search engine algorithms. ALT attributes provide important information about the content of the image. Each unique image on a web page should have a unique and high-value ALT attribute. 

Check if there are images with no ALT attribute at all

Go to Overview > SEO Elements > Missing Alt Text to view all the images with missing ALT text. 

You can also check images on a per-page basis using the Detailed SEO Chrome extension.

Each unique image on a web page should have a high-value unique ALT text. 

 

Check if there are images with low-value file names

To check image filenames in bulk, go to Overview SEO Elements > Images > All. Image file names are not as important as ALT attributes. If a website has high-value ALT attributes, then filenames are less important. However, it’s still a good practice to use SEO-friendly image filenames at least from now on.

Check if the images used are of appropriate size

In an ideal world, the images should be displayed in their original (already compressed) size. A very common error I see is when a web page has huge (often PNG) images and adjusts its display size with CSS/HTML.

Google PageSpeed Insights will notify you if a web page has this issue. If this is a common practice, then this can really decrease the speed and performance of the website. 

Check if the images used are optimized

Test the web page with Google PageSpeed Insights to check if there is room for improvement. The images on a website (especially if it has lots of them) should be compressed and optimized. The next-gen format should be used if possible.

In the case of WordPress sites, there are lots of useful plugins for optimizing and compressing images. Check Google Image best practices

Check the code

This is very case-specific. Simply view the source code of the website audited and use your common SEO sense. You can view the code of any website by simply adding view-source before its address like in: view-source:https://seosly.com/ 

PRO TIP: To check the code of a website on a mobile device, simply add “view-source” before the address. 

If the homepage is relatively small and does not have a lot of content, but there are tens of thousands of lines of code, then something is not right. 

Check if there are some unnecessary comments in the HTML code

Check if the website has some unnecessary or weird comments in the code. You may be really surprised at what sometimes gets there! 

Check if the HTML code is alternated with JavaScripts

The general rule is to use JavaScript <script> tags before the </body> tag and in the <head> section. Your task is to check if they are not added all over the place. 

Check if in-line styles are used

In-line styles on rare occasions are OK. But they should be an exception rather than the rule. The HTML code should not contain excessive numbers of in-line styles. 

Content
In Sitebulb, go to Duplicate Content to see the technical analysis of the content on the site.

In Screaming Frog, go to Overview > SEO Elements > Content to take a technical look at the content side of things.  

 

Check if there are duplicate or near-duplicate web pages

Now go to Overview > SEO Elements > Content and check the web pages listed under Exact Duplicates and Near Duplicates.  In most cases, you don’t want Google to index those pages. These web pages should either be canonicalized or a “noindex” robots meta tag should be added to them. 

heck if the homepage has at least some text

The homepage is by far the most significant web page of the website. That’s why it should have at least some text (at least a few hundred words) and a clear heading structure. Use the Detailed SEO Chrome plugin to check the heading structure and number of words on any webpage. 

Check if there are low-content web pages

Go to Overview > SEO Elements > Low Content Pages. A low content page is a page that has few words and no unique content on its own. No one likes low-content web pages. In many cases, low content pages should either be optimized (if these are category pages) or trimmed down. 

Check if text is implemented in the form of images

Google is getting better at understanding images but it is still a good practice to add text in the form of… text. To check if this issue is on the website simply view some of its most important web pages and analyze images used there. 

Check if flash elements are used instead of text

I know this is an obsolete question. But I still advise you to check if Flash is used on the website. To check that, go to Overview > SEO Elements > Internal > Flash. You should see (0) there. 

Check if the content of the website is added with the use of iframes

The actual content of the web page should not be placed with the use of iframes. Make sure this does not happen. 

Check if the website has relevant and topically-coherent content

A Technical Look At Keywords
Keywords are content but a technical approach and technical knowledge are required to make them work! Here are a few things to check regarding keywords.

Check if correct tags are used for highlighting keywords

Highlighting keywords on a page can be quite helpful both for users and search engine robots.  To make keyword highlighting work, <strong> tags need to be used instead of <b>.Your task here is to take a look at the most important web pages, articles, or guides and check if keywords are highlighted correctly. 

Check if keyword research has ever been done for the website

Your task as an auditor is to check if anyone has ever done keyword research for the website and if possible take a look at the keywords selected for the website. This provide some additional insights into the website examined.

Keyword research is not part of the technical SEO audit but you may offer the client to do it. You can use Semrush or Ahrefs to do keyword research. If you don’t know how to do keyword research, you might want to take the Semrush free course on keyword research.

Check if specific keywords are mapped to specific web pages

If you don’t have information about the keyword research for the website, you may simply manually review a bunch of its web pages to check if they are mapped to keywords. A web page targeted at a specific keyword will usually contain this keyword in the title, URL, headings, the first paragraphs of text.

The web page you are now reading is obviously targeting the keyword technical SEO audit. If you have any experience with SEO, this will be very easy to determine.

In the case of WordPress websites that have an SEO plugin installed, one way to check if a web page targets a specific keyword is to view the meta data of the page. Both Yoast SEO and Rank Math let you do that. Here is how it looks in Rank Math.

Note that to be able to check that you need WordPress admin access.

Check if web pages are optimized for their specific keywords

This is the follow-up to the previous step. This time you want to make sure that the web page targeting a given keyword is actually optimized for it. In addition to having the keyword in the title, URL, headings and the first paragraphs of text, the web page should also have valuable graphics (with ALT text), topically-relevant links to external resources, and more.

Structured data
Google uses structured data in order to better understand the content of the web page and to enable special search results features and enhancements (rich results). Every technical SEO audit should also analyze the website in terms of structured data. You can learn more about structured data in this Google article.

Check if structured data are used on the website

The best way to check if structured data are used on the website is to run a crawl. You can use Sitebulb or Screaming Frog.

In Sitebulb, go to Structured Data > Schema to check what types of Schema are used on the site (and whether there are warnings or validation errors).

If you are using Screaming Frog, make sure to check JSON-LD and Schema.org Validation under Structured Data in Spider Configuration, or the crawler will not check for structured data.

Once the crawl is done, you can check if structured data is on the website. Navigate to Overview > Structured Data and Contains Structured Data.

This is the list of URLs that have structured data. To check if structured data is used on a per-page basis, you can use the Detailed SEO Chrome extension.

Check if the structured data used is valid

To check if the structured data used on the website is valid you can use the Google Structured Data Testing Tool or the Rich Results Test to check if the web page is eligible for rich results.

If you are using Sitebulb, run the crawl and go to Structured Data to see very in-depth and beautifully-looking report of the structured data used on the site.

If you are using Screaming Frog, navigate to Overview > Structured data to check if structured data are used on the website. If they are, then you will see the number of URLs next to Contains Structured Data and in Missing there will be nothing or almost nothing.

Check if other types of structure data could be added to the website

Here your task is to analyze the most important web pages of the website, check the types of structured data they contain, and decide they other types of data can also be added. Here you can use a Chrome extension like Detailed SEO to check types of structured data on a per-page basis.

If the website audited is based on WordPress, then you may think about upgrading to Rank Math Pro which allows for implementing different types of structured data.

Website Speed
I assume you have already tested the website with Google PageSpeed Insights. Let’s now get even more data about the speed of the website.

Note that you can check the PSI scores of all your pages in bulb with the help of Sitebulb. When setting up the crawler settings make sure to check, Page Speed, Mobile Friendly and Front-end.

When the crawl is done, go to Page Speed to analyze this element in detail.

Check the website speed with GTmetrix

GTmetrixis another great tool for analyzing the speed of the website. It also gives a lot of actionable tips and highlights specific problems. 

Check the website speed with the WebPageTest

And now check the web page with the WebPageTest. Make sure to test it as a mobile device.

Check the website speed with Google PageSpeed Insights if you still have not done it.

Or rerun the test and compare its results with the results provided by other speed tools.

Website Security

Most websites on the internet do not implement even basic security best practices. Your task is to ensure that the website you are auditing is not one of those websites.

To take a thorough look at some security issues in Sitebulb, run the crawl and then navigate to Security where you will see a ton of different security elements and their assessment.

Does the website have an SSL certificate?

HTTPS has been a ranking factor since 2014. In 2020 (and 2021) each and every website should use HTTPS. A website not secured with HTTPS is marked as Not secure in Chrome and other browsers. 

Make sure the website uses HTTPS. If it does not make it a priority that it moves to HTTPS as soon as possible. 

 

Is there mixed content on the website?

Mixed content occurs when website resources load both over HTTP and HTTPS. In Sitebulb, go to Security and see the issues. If the site does not have mixed content, then you will see it in the No Issue section.

All the HTTP resources should be redirected to the resources that load over HTTPS. But note that Chrome has recently been updated and deals with mixed content on its own! 

No. 167: Are there at least some basic security best practices implemente

Are there at least some basic security best practices implemented?

It is hard to exactly determine the scope of basic security practices. Generally, the more, the better.  A few simple and effective security practices include:

  • HTTPS, 
  • two-factor authorization for login panels, 
  • password-protection of the login panel,
  • strong passwords,
  • regularly scanning the website with some security software,
  • doing regular backups,
  • making sure the website is not filtered by Google Safe Browsing,
  • to name just a few.  

Check server logs

Server logs will always tell you the truth and the truth only. If possible, do a server log analysis. The areas to focus on include, for example, crawl volume, response code errors, crawl budget waste, temporary redirects, last crawl date. Semrush contains Log File Analyzer that will help you analyze the raw data and make sense of it.

If you cannot access server logs, make sure to analyze the crawl stats report in Google Search Console.

WordPress Technical Checks & Quick Fixes
Here are some of the plugins that will let you fix some of the above-discussed issues. I am a heavy user of WordPress, so I can recommend some of my favorites plugins. You don’t necessarily need to install all of those plugins if you are only doing a one-time technical audit but you should at least indicate that they are an option.

If, on the other hand, you are auditing a website which you will be monitoring on a regular basis, you will make your life a lot easier with those plugins.

Install a backup plugin.

Before making any changes to the website, back it up. There are a lot of ways to back up a site. One possible way is to install a backup plugin (like UpdraftPlus) that will automatically create a copy of all the files and database. Make sure that the backup is not stored in the same place as the rest of the files. If the website does not have any backup plan with its hosting, take care of this. 

Install a security plugin

WordPress websites are especially vulnerable to attacks and hacks. There are a few good WordPress security plugins (like iThemes Security) that will let you implement at least a basic level of security on a website. If you decide to buy a pro version of a security plugin, you can pretty much forget about this aspect.

UPDATE: Security plugins often can also slow down your site. Make sure to test this. Adding Cloudflare CDN to your site also increases its security.

Install a Really Simple SSL

The mixed content issue is quite common in WordPress websites. There is very often something not quite right with the HTTP > HTTPS redirects. Fortunately, you can fix it with one click with the help of an SEO plugin called Really Simple SSL. 

UPDATE: According to my recent tests, this plugin slows down sites. I recommend installing it and uninstalling it but keeping the settings.

Update WordPress and plugins

Both WordPress and plugins should be updated regularly. This is also a very important security best practice. Some websites, unfortunately, like to break with updates. That’s why doing regular backups is so crucial.

Your task here is to back up the website and run updates (if you have the power to do it).  

Install Google Site Kit

I know that having too many plugins is not good but Google Site Kit is really worth installing. This is an official Google plugin that provides insights from different Google tools in one place: the WordPress dashboard.

UPDATE: I have recently been moving in the direction of minimizing the number of plugins. Unless you really need to have a dashboard in the WordPress panel, I don’t recommend installing this plugin any more.

Install an SEO plugin

Check if the website is using an SEO plugin. Install and configure one if needed. There are basically two major players here: Rank Math and Yoast SEO. It’s up to you to decide which one to choose. 

UPDATE: I recommend using Rank Math which does not slow down sites and offers many of the features that you need to pay for in Yoast.

Optimize all the images in bulk.

Images can really slow down the website if they are not optimized. Fortunately, there are a lot of plugins that will let you optimize images in bulk. I most often use Imagify. There are other good options like WP Smush. 

Improve the speed and performance.

I have probably tested hundreds of different site speed and caching plugins. Some improved the Google PageSpeed Insights score a bit, others even decreased it. From my own experience, there is only one plugin for speed. WP Rocket!

Regularly check for broken links.

Links change, come and go. Broken links can lead to a bad user experience. Fortunately, there is an easy way to monitor all your links and get notifications about broken links with a quick option to update the link. I use Broken Link Checker to regularly check my links. 

UPDATE: I don’t recommend using this plugin any more. In my tests it proved to slow sites down a lot. You can monitor your sites for broken backlinks with the help of Sitebulb.

Check if the website needs some cleaning regarding the unused/used plugins.

When in WordPress, navigate through the list of the plugins installed. Make sure that each plugin is actually used. If it is not, remove it. Make sure the website does not have multiple plugins doing the same things (e.g. multiple security or SEO plugins).

Technical SEO Audit: E-A-T

E-A-T (Expertise, Authoritativeness, Trushworthiness) is big. That’s why I think you should also analyze it at least briefly even when doing a technical audit.

Check if the website has backlinks from authoritative sites from the same field

In the case of an SEO blog, these would be backlinks from SEO authorities like SEJ or Moz. A quick way to get an overview of backlinks is to run a Backlink Audit in Semrush. Once complete, you can sort link types by their authority score. In the example above, there are unfortunately no highly authoritative links for the domain examined.

Check if the website is mentioned on other authoritative websites

Mentions are not always links but are very important as well. Check if the website (or the brand name) is mentioned on authoritative websites. The simplest way is to perform an exact match search for the brand name. In the case of my website that would be "seosly".

Check if the content of the website is up to date

Depending on the topic of the website, there may not be a clear way to check if the content is up to date. One possible way to check if the content on the website is up to date is look at the publication or update date on articles. Here is how it looks in the case of my website.

You can also check the last modified date in the XML sitemap for a given URL or all the URLs.

Check if the content is factually accurate

Of course, you may not be able to verify the factual accuracy of the content of the website (especially if its topic is very specific) but you should check if the website makes claims that contradict scientific consensus. This includes websites that promote conspiracy theories or alternative medicine treatments.

Check if the authors of the websites are recognized authorities in the field

One of the elemnt of E-A-T is authority which means that the website and its authors should be reconized authorities in the field. A quick way to check that is to perfom an exact match search for the author or authors of the website. Do their names appear in other trustworthy publications or websites? Are they referred to as authorities?

Check if the website presents its credentials (awards, certifications, badges, etc.)

If the website or the brand has any kind of achievements (like awards, certifications, trush badges, etc), they should all be presented on the website. The best place to show these achievements is the home page or about page. In the case of my website, I put all of my achievements on the about page and on the home page. Here is how it looks on my about page.

Check if the website has genuine reviews (and if they are positive or negative)

There is nothing worse than the website or brands writing its own (fake) reviews. Again, this may not be very easy to verify but some digging should let you determine the quality and genuinity of the reviews. One or two negatives reviews are not a problem and are part of the web. However, if after typing the brand name in the search box, you see nothing but overtly negative reviews and dissatisfied customers, then this needs to be addressed in the first place.

Check if the website has information about its authors (like author bios)

If the website has one author, then the information in the about page should do. However, if there are multiple authors, each author should have a bio in each of their articles. I am the only author of the content on SEOSLY but I still add my bio at the end of my articles.

Check if the website has contact details (and contact me page)

Any trustworthy website must give you an option to contact its owner. Ideally, there should be a contact page where all the possible ways to contact the owner are listed. There should ideally be an e-mail address, phone number, and physical address. Some contact details may also be placed in the footer. If there is no way to content the website, then this is a red flag.

Check if the website has a Wikipedia page

Most of the web pages do not have a Wikipedia page and that’s OK. It is extremely difficult to get a Wikipedia page but if the website audited (or its brand or author) has a Wikipedia page, then it means it has a decent level of E-A-T.