Chapter 1. SEO Foundation

What Is a Website Audit and How To Do It?

What Is a Website Audit and How To Do It?
Tag:

In an earlier learning section, we covered how to develop an SEO plan, but detailed action plans should be defined through a website audit. Regardless of whether you do it yourself or hire a specialist, it is beneficial for you to understand how website audit works. This learning section explains what a website audit is and how to do it with key SEO tools.

What Is a Website Audit?

A website audit in the SEO context is an assessment of an existing website used to identify the gaps between the current status and target status, improve search engine rankings, and increase organic search traffic.

As a website audit is an aggregation of each web page audit, it is usually time-consuming. If your website has many pages, you should consider using paid tools or hiring an SEO specialist who uses proper tools.

Generally, a typical website audit covers the following four areas, although grouping can differ by case.

  • Overall Assessment
  • On-Page SEO Assessments
  • Technical SEO Assessments
  • Off-Page Assessments

Overall Assessment

The first thing you need to know to understand your website's current status is its general performance, including organic traffic, keyword ranking, indexed pages, and user-related metrics. These performance indicators (KPIs) may not be included in the narrow scope of a website audit, but monitoring these KPIs is critical as the KPIs are more directly linked with your business goals.

Organic search traffic generation funnel – Google Search Console

Usually, maximizing organic search traffic is the objective of SEO. To achieve the goal, you need to know the current volume of organic search traffic and your web pages' readiness to generate organic search traffic.

There are four key stages of generating a high volume of organic search traffic.

  1. Pages are indexed (# of indexed pages)
  2. Pages are ranked for particular keywords (# of keywords and pages ranked)
  3. Pages are ranked high (e.g., # of pages ranked within the top 10)
  4. Pages are gaining traffic from SERPs (# of Clicks)

This is an illustration of the four stages.

Four Stages to Generate Organic Search Traffic

For stages #3 to #4, you can also check the other metrics as interim indicators.

  • # of impressions
  • CTR (Click Through Rate)

Most of the data above are available in the Search Console. For example, you can check the indexing status on the Pages section in the Search Console.

Google Search Console Example for Page Indexing

Under the Performance section, you can get the # of clicks, # of impressions, average CTR, and average position.

Google Search Console Example for Performance Monitoring

You can also see keywords (Queries) and pages ranked in the specified timeframe on the Search Console.

User lifecycle status – Google Analytics

To understand a website's overall performance, you need to know how your website is acquiring, engaging, and retaining users. User experience metrics are also crucial for improving page rankings on SERPs and monitoring the achievement of your business goals.

Google Analytics can be used to check user lifecycle status. Since Google Analytics has many features that require a lot of explanations, we'll explain them in a different tutorial.

Google Analytics Report Example for Average Engagement Time

Domain and Page Authority - Mozbar

Domain Authority (DA) and Page Authority (PA) are search engine ranking scores developed by Moz, but the idea is widely used in the SEO professional community.

Google may not calculate DA in the same way as SEO tool service providers; however, the metrics are useful for checking how likely a page and domain are to rank on SERPs.

The higher your domain or page authority score (up to 100) is, compared to competitor pages and domains, the more easily your page is ranked high.

There are several ways to check DA and PA, but the quickest way is using MozBar. Install the Mozbar Chrome extension and run the extension on the page where you want to check the score.

Mozbar UI Example on Each Web Page

Competitor Analysis

Traffic and user metrics comparison

You should use paid tools to conduct robust competitor analysis for traffic and user metrics. For example, Semrush and Similarweb provide detailed competitors' traffic and user-related data, such as average visit duration and bounce rate.

Semrush Competitor Traffic Data

If you want to check traffic data with a free tool, you can use ahref's website traffic checker.

ahrefs Website Traffic Checker Landing Page

You can see traffic data by entering the domain you want to check.

ahrefs Website Traffic Checker Output Example

DA and PA

Using MozBar, you can also check the Domain Authority (DA) and Page Authority (PA) of competitors' pages. Run the MozBar Chrome extension on the SERP of your target keyword.

Mozbar UI Example on the SERP

On-Page SEO Audit

The objectives of on-page SEO are to optimize web pages, improve a website’s visibility in search engines, and attract target users. Some SEO tools do not separate on-page and technical audits. In this tutorial, the on-page SEO covers SEO aspects that are mainly related to content and keywords.

As on-page SEO includes optimizing content itself, it can be a considerable effort. In the on-page audit, you should first focus on minimum requirements that can be systematically checked. Content-related aspects, such as keyword optimization, will be covered in Chapter 2 in detail.

The list below is a list of basic checkpoints you need to assess.

  • Missing tags and character length
  • Links with anchor tag
  • URL structure
  • Image file name and alt (alternative text)
  • Keyword Optimization
  • Content duplication
  • UI/UX

We'll explain how to conduct each on-page SEO with SEO tools below.

Missing tags and character length

As search engines understand the meaning of web pages using HTML tags, it is important to use HTML tags appropriately. The following are key HTML tags that are important from the SEO point of view.

Meta Title

Each page should have one meta title. The meta title is defined by the <title> tag in the <head> section of the HTML document.

Several sources say that the optimal meta title length is 50-60 characters, considering that the meta title is expected to be displayed on SERPs.

Meta Description

Each page should have one meta description. The meta description is defined by the <meta name="description"> tag in the <head> section in the HTML document.

Several sources say that the optimal meta description length is 150-160 characters, considering meta description is expected to be displayed on SERPs.

H1 tag

Unlike other heading tags (e.g., h2, h3 tags), each page should have only one H1 tag.

The H1 tag is usually used as the web page's shown title. Unlike the meta title, the phrase in the H1 tag is visible on the web page.

Other meta tags

From a technical SEO point of view, each page should have more meta tags, such as a canonical tag and a meta robots tag. These tags will be explained in the technical SEO section.

Links with anchor tag

Having a sufficient amount of internal links and external links is essential in SEO. As Search Engines understand links with the <a> tag, you need to be careful when making links using JavaScript.

Properly written anchor text is also important to deliver key information to crawlers.

You also need to make sure that there are no broken links.

SEO Quake

You can use the SEO Quake Chrome extension to check how your page implements tags and links. Go to the Chrome web store and add the extension to Chrome.

Run the extension on the page you want to assess. For example, under the DIAGNOSIS tab, you can see the status of tags on the page.

SEOquake Diagnosis Example

You can also check your link status under the INTERNAL and EXTERNAL tabs,

SEOquake Internal Link Check Example

URL structure

Google recommends using simple and descriptive words in URLs (Google Seach Central - URL structure best practices for Google).

For the word separator, using a (-) hyphen is recommended. Search engines may misread the meaning if you use an (_) underscore.

Image file name and alt

Alt: As crawlers cannot understand images on web pages like humans, the alt attribute is essential for providing a descriptive image explanation.

File name: The image's file name is also important for delivering information to crawlers. Use simple and descriptive words, as they will be part of URLs. If you need to use the ID to manage image files, add it at the end of the file name (not at the beginning of the file name).

Title: Adding the title attribute helps improve usability, but the alt attribute is more important for explaining key information to crawlers.

SEO META in 1CLICK

You can use SEO META in 1 CLICK Chrome extension to check how your page implements images.

Go to the Chrome web store and add the extension to Chrome.

You can just run the extension on the page that you want to assess. Under the Images tab, you can see the status of the tags on the page.

SEO META in 1CLICK Image Analysis

This tool also provides other helpful information, such as header structure and OGP (Open Graph Protocol) implementations.

Keyword Optimization

Keyword optimization is one of the most critical aspects of the on-page SEO. You need to include target keywords in specific locations so that crawlers can effectively assess the page's relevance to the user's query. To learn keyword optimization, check Chapter 2.

UI/UX

UI/UX is one big theme in website design, but it has many implications for SEO. For example, website navigation design is important to usability. Generally, adding a sitemap, breadcrumbs, and navigation bars (top, side, or bottom bar) is recommended.

Technical SEO Audit

The coverage of technical SEO and on-page SEO varies as they are interlinked. In this tutorial, we cover the following three areas of technical SEO.

  1. Index-ability
  2. Page experience issues (primarily speed, mobile friendliness, and security)
  3. Implementation of Schema Markups (Structured data, Rich Results)

Index-ability

Indexing is often the first challenge for new websites. You can check the number of indexed pages on your website using Google Search Console.

You can also use Google Search Engine to check which pages are indexed under a particular domain by typing site: DOMAIN (e.g., site:example.com) in the search window. As information on Google Search Console may not be updated, you can use this approach to check the latest indexing status.

There needs to be more than just knowing how many pages are indexed to check indexability. Pages must be crawlerable and attract users so they can be indexed.

Several factors accelerate or decelerate page indexing. In this section, we'll explain the main factors at a high level. Chapter 3 will cover the actual implementation guide.

Sitemap.xml

Sitemap.xml is a sitemap for crawlers. It should contain all the web page URLs that need to be indexed.

If the website does not have many pages (e.g., less than 100 pages), you may not need to implement sitemap.xml. However, if you have many pages on your website, having sitemap.xml improves your website's crawlability.

The file should be placed in the root directory of the website. To signal the existence of sitemap.xml to search engines, you can submit the URL of the sitemap on Google Search Console.

Sitemap.xml Example
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>http://www.example.com/</loc>
    <lastmod>2024-02-20</lastmod>
    <changefreq>daily</changefreq>
    <priority>1.0</priority>
  </url>
  <url>
    <loc>http://www.example.com/about</loc>
    <lastmod>2024-01-15</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
    :
</urlset>

Robots.txt

Robots.txt is a text file that instructs robots (crawlers) which pages should or shouldn't be crawled. The file should be located at the root of the website. The instructions are applied to all the websites instead of the meta robots tag. As robots.txt can block crawlers from crawling particular pages, you need to check if the instructions are described properly.

Robots.txt Example
# robots.txt for http://www.apple.com/
User-agent: *
Disallow: /*/includes/*
Disallow: /*retail/availability*
Disallow: /*retail/availabilitySearch*

  :
Sitemap: https://www.apple.com/shop/sitemap.xml
Sitemap: https://www.apple.com/autopush/sitemap/sitemap-index.xml
Sitemap: https://www.apple.com/newsroom/sitemap.xml
Sitemap: https://www.apple.com/retail/sitemap/sitemap.xml
Sitemap: https://www.apple.com/today/sitemap.xml

Noindex and nofollow (meta robots tag)

If you don't want to index a specific page, you can add the instructions on the page using the meta robot tag. The noindex directive is often used together with the nofollow directive. The noindex directive tells search engines not to include a specific page in their search results. The nofollow directive tells search engines not to follow the link or pass any page value (link juice) to another page.

As the noindex directive can block crawlers from crawling particular pages, you need to check if it is properly used.

Noindex and Nofollow Example
<meta name="robots" content="noindex, nofollow">

Duplicate pages and canonical tag

As search engines don't want to index the page with duplicate content, such content can confuse search engines and cause a penalty from search engines.

To tell search engines which version of a page is preferred, you can use the canonical tag to avoid penalties for duplicate content.

It is a good practice to use the canonical tag on all indexed pages. You also need to ensure that the URLs are the same as those in the sitemap.xml.

Search engines handle HTTP URLs vs. HTTPS URLs and www URLs vs. non-www URLs as different URLs. You need to add the canonical tag even on the homepage.

The image below shows an example of a canonical tag.

Canonical tag example
<link rel="canonical" href="https://example.com/example-page">

404 error and redirection

404 error occurs when a page cannot be found. This issue may occur when you modify a web page's URL.

If the links to the page or the URLs in sitemap.xml are old versions, crawlers may encounter the 404 error.

404 Not Found Example

If you change the URL of the web pages, you need to set redirections (such as 301 redirects) to send users and crawlers to a new URL from the one they originally requested.

Site structure and internal links

As crawlers crawl web pages through hyperlinks, a well-structured website with proper internal links helps them crawl the site efficiently.

If you have addressed the above points, but your website pages are not indexed, the website may have structure or internal link issues. You need to be especially careful when you are using JavaScript to build your website.

As explained in the on-page SEO assessment section, if you are not using <a> tags, crawlers may not recognize it as a link properly. You can easily check the link status using Chrome extensions such as SEOquake and SEO META in 1CLICK.

To understand how to manage indexing better, go to this tutorial: "Index Your Website - How To Fix Page Indexing Issues?".

Page performance

As Google emphasizes that page experience is an important factor in ranking results, improving website performance is a critical part of SEO, and its importance is increasing.

Core Web Vitals

Google introduced Core Web Vitals as a set of specific factors to determine a webpage's overall user experience from the speed point of view. Core Web Vitals include:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts to load.
  • Interaction to Next Paint (INP): Measures interactivity, the responsiveness of a web page by quantifying the delay between user interactions (such as clicks, taps, or key presses) and the visual response or feedback from the page. INP is a newly developed metric that replaced First Input Delay (FID) as one of Core Web Vitals on March 12, 2024.
  • Cumulative Layout Shift (CLS): Measures visual stability. To ensure a good user experience, pages should maintain a CLS of 0.1 or less. CLS quantifies how often users experience unexpected layout shifts—a low CLS ensures that the page is stable as it loads.

We'll explain these metrics in "How To Improve Website Performance (Page Experience)".

Google Search Console shows how many pages on a website perform poorly, need improvement, or are good. To get more insights on Core Web Vitals, you should use Lighthouse or PageSpeed Insights.

Lighthouse and PageSpeed Insights

Both Lighthouse and PageSpeed Insights are tools for assessing page performance. Although their assessment coverage and user interfaces are different, they can be used for the same purposes.

Lighthouse is available as a Chrome extension and in Chrome Developer Tools, while PageSpeed Insights is accessible through a dedicated web page.

PageSpeed Insights Report Example

Gtmetrix

If you need more detailed insights, GTmetrix is a good tool. There is a quota limitation, but you can use it for free after signing up.

GTmetrics Report Example

Mobile-friendliness

Google prioritizes mobile-friendly sites in its search rankings (mobile-first indexing). Check Google Search Central - Mobile site and mobile-first indexing best practices.

From an audit point of view, key checkpoints are whether the page applies responsive design and whether the content is clearly visible on small devices.

Responsive Design Illustration

HTTPS

Due to security concerns, using HTTPS is necessary when launching a website.

You can check HTTPS status of your website using Google Search Console.

To use HTTPS, you need to set up SSL (Secure Sockets Layers). To understand it, you can refer to SSL Setup.

HTTPS Illustration

Schema markups (Google Search Console)

Schema markup is code in a particular format (Structured Data) that helps search engines understand the meaning of the information and allows them to return more informative results for users (Rich Results).

Implementing Schema Markups is not the first priority if you haven't done other basic SEO; however, they are powerful tools for attracting users on SERPs and improving CTR.

From the Technical SEO assessment point of view, you can check how many pages have schema markups and if they were appropriately implemented. Google also provides a Rich Result Test as a Search Console feature to verify if webpages are eligible for Rich Results in search.

To learn more about schema markup, go to this tutorial: "What Is Schema in SEO? | Structured Data and Rich Results".

Off-Page SEO Audit

The objectives of off-page SEO mainly involve assessing backlink quality and quantity. Social and brand signals are also other off-page factors, but they may require a tailored assessment approach.

Backlinks

The quick way to check backlinks is to use Google Search Console. In the Link section, you can see a number of backlinks; however, it doesn't give quality information.

If you want to do a comprehensive backlink audit, you should consider using a paid tool, such as Semrush, Ahref, Moz Pro, or Majestic.

Those tools usually give you an analysis of the backlink profile, including the quality and quantity of links (or toxic level).

Semrush Backlink Audit Example

To learn more about backlinks, go to this tutorial: "What Are Backlinks in SEO and How To Build Them?".

Social Media Marketing

Although you may not conduct a social media assessment or brand recognition survey, you can check how your website implements some social media features.

For example, OGP (Open Graph Protocol) can help to promote your web pages with short descriptions and images through social media, such as Facebook, LinkedIn, X (Twitter), etc.

OGP is implemented through meta og tags or meta twitter:card tags. These can also be considered part of on-page SEO or technical SEO.

Whatsapp OGP (Open Graph Protocol) Example

To learn more about Social Media Marketing, go to this tutorial: "SEO and Social Media Marketing - What Are Social Signals?".

Conclusion

There are many audit points, and you may be overwhelmed. If you want to save time, consider hiring SEO specialists or using paid tools. Even if you use a specialist or paid tools, you still need basic knowledge to get the most from an SEO audit.


You can also learn this topic on Kindle. ClickAmazonKindle.

In an earlier learning section, we covered how to develop an SEO plan, but detailed action plans should be defined through a website audit. Regardless of whether you do it yourself or hire a specialist, it is beneficial for you to understand how website audit works. This learning section explains what a website audit is and how to do it with key SEO tools.

What Is a Website Audit?

A website audit in the SEO context is an assessment of an existing website used to identify the gaps between the current status and target status, improve search engine rankings, and increase organic search traffic.

As a website audit is an aggregation of each web page audit, it is usually time-consuming. If your website has many pages, you should consider using paid tools or hiring an SEO specialist who uses proper tools.

Generally, a typical website audit covers the following four areas, although grouping can differ by case.

  • Overall Assessment
  • On-Page SEO Assessments
  • Technical SEO Assessments
  • Off-Page Assessments

Overall Assessment

The first thing you need to know to understand your website's current status is its general performance, including organic traffic, keyword ranking, indexed pages, and user-related metrics. These performance indicators (KPIs) may not be included in the narrow scope of a website audit, but monitoring these KPIs is critical as the KPIs are more directly linked with your business goals.

Organic search traffic generation funnel – Google Search Console

Usually, maximizing organic search traffic is the objective of SEO. To achieve the goal, you need to know the current volume of organic search traffic and your web pages' readiness to generate organic search traffic.

There are four key stages of generating a high volume of organic search traffic.

  1. Pages are indexed (# of indexed pages)
  2. Pages are ranked for particular keywords (# of keywords and pages ranked)
  3. Pages are ranked high (e.g., # of pages ranked within the top 10)
  4. Pages are gaining traffic from SERPs (# of Clicks)

This is an illustration of the four stages.

Four Stages to Generate Organic Search Traffic

For stages #3 to #4, you can also check the other metrics as interim indicators.

  • # of impressions
  • CTR (Click Through Rate)

Most of the data above are available in the Search Console. For example, you can check the indexing status on the Pages section in the Search Console.

Google Search Console Example for Page Indexing

Under the Performance section, you can get the # of clicks, # of impressions, average CTR, and average position.

Google Search Console Example for Performance Monitoring

You can also see keywords (Queries) and pages ranked in the specified timeframe on the Search Console.

User lifecycle status – Google Analytics

To understand a website's overall performance, you need to know how your website is acquiring, engaging, and retaining users. User experience metrics are also crucial for improving page rankings on SERPs and monitoring the achievement of your business goals.

Google Analytics can be used to check user lifecycle status. Since Google Analytics has many features that require a lot of explanations, we'll explain them in a different tutorial.

Google Analytics Report Example for Average Engagement Time

Domain and Page Authority - Mozbar

Domain Authority (DA) and Page Authority (PA) are search engine ranking scores developed by Moz, but the idea is widely used in the SEO professional community.

Google may not calculate DA in the same way as SEO tool service providers; however, the metrics are useful for checking how likely a page and domain are to rank on SERPs.

The higher your domain or page authority score (up to 100) is, compared to competitor pages and domains, the more easily your page is ranked high.

There are several ways to check DA and PA, but the quickest way is using MozBar. Install the Mozbar Chrome extension and run the extension on the page where you want to check the score.

Mozbar UI Example on Each Web Page

Competitor Analysis

Traffic and user metrics comparison

You should use paid tools to conduct robust competitor analysis for traffic and user metrics. For example, Semrush and Similarweb provide detailed competitors' traffic and user-related data, such as average visit duration and bounce rate.

Semrush Competitor Traffic Data

If you want to check traffic data with a free tool, you can use ahref's website traffic checker.

ahrefs Website Traffic Checker Landing Page

You can see traffic data by entering the domain you want to check.

ahrefs Website Traffic Checker Output Example

DA and PA

Using MozBar, you can also check the Domain Authority (DA) and Page Authority (PA) of competitors' pages. Run the MozBar Chrome extension on the SERP of your target keyword.

Mozbar UI Example on the SERP

On-Page SEO Audit

The objectives of on-page SEO are to optimize web pages, improve a website’s visibility in search engines, and attract target users. Some SEO tools do not separate on-page and technical audits. In this tutorial, the on-page SEO covers SEO aspects that are mainly related to content and keywords.

As on-page SEO includes optimizing content itself, it can be a considerable effort. In the on-page audit, you should first focus on minimum requirements that can be systematically checked. Content-related aspects, such as keyword optimization, will be covered in Chapter 2 in detail.

The list below is a list of basic checkpoints you need to assess.

  • Missing tags and character length
  • Links with anchor tag
  • URL structure
  • Image file name and alt (alternative text)
  • Keyword Optimization
  • Content duplication
  • UI/UX

We'll explain how to conduct each on-page SEO with SEO tools below.

Missing tags and character length

As search engines understand the meaning of web pages using HTML tags, it is important to use HTML tags appropriately. The following are key HTML tags that are important from the SEO point of view.

Meta Title

Each page should have one meta title. The meta title is defined by the <title> tag in the <head> section of the HTML document.

Several sources say that the optimal meta title length is 50-60 characters, considering that the meta title is expected to be displayed on SERPs.

Meta Description

Each page should have one meta description. The meta description is defined by the <meta name="description"> tag in the <head> section in the HTML document.

Several sources say that the optimal meta description length is 150-160 characters, considering meta description is expected to be displayed on SERPs.

H1 tag

Unlike other heading tags (e.g., h2, h3 tags), each page should have only one H1 tag.

The H1 tag is usually used as the web page's shown title. Unlike the meta title, the phrase in the H1 tag is visible on the web page.

Other meta tags

From a technical SEO point of view, each page should have more meta tags, such as a canonical tag and a meta robots tag. These tags will be explained in the technical SEO section.

Links with anchor tag

Having a sufficient amount of internal links and external links is essential in SEO. As Search Engines understand links with the <a> tag, you need to be careful when making links using JavaScript.

Properly written anchor text is also important to deliver key information to crawlers.

You also need to make sure that there are no broken links.

SEO Quake

You can use the SEO Quake Chrome extension to check how your page implements tags and links. Go to the Chrome web store and add the extension to Chrome.

Run the extension on the page you want to assess. For example, under the DIAGNOSIS tab, you can see the status of tags on the page.

SEOquake Diagnosis Example

You can also check your link status under the INTERNAL and EXTERNAL tabs,

SEOquake Internal Link Check Example

URL structure

Google recommends using simple and descriptive words in URLs (Google Seach Central - URL structure best practices for Google).

For the word separator, using a (-) hyphen is recommended. Search engines may misread the meaning if you use an (_) underscore.

Image file name and alt

Alt: As crawlers cannot understand images on web pages like humans, the alt attribute is essential for providing a descriptive image explanation.

File name: The image's file name is also important for delivering information to crawlers. Use simple and descriptive words, as they will be part of URLs. If you need to use the ID to manage image files, add it at the end of the file name (not at the beginning of the file name).

Title: Adding the title attribute helps improve usability, but the alt attribute is more important for explaining key information to crawlers.

SEO META in 1CLICK

You can use SEO META in 1 CLICK Chrome extension to check how your page implements images.

Go to the Chrome web store and add the extension to Chrome.

You can just run the extension on the page that you want to assess. Under the Images tab, you can see the status of the tags on the page.

SEO META in 1CLICK Image Analysis

This tool also provides other helpful information, such as header structure and OGP (Open Graph Protocol) implementations.

Keyword Optimization

Keyword optimization is one of the most critical aspects of the on-page SEO. You need to include target keywords in specific locations so that crawlers can effectively assess the page's relevance to the user's query. To learn keyword optimization, check Chapter 2.

UI/UX

UI/UX is one big theme in website design, but it has many implications for SEO. For example, website navigation design is important to usability. Generally, adding a sitemap, breadcrumbs, and navigation bars (top, side, or bottom bar) is recommended.

Technical SEO Audit

The coverage of technical SEO and on-page SEO varies as they are interlinked. In this tutorial, we cover the following three areas of technical SEO.

  1. Index-ability
  2. Page experience issues (primarily speed, mobile friendliness, and security)
  3. Implementation of Schema Markups (Structured data, Rich Results)

Index-ability

Indexing is often the first challenge for new websites. You can check the number of indexed pages on your website using Google Search Console.

You can also use Google Search Engine to check which pages are indexed under a particular domain by typing site: DOMAIN (e.g., site:example.com) in the search window. As information on Google Search Console may not be updated, you can use this approach to check the latest indexing status.

There needs to be more than just knowing how many pages are indexed to check indexability. Pages must be crawlerable and attract users so they can be indexed.

Several factors accelerate or decelerate page indexing. In this section, we'll explain the main factors at a high level. Chapter 3 will cover the actual implementation guide.

Sitemap.xml

Sitemap.xml is a sitemap for crawlers. It should contain all the web page URLs that need to be indexed.

If the website does not have many pages (e.g., less than 100 pages), you may not need to implement sitemap.xml. However, if you have many pages on your website, having sitemap.xml improves your website's crawlability.

The file should be placed in the root directory of the website. To signal the existence of sitemap.xml to search engines, you can submit the URL of the sitemap on Google Search Console.

Sitemap.xml Example
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>http://www.example.com/</loc>
    <lastmod>2024-02-20</lastmod>
    <changefreq>daily</changefreq>
    <priority>1.0</priority>
  </url>
  <url>
    <loc>http://www.example.com/about</loc>
    <lastmod>2024-01-15</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
    :
</urlset>

Robots.txt

Robots.txt is a text file that instructs robots (crawlers) which pages should or shouldn't be crawled. The file should be located at the root of the website. The instructions are applied to all the websites instead of the meta robots tag. As robots.txt can block crawlers from crawling particular pages, you need to check if the instructions are described properly.

Robots.txt Example
# robots.txt for http://www.apple.com/
User-agent: *
Disallow: /*/includes/*
Disallow: /*retail/availability*
Disallow: /*retail/availabilitySearch*

  :
Sitemap: https://www.apple.com/shop/sitemap.xml
Sitemap: https://www.apple.com/autopush/sitemap/sitemap-index.xml
Sitemap: https://www.apple.com/newsroom/sitemap.xml
Sitemap: https://www.apple.com/retail/sitemap/sitemap.xml
Sitemap: https://www.apple.com/today/sitemap.xml

Noindex and nofollow (meta robots tag)

If you don't want to index a specific page, you can add the instructions on the page using the meta robot tag. The noindex directive is often used together with the nofollow directive. The noindex directive tells search engines not to include a specific page in their search results. The nofollow directive tells search engines not to follow the link or pass any page value (link juice) to another page.

As the noindex directive can block crawlers from crawling particular pages, you need to check if it is properly used.

Noindex and Nofollow Example
<meta name="robots" content="noindex, nofollow">

Duplicate pages and canonical tag

As search engines don't want to index the page with duplicate content, such content can confuse search engines and cause a penalty from search engines.

To tell search engines which version of a page is preferred, you can use the canonical tag to avoid penalties for duplicate content.

It is a good practice to use the canonical tag on all indexed pages. You also need to ensure that the URLs are the same as those in the sitemap.xml.

Search engines handle HTTP URLs vs. HTTPS URLs and www URLs vs. non-www URLs as different URLs. You need to add the canonical tag even on the homepage.

The image below shows an example of a canonical tag.

Canonical tag example
<link rel="canonical" href="https://example.com/example-page">

404 error and redirection

404 error occurs when a page cannot be found. This issue may occur when you modify a web page's URL.

If the links to the page or the URLs in sitemap.xml are old versions, crawlers may encounter the 404 error.

404 Not Found Example

If you change the URL of the web pages, you need to set redirections (such as 301 redirects) to send users and crawlers to a new URL from the one they originally requested.

Site structure and internal links

As crawlers crawl web pages through hyperlinks, a well-structured website with proper internal links helps them crawl the site efficiently.

If you have addressed the above points, but your website pages are not indexed, the website may have structure or internal link issues. You need to be especially careful when you are using JavaScript to build your website.

As explained in the on-page SEO assessment section, if you are not using <a> tags, crawlers may not recognize it as a link properly. You can easily check the link status using Chrome extensions such as SEOquake and SEO META in 1CLICK.

To understand how to manage indexing better, go to this tutorial: "Index Your Website - How To Fix Page Indexing Issues?".

Page performance

As Google emphasizes that page experience is an important factor in ranking results, improving website performance is a critical part of SEO, and its importance is increasing.

Core Web Vitals

Google introduced Core Web Vitals as a set of specific factors to determine a webpage's overall user experience from the speed point of view. Core Web Vitals include:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts to load.
  • Interaction to Next Paint (INP): Measures interactivity, the responsiveness of a web page by quantifying the delay between user interactions (such as clicks, taps, or key presses) and the visual response or feedback from the page. INP is a newly developed metric that replaced First Input Delay (FID) as one of Core Web Vitals on March 12, 2024.
  • Cumulative Layout Shift (CLS): Measures visual stability. To ensure a good user experience, pages should maintain a CLS of 0.1 or less. CLS quantifies how often users experience unexpected layout shifts—a low CLS ensures that the page is stable as it loads.

We'll explain these metrics in "How To Improve Website Performance (Page Experience)".

Google Search Console shows how many pages on a website perform poorly, need improvement, or are good. To get more insights on Core Web Vitals, you should use Lighthouse or PageSpeed Insights.

Lighthouse and PageSpeed Insights

Both Lighthouse and PageSpeed Insights are tools for assessing page performance. Although their assessment coverage and user interfaces are different, they can be used for the same purposes.

Lighthouse is available as a Chrome extension and in Chrome Developer Tools, while PageSpeed Insights is accessible through a dedicated web page.

PageSpeed Insights Report Example

Gtmetrix

If you need more detailed insights, GTmetrix is a good tool. There is a quota limitation, but you can use it for free after signing up.

GTmetrics Report Example

Mobile-friendliness

Google prioritizes mobile-friendly sites in its search rankings (mobile-first indexing). Check Google Search Central - Mobile site and mobile-first indexing best practices.

From an audit point of view, key checkpoints are whether the page applies responsive design and whether the content is clearly visible on small devices.

Responsive Design Illustration

HTTPS

Due to security concerns, using HTTPS is necessary when launching a website.

You can check HTTPS status of your website using Google Search Console.

To use HTTPS, you need to set up SSL (Secure Sockets Layers). To understand it, you can refer to SSL Setup.

HTTPS Illustration

Schema markups (Google Search Console)

Schema markup is code in a particular format (Structured Data) that helps search engines understand the meaning of the information and allows them to return more informative results for users (Rich Results).

Implementing Schema Markups is not the first priority if you haven't done other basic SEO; however, they are powerful tools for attracting users on SERPs and improving CTR.

From the Technical SEO assessment point of view, you can check how many pages have schema markups and if they were appropriately implemented. Google also provides a Rich Result Test as a Search Console feature to verify if webpages are eligible for Rich Results in search.

To learn more about schema markup, go to this tutorial: "What Is Schema in SEO? | Structured Data and Rich Results".

Off-Page SEO Audit

The objectives of off-page SEO mainly involve assessing backlink quality and quantity. Social and brand signals are also other off-page factors, but they may require a tailored assessment approach.

Backlinks

The quick way to check backlinks is to use Google Search Console. In the Link section, you can see a number of backlinks; however, it doesn't give quality information.

If you want to do a comprehensive backlink audit, you should consider using a paid tool, such as Semrush, Ahref, Moz Pro, or Majestic.

Those tools usually give you an analysis of the backlink profile, including the quality and quantity of links (or toxic level).

Semrush Backlink Audit Example

To learn more about backlinks, go to this tutorial: "What Are Backlinks in SEO and How To Build Them?".

Social Media Marketing

Although you may not conduct a social media assessment or brand recognition survey, you can check how your website implements some social media features.

For example, OGP (Open Graph Protocol) can help to promote your web pages with short descriptions and images through social media, such as Facebook, LinkedIn, X (Twitter), etc.

OGP is implemented through meta og tags or meta twitter:card tags. These can also be considered part of on-page SEO or technical SEO.

Whatsapp OGP (Open Graph Protocol) Example

To learn more about Social Media Marketing, go to this tutorial: "SEO and Social Media Marketing - What Are Social Signals?".

Conclusion

There are many audit points, and you may be overwhelmed. If you want to save time, consider hiring SEO specialists or using paid tools. Even if you use a specialist or paid tools, you still need basic knowledge to get the most from an SEO audit.


You can also learn this topic on Kindle. ClickAmazonKindle.

Tag: