• Welcome to Milao Haath
  • Call Us:(+92) 307 111 7867
A Complete Google Search Console Guide For SEO Pros

A Complete Google Search Console Guide For SEO Pros

A Complete Google Search Console Guide For SEO Pros Get to know Google Search Console and use its power to improve the health and search performance of your website.

Google Search Console provides the data needed to monitor site search performance and improve search rankings, information that is only available through Search Console.

This makes them essential for online businesses and publishers looking to maximize their success.

Controlling your search presence is easier when you use free tools and reports.

If you are interested and want to read up on this subject, I recommend checking my other post Top 5 Automotive SEO Best Practices For Driving Business In 2022

Table of Contents

What Is Google Search Console?

A Complete Google Search Console Guide For SEO Pros, Google Search Console is a free web service hosted by Google that offers publishers and search engine marketers the ability to monitor the overall health and performance of their website versus Google Search.

Provides an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (such as hacking vulnerabilities) and when the Search Quality team has imposed a manual action penalty.

Important features:

Monitor indexing and crawling.

Identify and fix errors.

Overview of search performance.

Request indexing of updated pages.

Review internal and external links.

It is not necessary to use Search Console to rank better, nor is it a ranking factor.

However, the usefulness of Search Console makes it indispensable for improving search performance and driving more traffic to a website.

If you are interested and want to read up on this subject, I recommend checking my other post Top 12 Essential SEO Data Points For Any Website

How To Get Started

The first step in using Search Console is to verify website ownership

Google offers different ways to perform website verification, depending on whether you are verifying a website, a domain, a Google site, or a site hosted by Blogger.

Domains registered with Google Domains are automatically verified when they are added to Search Console.

Most users verify their websites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some website hosting platforms limit what can be uploaded and require some method of verifying website owners.

But that’s less and less of an issue as many hosted site services have an easy to follow verification process covered below.

If you are interested and want to read up on this subject, I recommend checking my other post How to Increase The Traffic For YouTube Channels | June 2022 Update

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When you verify a website using either of these two methods, the URL prefix property process is selected.

Let’s stop here and acknowledge that the phrase “URL prefix properties” means absolutely nothing to anyone but the Googler who invented the phrase.

Don’t let this make you feel like you’re entering a maze blindfolded. Verifying a website with Google is easy.

HTML File Upload Method

Step 1 – Go to Search Console and open the Property Picker dropdown menu visible in the top left corner of every Search Console page.

Search Console and open the Property/milaohaath

Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Step 2 – In the “Select Property Type” pop-up window, enter the website URL and then click on the “Next” button.

HTML File Upload Method

HTML File Upload Method

Step 3 – Choose the HTML file upload method and download the HTML file.

Step 4 – Upload the HTML file to the root of your website.

Root means https://example.com/. So if the downloaded file is called verification.html, the uploaded file should be at https://example.com/verification.html.

Step 5 – Complete the verification process by clicking Verify again in Search Console.

Verifying a default website with your own domain on website platforms like Wix and Weebly is similar to the steps above, except that you add a meta description tag to your Wix site.

Duda has a simple approach using a Search Console application that easily checks the site and helps users get started.

If you are interested and want to read up on this subject, I recommend checking our other post Google Analytics & Search Console Data Never Match – And Here’s Why

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index web pages.

Search Console’s URL Inspection tool warns you of crawling and indexing problems before they become a bigger problem and pages disappear from search results.

URL Inspection Tool

The URL inspection tool indicates whether a URL can be indexed and displayed in a search result.

For each submitted URL a user can:

  1. Request indexing for a recently updated webpage.
  2. View how Google discovered the webpage (sitemaps and referring internal pages).
  3. View the last crawl date for a URL.
  4. Check if Google is using a declared canonical URL or is using another one.
  5. Check mobile usability status.

Check enhancements like breadcrumbs.

Coverage

The coverage area shows Detection (how Google discovered the URL), Crawl (indicates whether Google successfully crawled the URL and if not, provides a reason), and Enhancements (provides the status of the structured data).

The coverage area is accessed from the menu on the left:

coverage section/milaohaath

coverage section/milaohaath

Coverage Error Reports

Although these reports are flagged as errors, that doesn’t necessarily mean there’s anything wrong. Sometimes it just means indexing can be improved.

For example, in the screenshot below, Google shows a 403 Forbidden Server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

403 response/milaohaath

403 response/milaohaath

The above errors occur because Googlebot cannot crawl the member pages of a web forum.

Every member of the forum has a members page with a list of their latest posts and other statistics.

The report contains a list of the URLs that are causing the error.

The report contains a list of the URLs that are causing the error.

There is also a context menu in the form of a magnifying glass icon to the right of the URL itself, which also offers an option to Inspect URL.

Inspect URL/milaohaath

Inspect URL/milaohaath

It also shows the following data points:

Last crawl.

Crawled as.

Crawl allowed?

Page fetch (if failed, provides the server error code).

Indexing allowed?

There is also information about the canonical used by Google:

User-declared canonical.

Google-selected canonical.

For the discussion board internet site withinside the above example, the crucial diagnostic data is positioned withinside the Discovery phase.

This phase tells us which pages are those which can be displaying hyperlinks to member profiles to Googlebot.

With this data, the writer can now code a PHP declaration on the way to make the hyperlinks to the member pages disappear while a seek engine bot comes crawling.

Another manner to restoration the trouble is to jot down a brand new access to the robots.txt to forestall Google from trying to move slowly those pages.

By making this 403 blunders move away, we loose up crawling assets for Googlebot to index the relaxation of the website.

Google Search Console’s insurance document makes it feasible to diagnose Googlebot crawling troubles and fasten them.

Fixing 404 Errors

The insurance record also can alert a writer to 404 and 500 collection blunders responses, in addition to speak that the whole thing is simply fine.

A 404 server reaction is known as an blunders best due to the fact the browser or crawler’s request for a website turned into made in blunders due to the fact the web page does now no longer exist.

It doesn’t mean that your site is in error. If some other site (or an internal link) links to a web page that doesn’t exist, the insurance record will display a 404 reaction.

Clicking on one of the affected URLs and choosing the Inspect URL device will display what pages (or sitemaps) are relating to the non-existent web page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

If you are interested and want to read up on this subject, I recommend checking our other post Google Update Core

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Search Type/milaohaath

Search Type/milaohaath

A menu pop-up will display allowing you to change which kind of search type to view:

Search Type/milaohaath

Search Type/milaohaath

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Performance Report Chart/milaohaath

Performance Report Chart/milaohaath

By default, the Total Clicks and Total Impressions metrics are selected.

By clicking on the tabs associated with each metric, one can choose to display those metrics in the bar chart.

Impressions

Impressions are the number of times a website appears in search results. As long as a user doesn’t have to click a link to view the URL, it counts as an impression.

Even if a URL ranks at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

High impressions are great because it means Google will show the page in search results.

However, the meaning of the Impressions metric is made meaningful by the Clicks and Average Position metrics.

Clicks

Click metrics show how many times users clicked to your website from search results. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good, but not bad. This means that the website may need to be improved in order to get more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

Average CTR is a percentage of how many times users clicked to your website from search results.

A low CTR means something needs to be improved to increase visits from search results.

A higher CTR means the site is doing well.

This metric gains importance when viewed in conjunction with the Average Position metric.

Average Position

The Average position shows the average position in the search results that the website tends to appear in.

An Average in positions 1 to 10 is excellent.

An average position in the top twenty (20-29) means that the website appears on page two or three of the search results. That is not that bad. It just means the site needs some extra work to give it that extra boost in the top 10.

Average positions below 30 can (generally) mean that the site could benefit from significant improvements.

Or it could be that the site ranks for many low-ranking keyword phrases and some very good keywords that rank exceptionally high.

In any case, it can mean dealing with the content more intensively. This can be an indication of a content gap in the site where the content ranking for certain keywords is not strong enough and you may need a dedicated page for that keyword set to rank better.

All four metrics (impressions, clicks, average CTR, and average position) taken together provide a meaningful overview of site performance.

What’s great about the performance report is that it’s a starting point for quickly understanding how your site is performing in search.

It’s like a mirror that reflects how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

Keywords appear in queries as one of the dimensions of the performance report (as mentioned above). The query report shows the top 1,000 search queries that generated traffic.

Of particular interest are low-performance queries.

Some of these queries show little traffic because they are infrequent, known as long-tail traffic.

But others are searches that come from websites that could be improved, maybe they need more internal links, or it could be a sign that the keyword phrase deserves a website of its own.

It’s always a good idea to check underperforming keywords as some of them can be quick wins which, when the issue is resolved, can result in a significant increase in traffic.

Links

Search Console provides a list of all links pointing to the site.

However, it is important to note that the link report does not represent links that help the site to rank.

It simply reports all links pointing to the website.

This means that the list contains links that do not help the page to rank. This explains why the report may show links with a unfollow link attribute.

The Links report is accessible  from the bottom of the left-hand menu:

Links report/milaohaath

Links report/milaohaath

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (Top Linking Pages, Top Linking Sites, etc.) has a More Results link that you can click to view and expand the report for each type.

For example, the advanced Most Linked Pages report shows the top landing pages, which are the pages on your website that are linked to the most.

Clicking on a URL will change the report to show all external domains linking to that page.

The report shows the domain of the external site but not the exact page that links to the site.

Sitemaps

A sitemap is generally an XML file, which is a list of URLs that helps search engines discover web pages and other types of content on a website.

sitemaps are especially useful for large websites, sites that are difficult to crawl when new content is added frequently.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can all affect a site’s crawling and page indexing.

Sitemaps make it easy for search engines to find these pages, and that’s it.

Creating a sitemap is easy as most are automatically generated by the CMS, plugins, or website platform on which the website is hosted.

Some hosted website platforms generate a sitemap for each website hosted on their service and automatically update the sitemap when the site changes.

Search Console provides a sitemap report and offers publishers the option to upload a sitemap.

To access this function click on the link located on the left-side menu.

left-side menu/milaohaath

left-side menu/milaohaath

The sitemap section reports any errors with the sitemap.

Search Console can be used to remove a sitemap from reports. However, it is important to remove the sitemap from the website, otherwise Google may remember it and revisit it.

After submission and processing, the coverage report fills out a section of the sitemap that helps troubleshoot issues related to URLs submitted through sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console provides feedback on rich search results through the performance report. It is one of the six dimensions listed under the chart at the top of the page, listed as Search Appearance.

Selecting the Search Appearance tabs displays click and impression data for the different types of rich results that are displayed in the search results.

This report highlights the importance of rich search result traffic to your site and can help identify the reason for specific site traffic trends.

The Search Appearance report can help diagnose problems related to structured data.

For example, a drop in traffic for rich search results could be a sign that Google has changed their structured data requirements and the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

If you are interested and want to read up on this subject, I recommend checking our other post Google Ads implements 3 Strike policy rules

Search Console Is Good For SEO

In addition to the above Search Console benefits, publishers and SEOs can also upload reports on link disapproval, fix penalties (manual actions), and security events like website hacking, all of which contribute to better search exposure.

This is a valuable service that every web publisher concerned with visibility in search results should take advantage of.

If you are interested and want to read up on this subject, I recommend checking our other post The Most Profitable Blogging Niches in 2022

Comments (1)


  1. Very helpful guidelines

leave your comment

Your email address will not be published. Required fields are marked *

Top