This Google Search Console guide aims to explain each feature of Google Search Console for the beginners. So, they can use it properly and improve their website’s search appearance and SEO.
Google Search Console or previously known as Google Webmaster Tools is a totally free web service introduced by Google. The purpose of this free tool is to give webmasters an opportunity to understand how their website appears in search results and what they need to do in order to improve their website search appearance.
Google recently introduced a new version of Google Search Console this year in 2018. However, in this guide, we are going to use the old version.
So without further ado, let’s get started.
I am dividing this guide into multiple sections for ease of use. This guide could take you some time to complete it.
- How to Set up Google Search Console?
- How to Verify Google Search Console Property?
- How to Share Google Search Console Access?
- How to Submit a Sitemap in Google Search Console?
- Google Search Console Features
- Search Appearance
- Search Traffic
- Google Index
- Crawl Status Report
- Security Issues
- Web Tools
How to Set up Google Search Console?
Step 1: Login to Google Search Console. You will need a Google account in order to add a website property. Once you are logged in, you should click on ‘Add a Property’ button. Now type the URL of the site which you want to verify.
Step 2: Make sure to enter the URL that is redirected and seen by a user in the address bar of your browser. For example, in my case, I have been using HTTPS and I have removed ‘www’ from the domain. So, I have to enter my site address as ‘https://azibyaqoob.com/’.
Step 3: Clicking on ‘Add’ button will open the site verification page. Now, you need to select a verification method to verify you own the website.
- HTML file upload
- HTML meta tag
- Google Analytics tracking code
How to Verify Google Search Console Property?
Google recommends this method of verification. They will provide you a downloadable HTML file, which you need to upload to the root folder of your website. After that, clicking on the verify button will verify this property.
Click on alternate methods will bring up multiple verification methods. If you select HTML meta tag as a verification method then Google Search Console will provide you an HTML meta tag with a verification ID. You need to add this code to the HTML HEAD section of the website.
This is an easier method if you have already verified your Google Analytics account for the same website. If you haven’t then check out this post on how to set up Google Analytics
First, you need to select Google Analytics as a verification method. After that, clicking on the verify button will verify this property.
You may want to add new users to your Google Search Console property at some point. Let’s suppose you are hiring an SEO team to fix SEO issues. In that scenario, it is most likely they would ask you to share your Google Search Console property access.
You can read my blog post on this topic to share Google Search Console access with anyone.
How to Submit a Sitemap in Google Search Console?
Submitting a sitemap in Google Search Console will help your website get crawled more frequently and all new pages/posts get indexed much faster.
I would highly recommend you to submit XML sitemap when your website is ready. I have recently written a blog post on how to submit an XML sitemap to Google, Bing, and Yahoo. Please check it out.
When you are done submitting a sitemap, it will take Googlebot some time to start indexing pages. After that, Google Search Console will start reporting useful information about your site.
Google Search Console Features
When data starts getting populated in your Google Search Console account, you get confused with all this new information. So, let’s explain each feature step by step.
The first page you will see on a verified property is a Dashboard. Here you can find an overview of your website, crawl errors, indexing issues, search analytics graph and sitemap overview.
This page is only good to give you an overview of website issues. If you are in a hurry and want to look for recent errors you can check the dashboard to give you an overview.
This tab will show you critical website errors such as indexing issues, critical sitemap errors, manual actions, site hacking warning, etc…
Also, Google usually notifies you through an email when they send you a message to the message center. Whenever you receive a message here, always act quickly.
This section provides information about how your website appears in search results. You can find out different aspects of your site search appearance in different reports.
If you are using structured data on your website, this report will notify you of structured data errors, and how many structured data entries are indexed fine. It will tell you, the data type of each top-level structured data.
If this page shows structured data errors, you should consider fixing them. The most commonly occurred errors are missing fields. You can use Google’s structured data testing tool to find out which entries are missing. You can then easily add those entries manually or hire someone to get this job done.
Rich cards were introduced by Google in 2016. Rich cards will show search snippets in a more engaging visual format. Of course, you would require to use structured data markup. Rich cards provide an overall great user experience for mobile users.
In this tab, if your website is not using any of the rich cards, it won’t show you anything. However, if you do use structured data markups which are supported by rich cards then you will see rich card related errors, enhancements, and cards which are fully working and showing up as rich cards in search results.
It is a helpful tool to highlight structured data, so Google can understand how your page is structured and on the next crawl index those new changes.
You can easily use this tool to highlight structured data on your page. Once all the information is highlighted, you can press the publish button.
You can do it for a single page or create a data set, which is basically a collection of pages.
Google will take some time to index those new changes. After a while, you will start seeing new data in the Data Highlighter report.
It shows errors related to the page title, meta description, and non-indexable content. While crawling and indexing your site, Google finds out potential issues related to HTML. It is quite an important page in Google Search Console, so every webmaster should check it regularly. Here are some examples of issues that you will likely see in the HTML improvements report:
- Short meta descriptions
- Duplicate meta descriptions
- Long meta descriptions
- Duplicate title tags
- Missing title tags
- Long title tags
- Short title tags
- Non-informative title tags
- Non-indexable content
Accelerated Mobile Pages (AMP)
When Google launched the mobile-first indexing update, site owners began looking forward to a faster mobile experience. However, if the mobile browser pages aren’t loading quickly, it can discourage users from enjoying the content you have to offer. The solution for this is to use Accelerated Mobile Pages or AMP.
The AMP plugin was brought out of a collaboration between Google and Twitter. AMP is an open source project designed to optimize faster mobile pages. It renders mobile pages quickly by cutting back on the HTML code tag manager aspect and rendering only the ones that are suitable for mobile pages.
According to Google, AMP cuts download time from 15% to 85%. SEO and webmasters found several instances pointing to the fact that AMP has a huge impact on mobile search engine rankings. AMP optimized pages rank better and convert more mobile page visitors into customers.
This section in Google Search Console provides information about how your website is getting traffic from organic search. Which websites are linking you, and how your internal link structure is seen by Google. Let’s discuss each subsection from Search Traffic.
It is the most important page in Google Search Console to grow your site traffic. There is a lot of useful information on this page about how actually your website is getting traffic from Google, how many clicks are coming from particular queries, impressions, and CTR.
To explain each feature on this page, I am planning to write a post. I will link the post here, so this guide won’t get excessively long.
Links to Your Site
Links to Your Site report lists links that Googlebot discovered during its indexing process, as well as the pages on your site with most links. Not all the links to your site may be listed, but that’s completely normal as Google has been making improvements on how to gather and display link data. As a result, you may see changes in the number of links displayed for your site.
The data displays the content Googlebot discovered during its crawling process and if you block a page from your site with robots.txt, links to that page won’t be shown, or if Google finds a broken or invalid link on your site, that link won’t be listed either. I recommend that you review the Crawl Error page to check for 404 error that Googlebot might have encountered while crawling your site.
An internal link is a type of a hyperlink on a webpage to another web page or resource, such as an image or document, on the same website or domain. They are commonly used in navigation.
Internal links are useful because they allow users to navigate a website. They help establish information hierarchy for the given website as well as help spread link equity ( ranking power) around the websites.
This report will show you how your website internal pages are linked together. How many times they have been linked from other pages. If you want to remove a page from your website, make sure that you check this internal link report how many times that particular page has been linked. You should also remove those links before deleting the page. Here’s a more detailed article on how to fix broken links.
Manual Actions are Google’s way of demoting or removing web pages as a whole. Manual Action Report is when a human reviewer determines that the pages on your site are not compatible with Google’s webmaster quality guidelines. Google’s algorithms detect and demolish any spam on the internet automatically, but Google also uses human reviewers to manually review pages and mark them if they violate the guidelines.
If your site isn’t performing as well as it once did, then you should check your Manual Actions Report. Once you’re sure that you have fixed the issues reported by manual actions, you can submit the reconsideration request to Google.
The International Targeting reports within Google Webmaster Tools helps webmasters debug common issues with their implementation of href-lang. The markup enables Google and other search engines to serve the correct language or regional version of pages to searchers.
The International Targeting Report helps webmasters identify two of the most common issues with hreflang annotations. One of them is missing return links in which the annotations must be confirmed to the pages they are pointing to, otherwise, the annotations may be interpreted incorrectly. The other problem is incorrect hreflang values in which the value of the hreflang attribute must either be a language code or a combination of language and country codes. If Google’s indexing system detects language or country codes that are not in the right format, Google will provide example URLs to help you fix them. You can read more about what is a hreflang attribute and how it impacts SEO in my recent blog post.
Mobile Usability applies to the concept of user-friendliness in the use of online services with mobile devices such as smartphones and tablets.
This report shows mobile usability issues which are faced by Googlebot when it crawled particular pages on your website. You should fix them as soon as possible.
Mobile usability can play an important role in mobile SEO. Make sure that you are using responsive web design to get less of these errors.
This section is, in general, provides useful information about your website indexing issues, how many pages are indexed by Google, are there any URLs or resourced getting blocked, and if you want to remove URLs from the index. Let’s explain each subsection in details.
Index Status shows the total URLs available in the search results along with other URLs Google might have discovered by other means. The number changes over time as pages are added and removed constantly.
Total URLs that Google discovered when crawling your site. A steady rise in the number of indexed pages points that Google is regularly crawling your content, and new pages of the website getting indexed.
If your website URL structure has some issues then make sure that you should fix URL structure of your website.
The URL removal Tool enables you to temporarily block pages from Google search results for sites that you own.
Also, Check out this post on how to remove negative search results from Google.
The tool will help you remove snippets, specific content and even whole pages. However, this tool will only remove pages temporarily. If you want to remove a website permanently from Google, you should use other methods such as using noindex meta tag, password protection, etc…
Crawl Status Report
This section in Google Search Console will help you to discover crawling related issues. Also, you can submit a sitemap through this section. Let’s check out each subsection in Crawl status report.
The search engines use a program called “spider” to scan or crawl your website through a bot. So, everything a search engine knows about your website is through crawling. It is a fundamental mechanism for search engines to interact with your site, and are important enough to be checked daily.
Crawl Errors are issues reported by Google as it crawls your website. Whenever you see an unexpected error reported here, you should consider fixing it as quickly as possible.
Sometimes search engine draws attention to things that aren’t really errors- such as pages that have been retired or whose content has never been reinstated. In cases like this, they can be simply be left alone and Google will eventually stop reporting while in some other cases, a web developer may be required to implement “redirects” to resolve the crawl errors.
The Crawl Stats report provides information on Googlebot activity on your site for the last 90 days. Having blocked resources or broken HTML can result in a decrease of crawl rate. However, new and useful information on your site increases the crawl rate. Crawl rate depends on how fast and bot-friendly your site is, crawl rate is a good indicator of how crawlable your site is.
Fetch as Google
It helps you test a page if it can be crawled by Google on both desktop and mobile phones. The purpose of this tool is to know how Google crawls and render a particular page.
Also, if the page is working fine but it is not getting indexed, then after a crawl, you can ‘request indexing’ of that page.
Note: you can only fetch 10 pages daily. It is a limit set by Google, so use this tool only when it is important.
The robots.txt file is located at the root folder of your website. The purpose of this file is to block pages getting crawled by search bots including Googlebot and others. The purpose of this file is to provide instructions to web crawlers to not crawl particular pages on your website. Most search engines such as Google, Bing, and Yahoo bots respect these instructions.
However, this file can only give instructions but a page can get indexed in other ways. You should not use a robots.txt file to hide pages. To hide a page from Googlebot you should use a noindex meta tag or protect the page with a password.
An XML sitemap should be needed to submit to Google Search Console when you create a new website. This sitemap will help Google easily crawl all the important pages on your website.
In the sitemap report, you will get an overview of how many pages have been indexed in Google. Also, this report will show you errors related to the sitemap. So, if your website is new, you should keep an eye on the indexed pages of your website.
You can follow this post on how to submit an XML sitemap to Google.
When you have duplicate pages on a website and you want Google to identify and treat them right, you should consider using this URL parameter tool.
So when should you use this tool exactly?
If you are using session IDs, tracking codes within the URLs of the page or having similar pages with identical content.
By using this tool, you can tell Googlebot how to treat identical content on your website.
Websites get flagged by Googlebot as soon as they find malicious code on the website. Googlebot usually reports these errors to webmasters in the security issues tab. Also, users will more likely to see a warning sign on your website and if you don’t fix the issues soon your website might get de-index from the Google search.
Note: I am not claiming to be an expert in security so, if you want to make your website more secure, I would strongly recommend to check out this guide.
In this section, you will find tools to check your site usability, ad experience, user experience, page speed, etc..
Ad Experience Report
The Ad experience Report is used to identify ad experiences that violate the Better Ad Standards, a set of ad experiences the industry has identified as being highly annoying to users and blatant ad experiences which are misleading as well as abusive. If your site presents either type, the Ad experience Report will help you identify as well as fix them.
Abusive Experience Report
An abusive experience is designed to mislead website visitors. For example, auto redirects that take visitors to a new page without any action on their part is an abusive experience as well as the ads that are designed to mislead the users into interacting with them.
The Abusive experience Report lists experiences on your site that are identified as being misleading to the visitors. By removing them, you can increase your visitor’s enjoyment of your site and be sure they’ll return.
I think we have already covered these tools but in this section, you will see 3 testing tools:
- Structured Data Testing Tool
- Structured Data Markup Helper
- Email Markup Tester
You can check that if Google is correctly parsing your structured data markup, if not, what you can do to fix it.
If you have trouble using structured data markup, this tool will help you easily markup important pages on your website with structured data.
This tool will make the HTML source code of your email into email data markup. This will assist users to perform specific actions on apps such as Google Calander, Google Search, Inbox, and Gmail.
Google Search Console is an essential tool for every webmaster if their sole traffic comes from Google Search. I personally use Google Search Console on all my websites to track errors and check how I can improve the website traffic.
I hope you have learned something new from this guide. If you haven’t or feel like there’s something still missing from the guide, please let me know in the comments.
If you don’t have time to deal with all the SEO related issues on your website, then you can always hire an SEO specialist.