Blog for internet marketing

SEO

How to Save and Analyse Data from Search Console in Google Sheets

25
4
0
6

We at Netpeak have a powerful department of SEO specialists that are successfully automating complex tasks. The approaches and techniques that we have elaborated and successfully implemented in our work process have seriously improved our timing and effectiveness. In this article, we will share with you one of our most profitable tools for Search Console data download and analysis. This tool will be useful for all types of SEO specialists.

The current tool will make Search Console data analysis easier for you. When you start using the technique we've elaborated, you'll be able to download data directly to your Google Spreadsheet, analyse effectiveness of brand and non-brand queries, define how queries CTR correlate with search position (data can be grouped by device, query type and site category), check crawl errors and mark them as fixed. Our specific instrument contains script at Google Apps Script platform. We've been using this platform for work automation for over a year. It’s very useful and easy to configure. But at the same time it has quotas. So it’s suitable for simple projects like the following one.

Setting up script

First, you should copy spreadsheet with script to your Google Drive. Click ‘File’ -> ‘Make a copy…’ Then try to set up a script. Script setup is modified from tutorial for blogger platform.

Steps for OAuth2 setup

1. Go to menu ‘Tools’ -> ‘Script editor…’

2. Enable Search Console API in Your Account

For interaction with Search Console API you should enable it in the Google Developers Console project, associated with our script.

2.1 Run ‘Resources’ -> ‘Developers Console Project…’. Click on the link in the dialogue.

Enabloing </p> <p>Search Console API in Your Account

2.2 Find ‘Api manager page’. Api manager page

2.3 Type ‘Search Console API’ in search box. Look for Search Console API at search results. Click first on its name and then press on ‘Enable’ button. Search </p> <p>Console API at search results, click on its name and then press 'Enable' button

Search Console API has its own 100 000 000 requests per day quota. This is more than enough for work, so don’t be afraid to use script the whole day. Search Console API requests

By default in the OAuth 2.0 client IDs list should be ‘Apps Script’. Click on ‘Apps Script’. For the OAuth2 library we need to specify an ‘Authorized redirect URIs’ that matches our script project.

The Url you need to enter is:

https://script.google.com/macros/d/{PROJECT KEY}/usercallback

Here {PROJECT KEY} is the key of your script project. You can find your script’s project key in the Apps Script URL. Project properties

Copy the ‘Client ID’ and ‘Client Secret’ to Notepad (you will need them on the following stage), and then click ‘Save’. Saving ‘Client ID’ and ‘Client Secret’ to Notepad

3. Setup the OAuth 2.0 Service in Script

You should setup the service in our Apps Script code project. Replace ‘YOUR_CLIENT_ID’ and ‘YOUR_CLIENT_SECRET’ variables at ‘Variables.gs’ file with your own client id and secret from your Google Developers Console project.

Setup the OAuth 2.0 Service in Script

4. Authenticate user

You don’t need script editor anymore. Now you can close editor and work in spreadsheet.

4.1 Run ‘Search Console’ -> ‘List Account Sites’ in the menu. It will prompt you for authorisation.

authorisation Click ‘Continue’ and ‘Allow’

‘Continue’ and ‘Allow’

4.2 If everything is ok, you’ll see the following message box:

message box

Copy the link you will see and run it in browser (if you don't receive any messages, go back to 3.1 again).

4.3 Allow access to your Search Console. After running copied link you will get new authorisation window. Click ‘Allow’, then close the page and go to spreadsheet. Now your script have access to your Search Console account. You just have to rerun ‘Search Console’ -> ‘List Account Sites’ in the menu and you’ll see all your sites in the ‘Sites’ sheet.

Using the API Methods

We use this script to analyse the Search Console most checked data: Search Analytics and Crawl Errors. You have probably heard about Supermetrics Addon, that can also download Search Analytics data. Our tool cannot compete with this service in genericity, although it can download crawl errors and aggregate data for easier analysis.

1. List sites from your account

You can list all sites and their access level from your account. List  Account Sites

2. Download Search Analytics data

You can download Search Analytics data for the defined period. Сhoose site and dates, then click ‘Search Console’ -> ‘Search Analytics’ -> ‘Get Search Analytics data’ in the menu.

2.1 Define brand/nonbrand queries

You can define whether the downloaded queries are brand or nonbrand. To do that, paste comma separated (without spaces) variations of your brand in the B4 cell. Script will check whether the given queries contain any of these variations. For example, for netpeak.net/blog/ brand queries may contain: netpeak, netpak, netpek, . brand queries Now you can see which part of impressions and clicks get branded queries in ‘Brand Analysis’ sheet .

2.2 Analyse Impressions and Clicks by URL

The "Landing page" sheet contains Impressions and Clicks aggregated by URL. You can use it to see how many Clicks and Impressions a particular URL receives. Clicks and  Impressions a particular URL receives

2.3 Analyse Correlation of CTR and Average Search Position

Sheet «CTR» contains tool for correlation of CTR and average search position analysis. Search positions’ values are rounded to the nearest tenth for decreasing data volume. You can change rounding in «Round Position» column of «Search Analytics» sheet. You can group data by device or query type by changing values in cell E2 or F2 correspondingly. data grouping

If you want to group data by site categories, just specify values in G2 cell (categories' names are listed in column N, URLs - in column O). There is an example of netpeak.net/blog/ categories in sheet. As a result, you will get the following chart.

chart

In the chart above, you can see that some queries get high CTR rate even within low search positions. You can analyse such queries snippets and get ideas how to improve other snippets of your site.

3. Download Crawl Errors

URL Crawl Errors Samples: the listing method allows you to compile a list of errors with the defined type and platform. Choose site, category, platform, and run it from menu ‘Search Console’ -> ‘Crawl Errors’ -> ‘List Crawl Errors’. ‘Search Console’ -> ‘Crawl Errors’ -> ‘List Crawl Errors’

3.1 Get Crawl Errors Details for URLs

If you want to get the detailed information about URLs you’ve downloaded, run ‘Search Console’ -> ‘Crawl Errors’ -> ‘Get Crawl Errors Details’. You’ll get your URLs and pages where they were found by Google in ‘Crawl Samples by URL’ sheet. Crawl Samples by URL

You can use it to collect all problem links' donors found in Google search engine.

3.2 Check whether problematic URL is still present on ‘Linked from’ page

Usually, crawl errors data are not up-to-date. URL that has crawl error doesn’t appear at ‘Linked from’ page. To check this, run ‘Search Console’ -> ‘Crawl Errors’ -> ‘Check Links Presense’. You'll get the following result: ‘Search Console’, ‘Crawl Errors’, ‘Check Links Presense’

‘Yes’ - page still has link to URL with crawl problem; ‘No’ - page doesn’t have link; ‘Error’ - ‘Linked from’ page has “404”, “500” or another response code for error.

Note! Script has 6 minutes execution time quota. During this period of time, it is possible to check up to 100 URLs. If you want to check more links, use any other powerful app or script.

3.3 Mark URL as Fixed

If problematic URL is fixed, you can report about it to Search Console. To do this, you can choose ‘Yes’ in the E column of 'Crawl Samples' sheet. After that, press ‘Search Console’ -> ‘Crawl Errors’ -> ‘Mark as Fixed’.

‘Search Console’ -> ‘Crawl Errors’ -> ‘Mark as Fixed’

If everyting is OK, you will receive a notification confirming that URLs have been successfully deleted. notification confirming that URLs have been deleted

In this article, we have shown you the steps that seriously simplify the process of working with Search Console data analysis. Our elaborated tool will help you to analyse brand and nonbrand queries effectiveness. You will easily check how your queries CTR correlate with Average Search Position. Chart data will give you ideas about snippets improvement. Moreover, you can easily download Crawl Errors, check whether they afre still actual and mark them as fixed in a few clicks. As a result, you will keep Crawl Errors data actual. In case you have found this script useful, please, share it with your colleagues. We will be glad to get feedback about bugs and features in comments.

Comments (10)

  1. 0
    11 days ago

    Hi, super script, thanks. 

    One question, how could I add brand query with space? eg. coca cola.

    Any ideas? Thanks.

  2. 0
    15 days ago

    Nice article, congratulations! But, if you are not technically savvy (or don't have time to set everything up), you can use tools, such as SearchConsoleHelper.com.

    It connects to Google Search Console automatically and can retrieve up to 5000 rows per request and repeat the requests 200 per minute.

    It also has a much better user interface than the original Search Console, offers advanced filtering options and allows you to store historical data (past the 90 days you get from Google).

  3. 0
    3 months ago

    after the step 4.2 im getting the following error any idea where im getting wrong

    400. That’s an error.

    Error: redirect_uri_mismatch

    Application: Search Console

    You can email the developer of this application at:

  4. 0
    3 months ago

    Hi there. Great tutorial but I was under the impression it was possible to extract more than 1,000 crawl errors via the GOOGLE API based method. I've tried to edit the underlying script to toggle the appropriate 1000 values to 10000 but I am still only able to extract 1000 crawl errors. Any ideas guys?

  5. 0
    5 months ago

    Hi - this looks amazing, and very useful. I have the list of sites, but can't get the search analytics to work. I have entered a URL of a site from my list in the 'Site' section of the 'Search Analytics' page, but when I click Search Console > Search Analytics > Get Search Analytics data, I get the following error:

    Недопустимые координаты или размеры диапазона. (line 99, file "Search Console methods")

    Do you have any idea where I might be going wrong?

  6. 0
    5 months ago

    Great job guys! I'm playing around with the script. It all looks great, but I'm having problems running the "crawl errors" function. I always get: "SyntaxError: Unexpected token: N".

    Analytics data works well, though. Any ideas? Thanks!

To leave a comment, you have to log in.

Subscribe

to the most useful newsletter on internet marketing

Most

discussed popular readable