Boost Your Business

How to Save and Analyse Data from Search Console in Google Sheets

This script is provided for illustrative purposes only. It is not supported by the author anymore. To work with Search Analytics, the author recommends using  Search Analytics for Sheets app.  

We at Netpeak have a powerful department of SEO specialists that are successfully automating complex tasks. The approaches and techniques that we have elaborated and successfully implemented in our work process have seriously improved our timing and effectiveness. In this article, we will share with you one of our most profitable tools for Search Console data download and analysis. This tool will be useful for all types of SEO specialists.

The current tool will make Search Console data analysis easier for you. When you start using the technique we've elaborated, you'll be able to download data directly to your Google Spreadsheet, analyse effectiveness of brand and non-brand queries, define how queries CTR correlate with search position (data can be grouped by device, query type and site category), check crawl errors and mark them as fixed.

Our specific instrument contains script at Google Apps Script platform. We've been using this platform for work automation for over a year. It’s very useful and easy to configure. But at the same time it has quotas. So it’s suitable for simple projects like the following one.

Setting up script

First, you should copy spreadsheet with script to your Google Drive. Click ‘File’ -> ‘Make a copy…’

Then try to set up a script. Script setup is modified from tutorial for blogger platform.

Steps for OAuth2 setup

1. Go to menu ‘Tools’  -> ‘Script editor…’

2. Enable Search Console API in Your Account

For interaction with Search Console API you should enable it in the Google Developers Console project, associated with our script.

2.1 Run ‘Resources’ -> ‘Developers Console Project…’.

2.2 Find ‘Api manager page’.

2.3 Type ‘Search Console API’ in search box. Look for Search Console API at search results. Click first on it’s name and then on ‘Enable’ button.

Search Console API has its own 100 000 000 requests per day quota. This is more than enough for work, so don’t be afraid to use script the whole day.

2.4 Go to ‘Credentials’

By default in the OAuth 2.0 client IDs list should be ‘Apps Script’. But we should create new one:.

For the OAuth2 library we need to specify an ‘Authorized redirect URIs’ that matches our script project. The Url you need to enter is:{Script ID}/usercallback

Here {Script ID} is the ID of your script project. You can find your script's project key in the Apps Script code editor by clicking on ‘File’ -> ‘Project properties’.

Copy the ‘Client ID’ and ‘Client Secret’ to Notepad (you will need them in next step), and then click ‘Save’.

3. Setup the OAuth 2.0 Service in Script

You should setup the service in our Apps Script code project.

Replace ‘YOUR_CLIENT_ID’ and ‘YOUR_CLIENT_SECRET’ variables at ‘’ file with your own client id and secret from your Google Developers Console project.

4. Authenticate user

You don’t need script editor anymore. Now you can close editor and work in spreadsheet.

4.1 Run ‘Search Console’ -> ‘List Account Sites’ in the menu. It will prompt you for authorisation.

Click ‘Continue’ and ‘Allow’

4.2 If everything is ok, you’ll see the following message box:

Copy the link you will see and run it in browser (if you don't receive any messages repeat 3.1 again).

4.3 Allow access to your Search Console.

After running copied link you will get new authorisation window. Click ‘Allow’. Then close the page and go to spreadsheet.

Now your script have access to your Search Console account. You just have to rerun ‘Search Console’ -> ‘List Account Sites’ in the menu and you’ll see all your sites in the ‘Sites’ sheet.

Using the API Methods

We use this script to analyse the Search Console most checked data: Search Analytics and Crawl Errors.

You have probably heard about Supermetrics Addon, that that can also download Search Analytics data. Our tool cannot compete with this service in genericity, although it can download crawl errors and aggregate data for easier analysis.

1. List sites from your account

You can list all sites and their access level from your account.

2. Download Search Analytics data

You can download Search Analytics data for the defined period.

Сhoose site and dates, then click ‘Search Console’ -> ‘Search Analytics’ -> ‘Get Search Analytics data’ in the menu.

2.1 Define brand/nonbrand queries

You can define whether the downloaded queries are brand or nonbrand. To do that, paste comma separated (without spaces) variations of your brand in the B4 cell. Script will check whether the given queries contain any of these variations.

For example, for brand queries may contain: netpeak, tetpak, netpek, .

Now in sheet ‘Brand Analysis’ you will see which part of impressions and clicks get branded queries.

2.2 Analyse the Impressions and Clicks by URL

The "Landing page" sheet contains Impressions and Clicks aggregated by URL. You can see how many Clicks and Impressions a particular URL receives.

2.3 Analyse Correlation of CTR and Average Search Position

Sheet «CTR» contains tool for correlation of CTR and average search position analysis.

Search positions’ values are rounded to the nearest tenth for decreasing data volume. You can change rounding in «Round Position» column of «Search Analytics» sheet.

You can group data by device or query type by changing values in cell E2 or F2 correspondingly.

If you want to group data by site categories just specify values in G2 cell (categories names are listed in column N, URLs - in column O). There is example of categories in sheet.

As a result, you will get the following chart.

In the given above chart, you can see that some queries get high CTR rate even within low search positions. You can analyse such queries snippets and get ideas how to improve other snippets of your site.

3. Download Crawl Errors

URL Crawl Errors Samples: list method allows you to compile a list of errors with the defined platform and type. Choose site, category, platform, and run it from menu ‘Search Console’ -> ‘Crawl Errors’ -> ‘List Crawl Errors’.

3.1 Get Crawl Errors Details for URLs

If you want to get the detailed information about URLs you’ve downloaded, run ‘Search Console’ -> ‘Crawl Errors’ -> ‘Get Crawl Errors Details’.

You’ll get your URLs and pages where they were found by Google in ‘Crawl Samples by URL’ sheet.

You can use it to collect all problem links donors found in Google search engine.

3.2 Check whether problematic URL is still present on ‘Linked from’ page

Usually, crawl errors data are not up-to-date. URL that has crawl error doesn’t appear at ‘Linked from’ page. To check this, run ‘Search Console’ -> ‘Crawl Errors’ -> ‘Check Links Presense’.

You'll get the following result:

‘Yes’ - page still has link to URL with crawl problem;

‘No’ - page doesn’t have link;

‘Error’ - ‘Linked from’ page has “404”, “500” or another response code for error.


Script has 6 minutes execution time quota. During this period of time, it is possible to check up to 100 URLs. If you want to check much more links, use any other powerful app or script.

3.3 Mark URL as Fixed

If problematic URL is fixed, you can report about it to Search Console. To do this you can choose ‘Yes’ in the E column of 'Crawl Samples' sheet. After that, press ‘Search Console’ -> ‘Crawl Errors’ -> ‘Mark as Fixed’.

If everyting is OK, you will receive a notification confirming that URLs have been deleted.

The above steps that we have shown you seriously simplify work with Search Console data analysis. Our elaborated tool will help you to analyse brand and nonbrand queries effectiveness. You will easily check how your queries CTR correlate with Average Search Position. Chart data will give you ideas about snippets improvement. Also you can easily download Crawl Errors, check if they still actual and mark them as fixed by few clicks. As a result you will keep Crawl Errors data actual.

In case you have found this script useful, please, share it with your colleagues. We will be glad to get feedback about bugs and features in comments.

Found a mistake? Select it and press Ctrl + Enter

Comments (25)

Latest comments

    To leave a comment, you have to log in.

    You must switch to your user profile before you can leave a comment


    to the most useful newsletter on internet marketing
    Cookies policy
    Netpeak processes information about your visit using cookies to improve site performance. By continuing to browse our site, you agree to the use of these cookies. More information — Accept