Internet marketing blog

SEO

Netpeak Checker 2.1 Review: Multifunctional Tool for Mass Analysis and Comparison of the Websites

20
0
3

International association of SEO tools Netpeak Software presents… Laureate of SEOscar and Golden Analysis Award… A tool which is to become a must-have for online marketers all over the world...

Please welcome – Netpeak Checker 2.1:

1. What is Netpeak Checker?

Now you come, and you say: 'Don, give me data for the analysis.' But you don't ask with respect. You don't enter your API keys to the services access settings. You don't even know which URLs you want to get data for…
Don Checkerone

Netpeak Checker – is a multifunctional tool for mass analysis and comparison of the websites which helps you perform a wide range of tasks in online marketing field. The tool lets you get data from the most popular services all around the world, such as Moz, Serpstat, Majestic, SEMrush, Alexa, Google, Bing, Whois, Facebook, Twitter, etc.

To start working with the tool, you need to:

  • have a list of URLs for which you want to get data
  • know the parameters you want to analyze (for getting some parameters, you'll need a paid access to an appropriate service)

2. Issues with the previous versions

If you have used Netpeak Checker before, you certainly have met at least one of these issues:

  • lack of opportunity to save and load project results → there was an option to save the list of URLs, however, it's often necessary to save exactly the results in order to handle them in future when needed
  • odd empty cells in the results table after the analysis → the parameters are chosen and the analysis seems to be going but you still see empty cells without the slightest idea why there are no results
  • complicated interface → different tabs for editing the list of URLs, adjusting the parameters, and viewing the results table; settings of the tool itself, of the services, and of the parameters and threads are all in the different places
  • low productivity → due to outdated results table and lack of optimization of getting results
  • analysis full stop when the Internet connection is lost → this is especially annoying when you start the analysis for a huge number of URLs and parameters and leave it for a night, and if the internet is down even for a while, the analysis will be completely stopped
  • only dark color scheme of the interface → this doesn't refer to a technical part of the tool, however, we often get feedback on this disadvantage

3. Solution of the problems and new opportunities

3.1. Absolutely new interface

Now, all the main actions with the tools will be made in one window:: Netpeak Checker: main window

Issues

If there was an issue during the analysis, then an appropriate cell will be highlighted with a color depending on the issue status:

  • yellow – in this case, you can fix the issue yourself. For instance, 'Proxy Error' means that there is something wrong with proxy so just add another proxy or disable its using
  • red – these are issues independent of you, such as 'Service Error'. This means the service you’re turning to is unavailable and there is nothing left except to wait

Color scheme

Now there are three color schemes available: light, dark, and blue. Netpeak Checker: dark color scheme of the interface

3.2. Saving and uploading the projects

We've implemented an option to save the project with the most space-saving way of storing the data. Try to save your project more often and all your results will be safe and sound :)

Please notice that saved project stores all the received data: you can easily turn off the parameters and choose other ones, but when you turn on the parameters that were analyzed before, they won't be retrieved again, they'll just appear in the table.

3.3. Changes to parameters

Quantity

Now Netpeak Checker can analyze more than 1000 parameters. To tell you the truth, we almost stopped counting them after we crossed the mark of 1000 parameters :)

Learn All Parameters

Old parameters

All the old parameters were revised and optimized. Let me point out Whois parser of our own production – we've made every effort to develop it and we do believe you'll be satisfied with it!

New parameters

Some new services were added to the tool:

  • Yahoo SERP – indexation, merge, and other info referring to Yahoo results page
  • Bing SERP – the same as the previous one, only for Bing
  • StumbleUpon – data on the indexation and number of views of the target URL by the users of this service

Also, we've added parameters 'Title' and ‘Description’ for all search engines. This will let you know how search engines display your website in the results page. So don’t miss your chance to improve the CTR of your pages!

Search and selection

As I've already mentioned before, now there are 1000+ parameters in the tool, so it’s extremely useful to have an opportunity to search the necessary ones. Just start typing the name and you’ll get all the parameters containing any matches.

Additional settings

Now all the parameters have 'target' and 'mode' properties.

Target defines what value of the URL will be used for the request to a service. Targets can be of the following types:

  • URL → in this case, the URL exact match is sent to a service, for instance, https://subdomain.domain.com/page.html
  • Host → the host of the URL is sent to a service, in our example, it’ll be subdomain.domain.com
  • Root Domain → as the name implies the root domain is sent, in the above-stated example it is domain.com

Mode is specific for each service and works together with the target in the following way:

  • Prefix (available only for Ahrefs) → ‘URL’ target and a parameter of ‘Prefix’ mode is sent to the service so this way the data for domain.com/path/* path will be retrieved
  • Subdomains (available only for Ahrefs) → ‘Host ’ target and a parameter of ‘Subdomains’ mode is sent to the service and the data is retrieved for *.domain.com/*
  • Subdomain (available only for Moz) → kind of technical nickname for ‘Host’ target: in this mode, all Moz subdomain parameters are grouped for your convenience

So if the service provides an opportunity to get data in different modes, you can get the results simultaneously if you check the parameters in corresponding modes.

Also, some services (now, they are Serpstat and SEMrush) have 'Search Engine' group of parameters – here you should specify what search engine / region the results should be retrieved for. This feature lets you to compare the results for one and the same URL in different databases of services.

Limitations

We've spent a great deal of time on testing and finally found optimal values that will let you get maximum of data from each service – these settings (number of available threads and their speed) are merged into the tool. So now you don't have to manually set the delays between requests thinking what number will be optimal.

Note! We managed to greatly optimize table performance, however, there still can be some delays in tool response when analyzing 1000 parameters at once. So we recommend turning on less amount of parameters to make the table more responsive.

3.4. New application core

The core of new Netpeak Checker is made based on the core of Netpeak Spider 2.1 – we can point out the following peculiarities of its work:

  • the tool analyzes which cells are empty and which ones have issues with getting results: these cells become pending for the analysis
  • the above-mentioned feature lets you easily open project results and continue working with them
  • if the Internet was lost, the tool will try to resume the analysis every 15 seconds until the analysis is finally resumed or you stop it manually
  • analysis of pending URLs uses only the necessary number of requests, this means that if you added some URLs with the same root domain and chose only Root Domain parameters, only one request for URL data initialization will be made
  • Moz and Majestic give an opportunity to get a number of rows by sending only one requests for some of their parameters – this is an important point of core implementation: for Majestic it’s possible to get 100 results per request, while Moz returns 50 results per request for paid API plans and 10 results per request for free subscriptions

3.5. Handling the list of URLs

You can add URLs for the analysis in several ways:

  • paste from clipboard
  • enter manually
  • upload from a TXT file
  • drag and drop – just drag a file or text and drop it into the table

Please notice that when you add URLs manually, paste them from the clipboard, or upload from a file, http protocol will be added to lines of ‘domain.com’ type by default. To avoid this, please add https protocol by yourself.

3.6. Hotkeys

I can't imagine working with new Netpeak Checker without the hotkeys. I highly recommend using them as I’m sure you will get used really quickly and realize how much time this can save:

  • Ctrl + V – paste URL from the clipboard to the table
  • F5 – start the analysis
  • Ctrl + F5 – stop the analysis
  • Ctrl + S – save project (it’s especially useful when you work with one and the same list of URLs, just save your project as a simple text file more often)
  • Ctrl + F – filter the results
  • Delete – clear selected cells
  • Shift + Delete – delete selected URLs from the table (look out since the URLs are deleted with the results)
  • Ctrl + E – export active table
  • Ctrl + Shift + T – open last saved project: I normally start working with the tool from these keys and suggest you doing the same!

You can learn other combinations of hotkeys in the tool interface.

3.7. Multiscreen

Now you can open a few windows with the tool and work with them simultaneously.

3.8. Displaying data and export

When reviewing data in the table, you can find some new values:

  • (NULL) → means there is no data
  • (Empty) → means there are results but they are empty: this refers only to string data like 'Title' parameter
  • TRUE → means that data complies with the parameter condition: this is applicable to 'Indexation' type of parameters (in this case, the page is indexed by search engine)
  • FALSE → on the contrary, means that the results do not comply with the parameter condition, in this instance, the page was not indexed

All the exported results are brought up to Google Sheets and Microsoft Excel standards and are optimized for displaying in these editors.

3.9. Filtering the results

Filter settings are completely changed, the following functions were added:

  • parameters search
  • new types of data: conditional statements TRUE / FALSE and dates
  • filtering by cells (NULL) and (Empty)

3.10. Working with proxy

We've implemented two options to use proxy in the tool:

  • general proxy – is applied to all parameters
  • list of proxies – is applied only to the selected parameters and has a higher priority than the general proxy

Proxies from the list always alternate for every request to a service. Herewith, parsing of some services like Google, Yahoo, and Bing strongly depends on the number of proxies – the more proxies you use, the faster the analysis is.

3.11. Captcha solving

Manual captcha solving is set by default. However, you've got an opportunity to use auto captcha solving with the help of anti-captcha.com service (we've added API v.2 support).

3.12. Other changes

  • saving of windows size and position
  • Multiple Document Interface which lets you unpin 'All results' and 'Filtered results' tabs from the main window and also automatically hide panels for choosing parameters and with additional info

An offer you can't refuse

We offer you to try the new version of Netpeak Checker 2.1 and the latest version of Netpeak Spider for free within 14 days. Some day, and that day may never come, I will call upon you to do a service for me. But until that day, consider this trial period a gift on B-day of new Netpeak Checker.

Try Netpeak Software Products

We're always ready to help in case you need assistance getting to know the new tool… And I insist on hearing bad news immediately! :)

Feel free to reach out on any matter and follow us on social media:

In a nutshell

Netpeak Checker 2.1 – is a tool which you call to solve a problem in internet marketing. Add the list of URLs, choose the parameters, and get data from the most popular services all over the world: Ahrefs, Moz, Serpstat, Majestic, SEMrush, Alexa, Google, Bing, Whois, Facebook, Twitter, etc.

Comments (2)

  1. 1
    4 months ago

    Don Checkerone looks really impressive :) Thanks for great update and cool video!

To leave a comment, you have to log in.

Subscribe

to the most useful newsletter on internet marketing

Most

discussed popular readable