Please welcome – Netpeak Checker 2.1:
1. What is Netpeak Checker?
Now you come, and you say: 'Don, give me data for the analysis.' But you don't ask with respect. You don't enter your API keys to the services access settings. You don't even know which URLs you want to get data for…
Netpeak Checker – is a multifunctional tool for mass analysis and comparison of the websites which
To start working with the tool, you need to:
- have a list of URLs for which you want to get data
- know the parameters you want to analyze (
for gettingsome parameters, you'll need a paid access to an appropriate service)
2. Issues with the previous versions
If you have used Netpeak Checker before, you certainly have met at least one of these issues:
- lack of opportunity to save and load project results → there was an option to save the list of URLs, however, it's often necessary to save exactly the results in order to handle them in future when needed
- odd empty cells in the results table after the analysis → the parameters are chosen and the analysis seems to be going but you still see empty cells without the slightest idea why there are no results
- complicated interface → different tabs for editing the list of URLs, adjusting the parameters, and viewing the results table; settings of the tool itself, of the services, and of the parameters and threads are all in the different places
- low productivity → due to outdated results table and lack of optimization of getting results
- analysis full stop when the Internet connection is lost → this is especially annoying when you start the analysis for a huge number of URLs and parameters and leave it for a night, and if the internet is down even for a while, the analysis will be completely stopped
- only dark color scheme of the interface → this doesn't refer to a technical part of the tool, however, we often get feedback on this disadvantage
3. Solution of the problems and new opportunities
3.1. Absolutely new interface
Now, all the main actions with the tools will be made in one window::
If there was an issue
- yellow – in this case, you can fix the issue yourself. For instance, 'Proxy Error' means that there is something wrong with proxy so just add another proxy or disable its using
- red – these are issues independent of you, such as 'Service Error'. This means the service you’re turning to is unavailable and there is nothing left except to wait
Now there are three color schemes available: light, dark, and blue.
3.2. Saving and uploading the projects
We've implemented an option to save the project with the most space-saving way of storing the data. Try to save your project more often and all your results will be safe and sound :)
Please notice that saved project stores all the received data: you can easily turn off the parameters and choose other ones, but when you turn on the parameters that were analyzed before, they won't be retrieved again, they'll just appear in the table.
3.3. Changes to parameters
Now Netpeak Checker can analyze more than 1000 parameters. To tell you the truth, we almost stopped counting them after we crossed the mark of 1000 parameters :)
All the old parameters were revised and optimized. Let me point out Whois parser of our own production – we've made every effort to develop it and we do believe you'll be satisfied with it!
Some new services were added to the tool:
- Yahoo SERP – indexation, merge, and other info referring to Yahoo results page
- Bing SERP – the same as the previous one, only for Bing
- StumbleUpon – data on the indexation and number of views of the target URL by the users of this service
Also, we've added parameters 'Title' and ‘Description’ for all search engines. This will let you know how search engines display your website in the results page. So don’t miss your chance to improve the CTR of your pages!
Search and selection
As I've already mentioned before, now there are 1000+ parameters in the tool, so it’s extremely useful to have an opportunity to search the necessary ones. Just start typing the name and you’ll get all the parameters containing any matches.
Now all the parameters have 'target' and 'mode' properties.
Target defines what value of the URL will be used for the request to a service. Targets can be of the following types:
- URL → in this case, the URL exact match is sent to a service, for instance,
- Host → the host of the URL is sent to a service, in our example, it’ll be
- Root Domain → as the name implies the root domain is sent, in the above-stated example it is
- Prefix (available only for Ahrefs) → ‘URL’ target and a parameter of ‘Prefix’ mode is sent to the service so this way the data for path will be retrieved
- Subdomains (available only for Ahrefs) → ‘Host ’ target and a parameter of ‘Subdomains’ mode is sent to the service and the data is retrieved for
- Subdomain (available only for Moz) → kind of technical nickname for ‘Host’ target: in this mode, all Moz subdomain parameters are grouped for your convenience
So if the service provides an opportunity to get data in different modes, you can get the results simultaneously if you check the parameters in corresponding modes.
Also, some services (now, they are Serpstat and SEMrush) have 'Search Engine' group of parameters – here you should specify what search
We've spent a great deal of time on testing and finally found optimal values that will let you get maximum of data from each service – these settings (number of available threads and their speed) are merged into the tool. So now you don't have to manually set the delays between requests thinking what number will be optimal.
3.4. New application core
The core of new Netpeak Checker is based on the Netpeak Spider's 2.1 one – we can point out the following peculiarities of its work:
- the tool analyzes which cells are empty and which ones have issues with getting results: these cells become pending for the analysis
- the above-mentioned feature lets you easily open project results and continue working with them
- if the Internet was lost, the tool will try to resume the analysis every 15 seconds until the analysis is finally resumed or you stop it manually
- analysis of pending URLs uses only the necessary number of requests, this means that if you added some URLs with the same root domain and chose only Root Domain parameters, only one request for URL data initialization will be made
- Moz and Majestic give an opportunity to get a number of rows by sending only one requests for some of their parameters – this is an important point of core implementation: for Majestic it’s possible to get 100 results per request, while Moz returns 50 results per request for paid API plans and 10 results per request for free subscriptions
3.5. Handling the list of URLs
You can add URLs for the analysis in several ways:
- paste from clipboard
- enter manually
- upload from a TXT file
- drag and drop – just drag a file or text and drop it into the table
Please notice that when you add URLs manually, paste them from the clipboard, or upload
I can't imagine working with new Netpeak Checker without the hotkeys. I highly recommend using them as I’m sure you will get used really quickly and realize how much time this can save:
- + – paste URL from the clipboard to the table
- – start the analysis
- + – stop the analysis
- + – save project (it’s especially useful when you work with one and the same list of URLs, just save your project as a simple text file more often)
- + – filter the results
- – clear selected cells
- + – delete selected URLs from the table (look out since the URLs are deleted with the results)
- + – export active table
+ + – open last saved project: I normally start working with the tool from these keys and suggest you
You can learn other combinations of hotkeys in the tool interface.
Now you can open a few windows with the tool and work with them simultaneously.
3.8. Displaying data and export
When reviewing data in the table, you can find some new values:
- (NULL) → means there is no data
- (Empty) → means there are results but they are empty: this refers only to string data like 'Title' parameter
- TRUE → means that data complies with the parameter condition: this is applicable to 'Indexation' type of parameters (in this case, the page is indexed by search engine)
- FALSE → on the contrary, means that the results do not comply with the parameter condition, in this instance, the page was not indexed
All the exported results are brought up to Google Sheets and Microsoft Excel standards and are optimized for displaying in these editors.
3.9. Filtering the results
Filter settings are completely changed, the following functions were added:
- parameters search
- new types of data: conditional statements TRUE / FALSE and dates
- filtering by cells (NULL) and (Empty)
3.10. Working with proxy
We've implemented two options to use proxy in the tool:
- general proxy – is applied to all parameters
- list of proxies – is applied only to the selected parameters and has a higher priority than the general proxy
Proxies from the list always alternate for every request to a service. Herewith, parsing of some services like Google, Yahoo, and Bing strongly depends on the number of proxies – the more proxies you use, the faster the analysis is.
3.11. Captcha solving
Manual captcha solving is set by default. However, you've got an opportunity to use auto captcha solving with the help of anti-captcha.com service (we've added API v.2 support).
3.12. Other changes
- saving of windows size and position
- Multiple Document Interface which lets you unpin 'All results' and 'Filtered results' tabs from the main window and also automatically hide panels for choosing parameters and with additional info
An offer you can't refuse
We offer you to try the new version of Netpeak Checker 2.1 and the latest version of Netpeak Spider for free within 14 days.
We're always ready to help in case you need assistance getting to know the new tool… And I insist on hearing bad news immediately! :)
Feel free to reach out on any matter and follow us on social media:
In a nutshell
Netpeak Checker 2.1 – is a tool which you call to solve a problem in internet marketing. Add the list of URLs, choose the parameters, and get data from the most popular services all over the world: Ahrefs, Moz, Serpstat, Majestic, SEMrush, Alexa, Google, Bing, Whois, Facebook, Twitter, etc.