SiteAnalyzer is a free site audit software for Windows PC. It is a powerful web crawler for crawling and analyzing website content. The utility allows not only to scan sites and check their basic technical and SEO parameters for errors, but also to effectively correct these errors.
With this useful program and its powerful tools, you will be able to test your link building strategy, your website optimization, page content and more...
SiteAnalyzer features and benefits
Scanning site pages, images, scripts and documents.
Crawling sites of almost any size.
Search and display «duplicate» pages, meta tags and titles.
Server code responses for each page of the site.
Determining the presence of the rel = «canonical» attribute for each page of the site.
Following the directives of the «robots.txt» file, the «robots» meta-tag, or the X-Robots-Tag.
The ability to filter data by any parameter.
Low requirements for computer resources.
Modest consumption of RAM.
It is distributed free of charge.
Changes in recent version
Added the ability to specify arbitrary HTTP headers when accessing the server.
Added the ability to use virtual robots.txt instead of the real one (located at the root of the site).
On the tab for checking the uniqueness of content, a window has been added for displaying a list of pages that are closest in uniqueness to the selected URL.
Added the ability to cancel the procedure for testing the health of the proxy list at any time during the test.
Rescanning of arbitrary project URLs now occurs in several streams, based on the program settings.
The SERPRiver service has been added to the Yandex XML settings section to check the indexing of pages in Yandex.
The Custom Search function, designed to search for content on the site, has been restored.
Added display of the date of the last scan of the project in the list of projects.
Added the ability to drag and drop several projects through the folders with the mouse, as well as using the context menu.
Added additional buttons for checking Google PageSpeed and content uniqueness on the corresponding tabs.
Added the ability to open a robots.txt file in a browser for a site of interest.
Added the ability to open the selected URL on the Web.Archive.org site.
Added "Source" column to the "Images" tab.
Optimized accounting of excluded URL rules when crawling sites, added the use of RegEx regular expressions.
Optimized and improved accounting of robots.txt rules settings.
Fixed a bug that occurs when parsing incorrectly set rules in robots.txt.
Fixed incorrect accounting of subdomains when the checkbox "Consider subdomains" is enabled.
Fixed incorrect encoding when loading HTML-code of pages in the form of data extraction testing.
Fixed incorrect sorting of the "TOP Domains" tab and other filters in the "Custom Filters" panel.
Fixed a bug that occurred when entering into the filter of projects addresses of sites that are not present in the list.
Fixed display of incorrect encoding for sites using Windows-1251 encoding.
Fixed incorrect data filtering when switching between regular tabs and Custom filters.
Fixed a bug that occurred when scanning a large number of sites in the list of projects.
Restored display of detailed decryption of data received from Google PageSpeed.
Restored display of error statistics for titles Title, Description and H1.
Redundant context menu display is hidden in the Custom filters section
Optimized adding a large number of URLs to the list of projects.
Fixed incorrect determination of URL nesting level.