Dienstag, 21 November 2017

General

In the general analysis settings of the WebsiteAnalyser you are able to set the maximum number of URLs to analyze or the maximum URL depth. You can also activate or deactivate the tag analysis (CSS, HTML, H1 to H6) as well as the content analysis quickly and simply by clicking on anchor of the "checkbox". Furthermore, you can enable the analysis of image files, CSS and XML files and allow or block the use of data in a 'robots.txt' (ALLOW / DISSALOW) of your domain.

It is not always necessary to analyze all parts of a domain and it is not always sensible to activate all usable analysis options. It does not make sense to parse the content of CSS and XML files every time you run it. Image and media files are subject to conditional changes in the long term. Here you or the administrative contact person will be able to tell you whether an analysis of these areas makes sense.

The settings for the analysis can be found in the settings dialog under Analysis ⇒ General.

Settings - Analysis -
 General settings of the WebsiteAnalyser

General analaysis settings:

SettingDescription
just analyse the given URL The analysis is limited to the entered URL. All found links etc. are added to the database but not recursively analyzed.
strip off URL parameter part This option may be useful, especially for large shops and CMS systems, if the URLs are partially analyzed several times by parameters and anchor tags based on the parameter portion of the URL.
maximum URL's to analyse Here you can specify the maximum number of URLs to analyze.
maximum URL level/maximal URL depth/maximal crawl depth Here you can specify the maximum URL level/maximum URL depth or maximum crawl depth.

Tag's and content:

SettingDescription
detect H - H6 tag's of the URL The H to H6 tags of the URLs are identified and included in the analysis.
HTML tag analysis The HTML tags are recognized and evaluated usage, number etc.
CSS tag analysis The CSS tags are recognized and evaluated usage, number etc.
Extract URL content for further analysis The content of the URL is extracted for, separately analyzed and evaluated.

Files and images:

SettingDescription
detect images and files Images and files are automatically recognized and included in the analysis database and analyzed.
detect images and files in css files Images and files are recognized in CSS files by additional analysis and included in the analysis database and analyzed.
detect images and files in xml files Images and files are recognized in XML files by additional analysis and included in the analysis database and analyzed.

'robots.txt':

SettingDescription
use Disallow statements from robots.txt The disallow statements from the existing robots.txt are included in the analysis and a separate list of the DISALLOW statements and the explicitly blocked/disallowed URLs are created.
use Allow statements from robots.txt The allow statements from the existing robots.txt are included in the analysis and a separate list of the ALLOW statements and the explicitly allowed URLs are created.

If you have any questions regarding thegeneral analysis settings of the WebsiteAnalyser, please feel free to contact our Support.

Zugriffe: 499 Stand: Donnerstag, 02 November 2017