Site Vigil / Upgrade
Version 5.4 Build 56 - Major upgrade
This version is an incremental upgrade to Site Vigil builds 31 through to 54. Monitoring settings are forwards compatible from all these builds.
Configuration and data files are backwards compatible. Please backup your configuration settings (using the Export feature in Options screen) and configuration folder contents before upgrading.
Release date 20th April 2005.
It contains the following major new features :
- New wizard to aid the creation of complex alert schemes
- Allows a check for a specific string in HTML response file so can look for error messages
- New monitoring feature to check the availability of mail servers
It contains the following enhancements to existing functionality :
- Display statistics on the change in web traffic including high and low values and average
- Display HTML validation errors inline with the HTML source from the Analyzer utility
- When activate Site Position utility from Scheduler switch existing context to new web site
- Allows specific DNS name server to be specified for DNS lookup for particular page monitors
- Warn if a page access is redirected to another site in spider and analysis results
- The keyword analysis allows both individual keyphrases and keywords to be analysed
- The robot analysis reports are arranged by robot name rather than by robot IP address
- As the Analyzer spider scan progresses, keep the new results visible
- New Analyzer option detects broken links from pages as it analyses
- A summary of the results of keyword analysis are added to the general summary
- Detailed report for HTML validation problems within spider site results
- Make sure that the entered URL for site configuration spider scans is correct
- When select a URL from the Analyser URL selector immediately start scan
- If sort the watch results by name then those with blank names are put after the named entries
- An alerting indication has been added to watches that are out of range
- The robot detailed results include a count of the number of robot visits to pages
- A URL can be automatically varied in the Query string to force non-caching of results
- The 'in progress' dialog uses different icons to indicate when the scan is complete
- The File menu allows access to the in-built WhoIs utility
- Automatically enable/disable the URL archive limit when enabling check box is changed
- SitePosition makes sure that a 'Find' result is made visible
- SitePosition filters out duplicate keywords
The following fixes have been applied :
- Maximise window size maintained when switch between Status to Configure
- On deletion of a monitored site some configuration could sometimes be left behind
- If watching no resources a default alert scheme could not be associated with general error conditions
- A page that redirects to another page in the web site can be redirected to wrong folder in spider scans
- If an initial spider page redirects to a different domain scan may use wrong domain for links
- Very occasionally search position scans could fail due to a multi-threading access conflict
- The referral name used by SiteVigil includes license information so accesses can be traced
- If log files are automatically compressed by the server the log analysis may miss out on some log records
- If a Spider/browser scan contains a specific port specifier this may not have been passed on to implied internal links
- The default SitePosition scan speed has been improved
- The keywords analysis results are now reported in the remote viewer
- If clicked on Configuration site tab header to change site page then the selection changed even on validation error
- The HTML validation allows TARGET attribute for the BASE tag
- The HTML validation allows NOFRAMES to contain body tags
- The remote viewer alert page had an extraneous TABLE tag
- Occasionally SitePosition could crash due to overrun of its scan buffer
- Some search engines can give incomplete results if mixed result formats