Select Page

Andreas Marx has published a new set of tests of antivirus products.

From Andreas:

The number of unique malware samples received by AV-Test.org increased from 333,000 in 2005 to 972,000 in 2006 and reached 5,490,000 in 2007. During January and February 2008 alone we found more than 1.1 million samples spreading in the internet.

Therefore, we thought it is a good idea to start a new test of anti-malware software in order to see how well the tools are currently performing, given the masses of malware “in the wild”. All products were tested in the best available 2008 security suite editions in English language (this includes AVG Internet Security 8.0 and ESET Smart Security). The tools were last updated on March 1, 2008 and tested on Windows XP SP2 (English).

A comprehensive review should not only concentrate on detection scores of the on-demand scanner, as this would give a user only a very misleading and limited view of the product’s capabilities. When comparing the security of cars, we would not only focus on the safety belts, but also check the ABS system (anti-lock braking system), one or more airbags, crush zones, the ESP (electronic stabilization program) as well as constructional changes and many other features which make a car secure. The different detection types have to be taken together to make a valid statement about the whole detection mechanisms: neither static nor proactive detection mechanisms alone can catch all malware.

It is important to have good heuristics, generic signatures and dynamic detection and prevention in place to be able to handle new unknown malware without any updates. It is crucial to have good response times, to be able to react to new malware, when proactive mechanisms fail to detect them. It is essential to have good static detection rates, to be able to handle already known malware, even before it is executed on a system. So comparing single features makes less sense, as we should think about the fact that a user has not bought an AV product to find some viruses and report them, but he has actually bought a service to keep his system malware-free.

You also do not need to shop for a new product even if the tool you are currently using has some limitation in certain categories. For example, if you have a very fast PC, the slow-down caused by a multi-engine product might be less noticeable. If the proactive detection is not so good, you have to update your scanner more frequently and you may want to use a behavior-based product such as Norton Antibot. If your scanner is not good in catching ad- and spyware used in our test, you might consider using a dedicated anti-spyware application. If the detection of active rootkits is worse, you might want to use specialized anti-rootkit detection and removal tools like GMER. However, not all stand-alone products can work properly together, so an integrated security suite from one vendor might fit best for the users which are currently not running an anti-virus tool or want to buy a new one, as the license for the current one will expire soon.

In case of the actual testing, we first checked the signature-based on-demand detection of all products against more than 1.1 million inactive samples we’ve found spreading or which were distributed during the last two months which means, we have not used any “historic” samples. We included all malware categories in the test: Trojan Horses, backdoors, bots, worms and viruses. Instead of just presenting the results, we categorized the products this time, from “very good” (++) if the scanner detected more than 98% of the samples to “poor” (–) when less than 85% of the malware was detected. (Ed: For the US version, I have changed this to letter grades — A, B, C, etc.)

Not only malware (intentionally malicious software) poses a threat to the user, also possibly unwanted applications like ad- and spyware has to be detected. A collection of more than 80,000 inactive samples was used for this test. We used the same ranking criteria as for the malware detection rates. While we have tested security suites, we want to emphasis that free (personal) editions of AntiVir and AVG exist which offer only very limited ad- and spyware detection rates (less than 15%).

Besides, we checked the number of false positives the products generated during a scan of 100,000 known clean files. This includes common files from different Microsoft Windows and Office versions as well as other well-known products and drivers. Only suites with no false positives received a “very good” (++) rating.

All products require quite some resources (this includes, but is not limited to memory and CPU power) on the installed system. It is important that the slow-down caused by the security suites is not too heavy, because in this case, an annoyed user might simply deactivate the virus guard and leave his system in an unprotected state.
Especially products with more than one scanning engine are usually performing slower than the tools with just one engine. A good trade-off between the required scanning time and the detection rates is therefore important.

In case of the proactive detection category, we have not only focused on signature- and heuristic-based proactive detection (based on a retrospective test approach with a one week old scanner). In addition to this, we also checked the quality of the included behavior based guard (e.g. Deepguard in case of F-Secure, Sonar in case of Norton/Symantec products and TruPrevent in case of Panda). We used 3,500 samples for the retrospective test as well as 20 active samples for the test of the “Dynamic Detection” (and blocking) of malware.

Furthermore, we checked how long AV companies usually need to react in case of new, widespread malware (read: outbreaks), based on 55 different samples from the entire year 2007 and 3 samples seen in 2008. “Very good” (++) AV product developers should be able to react within less than two hours and we found a reaction time of more than 8 hours unacceptable and thus, “Very poor” (–).

Another interesting test was the detection of active rootkit samples. While it is trivial for a scanner to detect inactive rootkits using a signature, it can be really tricky to detect such nasty malware when they are active and hidden. We checked the scanners detection against 12 active rootkits.

Detection is only one point, removal and remediation is extremely important, too. It is usually not desirable to reinstall and setup a system after an infection has been detected, since this costs time which in turn costs money. Therefore, we checked if the security software was able to scan for and remove 20 active malware samples from the system, cleaning all files (or deleting the components), repair the registry traces and undo the ‘hosts’ files changes.

In order to get a more comprehensive impression of the products, one should not only look at this test, but also compare the results of various tests and the products’ performance over time and their on-going development. We have not reviewed more “subjective” criteria like the usability, support, (online) backup features and the like.
Therefore, we would suggest trying these features with a trial version which is usually available as web download from vendor’s website before buying a security suite.

I have put these on my site, in a number of different ways:

My version, which I believe is simpler for American readers, as it uses a letter grading system. Grades here, spyware/adware tests here, malware detections here (HTML). Excel spreadsheet here.

Andreas’ original spreadsheet is here.

Alex Eckelberry