Select Page

As always, good stuff from Andreas Marx of

We have just finished a new comparison test of AV software. All products (in the “best” available Security Suite edition) were last updated on January 7, 2008 and tested on Windows XP SP2 (English).

First, we checked the signature-based on-demand detection of all products against more than 1 Mio. samples we’ve found spreading or which were distributed during the last six months (this means, we have not used any “historic” samples.) We included all malware categories in the test: Trojan Horses, backdoors, bots, worm and viruses. Instead of just presenting the results, we have ranked the product this time, from “very good” (++) if the scanner detected more than 98% of the samples to “poor” (–) when less than 85% of the malware was detected.

Secondly, we checked the number of false positives of the products have generated during a scan of 65,000 known clean files. Only products with no false positives received a “very good” (++) rating.

In case of the proactive detection category, we have not only focussed on signature- and heuristic-based proactive detection only (based on a retrospective test approach with a one week old scanner).

Instead of this, we also checked the quality of the included behavior based guard (e.g. Deepguard in case of F-Secure and TruPrevent in case of Panda). We used 3,500 samples for the retrospective test as well as 20 active samples for the test of the “Dynamic Detection” (and blocking) of malware.

Furthermore, we checked how long AV companies usually need to react in case of new, widespread malware (read: outbreaks), based on 55 different samples from the entire year 2007. “Very good” (++) AV product developers should be able to react within less than two hours.

Another interesting test was the detection of active rootkit samples. While it’s trivial for a scanner to detect inactive rootkits using a signature, it can be really tricky to detect this nasty malware when they are active and hidden. We checked the scanner’s detection against 12 active rootkits.

Having such a multi-faceted test methodology is important — an antivirus engine could, for example, have extraordinarily high detection, but high false positives. And, a retrospective test allows you to see how well an antivirus’ heuristics work. It’s good to look at all the parameters in order to judge efficacy.

I’ve put the test results into PDF. You can see the main results here and the details of the test of signature detection here.

Alex Eckelberry