...and for once we're not one of the vendors getting hammered.

Secunia, a Danish company that sends out security notifications, has announced that it has tested a dozen security suites. Interestingly, Secunia used a number of exploits developed in-house for analysing vulnerabilities rather than the sort of malware sample based testing that we're more used to discussing here. Secunia state that they chose to test "high-end" product suites marketed as "comprehensive Internet Security Suites" because their marketing gives the impression that users are "fully protected against all Internet threats." I don't believe that it's altogether appropriate to blame vendor marketing for this impression. I'd agree that it is iniquitous to market any product in terms of "this is the only security measure you'll ever need and you can do anything you like and click on anything you like because we will fully protect you from yourself as well as the bad guys." Technology cannot protect anyone from everything, even someone who behaves responsibly on-line. However, I think you'll find that many researchers within the industry are of the same opinion: certainly Graham Cluley was quoted by John Leyden of The Register as agreeing that patching is important and that there's no such thing as "the perfect security suite",  while Graham also remarked that it sounded as though the products tested would probably perform better in a more "real-life test".

However, Leyden - never the anti-malware industry's number one fan - clearly feels that a point has been made about the incompetence of this product sector, and made some cutting remarks about chocolate teapots. Thomas Kristensen, of Secunia, was quoted as saying that "generic detection of exploits would be a better approach". Apart from the fact that it seems very archaic to assume that a modern anti-malware program (let alone a suite) is wholly dependent upon signature scanning, this comment does bring to mind the old saw (pun intentional) that to a man with a hammer, every problem is a nail. It's to be expected that Secunia would focus on exploits, since that's their specialty. And it's perfectly true that there are classes of threat that can be forestalled generically (we do quite a lot of that ourselves...) but an awful lot of threats are not exploits of unpatched vulnerabilities, and it's misleading to focus purely on that approach as if it were The Answer(TM).

In other forums, there's some disagreement about whether there's an element of dynamic analysis present in this test, pending further clarification from Secunia. I have the permission of Andreas Marx, a very experienced tester and one of the leading lights of AV-Test, to quote him at length. (Thanks, Andreas!)

Andreas believes that there are a number of problems with the test, at least as described by Secunia:

  • some critical details are missing, for example, the time of the last update of the scanners, the exact product versions, and the like
  • only the on-demand scanner and the on-access guard was tested, so it was only checked if the file-scanner would trigger an alert
  • the paper also speaks about a test with html/web pages, but I cannot see a single test case for the part in the review (is it missing or was it excluded?)"

He believes that a critical point here is that scanning "all data files for possible exploits...would slow down the scan speed dramatically." He states, quite rightly, that most companies have focused on common file-based exploits like the ANI exploits. (In a perfect world, all vulnerable systems would have been patched for these long ago, of course.) He goes on to point out that there are many other approaches to proactively preventing exploits that don't seem to have been tested here, such as:

  • filtering known "bad" URLs
  • browser exploit filtering
  • buffer, stack or heap overflow protection

HIPS, IDS or personal firewall components

He concludes that "a better test setup would...have the vulnerable applications installed on the test PC, together with the security suite...Then the tester would need to trigger the exploit, and see whether the machine was exploited successfully or not...really focusing on the entire suites' features and not only on the 'traditional' scanner part of an AV product."

This, of course, is an approach much closer to the dynamic testing models towards which the better testing organizations are moving, as providing a fairer assessment of the capabilities of  tested products. I'm hoping that Secunia will respond to various requests for clarification of their methodology, and in particular whether it's seen as aligning with AMTSO recommendations for good testing practice.

David Harley
Malware Intelligence Team