Cascading False Positives

 Security researchers work together and share information in many ways and in many contexts that aren't constrained by company boundaries, but it's unusual for security researchers working for different vendors to join forces in a company blog.

However, John Leyden of The Register contacted us both when he was writing an article on the controversy following Kaspersky Lab's dramatic demonstration of the way in which false positives can cascade from one vendor to another. This is a major issue, because it can and does introduce a serious bias into comparative detection testing and analysis. After responding to John's questions, we continued the discussion subsequently by email and found that we (along with most of the AV industry) were in agreement on all major points, and decided that it was more important to clarify those points, than to continue debating the detail of the demonstration.

The fact that the demonstration used Virus Total as a channel for cascading the "artificial" false positives to other vendors should not be seen as in any way detrimental to Virus Total. Hispasec have never endorsed the use of the service as a substitute for comparative testing or for sample validation, either of which are very likely to generate misleading results.

Multiple scanners are not in themselves the problem, whether they're hosted on public sites, specialist resources, or used by testers or anti-malware companies in-house. As tools for comparative analysis or precursors to more detailed analysis, they have a great deal of value. However, that value depends on the user's knowledge and understanding of how to make the most appropriate use of them.

Mainstream testers and security vendors have extensive understanding of these issues: however, many tests do not take them sufficiently into account. The Kaspersky Lab experiment did at least bring the issue to the attention of some of the press and publishers who most need to be aware of it, and who would probably have taken far less notice of a less controversial presentation.

As supporters of AMTSO, the Anti-Malware Testing Standards Organization, we are in emphatic agreement that away from static testing and toward dynamic testing is a positive direction. We hope that more reviewers now appreciate that dynamic testing with small but properly validated sample sets offers more realistic assessment of detection capability with less risk of unintended bias. If more people realized this, it would allow vendors to spend more time on real threats and less on making sure they detect samples that shouldn't be included in a test set.

David Harley, ESET Research Fellow & Director of Malware Intelligence
Magnus Kalkuhl, Senior Virus Analyst, Kaspersky Lab

Author David Harley, ESET

  • Laurence

    link to register article is broken due to extra character at the end: "]"
     
    do dont need to allow my comment, just wanted to help!

    • http://www.smallblue-greenworld.co.uk David Harley

      Thank you, Laurence. Your help is appreciated!

Follow Us

Automatically receive new posts via email:

Delivered by FeedBurner

4 articles related to:
Hot Topic
16 Feb 2010
ESET Virus Radar

Archives

Select month
Copyright © 2014 ESET, All Rights Reserved.