False Positives: A Round of Applause…

The anti-malware industry isn't a suitable environment for the thin-skinned. We get used to receiving "more kicks than ha'pence" (see http://www.virusbtn.com/spambulletin/archive/2006/11/vb200611-OK)..

In particular, I've grown accustomed to the fact that many people expect all the following from an AV product:

  • Absolute Protection
  • Absolute Convenience
  • Absolutely no  False Positives
  • Absolutely no charge

False positives (FPs) are not a minor issue: my experience is that many people (especially in corporate environments) are more infuriated by an FP than by a false negative (i.e. where security software fails to detect real malware).

So it was a pleasant surprise to come across a blog from a source not usually associated with fulsome praise of the AV industry that was actually rather positive, in a backhanded sort of way. The author states "…while I rant quite a bit about the AV industry, you have to give this one to them: the number of false positives is really low. For example, in the AV-Comparatives test 20 false positives is considered many, even though the collection is over 1 500 000 samples (so the acceptable FP rate is below 0.0015%!)." [Sadly, this calculation doesn't altogether make sense, either to us or to AV-Comparatives. In fact, it seems to be based on comparing the number of "acceptable" false positives to the AV-Comparatives malware test set. It isn't based on the AV-Comparatives clean set: AV-C don't publish that information, as they consider FP ratios in percentages to be potentially misleading.]

The blog isn't actually about us, I should make clear: it's about the importance of false positives in intrusion detection, an area with just enough methodology and terminology shared with anti-malware to be confusing – for instance, both technologies talk about FPs and signatures, but often mean something slightly different by them. This led me to another blog that makes a useful distinction between "real" false positives – misdetection of innocent objects as being malicious – and "noise" such as "inappropriate traffic that is legitimately detected that we just don't care about." Actually, because modern anti-malware has a much wider remit than traditional virus detection, that is a concern we share with IDS and IPS vendors – of course, there's an overlap – most AV products now include some IPS (Intrusion Protection System) functionality. But it also has an analogue in more traditional anti-malware detection, which tends to cover "noise" programs such as adware and "possibly unwanted" code as well as out-and-out malware. 

Then there's the miscellaneous detritus that may be left behind when malware is removed: it may be harmless in itself, but what happens when two security programs are used on the same system and one is more cautious about what it removes than the other? Which is "correct" is a subjective judgement rather than a right/wrong issue.

To take another common case, no-one wants to see those misidentifications of innocent system files as malicious that affect even the most cautious products from time to time, but what about files that are detected as malicious because they're packed with a run-time packer that's often (but not necessarily always) used by malware authors? Well, that's not an issue I'm going to be able to give a short and simple answer to here. But I will say this (and often have before): given the complexity of the malware problem, not to mention the sheer volume of samples, the wonder is that mainstream anti-malware generates so few FPs. And it's a real pleasure to find someone agreeing on that point who can't be accused of undue bias in favour of this industry.

By the way, these blogs reminded me of an old but still relevant paper on "The Base-Rate Fallacy and its Implications for the Difficulty of Intrusion Detection" by Stefan Axelsson. If you're interested in the false positive conundrum and  the words "Bayes Theorem" don't strike terror to your heart, you might find it well worth reading.


David Harley BA CISSP FBCS CITP
Director of Malware Intelligence

ESET Threatblog (TinyURL with preview enabled): http://preview.tinyurl.com/esetblog
ESET Threatblog notifications on Twitter: http://twitter.com/esetresearch
ESET White Papers Page: http://www.eset.com/download/whitepapers.php

Securing Our eCity community initiative: http://www.securingourecity.org/

Author David Harley, ESET

  • http://hype-free.blogspot.com/ Cd-MaN

    Thank you for continuing to read my blog. It is both an encouraging and a slightly frightening situation :-)

    You of course were 100% right, I was comparing apples to oranges. I’ve update the post to reflect the criticism and came up with a new number, which is completely pulled out of my rear :-) The argument goes as follows: the Bit9 Global Software Registry claims to have more than 6 000 000 000 files. If we consider that they have “all” the clean executables (which of course they don’t, but lets suppose it for the moment) and AV-Comparatives uses 0.5% of them for FP testing, 20 files still weigh in at just 0.003%.

    I hope that I managed to drive home the idea that current “AV” solutions have a very low FP rate, which inspires trust in users (popping up frequently and claiming to have found malware in applications which the user is convinced is clean can very quickly erode the trust in a product!).

    Thank you again and I hope that you will continue to follow my blog.

  • http://www.eset.com/threat-center/blog/ David Harley

    We are watching you. Be afraid, be very afraid. :-D

    Actual figures aside, your essential point is very well taken. And yes, I’ve been following your blog for a while, and intend to continue: you blog some pretty interesting stuff.

Follow Us

Automatically receive new posts via email:

Delivered by FeedBurner

23 articles related to:
Hot Topic
ESET Virus Radar

Archives

Select month
Copyright © 2014 ESET, All Rights Reserved.