...and it's still hybrid. Or multi-layered, if you prefer. What anti-malware companies (and malware authors, if it comes to that) are constantly doing is revisiting concepts that have worked before so that they fit the current environment better: there's nothing wrong with an evolutionary approach, but changing the terminology doesn't make it revolutionary. So what Larry Seltzer is describing in a recent eWeek article isn't exactly groundbreaking technology, it's what all anti-malware companies originating in traditional AV are doing, to a greater or lesser extent. "File reputation" is pretty much what we used to call integrity checking, and is close to a limited application of whitelisting that's been in common use since the 1990s. The main difference is that whereas earlier incarnations of anti-virus tended to bundle an integrity checker as a separate application along with a known-virus (signature) scanner, it's now common to whitelisting or a near equivalent into the main application. And in those days, no-one described their own server networks as a cloud. ;-)

What's more interesting is Larry's critique of various "classic methods" of malware scanning.

  • Clearly, we aren't about to argue that it's feasible to have a signature for "every new variant": that's a model we moved away from many years ago.
  • I would argue, though, that "true heuristics" is an odd term to use for what we describe as "passive heuristics", where an object is scanned for malicious characteristics statically - that is, looking at the code without executing it. (We have a fairly comprehensive, vendor-agnostic review of heuristic analysis on our white papers page, by the way.) The term heuristics has a far wider range of applications than that, even within the anti-malware industry, which uses it in quite a specialized sense. Clearly, there is still a place for this analogue to static analysis, as the term is used in forensics, but it isn't nearly as effective as it used to be, because the bad guys use a variety of obfuscatory techniques to hide malicious code from signature and basic heuristic detection. There's nothing "untrue" about heuristic analysis when it analyses code when it's actually running (an analogue to dynamic analysis): the point Larry seems to have missed is that when products like ours run that code, it's in a protected environment, so (assuming a sound implementation of the product) the code being run shouldn't present a risk to the system.

What interests me most, however, is his yearning for "a simple solution like absolute whitelisting." It does seem that we're always looking for the 100% solution that will render current anti-malware solutions unnecessary. The way that firewalls, IDS, IPS, reputation services, NAC and a dozen other panaceas du jour were once seen as The Answer. But the fact is that whitelisting itself is hybrid (by which I mean that you can't whitelist an application without using other technologies to confirm that it's what AMTSO like to call "innocent". And it works best as one layer of a defensive strategy, at any rate in the version of the internet in which we currently find ourselves.

David Harley 
Director of Malware Intelligence