Recreational Virus Writing

General

12

Greetings, my loyal readers. How are you both? Have you noticed that I’ve been uncharacteristically quiet for the past month or two? A combination of sheer overwork (are you listening, boss? ), a much needed holiday, and some fairly serious surgery, has prevented me from sharing my prejudices with you. And look at all the things that have been going on… You’re no doubt well aware of the Race to Zero competition that’s on the menu for the Defcon 16 conference in August, where contestants will be given samples of malware and compete to be the first to make enough modifications to them to get their sample set undetected past a range of AV products. Thus proving that if you take a sample of known malware and spend some time modifying it, eventually you’ll get something that anti-virus products don’t recognize. Well, I bet you never realized that…

There’s no big surprise, either, in an idea like this coming out of Defcon. While the range of people who run and attend the event is wider than you might think, it does attract a good few security fringe people who assume that the anti-malware industry is populated by dummies who know less about malware than anyone else, and will take any opportunity to rub our noses in the fact that we don’t have the 100% cure for malicious code. (Oddly, I don’t see many people complaining that no-one has a 100% cure for other security problems, or, come to that, cancer, crime, or world hunger). You might think, though, that readers of the world’s only significant specialist anti-malware publication might have a more balanced view. However, Virus Bulletin have run a survey on their web site on "Are virus-writing contests a good idea?" and it appears that most respondents don’t see a problem with them. (Of course, people who visit the web site are not necessarily readers of the magazine.) A quote or two from Graham Cluley of Sophos summarizes some of the reasons the anti-virus community objects on principle to malware creation for test purposes.

"Ah," you may say, "but this isn’t a test like that Consumer Reports fiasco: it’s the skills of the contestants that are being tested." Sorry, but it doesn’t work quite like that. It’s inevitable that conclusions will be drawn about individual products according to how well they survived the accelerating modifications thrown their way. At least some of those conclusions will be inappropriate because they’ll assume (1) the competence of the assessment of the malware (2) the incompetence of the anti-virus products.  

Anyway, Graham made some trenchant points, though he might not agree with every nuance of my interpretation:

  • There’s enough malware around already: creating new variants won’t tell us much about anything. (True: neither the "malware for profit" gangs who now dominate the malicious software scene nor the anti-malware specialists are likely to learn much from the efforts of some "up for a laugh" amateurs.)
  • Writing malware isn’t a way of writing improved anti-malware, and if the organizers are really keen to mitigate the problem rather than having a cheap laugh, perhaps they could think about ways of competing to improve defensive measures rather than offensive measures.
  • There’s a safety issue when new malware is created in scenarios like this, though to be honest, instances of malware escaping directly "into the wild" from inept tests (yes, I am going to keep referring to this as a test!) are few and far between. It does seem that even reputable security mavens like Bruce Schneier think we’ve overstated that objection, and few people outside the industry understand our ethical objections. I think perhaps we have, and, even worse, have failed to make clear that there are actually far more pressing practical objections: however, AMTSO will, I honestly believe, be able to make real hardway on that problem.

Something he didn’t directly comment on is the fact that a startling 46% of respondents apparently believe that virus-writing contests are a useful way of "highlighting issues with antivirus." Well, I guess that depends on whose attention the "issues" are being brought to. If there are still companies who are totally reliant on malware-specific signature detection, or end users who think that an anti-virus product can protect them from all known and unknown viruses so they don’t have to think about or contribute to their own security, then perhaps they are. However, if they think that this particular contest will add much in the way of instant enhancement to anti-malware technology, I’m afraid I don’t share their optimism.

[3rd bullet point slightly edited for grammar and clarity after the blog was published. Sorry for any confusion I may have generated.]

David Harley, ESET Research Author

Author David Harley, ESET

  • http://anti-virus-rants.blogspot.com kurt wismer

    i think the contest website makes it quite clear that the point of the contest is not to really to test the contestants, but to highlight perceived problems with anti-virus technology…

    it’s a site, after all, that promulgates the “anti-virus is dead” propaganda… they aren’t looking to improve the situation, they’re looking to advance an anti-AV agenda…

  • eidolon sniper


    “There’s enough malware around already: creating new variants won’t tell us much about anything.”

    I think you answered that concern pretty well yourself. The antivirus vendors know how vulnerable they are. The virus writers know how vulnerable they are. Anyone who has the slightest bit of programming knowledge, or anyone who pays attention to these issues, know how vulnerable antivirus software is. They’re not our concern here; what is, as you’ve said, are the end users who actually buy the spiel that the marketing departments of antivirus companies churn out (I wonder if you viruslab guys ever take the time to read them? Bet you’d have a fit). And I think this contest is going to do that pretty well, and provide some balance to all the “xx times VB100% award winner” and “rated A+++ at AV-Comparatives” crap flying around.

    As this contest is going to tell people, heuristics are not the answer; nope, not even ESET’s heuristics. Instead, we’re going to see antivirus vendors with some real substance rise to the challenge: those who have dedicated analysts and sufficient manpower to provide good turnaround times to the malware.

  • blezer

    VIRTUMODE just arrive in my system and eset didn`t found anything!!! s&d+s.a.s+a.a. lavasoft clean up my system can`t belive

  • Randy Abrams

    The contest will do nothing very well except garner attention for the undeserving hype mongering. That and deceive some users into thinking that defense in depth is not a good strategy. There is no “the answer”. There are parts of the answer. AV is one part. Heuristics is one part. Education is one part. Better software is one part. The contest will do nothing to make end users anyt safer or better educated, and that really isn’t the point of the contest, it’s all about hype.

    Randy Abrams
    Director of Technical Education
    ESET LLC

  • eidolon sniper

    Mr Randy Abrams,

    Unfortunately, there is no hype involved in the contest. Instead, facts will be presented for what they are, facts that have too long been covered up and swept under the carpet by the marketing departments of antivirus vendors.

    How will users be deceived by being shown that antivirus products are not infallible? A false sense of security does not equate to security. Mr Joe Schmoe needs to know, before he hands his credit card over, that antivirus software are far from a perfect solution, instead of being told that by the tech support department only when he gets infected. Does the fact that almost everyone outside the antivirus industry – the party that stands to lose from this contest – think that this contest is a good idea not tell you anything at all?

  • http://www.smallblue-greenworld.co.uk David

    Hey, Kurt.

    You’re right, of course. This is a manifestation of a not- altogether-rational hatred of all things AV, not just on the slashdotty fringes but in some sectors of mainstream security. There’s no point in trying to make those guys love us, but I guess we’ll all keep trying to lessen the impact of out-and-out misinformation.

  • http://www.smallblue-greenworld.co.uk David

    blezer, sorry NOD32 didn’t do it for you on this occasion. It does detect many Virtumonde variants and variations, but as with all highly variable malware, heuristic analysis catches at least as many variations as signature scanning (probably far more, I’d guess). But it can only catch _everything_ if it’s set so aggressively that the false positive risk is higher than many people will tolerate.

  • http://www.smallblue-greenworld.co.uk David

    eidolon sniper,

    there is a gap sometimes between marketing and reality, or the version of reality that researchers work with. That’s inevitable with differing skillsets and knowledge-sets, and when we become aware of an issue in-house, we work with other teams to correct it. Unfortunately, that’s rarely an option when it’s someone else’s hype and misinformation, and that’s what I was most concerned with here.

    “Instead, we’re going to see antivirus vendors with some real substance rise to the challenge: those who have dedicated analysts and sufficient manpower to provide good turnaround times to the malware.”

    If I understand you correctly, you’re advocating improving signature detection by throwing more resources at the problem during the analysis phase. While more resources are always welcome, this can’t fix the glut problem. Even if you can refine your processes so that every incoming sample is turned around immediately (and bear in mind that the more hands you have on deck, the more resources you have to devote to coordination), that will only address a percentage of the -totality- of new threats. Generic signatures and more advanced heuristics, behaviour analysis and so on, -does- increase the percentage detected. Neither heuristics nor improving signature turnaround are “the answer”. As Randy says, “the answer” is an aggregation of many partial answers, not a single 100% solution.

  • eidolon sniper

    David,

    I’ve seen the accusation of “hype” and “misinformation” thrown around a few times, so I think it might be helpful for someone to step forward and clarify just exactly what and where are these hype and misinformation.

    Regarding your second point, all I have to say is that there ARE antivirus companies doing exactly that right now. I don’t know how ESET’s manpower resources compare to them, but they’re doing it, and they’re producing results. And ultimately, results is the thing that matters – not ideals.

  • Don

    Hi David,
    I want to know if it’s very hard to detect banking trojans,virtumonde,dnschange and zlob trojans by heuristic?I catch many variants everyday,but nod32 miss most of them,only detect some of them by signature,some avs detect most of them by generic detection.

  • David Harley

    I suggest that you check out Randy’s later post for more consideration of the hype and misinformation issue. Though I will say that any event based on the premise that modern AV is purely signature based and that signature detection has no place in malware management is certainly either poorly informed or intentionally misleading.

    I’m confused, though. Even the Race to Zero organizers appear to be pro-heuristic, though they don’t seem to know much about the technology. Why are you defending the no-longer tenable, effectively impractical route of total reliance on signature detection? Even if it was viable, it has nothing to with the apparent intentions of the contest organizers.

  • David Harley

    Don,

    I can’t comment on your comparative results, as I don’t know how you’re getting them. NOD32 does catch many variants of the Trojans you mention heuristically: clearly, it won’t catch them all, but I don’t think other mainstream AV vendors do either. Test results vary enormously, of course, but it seems to me that our results are generally pretty good. I’m not sure what you mean by generic detection – not your fault, it’s just a term that has a number of potential meanings – but products that err on the side of caution and don’t mind generating some false positives will get nearer to 100% detection. Historically, though, AV customers have tended to resent FPs almost as much as they resent failures to detect real malware.

Follow Us

Automatically receive new posts via email:

Delivered by FeedBurner

1 article related to:
Hot Topic

2FA

25 May 2008
ESET Virus Radar

Archives

Select month
Copyright © 2014 ESET, All Rights Reserved.