Further to my "top ten of top tens" post, I was encouraged by some queries to revisit the “Top Ten Mistakes Made When Evaluating Anti-Malware Software” list quoted by Kevin Townsend here. As it was an AMTSO issue and most of the queries have related to an AMTSO blog post, I've returned to it (and
Well, not exactly, though actually a top ten of top tens isn't a bad idea: apparently, top tens usually attract plenty of readers. As do top fives. twenties etc, though probably not top thirteens. Security Memes a Lot to Me Still, there is a touch of recursion to this post. I got a notification from
Of course, most vendors use in-house testing as a tool for monitoring and improving the capabilities of their own products. However, it’s also being used increasingly as a vehicle for showcasing a company’s own AV products in the best possible light.
AMTSO (the Anti-Malware Testing Standards Organization) has published its review analysis of the Endpoint Security Test that was published by NSS Labs on September 8, 2009. The Review Analysis published on March 17, 2010 compared AMTSO’s Fundamental Principles of Testing to the NSS Labs report and found that it doesn’t comply with two of the nine AMTSO
* http://math.boisestate.edu/gas/mikado/webopera/mk105a.html Kevin Townsend posted a blog in response to a piece by Mike Rothman at Securosis. Mike’s piece on “The Death of Product Reviews” makes some pretty good points about security product reviews in general. Kevin’s piece is more specific to anti-malware. He too makes some useful discussion points about the value or otherwise
Larry Seltzer posted an interesting item yesterday. The article on "SW Tests Show Problems With AV Detections " is based on an "Analyst's Diary" entry called "On the way to better testing." Kaspersky did something rather interesting, though a little suspect. They created 20 perfectly innocent executable files, then created fake detections for ten of them.
We have just come across a Buyer’s Guide published in the March 2010 issue of PC Pro Magazine, authored by Darien Graham-Smith, PC Pro’s Technical Editor. The author aims to give advice on which anti-malware product is the best for consumer users, and we acknowledge that the article includes some good thoughts and advice, but
The Hype-free blog at http://hype-free.blogspot.com/2009/12/congratulation-to-av-comparatives.html yesterday mentioned the latest AV-Comparatives round of test reports, including: The whole product dynamic test at http://www.av-comparatives.org/comparativesreviews/dynamic-tests The December 2009 performance test at http://www.av-comparatives.org/comparativesreviews/performance-tests The summary reports at http://www.av-comparatives.org/comparativesreviews/main-tests/summary-reports I have a pretty jaundiced view of testing organizations in general: after all, I see some pretty awful tests proclaimed by the
I recently made a presentation to the Special Interest Group in Software Testing of the BCS Chartered Institute for IT (formerly better known as the British Computer Society). The PDF version of the slide deck is now up at: http://www.eset.com/download/whitepapers/Curious_Act_Of_Anti_Malware_Testing.pdf The presentation outlines some of the problems with anti-malware testing and summarizes the mission and principles of