As Mac malware increases in prevalence, testing security software that supplements OS X internal security gets more important and more difficult.
It has happened before, it just happened again and it will happen in the future. It is inevitable! Some company that needs to get some press coverage or public visibility will release yet another statement on how worthless Anti-Virus is, based on its own dysfunctional test. For this “test”, they used the VirusTotal service. VirusTotal
AMTSO’s discussions on its own new directions, and updates to its testing-related resources.
The slides from an AMTSO-oriented presentation by Larry Bridwell and myself at this year's Virus Bulletin conference, on "'Daze of whine and neuroses (but testing is FINE)" are now available on the Virus Bulletin site are now available here (along with some other excellent presentations). The paper on which the presentation is based is on the ESET white papers
“Test Files and Product Evaluation: the Case for and against Malware Simulation” is a paper presented at the recent AVAR conference by Eddy Willems, Lysa Myers and myself: we were all at the EICAR conference and figured that it was a good moment to combine our experience of testing, EICAR, AMTSO and the anti-malware industry to cover the developments that had taken place since Sarah’s paper.
No-one believes that AMTSO has all the answers and can “fix” testing all by itself, but it has compiled and generated resources that have made good testing practice far more practicable and understandable. The way for testers (and others) to improve those resources is by talking to and working with AMTSO in a spirit of co-operation: the need for transparency is not going to go away.
Further to my "top ten of top tens" post, I was encouraged by some queries to revisit the “Top Ten Mistakes Made When Evaluating Anti-Malware Software” list quoted by Kevin Townsend here. As it was an AMTSO issue and most of the queries have related to an AMTSO blog post, I've returned to it (and
Well, not exactly, though actually a top ten of top tens isn't a bad idea: apparently, top tens usually attract plenty of readers. As do top fives. twenties etc, though probably not top thirteens. Security Memes a Lot to Me Still, there is a touch of recursion to this post. I got a notification from
Of course, most vendors use in-house testing as a tool for monitoring and improving the capabilities of their own products. However, it’s also being used increasingly as a vehicle for showcasing a company’s own AV products in the best possible light.
AMTSO (the Anti-Malware Testing Standards Organization) has published its review analysis of the Endpoint Security Test that was published by NSS Labs on September 8, 2009. The Review Analysis published on March 17, 2010 compared AMTSO’s Fundamental Principles of Testing to the NSS Labs report and found that it doesn’t comply with two of the nine AMTSO
* http://math.boisestate.edu/gas/mikado/webopera/mk105a.html Kevin Townsend posted a blog in response to a piece by Mike Rothman at Securosis. Mike’s piece on “The Death of Product Reviews” makes some pretty good points about security product reviews in general. Kevin’s piece is more specific to anti-malware. He too makes some useful discussion points about the value or otherwise
Larry Seltzer posted an interesting item yesterday. The article on "SW Tests Show Problems With AV Detections " is based on an "Analyst's Diary" entry called "On the way to better testing." Kaspersky did something rather interesting, though a little suspect. They created 20 perfectly innocent executable files, then created fake detections for ten of them.
We have just come across a Buyer’s Guide published in the March 2010 issue of PC Pro Magazine, authored by Darien Graham-Smith, PC Pro’s Technical Editor. The author aims to give advice on which anti-malware product is the best for consumer users, and we acknowledge that the article includes some good thoughts and advice, but