Malware simulators are not an appropriate basis for testing product detection capabilities.
Dynamic or on-access Mac testing of AV products is problematical with samples for which Apple has implemented signature detection.
As Mac malware increases in prevalence, testing security software that supplements OS X internal security gets more important and more difficult.
It has happened before, it just happened again and it will happen in the future. It is inevitable! Some company that needs to get some press coverage or public visibility will release yet another statement on how worthless Anti-Virus is, based on its own dysfunctional test. For this “test”, they used the VirusTotal service. VirusTotal
AMTSO's discussions on its own new directions, and updates to its testing-related resources.
A new conference paper discusses whether AMTSO has the credibility to achieve its aims of raising testing standards on its own.
The paper by Julio Canto and myself on the use and misuse of multi-scanner malware-checking resources like VirusTotal is now available.
The slides from an AMTSO-oriented presentation by Larry Bridwell and myself at this year's Virus Bulletin conference, on "'Daze of whine and neuroses (but testing is FINE)" are now available on the Virus Bulletin site are now available here (along with some other excellent presentations). The paper on which the presentation is based is on the ESET white papers
Aryeh Goretsky interviewed, as his paper on Possibly Unwanted Applications is published.
...AMTSO's members have approved a document that offers guidelines to vendors on ways in which they can make it easier to test products accurately....
Summary of and link to an AVAR paper addressing some of the pitfalls of using malware simulation in product testing.
By kind permission of Virus Bulletin, we've already put two of the papers written or co-authored by ESET researchers up on the White Papers page.
No-one believes that AMTSO has all the answers and can “fix” testing all by itself, but it has compiled and generated resources that have made good testing practice far more practicable and understandable. The way for testers (and others) to improve those resources is by talking to and working with AMTSO in a spirit of co-operation: the need for transparency is not going to go away.
Further to my "top ten of top tens" post, I was encouraged by some queries to revisit the “Top Ten Mistakes Made When Evaluating Anti-Malware Software” list quoted by Kevin Townsend here. As it was an AMTSO issue and most of the queries have related to an AMTSO blog post, I've returned to it (and
Well, not exactly, though actually a top ten of top tens isn't a bad idea: apparently, top tens usually attract plenty of readers. As do top fives. twenties etc, though probably not top thirteens. Security Memes a Lot to Me Still, there is a touch of recursion to this post. I got a notification from
Kevin Townsend asks whether AMTSO (the Anti-Malware Testing Standards Organization) is "a serious attempt to clean up anti-malware testing; or just a great big con?" I posted a lengthy response to that on the AMTSO blog here...
Of course, most vendors use in-house testing as a tool for monitoring and improving the capabilities of their own products. However, it’s also being used increasingly as a vehicle for showcasing a company’s own AV products in the best possible light.
AMTSO (the Anti-Malware Testing Standards Organization) has published its review analysis of the Endpoint Security Test that was published by NSS Labs on September 8, 2009. The Review Analysis published on March 17, 2010 compared AMTSO’s Fundamental Principles of Testing to the NSS Labs report and found that it doesn’t comply with two of the nine AMTSO
* http://math.boisestate.edu/gas/mikado/webopera/mk105a.html Kevin Townsend posted a blog in response to a piece by Mike Rothman at Securosis. Mike’s piece on “The Death of Product Reviews” makes some pretty good points about security product reviews in general. Kevin’s piece is more specific to anti-malware. He too makes some useful discussion points about the value or otherwise
Larry Seltzer posted an interesting item yesterday. The article on "SW Tests Show Problems With AV Detections " is based on an "Analyst's Diary" entry called "On the way to better testing." Kaspersky did something rather interesting, though a little suspect. They created 20 perfectly innocent executable files, then created fake detections for ten of them.