Verizon has just done something rather brave. The company has issued a report on "ICSA Labs Product Assurance Report" (http://www.icsalabs.com/sites/default/files/WP14117.20Yrs-ICSA%20Labs.pdf) that talks about the difficulties that most products have in meeting the requirements of ICSA Labs certification.

Why is it brave? Because those companies provide ICSALabs with a healthy income, and might therefore be a little upset to have it suggested that some of them need to be nursed through the certification process?  Well, I don't think security companies see it that way, though you might think that was the whole point, on a superficial reading of some of the news items inspired by this item.

John Leyden says in The Register that "Most security products not up to scratch. But most of all, you've let yourself down" (http://www.theregister.co.uk/2009/11/17/security_kit_testing_fail/)

Dan Raywood says in SC Magazine that "Over three quarters of security products fail an initial test and do not adequately perform." (http://www.scmagazineuk.com/over-three-quarters-of-security-products-fail-an-initial-test-and-do-not-adequately-perform/article/157883/)

Thomas Claburn says in InformationWeek that "Most Security Products Fail Initial Certification Tests. A study based on the testing of thousands of security products over 20 years finds that most require several rounds of testing before achieving certification." And I think that's closer to the real process
http://www.informationweek.com/news/security/vulnerabilities/showArticle.jhtml?articleID=221800223&cid=alert_art_sec_d_m

To look at it the issue in terms of short term failure would be to miss the point, though. There has been a certain amount of criticism of ICSA Labs, among others, in the past,  because it gives companies with products under test latitude when it comes to re-testing and re-certification. (And that's where the bravery comes in...) That latitude runs contrary to the way that some testers work, stress-testing the product under test by "tricking" it into demonstrating its weaknesses rather than coaxing it into demonstrating its capability. [1] But that's precisely why it's a Good Thing.

ICSA Labs certification isn't just about saying whether a product is "good" or "bad": I'd argue that any detection-oriented test that is entirely focused on that is probably not fully aware of the implementational difficulties and margin for error in even the best detection testing in the current threatscape. The value of the ICSA Labs certification process lies not just in the fact that it's tough (and it is: apparently, only 4% of tested products pass during the first testing cycle) but in the fact that it's a collaborative process that allows and encourages the vendor to work on the product until it passes, and then requires us to maintain those standards over time.

Read the report: it's about a lot more than product failure, and I can think of other testing and certification labs could learn from it....

[1] "Antimalware Evaluation and Testing" (Harley and Lee) in "AVIEN Malware Defense Guide" (Ed. Harley, Syngress, 2007)

David Harley
Director of Malware Intelligence