EICAR (formerly known as the European Institute for Computer Anti-virus Research, though that title hasn't been used for a good while) is best known for its yearly conference and for the EICAR test file, which can be used as an installation check with most anti-virus programs to check that it's installed and active.

Sadly, I've been in this business long enough to remember when the EICAR test file was not detected by most AV programs: one or two security vendors had test files with similar functionality, but they were detected as test files (and as pseudo-malware) only by their products.

Even more sadly, I've seen a hair-raising quantity of tests that use the EICAR file inappropriately. Not to mention, of course, other 'simulators' such as the Rosenthal utilities (I can hear the groans from here...) and Spycar. Sarah Gordon did a major paper on "Are good virus simulators still a bad idea?" back in the 1990s (Network Security, Volume 1996, Issue 9, September 1996, Pages 7-13) which drew on Luca Sambucci's Virus Simulator Test, published by ICARO (the Italian Computer Antivirus Research Organization) in 1994.

At the EICAR 2010 conference in Paris, an interesting student paper* was presented that used the EICAR file to make some points about the ways in which AV software works (or is presumed to work). Eddy Willems and I both mentioned it in articles for the June 2010 edition of Virus Bulletin. (Mine is available here, by the way.)

"Test Files and Product Evaluation: the Case for and against Malware Simulation" is a paper presented at the recent AVAR conference by Eddy Willems, Lysa Myers and myself: we were all at the EICAR conference mentioned above and figured that it was a good moment to combine our experience of testing, EICAR, AMTSO and the anti-malware industry to cover the developments that had taken place since Sarah's paper. Here's the abstract:

Any researcher with the most modest public profile is used to being asked for virus samples. Traditionally, we’ve advocated the use of alternatives, especially the EICAR test file, to anyone who doesn’t have access to malware through mainstream, trusted channels, as a way of simulating malware behaviour without the attendant risks of genuinely malicious behaviour. But is the EICAR file really suitable for the range of scenarios for which it is prescribed?

Of course, it’s always been difficult for aspirant testers outside the mainstream circle of trust to tap into the sample repositories and exchange mechanisms that benefit the major testers. However, as the influence of AMTSO on testing-related issues has increased, it has resulted in a move away from static testing to some form of dynamic testing, it’s become even more difficult for such testers to establish the connections in the industry that would make it easier for them to tap into the evolving sample and information exchanges that the big players in the industry are formulating, or the knowledge and experience that would enable them to explore more realistic alternative methodologies for trapping and validating samples.

Ironically, while dynamic testing offers, in the abstract, something closer to real-world testing, there’s been increased and unanticipated use of the EICAR file in testing contexts for which it was not designed - or appropriate, being a classic survivor of the strictest form of signature detection.

This paper draws on our combined experience of AV research, testing, and EICAR directorship to look at the genesis and development of the EICAR test file, from the rationalization of product-specific installation test files, through virus/malware simulation software, through its re-specification in 2003, to its recent rebirth as a test tool. Most importantly, it discusses, with examples, the separation in functionality between its use as an installation check and when it is (and, more often, isn’t) feasible to use it as a limited test tool, primarily as a check on detection functionality.

[Added 17th January 2017: here is a link to a related AMTSO guideline paper put together subsequently.]

David Harley
ESET Senior Research Fellow

*Jonathan Dechaux, Jean-Paul FIzaine, Romain Griveau and Kanza Jaafar, “New trends in Malware Sample-Independent AV Evaluation Techniques with Respect to Document Malware”, 19th EICAR Annual Conference Proceedings 2010