With cyberthreats on the rise, cybersecurity professionals are, unsurprisingly, a hot commodity. According to a recent study by Cybersecurity Ventures, there will be 350% growth in open cybersecurity positions from 2013 to 2021 and it is estimated that, due to the talent crunch, there will be 3.5 million job openings in the industry by 2021.
With that in mind, one of our articles to mark this year’s Antimalware Day features insights from several ESET security researchers. We asked them a series of questions to learn how they built their expertise and to gather their thoughts about the usefulness of formal education versus self-study for becoming a security practitioner.
Learn all by yourself?
While more and more colleges and universities worldwide offer degree programs in computer security, far from all academic institutions have launched such programs. Indeed, many experts in the field are self-taught and/or have acquired their skills through various non-academic courses and certifications.
ESET Distinguished Researcher Aryeh Goretsky, who embarked on a career in IT security in the late 1980s, notes that back then there weren’t actually any courses or certifications specifically focused on computer security.
“Computer security was taught, but it was largely in terms of models for access control, and I think tended to focus more on the concept of securing multiple-user computer systems and users’ access to them being seen as more of an atomic model than as bits and pieces of a larger, more globally-interconnected system. So, the people who were interested in the concept of cybersecurity, of how disparate computers and networks might behave towards each other, kind of had to self-teach. Some of that might come from reading standard computer science and engineering and reference tomes, and learning about computer and network operations, but some of that knowledge came from… shall we say, unofficial and very hands-on experimentation,” he explains.
This is echoed by Marc-Etienne M.Léveillé, a malware researcher at ESET’s lab in Canada who studied software development and computer engineering. “The things I have learned in college or university aren’t directly relevant for my position as a security researcher. I had to learn about many aspects of security on my own,” he says.
This is no doubt also the case with many other experts. There are a multitude of online learning resources these days, including countless massive open online courses (MOOCs) for people with various levels of skills and experience. Also, social networks, notably Twitter, and many other online services, including YouTube, offer great opportunities for people keen to exchange knowledge and experience, ultimately enabling them to learn from one another.
"It is true that the technology and security community is growing and many people are happy to share their knowledge, which allows newcomers to get support from established professionals," says ESET Brazil researcher Daniel Cunha Barbosa. “While self-learning is a possible path and it is how many experts in the industry received their training, it is not the only option,” he adds.
Indeed, while security professionals need to continue to learn on their own and sharpen their skills almost daily, many will agree that there’s an undeniable value in academic training.
“If I had to do it again, I’d still choose to go through college and university. Both gave me the opportunity to meet people and participate in extra-curricular activities such as competitions and security conferences that I enjoyed so much. Some schools also offer internships, which also helps getting started in the field,” says Léveillé.
Formal cybersecurity programs
As online threats have increased dramatically, says Goretsky, so has the desire to standardize the pedagogical aspects of those who would learn to practice cybersecurity.
“I think that overall it is a positive thing that the wide range of cybersecurity education at all levels – not just university – is out there, but I also worry about its quality. We need theorists as much as we need operationalists, and we need those people to be well versed in the building blocks of very complex and complicated systems. A lot of that can be learned, but there’s still a considerable need for being autodidacts who can take what they are learning and build complex structures and ideas with that learning. Do the postgraduate courses and certifications allow people to expand on what they learned in university, or was what they were taught too limited or brittle a framework for them to provide a solid foundation for cybersecurity concepts? I don’t know,” he adds.
Cunha Barbosa adds that “the fact that there are specialization and postgraduate programs on top of degrees is itself a positive thing, since having a degree that gives the future expert broader educational foundations will allow them to learn about aspects of technology that go beyond security and will ultimately help them become better prepared for the challenges”.
In Canada, says Léveillé, colleges and universities are now offering an increasing number of information security programs. “There are now degrees with specialization in computer security. Before, the only option was to do software development or computer networking. Cybersecurity experts need both, with a different approach,” he said, before adding: “There is still a growing need in our industry that we must fill. With the effort from the educational programs, perhaps we will see a more stable situation in a few years.”
A lack of cybersecurity career awareness
Young people often have a hard time deciding what career path to follow, and many finish high school without having a clear idea about what they want to do next. Cybersecurity is often not on the radar of young people because many of them lack enough information about this – arguably less traditional – career path in the first place. Perhaps more important: their assumptions of what a career in cybersecurity actually involves may be very inaccurate.
“The trope or image of the disaffected youth being a hacker and attacking computers (or ‘conducting offensive cyber-operations’) and gaining fame and fortune or ‘full-spectrum information dominance’ is appealing to youth but what’s lacking is a realization that there is much, much more to cybersecurity as well,” says Goretsky.
That said, there is a sense that the general interest in pursuing a career in computer security has been trending higher in recent years, which may ultimately also help remove some of the common misconceptions.
“I see a lot more students interested in computer security than when I was a student myself. Before, it was something you’d have to be interested in on your own. Now there are enterprises and schools that encourage more students to enter the field. I think there’s a growing demand from the industry, perhaps due to the increase in attacks,” says Léveillé.
Turning briefly to the importance of incorporating security from the onset of software development, we asked Léveillé if he thinks that college and university curricula give students enough opportunities to learn security-by-design principles.
“I think that, nowadays, secure development is pretty well taught. However, the problem is that developers need the incentive to apply what they learn. Insecure code should be caught during code review and blocked from being included in the project. If developers see that their code is repeatedly rejected for security reasons, they will pay extra attention and will develop the right ‘reflexes’,” he said.
Conclusion
Given the growing range and constant evolution of threats, there’s clearly an urgent need to train and educate the next generation of IT security professionals and help plug the industry’s talent gap. Options and opportunities abound; at the end of the day, the future is bright for people looking to build a career in cybersecurity.