Cybersecurity has a people problem.
No, that doesn’t mean cybersecurity professionals are rude to everyone else (we happen to be very nice people, I’ll have you know). The security industry is facing a very different kind of problem—not enough people are doing it.
It’s not as though security incidents are on the decline, either. Between January and August 2021 alone, there have been over 95 significant cyber attacks that have compromised multinational corporations and government organisations. And that doesn’t even begin to consider the less widespread or high-profile incidents. Even if you’re not a billion-dollar Fortune 500 company, you’re still very much on the radar.
But there appears to be an odd disconnect between the kind of rampant cyber-abuse we’re enduring on a day-to-day basis, and the response from major educational institutions that train the developers and engineers building the apps and networks we use. Rather, the lack of a response.
To put it more bluntly, we need to start asking, “Why don’t universities teach developers how to build software securely?”
To understand why this is an issue, we need to understand what the problem is in the first place. According to a 2021 survey by CSO Online, “57% of respondents said that their organisations have been impacted by the global cybersecurity skills shortage.”
This isn’t a new problem, either. Even back in 2016, “46 percent of organisations said they have a problematic shortage of cybersecurity skills.” Just by looking at the numbers, this trend seems to be getting worse.
But the real problem, it would seem, isn’t being addressed here at all. As increasingly many organisations have come to depend on software and online connectivity, the demand for software developers has skyrocketed, along with a commensurate need for security professionals.
And there’s a lot of demand for that latter group — the median salary of software developers is around $100,000, while security engineers can expect as much as $200,000. Despite this, there are a reported 4 million unfilled cybersecurity jobs around the world.
The short answer? Universities don’t train computer science students in security.
Here’s the root of the issue: university is the last place people train in a controlled environment at a large scale. Once people graduate and enter the workforce, their opportunities to focus on training drop sharply.
University is the most efficient and scalable way to teach people new skills, and computer science is the single largest undergraduate major at most schools, which means tons of software developers entering the job market each year.
But when these graduates don’t learn security, you’re getting an influx of people who know how to build software, but not how to do it securely.
“Think about how people learn to build airplanes,” says Jonathan Knudsen, senior security strategist at Synopsys. “Safety is part of every aspect of aviation — aerospace engineers don’t just learn how to make something fly, but how to make something that flies safely.”
“Software education is almost exactly the opposite: Students learn how to make things work, how to make them work faster, how to make them work more efficiently, but security is often neglected or ignored.”
Of the top 24 undergraduate programs in computer science in the US, only one university—UC San Diego—lists a security course as a core requirement. Everywhere else, students can obtain their degree without a single lesson in security.
This is where the application security industry faces its biggest challenge—but also its most significant opportunity—to make a dent in that dreaded security skill gap.
We want to show you how universities can completely transform the AppSec landscape, and it begins with these 5 steps.
Universities have the potential to be the biggest agents of change in the domain of application security, but they also tend to be the most susceptible to inertia and stagnation in a rapidly evolving software landscape.
If educational institutions are committed to offering the most technologically relevant and cutting-edge curricula possible, they have a responsibility to make security a part of their computer science programs at every level.
But we’ve all got to start somewhere. These are the 5 most important things universities can start doing to bring their software development programs up to speed.
Perhaps the biggest hurdle to a more widespread adoption of security training is the notion that it’s less important than ‘core’ programming skills. In reality, security is as important a part of software development as any. Historically, AppSec has suffered from being viewed as a secondary or ancillary part of the main development process.
Universities need to work on effecting a cultural change among students to view security not as a non-essential ‘nice-to-have’, but a necessary building block of good product engineering.
There’s a common misconception that a packed computer science program leaves no room for a course on secure programming.
“This statement assumes that a separate course is required,” says Matt Bishop, Professor of Computer Science at UC Davis. “But introductory and second programming classes teach the basics of secure programming: how to check for bad inputs, to do bounds checking, check return values, catch and handle exceptions, and so forth.”
The problem is when advanced classes assume that students are writing secure code without checking it. If the curriculum shifts its focus from the class material to pay more attention to the mechanics of programming, students will learn to be more mindful of their code as they write it, regardless of what level they are.
When security becomes an organic part of the coding process, it will become second nature for the developer to “clean up after themselves.”
Training that doesn’t have real-world application is rarely taken seriously (who even remembers most of what we learnt in school?). For training to really have a long-term impact on the way students treat application security, the techniques they learn need to be aligned with real techniques used by professionals.
Students are far more likely to be interested in a class where they learn how to defend against the OWASP Top 10 vulnerabilities, than a class that explains the history of security vulnerabilities over the years.
You know what’s more effective than showing students how to secure Container Breakout attacks? Letting them try it for themselves. Application security is one of those skills that needs to be taught practically, using labs, cyber-ranges, and exercises.
In fact, hands-on learning has been found to be 50% more effective than traditional methods, because the higher level of involvement activates the motor and sensory parts of the brain. Students quantifiably retain more information and are able to implement it more effectively when they learn by doing rather than watching.
A large portion of vulnerabilities don’t require advanced knowledge of security to be fixed. Security researcher Jack Cable says, “Given that the majority of breaches can be readily prevented using industry best practices, a small amount of knowledge can go a long way.”
In fact, 95% of security vulnerabilities can be readily prevented. This points to a clear lack of even basic security knowledge among software developers, a void universities can fill without significant overhaul to their existing curricula.
By implementing a security best practices class across computer science programs, universities can effectively create a baseline of application security knowledge that students can build on later in their careers. But more importantly, it ensures all software developers are equipped with at least the most basic toolkit to handle security incidents at their workplace.