One of the Biggest Mysteries in Cybersecurity: Why Don’t We Demand This?

Roger Grimes | Aug 29, 2025

blog.knowbe4.comhubfsSocial Image RepositoryEvangelist Blog Social GraphicsEvangelists-Roger Grimes-1“The problem is much, much worse than most people acknowledge.”

One of the biggest enduring mysteries for me in cybersecurity is why most cybersecurity curricula don’t teach secure coding to programmers.

I have no real answers, only speculation.

Secure coding has many other names, including secure by design, security development lifecycle, but it means that the humans involved in the development of software, services, and firmware, are given training in how to avoid inserting common security vulnerabilities. 

Common vulnerability types include buffer overflows, insecure input handling, hard-coding authentication credentials, directory traversal errors, cross-site scripting, etc. The OWASP Top Ten list is a great list of some of the most common issues. 

Some programming issues, like buffer overflows, can be solved by using “memory type safe” programming languages when possible and practical. Non-memory type safe languages are involved in up to 70% of commonly exploited vulnerabilities.

Secure coding means giving programmers and others in the development stream, old or new to the profession, education about those common vulnerabilities and how to avoid them. Like any security challenge, it takes a combination of education, policies and tools. And like most computer security challenges, education is often the weakest link when the tools aren’t more protective. 

I’ve tried for years to get universities and college curricula to add secure coding instruction as a required part of their curriculum or as a separate required dedicated class. It seems like a no-brainer. And yet, almost no programming curriculum does. There are a few, but not many.

And let me say that I don’t teach programming for a living. I’m looking from the outside in. 

But I often reach out to programming teachers and to people who develop and control programming curricula for a living. Surprisingly, most long-time programming instructors DO NOT agree with me. When I mention that they should teach secure coding as part of their curriculum, most push back and disagree, offering only blockers to what seems like common sense and long overdue. You would think that the people charged with creating the world’s latest programmers would have a personal desire to teach secure coding, but you would be wrong. 

Some do. Most do not. The professors and instructors who teach programming classes and agree that we should teach secure coding are like unicorns. They are few, often isolated, and most face pushbacks from their institutions and the powers that be that control what they teach.

Most programming instructors or curriculum creators simply don’t know or care about the subject. Most are indifferent. But if I broach the subject and ask them to consider teaching secure coding, the concept seems so new and outlandish to them that most have some guttural reaction that is simply adversarial. Their default reaction always befuddles me. 

We are in the middle of a huge crisis that causes hundreds of billions of dollars in damage…occasionally kills people…and I can’t get the teachers of the “paramedics” in the field to agree to teach them to apply tourniquets.

The few teachers who might somewhat agree with me often reply that there simply isn’t enough room in the curriculum to add secure coding in the available instruction time. They say there is nothing they can push or replace that is less important than teaching secure coding skills. 

I don’t care what you have to push or replace, you simply SHOULD or MUST replace something in your programming curriculum with secure coding or extend your curriculum hours!!

It is outright negligence to the world that you don’t.

To me, it’s like teaching engineers how to build buildings, roads and bridges, but not requiring them to learn how to do it safely. Can you imagine?

Every week, I read about some new exploit being used by bad people to illegally compromise  companies and organizations, usually as reported by the news media or in CISA’s Known Exploited Vulnerability Catalog. Usually, it’s many newly used vulnerabilities a week. Last year we had over 40,000 separately publicly announced software and firmware vulnerabilities. That’s over 109 every day, day-after-day. Every year is a record year. This year is on record to beat last year. 

According to Google Mandiant, exploited software and firmware vulnerabilities likely accounted for at least 33% of successful compromises. That figure is from a few years ago, and I don’t have good data, but anecdotally, I think that figure is likely closer to 40% today. 

So, software and firmware vulnerabilities account for 33% - 40% of all successful digital compromises, resulting in hundreds of billions of dollars in damages (some studies say trillions) and yet we don’t teach our programmers how to avoid even the simplest and most common programming security vulnerabilities??

Every month, I read about some popular program or device being exploited due to super simple and common types of vulnerabilities, like directory traversal attacks, hard coded credentials, buffer overflows, or SQL injection attacks. This is stuff that we have known about for decades and yet our programmers, despite the best processes and tools involved in that programming process, don’t know not to program those mistakes.

Every time I see a new common vulnerability, like a directory traversal attack or hard-code credential vulnerability announced, my long running joke is, “Relax, just know that today hundreds of other programmers are putting those exact same vulnerabilities in software and firmware they are coding today!”

It’s the truth. The problem is much, much worse than most people acknowledge.

There are hundreds of thousands of programs that protect critical infrastructure, protect our money, protect our lives, with those same glaring flaws and we don’t know about it. It’s just that someone didn’t check for or find them yet. Somewhere today there are hundreds to thousands of programmers making the same mistakes and likely many malicious hackers taking advantage of those same exploits and we just don’t know about it. 

We know that our programmers are putting in hundreds to thousands of commonly known vulnerabilities every day in our software and that it leads to billions of dollars of damage and even lost lives and we can’t as a society do the simplest of common-sense defense – educating those programmers in secure coding. 

Additionally, almost no programmer or vendor threat models their solution. Almost no product delivered by a cybersecurity company that is supposedly going to stop hackers and their malware creations has been threat modeled. I know. 

I get reached out to by hundreds of cybersecurity vendors a year to review and promote their “fantastic, great!!” product. I ask, “Have you threat modeled it and can you share your threat model?” I have always…for the entirety of my 37-years, heard crickets in reply.

Almost no company does it. Not even computer security vendors.

And when I think about why most programming curricula don’t teach secure coding and threat modeling, I realize that part of the problem - and another huge unanswered question - is that employers don’t ask for programmers they hire to have secure coding skills. 

In fact, the only company in the world that I’m aware of that asks prospective programming hires to have secure coding skills is my employer, KnowBe4 (see example KnowBe4 job description excerpt below):

Over the years, I’ve reached out to other companies, large and small, and asked their hiring managers to add ‘secure coding skills’ to their job descriptions. And despite a few of them saying they agreed, as far as I know, no other single company has ever done it.

The largest software and service companies in the world, like Microsoft and Google, have hundreds of vulnerabilities a year, sometimes resulting in hundreds of thousands of customer compromises, and they don’t require that their programmers have secure coding skills before they hire them. The latest hot companies, like OpenAI, Anthropic, Palantir, Tesla and Salesforce, don’t require that programmers they hire have secure coding skills before they get hired.

Not a single other company besides KnowBe4 asks that their programmers have secure coding skills. Why?

I’m truly bewildered. We have this huge cybersecurity problem that can be at least partially mitigated with secure coding education, and yet programming curricula don’t teach it, and employers (who are directly impacted by it) don’t require it. Why??

Why am I nearly the lone voice in the wilderness, crying out for these two things (i.e., teaching secure coding and requiring it when hiring)? I’ve tried for years to get it included in dozens of curricula. I’ve tried to have it included in national initiatives. I’ve failed every time, usually because the very people involved in the curriculum or initiative are the ones fighting me on it.

Why is it so, so hard to get very common-sense guidance that would benefit every person in the world, included? It’s one of the single biggest questions I have in cybersecurity. And until we fix these two problems, I will remain befuddled.


Request A Demo: Security Awareness Training

products-KB4SAT6-2-1New-school Security Awareness Training is critical to enabling you and your IT staff to connect with users and help them make the right security decisions all of the time. This isn't a one and done deal, continuous training and simulated phishing are both needed to mobilize users as your last line of defense. Request your one-on-one demo of KnowBe4's security awareness training and simulated phishing platform and see how easy it can be!

Request a Demo!

PS: Don't like to click on redirected buttons? Cut & Paste this link in your browser:

https://www.knowbe4.com/kmsat-security-awareness-training-demo



Subscribe to Our Blog


Gartner Magic Quadrant




Get the latest insights, trends and security news. Subscribe to CyberheistNews.