Are you still patching buffer overflows? It’s time to switch to a memory safe language (MSL) — and security agencies have released a report with plenty of advice to help make the switch.
Memory safety vulnerabilities are flaws caused by software incorrectly using memory — such as buffer overflows, use-after-free allocations, and other memory bugs — and they’ve long been a weakness in software, despite new languages such as Rust, Python and Java designed to avoid such issues.
The massive Heartbleed attack back in 2014 was pinned on such issues, as was the BadAlloc hack in 2021.
Because of that, the US National Security Agency (NSA) and Cybersecurity and Infrastructure Security Agency (CISA) pulled together a report on the best way to deal with such flaws — calling for developers to switch to MSLs where possible.
“Achieving better memory safety demands language-level protections, library support, robust tooling, and developer training,” the report noted.
“While decades of experience with non-MSLs have shown that secure coding standards and analysis tools can mitigate many risks, they cannot fully eliminate memory safety vulnerabilities inherent to these languages as effectively as the safeguards used in MSL.”
Emilio Pinna, director at SecureFlag, has welcomed the publication of the guidance, but noted it’s been a long time coming. Moreover, enterprise willingness to make the shift hasn’t exactly helped the situation.
“It’s 2025, and yet, we’re still patching buffer overflows like it’s 1995. The newly released CISA and NSA report on Memory Safe Languages is a much-needed wake-up call (again) for the industry,” he said. “After decades of shipping software riddled with memory safety bugs, it’s clear: the problem isn’t new, we’re just stubborn.”
According to the report, two-thirds of CVEs for iOS 12 were caused by memory safety issues, and the researchers pointed to Google data that showed three-quarters of in-the-wild exploits used memory safety vulnerabilities.
Meanwhile, Microsoft data showed that 70% of CVEs related to its own solutions were attributed to memory safety in 2016, though that’s since fallen to 50%.
“That’s not a bug, that’s a systemic design flaw in how we write code,” added Pinna. “And yes, that means the C++ codebase your team inherited might just be a beautifully commented landmine. Young developers today are stuck fighting the vulnerabilities their coding ancestors thought they could ‘just be careful’ around.”
Time to make the switch to memory safe languages
CISA advice has long been advising to shift to MSLs, which are any language designed with security by design mechanisms that prevent bugs such as memory management and bounds checking.
However, the agencies admit in the report that it’s not always possible to do so, such as with legacy systems and working with existing codebases and third-party libraries.
“Reducing memory safety vulnerabilities requires understanding when MSLs are appropriate, knowing how to adopt them effectively, and recognizing where non-MSLs remain practical necessities,” the report adds.
Pinna added that rewriting systems wouldn’t be cheap or easy, but said there are costs associated with fighting zero-day flaws, too.
“The push toward Memory Safe Languages (MSLs) like Rust, Swift, and modern iterations of Go or even Java is a necessity now more than ever. You don’t get brownie points for reinventing the wheel with manual memory management anymore,” he said.
“As the CISA/NSA guidance makes clear, we’re at a crossroads: keep clinging to legacy languages with known safety issues, or modernize for the future.”
What can be done?
So what can be done on this front? First and foremost, companies should prioritize the adoption of MSLs in new projects where possible and consider incremental adoption for existing code
Similarly, writing new components and features in MSLs for high risk areas, such as networking facing services, is also advised. One key to making the shift is ensuring robust APIs to allow communication between MSL and non-MSL components of systems.
This, researchers said, will ensure security while maintaining interoperability.
When developers must use languages that aren’t deemed memory safe by default, they should ensure they understand the risks, with the report calling for better training on memory safe flaws and “secure coding” mitigations in training courses and university programs.
Thomas Richards, infrastructure security practice director at Black Duck, echoed Pinna’s comments, adding that the shift away from unsafe programming languages is an “excellent step toward reducing certain classes of software vulnerabilities”.
This won’t be a silver bullet to help prevent vulnerabilities in the future, however.
“This should not be taken as assuming all code written in MSLs is fully secure; they still have insecure coding practices that could create other vulnerabilities,” he warned.
“Developers should always be aware of secure coding guidelines for the specific language they are using to reduce the risk of vulnerabilities in their code and ensure uncompromised trust in their software.”
Beyond that, the report calls for companies to actively include MSL expertise in job requirements. “This action will signal demand for these skills that is felt not only throughout the job market, but also in studies and in future funding of academic and certification programs,” the report noted.
MORE FROM ITPRO
Source link