The technology market has evolved in such a way over the last few decades that not only is a certain level of defects in software and hardware accepted, it’s expected, and the responsibility for using those technologies safely and compensating for those flaws has somehow landed on customers. Jen Easterly wants to change that. Now.
“We’ve normalized the fact that technology products are released to market with hundreds or thousands of defects when that would be unacceptable in any other industry. We’ve normalized the fact that security is relegated to IT people or the CISO in enterprises, but few have the ability to incentivize the changes that would help,” Easterly, the director of the Cybersecurity and Infrastructure Security Agency (CISA), said during a speech at Carnegie Mellon University Monday.
“This pattern of ignoring increasingly severe problems is a signal of the normalization of deviant behaviors. Collectively, we’ve become accustomed to a deviance of what we’d all think would be a norm for manufacturers, which is to create a safe product.”
The problem is essentially twofold. The first part is that, despite decades of research and warnings from software security experts, many technology providers do not have the practices and norms in place to develop products securely from the beginning. The premium often is on time-to-market and adding features, which can push security and reliability concerns much farther down the priority list, especially if they’re seen as obstacles to meeting deadlines or ship dates. Part of this also comes from the lack of formal education many developers have in secure coding practices, an issue that has been of concern for many years.
“We need security designed in from the beginning, right out of the box, without added cost. Memory safe language, secure coding practices, the attributes of secure coding by design will evolve over time,” Easterly said.
“Strong security has to be a standard feature of virtually every technology product. The fact that we’ve accepted a monthly patch Tuesday as normal is more evidence of our acceptance of operating at the accident boundary.”
Building security in from the earliest stages of the product development lifecycle is a simple idea, but it’s not easy to execute and requires considerable investment from the company in terms of both time and resources. Many large technology companies have formal secure software development life cycle (SDLC) programs that define processes for making security a core part of the product development process, but that’s not the norm for even mid-tier technology providers, let alone small companies. Finding developers and engineers with secure coding and development training is not a simple matter, nor is preventing the introduction of vulnerabilities into code in the first place.
“We must applaud and encourage progress while recognizing the need to do more. This threat environment is only getting more and more complex.”
That’s where the shift to developing in memory safe languages–those that can prevent common memory safety vulnerabilities–comes in. Moving to languages such as Rust, Go, and others that are considered memory safe can make a big difference in the security of software.
“We need to make memory safe languages ubiquitous in colleges globally. Make a security course a graduation requirement, make it part of every class,” Easterly said.
The second part of the problem, which in many ways derives from the first, is that much of the responsibility for addressing security problems in software and hardware falls on customers and consumers. A security vulnerability that leads to a compromise of a system or a breach of an organization often is seen as the fault of the person or organization using the product, rather than that of the manufacturer.
“We find ourselves blaming the user for failures of technology. Manufacturers are using us, the users, as crash test dummies and the situation isn’t sustainable. We need a new model in which responsibility for technology safety is shared based upon an organization’s ability to bear the burden. A model that emphasizes collaboration as a prerequisite for self preservation and a recognition that a cyber threat to one organization is a safety threat to all organizations,” Easterly said.
“This would begin with tech products that put the safety of the customer first, rebalancing risk onto organizations like major tech manufacturers much more suited to managing cyber risks.”
Addressing these issues requires a long-term approach and not simply a new set of regulations or industry standards. Easterly said it will require the leaders of technology companies to focus explicitly on building safer products, provide transparency into their development and manufacturing processes, and an understanding that the burden of safety should not fall solely (or even mainly) on customers. Part of that transparency commitment should be the use of software bills of materials (SBOM) to provide insight into what components and libraries a given product includes, she said.
“We must applaud and encourage progress while recognizing the need to do more. This threat environment is only getting more and more complex,” Easterly said.