“Microsoft reports that more than 50% of the problems the company uncovered during its ongoing security push are architectural in nature. Cigital data shows a 60/40 split in favor of architectural flaws.”
– Gary McGraw

Nearly 40% of the 1,000 CWEs (common weakness enumeration) are architectural flaws. Architectural design in secure software is an often overlooked aspect of software development. So much so that the IEEE established a Center for Secure Design and released a document “Avoiding the Top 10 Software Security Design Flaws”.

Static analysis is not enough

The static analysis testing of software source code is necessary but not enough. Architectural flaws are difficult to find via static analysis. Architectural flaws can obscure coding bugs that static analysis might have otherwise detected because of the added complexity. Research from Rich Kazman at the Software Engineering Institute shows that you should focus on identifying design weaknesses to alleviate software bug volume. In identifying structures in the design and codebase that have a high likelihood of containing bugs, hidden dependencies, and structural design flaws, SEI has found that architectural flaws and security bugs are highly correlated (.9 correlation). This is because defective files seldom exist alone in large-scale software systems. They are usually architecturally connected, and their architectural structures exhibit significant design flaws that can propagate bugs among many files.

Example HeartBleed

In his essay “How to Prevent the next HeartBleed” David Wheeler said “OpenSSL uses unnecessarily complex structures, which makes it harder for both humans and machines to review.” There should be a continuous effort to simplify the code. Otherwise, just adding capabilities will slowly increase software complexity. The code should be refactored over time to make it simple and clear while new features are being added. The goal should be code that is “obviously right,” as opposed to code that is so complicated that “I can’t see any problems.”

As we stated above, this is a good example of static analysis techniques not being enough. These techniques that were supposed to find HeartBleed-like defects in OpenSSL were thwarted because the code was too complex. Code that is security-sensitive needs to be “as simple as possible.” Many security experts believe using tools, like Lattix Architect, to detect especially complicated structures and then simplifying those structures is likely to produce more secure software. Simplifying code is a mindset. There needs to be a continuous effort to simplify (refactor) the code. If not, architectural erosion starts to happen as you add capabilities and slowly increase software complexity.

As David stated above, the goal should be code that is obviously right, as opposed to code that is so complicated that you can’t see any errors. I think Rus Cox said it best when talking about HeartBleed and complexity: “Try not to write clever code. Try to write well-organized code. Inevitably, you will write clever, poorly-organized code. If someone comes along asking questions about it, use it as a sign that perhaps the code is probably too clever or not well enough organized. Rewrite it to be simpler and easier to understand.”