Using Automated Static Analysis Tools for Code Reviews

August 20, 2013
Software Assurance: Post by Drew Buttner

In my first post on Software Assurance (SwA), I mentioned how our secure code reviews leverage both manual code inspection and automated analysis tools to highlight potential security problems.

In this post, I'll talk about the pluses and minuses of using automated static analysis tools for secure code reviews and discuss how they've affected MITRE's approach to incorporating these types of tools.

What Are Automated Static Analysis Tools?

These tools scan source code to identify weaknesses that can lead to vulnerabilities. The weaknesses are potential security problems, such as those found in the CWE/SANS Top 25 Most Dangerous Software Errors. You can find a sampling of available tools in NIST's collection of Source Code Security Analyzers.

Automated Tools Are Not a Panacea

When deciding whether to include an automated tool in your SwA arsenal, it's important to realize that a tool will not provide a complete and comprehensive analysis. A tool may find certain types of flaws, but may completely miss others. Even though they are not perfect, automated tools can add significant value; their strengths, limitations, and costs should be weighed against the criticality of the code to be reviewed and compared with manual code reviews.

The following information breaks down what automated tools can provide (strengths), considerations to make when selecting a tool (limitations), and how cost can have a bearing in not only selecting a tool but also its potential benefit.

Strengths

  • Volume: Automated tools can examine an entire large code base that would otherwise not be practical to manually review.
  • Speed: An entire application can be reviewed much faster than with a manual code review.

Limitations

  • Breadth: Automated tools only work well for certain software flaw categories.
  • Coverage: Automated tools tend to be somewhat specialized in their ability to detect software flaws.
  • False Positives: Tools typically generate a fair number of suspected flaws that, when carefully reviewed by engineers, turn out not to be actuals flaws.

Costs and Hidden Costs

  • Price: The better tools cost $50-$100K with annual maintenance costs ranging from 10-20%.
  • Training: Engineers must be trained to interpret results. Typically 1 staff-month per engineer is needed to become proficient.
  • Time: Processing the results of a scan takes time, with 50% of the time spent reviewing what turn out to be false positives.

Strengths

A manual review of an application's source code can take days if not weeks depending on the size of the code base. At MITRE, we have found that a 10,000 line application takes no less than 3 days (leveraging 2 engineers) to properly review manually. An automated static analysis tool can scan that same application in just hours. For larger applications, our manual reviews are forced to target only high risk functions, meaning much of the application is never looked at. A tool, however, will scan every line of code, and do so in a fraction of the time.

In addition to volume and speed strengths, automated tools generally work well in detecting certain categories of flaws, including:

  • Memory Corruption and Buffer Overflow type weaknesses
  • SQL Injection
  • Cross-Site Scripting

Limitations

Although a tool can scan every line and do so in a fraction of the time than a human, there are limitations that often are not recognized. Current automated analysis tools don't find every issue, but rather specialize in certain categories or types of flaws. Flaws related to application-specific business logic are beyond the ability of automated tools since they don't understand how the logic is supposed to work. For example, consider a proprietary authentication routine that is supposed to allow access only to specific groups. Without knowing which groups are intended to have access, it becomes impossible for an automated tool to know if the code has been implemented correctly.

Complex logic spanning multiple functions and source code files also challenge tools as they exponentially increase the potential paths to be checked. The complexity also increases the demand on system resources and tools often need to cut analysis short to avoid running out of memory.

Costs and Hidden Costs

It is widely understood that good automated static analysis tools carry a relatively high purchase price. However, a common misconception is that once purchased, they are "free" to use. Unfortunately, it costs engineers time to learn and then make sense of the tool's results. At MITRE, we have found that it often takes the same amount of time for an engineer to properly address all of a tool's findings (and produce a usable report for developers) as it takes to do a manual review of the application.

MITRE's Tool Approach

To help address the challenge of finding a single tool to effectively provide a wide range of coverage across the entire software flaw space, MITRE's approach is to use two complementary tools geared toward detecting different types of flaws.