HP’s acquisition of Fortify this week (which I am sure will make some people at Kleiner Perkins happy) has made me think some more about static analysis and the state of the technology.
I'm not a technical expert in static analysis, and I have only a superficial understanding of the computer science behind static analysis and the ongoing research in this area. So this is based more on a customer’s view of developments in static analysis over the past few years.
One of the static analysis tools that we rely on is Findbugs, an open source static analyzer for Java. I’ve said it before and I’ll say it again: if you are writing code in Java and not using Findbugs, you are a fool. It’s free, it’s easy to install and setup, and it works. However, the Findbugs detection engine was last updated almost a year ago (Aug 2009) and the updates leading up to that were mostly minor improvements to checkers. Bill Pugh, the University of Maryland professor who leads the development of Findbugs, admitted back in 2009 at JavaOne that the analysis engine “continues to improve, but only incrementally”. At that time, he said that the focus for future development work was to provide improved management and workflow capabilities and tools integration for large-scale software development using Findbugs. This was driven by work that Professor Pugh did while a visiting scientist at Google, a major user of the technology.
Anyways, these improvements look to be about a year late (originally to be available in the fall of 2009) and there hasn’t been any updates on product plans on the findbugs mailing list for a while. But the direction (less work on the engine, more on tools) seems to be consistent with other static analysis suppliers. In the meantime, Findbugs continues to do the job – and hey, it’s free, who am I going to complain to even if I wanted to?
Coverity, another tool that we have used for a while, has reduced its investment in the core static analysis engine over the past couple of years. As I noted in a post last year, after receiving venture capital investment, Coverity went on a buying spree to fill out its product portfolio, and since then has focused on integrating these acquisitions and repackaging and re-branding everything. In the last major release (5.0) of its technology earlier this year, Coverity put a lot of effort into packaging, improving management and workflow capabilities and documentation – but there were only minor tweaks to the core static analysis engine. All nice and ready for the enterprise, all dressed up and ready for somebody to make them an offer. But with IBM and HP already matched up, who’s left to buy them? Symantec maybe?
Over the last year, Fortify has been filling out its product portfolio and working on integration with HP’s product suite leading up to the acquisition (much foreshadowing) and building an On Demand / Cloud / SaaS offering. Over the past year, Klocwork has integrated its static analysis tool with collaborative peer review and code refactoring tools (Klocwork Insight Pro) and improved its reporting – like some of the other tool providers, mapping problems against the CWE and so on. And Parasoft has been about providing an integrated toolset for a long time now – static analysis is only one small part of what they offer (if you can get it to work).
I haven’t done anything with Microsoft tools in a long time, so I don’t know what advances they are making with their static analysis tools like PREfast and FxCop, although they do have active research projects in static analysis and on debugging concurrent programs, all of which might be cool if we developed in .NET.
Now that Ounce Labs has become Rational Appscan something-or-other at IBM it will be difficult for anyone except IBM product specialists to keep up with its development, and most of IBM’s time and money for the next while will likely be spent on assimilating Ounce and getting its sales teams skilled up. IBM Research labs have been doing some advanced work on static analysis, mostly in the Java world, although it’s not clear how or if this work will make it into IBM’s product line, especially after the Ounce acquisition.
The most interesting development in the static analysis area recently is the O2 work that OWASP’s Dinis Cruz started while at Ounce Labs, to build more sophisticated security analysis, scripting and tracing capabilities on top of Ounce and other static (and dynamic) analysis engines. I haven’t had a chance to look at O2 yet in any depth - I am still not sure I entirely understand what it does and how it does what it does, and I am not sure it is intended to be used by mere mortals, but it looks like it is almost ready for prime time.
Even O2 isn’t about offering new advances in static analysis – at least from what I understand so far. It is about leveraging existing underlying static analysis technology and getting more out of it, building a technical security analyst’s workbench and tool platform.
All of this: the absence of innovative product announcements in static analysis; the investments made instead to fill out integrated product suites and create better management tools for larger companies, and the emphasis on selling value-add consulting services; are clear signs that the market has matured; that the underlying technology has reached, or is reaching, its limits.
Our experience has been that there are benefits to using static analysis tools, once you get through the initial adoption – which admittedly can be confusing and harder than it should be, and which I guess is one of the drivers for vendors to offer better management tools and reporting, to make it easier for the next guy. The tools that we use today continue to catch a handful of problems each week: a little sloppy coding here, a lack of defensive coding there, and a couple of simple, small mistakes. And every once in a while a real honest-to-goodness slap-yourself-in-the-head bug that made it though unit testing and maybe even code reviews. So as I have said before, the returns from using static analysis tools are real, but modest. And given where suppliers are spending their time and money now, it looks like we won’t be able to expect much more than modest returns in the future.
Look at Veracode's binary static analysis technology for some innovation in the static analysis space.
ReplyDeletehttp://www.freddestin.com/blog/2010/08/veracode-innovation-on-multiple-levels-to-outcompete-hp-and-ibm.html
I would say static analysis is dead, I would say the market adoption so far is not as much as the investment put into it.
ReplyDeleteFortify was too expensive and worked too differently than most other static analyzers. It is closest to Ounce.
ReplyDeleteThere is plenty more to come, just not with AST parser style static analyzers. Instrumentation is key. However, no vendor has it right now. It may never come.
One security focused static analyzer that used as AST parser had a scanning engine in 30 loc. Yes, not kloc.
Do not dismiss Klocwork or Coverity. Understand the internals and how to best apply them to real world code that needs an extra set of eyes.
McConnell studied all of this long ago. There is no need for us to defy his near perfect analysis of the science of bug finding.
Just noticed a typo in my comment above. I meant to say: I would *not* say static analysis is dead. I would say the market adoption so far is not as much as the investment put into it.
ReplyDeleteWe're in the static analysis consulting business, partnered with Klocwork and Coverity and what we're seeing in the industry is a fair amount of activity focused on getting static analysis successful within the organization. I think you are right that we haven't seen any disruptive analysis techniques coming out recently but there are a steady stream of "checks" that are coming out that are built on top of the already existing engines that find more things. Still, a vendor focus on usability, integration, reporting, ecosystem, etc. helps with making the tool a better solution to a business problem.
ReplyDeleteThe flip side of what Andy (disclaimer: he’s my boss) said is that what you’re seeing isn’t so much static analysis reaching its limits, as vendors focussing on limitations that are, from their customers’ point of view, more urgent than creating new checkers. If you don’t do a very good job on the usability, integration, and reporting, then it may not matter how good your checkers are. I have seen some cool new checkers (details are NDA’d, sorry), but I agree with the vendors that they do need to focus more on the usability of the infrastructure for their existing ones.
ReplyDelete