Stephen Walli works on the Helion converged cloud team at Hewlett-Packard where he does business strategy and evangelism.
Jim Zemlin has always been quotable. His keynote at this year’s Linux Foundation Collaboration Summit provided a great summary (as always) of the growth of the Linux ecosystem, but also focused on the serious problems in the security of the Internet in an era when key breaches have their own branding and logos. [Think Heartbleed and Shellshock.] He ran through some scary facts:
- OpenSSL secures most of the Internet and the small team at the OpenSSL Foundation (including the “two Steves”, Steve Henson and Steve Marquess) were only pulling in about US $2,000 in donations per year.
- Theo de Raadt supports OpenSSH part-time.
- Harlan Stenn “runs the clocks on the Internet”, i.e. he’s responsible for ntpd, and until recently was earning about US $25K/year.
- Werner Koch maintains GnuPG, which secures a lot of email and provides the confirmation that a software package is what it says it is. According to a recent interview, he was going broke.
Before readers unversed in open source software get concerned with the security of open source software, let us remember this is a software problem, not an open source problem. Closed proprietary products have their share of exploits, etc. But with open source licensed software, the broad community can do things.
Review creates secure code
It is perplexing that if Linus’s Law is true, “Given enough eyeballs, all bugs are shallow,” then such security problems persist. Jim suggested as he gave the above examples that “there just aren’t enough eyes.” I’d offer a corollary. I think vibrant projects live a culture of review before code gets committed. I think this is because the developers have perspective and context that can never be built into a static analysis tool. Tools can find obvious portability breakage, or some security related issues (e.g. buffer overflow problems), so issues that are likely syntactically based, but a human can understand the semantic content of the code in front of them.
There’s even research to back this up:
- “Code review is more effective than test because in review the faults in the code are found directly, while testing uncovers only the symptoms of problems, requiring debugging to find the direct cause. The seriousness of the wrong behavior by the system does not have a relation to the type of mistake, since even simple mistakes can cause complex behaviors.”
– Walcélio Melo, Forrest Shull, Guilherme Horta Travassos, “Software Review Guidelines“, Technical Report ES 556/01, COPPE/UFRJ, August 2001. - “Software inspections are indeed cost-effective: They find about 70% of the recorded defects, take 6% to 9% of the development effort, and yield an estimated saving of 21% to 34%. i.e., finding and correcting defects before testing pays off, so indeed “quality is free”.” and: “Individual inspection reading and individual Code Reviews are the most cost-effective techniques to detect defects, while System Tests are the least cost-effective.”
– Reidar Conradi, Amarjit Singh Marjara, Øivind Hantho, Torbjørn Frotveit, and Børge Skåtevik, “A Study of Inspections and Testing at Ericsson, NO“, a rewritten edition of the paper presented at PROFES’99 (Oulu, Finland, June 1999). From the Conclusion:
In the open source community bugs are found quickly, but this happens after the fact. Vibrant projects live a culture of review before code gets committed so it’s much more like they find the bugs before they happen. Many of the key projects that have had breaches are taken for granted and don’t necessarily have the vibrancy of a Linux, or an OpenStack. These projects have simply become part of the fabric of the Internet.
Invest in Core Infrastructure
The Linux Foundation is stepping up to tackle these problems with the Core Infrastructure Initiative (CII). The Foundation is in an excellent position at this point in time to be the centre of gravity for such industry action. Jim talked about the Initiative in his keynote. A broad collection of players have banded together to provide a three-pronged approach to the problem of securing the open source software that secures the Internet.
1. Invest in the people that are the most knowledgeable about key projects.
2. Provide deep audits for the key projects to work to prevent the next security breach.
3. Develop and disseminate best practices and guidelines for developing and deploying secure software.
Jim’s big concerns are how not to perturb the market economics that drive open source software ecosystems and how to avoid creating an open source welfare state. He rightly used the example of the I-35W bridge collapse as an example of failing infrastructure that should have been fixed before a key transcontinental artery collapsed. I think that’s the right idea economically.
Governments invest in infrastructure to enable economic growth. Support and investment for rights of way for railroads, deep port infrastructure, or interstate highway systems creates the transportation infrastructure that enables economic growth and free markets for all. All the projects Jim discussed are fundamental Internet infrastructure. If a project under consideration implements or secures an underlying universal communications protocol or cryptographic algorithm then it is probably a good candidate for CII investigation.
Likewise, software projects that are not owned by corporations seem to be a necessary attribute. A database, even one as broadly deployed as MySQL, shouldn’t be a candidate. A fabulous engine for application deployment (node.js) is owned by a company. I’m pretty sure the investors would love the vendor community to invest in securing node.js “because it’s so hugely important going forward at enabling blah blah marketing blah.” Sorry — if a company controls the copyright, then you’re off the list. Private roads didn’t get government-funded bridges.
The Linux Foundation is obviously not a government, but it is a well-funded, well-organized industry non-profit. As such it provides an excellent place for the vendors that best benefit from the Internet infrastructure to collectively support the infrastructure on which they all depend.
The Core Infrastructure Initiative efforts are fundamentally important. A complete list of participants to date exists on the Linux Foundation site. Jim’s excellent keynote is up on YouTube (below), the slides will hopefully be up soon, and Jim’s blog post introducing CII is published on the Linux Foundation blogs. If your company isn’t supporting the initiative, it is well worth exploring how best to participate.
This article is republished with permission from Stephen Walli’s blog.