Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Date: Tue, 07 Apr 2015 22:44:43 -0400 (EDT)
From: "David A. Wheeler" <dwheeler@...eeler.com>
To: "oss-security" <oss-security@...ts.openwall.com>
Subject: Re: Hanno Boeck found Heartbleed using afl + ASan!

On Tue, 7 Apr 2015 18:58:40 -0700, Michal Zalewski <lcamtuf@...edump.cx> wrote:
> I think that cases such as Coverity are more of an exception than a
> rule. Yup, they get credit for a steady trickle of issues (mostly
> through their self-service offering to developers, rather than any
> in-house analysis); but if you consider the size of the commercial and
> research "market" for static analysis and symbolic execution tools,
> it's not a common practice. Coverity and the singular case of
> Heartbleed aside, the mark left by others isn't as easy to find.

HP/Fortify and Coverity are two of the most common
source code weakness analyzers, and both *do* intentionally
support OSS developers by giving the developers gratis access to their tools.
There are probably others. I think it's a "common enough" practice.

True, they're primarily provided as self-service for use by developers
during development.  But that's how these tools are often used.
There's nothing wrong with self-help, and that doesn't reduce the tool's utility.
Providing access to a tool (that would otherwise be unavailable) is still a contribution.

Besides, I think if someone makes a tool that really helps find vulnerabilities, we
should give the toolmaker credit for that.  Perhaps you can agree ;-).

This typical usage also explains why the mark "isn't as easy to find".
The vulnerabilities tend to be detected and fixed *before* the official release.
This is exactly how it *should* be for OSS that many people depend on.
Typically an OSS project makes its source code available for review
(say, via a public repo managed by a version control system), then
people publicly examine it using tools and manual review and localized tests,
and *then* the software is officially released (as a tarball or whatever).
Vulnerabilities that slip through that process may get more publicity,
but we don't *want* many to slip through.

> so to be clear, I'm not
> saying it provides no value. I'm just trying to be mindful of the fact
> that I wouldn't give a proprietary tool an easy pass in similar
> circumstances, so I don't want to give one to my own tool =)

Sure.  The only way to really know how effective this will be
is to apply this approach on more network protocols, and more thoroughly.
In practice that means it needs to be easier to use these tools on network protocols.

Hanno's post is good evidence, though, that it's worth doing. There's a lot of
evidence that afl works on files, and that ASan helps find vulnerabilities;
it seems reasonable that the combination (and similar) would also find
vulnerabilities in network protocol implementations.

--- David A. Wheeler

Powered by blists - more mailing lists

Please check out the Open Source Software Security Wiki, which is counterpart to this mailing list.

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.