Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Mon, 29 Apr 2013 17:50:37 -0500
From: Rob Landley <>
Subject: Re: Licensing.

On 04/29/2013 03:47:31 PM, Rich Felker wrote:
> On Sun, Apr 28, 2013 at 04:34:38PM -0500, Rob Landley wrote:
> > On 04/26/2013 01:11:07 AM, Igmar Palsenberg wrote:
> > >
> > >>>> incompatible licenses.  The openssl library can't be used with
> > >a GNU
> > >>>> program unless there's a waiver for it because one of the
> > >clauses in the
> > >>>> openssl license goes against the GNU license principles.  The
> > >gnutls
> > >>> Not _used_ but _distributed_. The GPL does not restrict use
> > ....
> > >> What about explicitly loading the library at run-time using
> > >uselib(2) in a plug-in like fashion?  Is that also considered
> > >problematic from a GNU perspective?
> > >
> > >I consider this a grey area. I personally don't thing it is
> > >considered a problem,
> > >but there are a number of interesting (theoretical) scenario's :
> >
> > Um, back up:
> >
> > You know how cryptographers point and laugh at non-cryptographers
> > trying to figure out whether something's breakable?
> >
> > You know how professional security auditors find most programmers'
> > code appallingly insecure, and the best of us have to put out
> > regular updates to fix exploits that we didn't personally find?
> >
> > Now imagine what lawyers think of programmers' legal theories.
> Your analogy would hold more water if the majority of lawyers doing
> software licenses had any clue about the law, but they don't. Nearly
> everything ever written in a proprietary software license has no basis
> in law; legally, such documents are not even licenses (a license gives
> you permission to do something that would otherwise not be permitted
> under the law; it doesn't take away your rights) but one-sided
> contracts that are never signed and that offer the victim nothing of
> value in exchange for surrendering their rights.

Sturgeon's Law applies everywhere, yes. Lots of self-proclaimed domain  
experts aren't. (And an MCSE doesn't qualify you do secure posix code.)

> > Programming-side example: the /tmp dir has the sticky bit set other
> > users running inotify to spot new files being created don't
> > immediately delete them and replace with a symlink so your
> > mknod/open pair is now accessing the wrong file. What your code is
> > doing worked fine, but the context it was running in made it
> > insecure.
> I don't follow. Unless you do idiotic things (like omit O_EXCL) there
> is nothing unsafe about properly-configured public temp directories.

Agreed, but most programmers had somebody else set it up for them and  
often don't know why they did it that way. (And then there's the  
"predictable names" fun where "they can't stomp you" doesn't mean "you  
can't me tricked into stomping them". Sure it's managable, but the  
issue does exist.)

The reason for that digression was it's a fairly simple, easily  
explained example of "you don't just have to understand what _you_ did,  
you have to understand the entire environment you did it in".

> > Now imagine telling a lawyer that your license usage is
> > unexploitable in all jurisdictions, and you know this because you
> > read the license text and you're sure you're using it ok. (The best
> > a lawyer or security professional can EVER say is "I can't spot
> > where you screwed up".)
> Even someone with no security skills at all can tell you there is no
> vulnerability *in your code* if your code is:
> int main() {}
> Similarly, a non-lawyer can tell you there's no vulnerability in a
> "0-clause" BSD license.

I agree in principle, sure.

(My friend Lamont "Piggy" Yaroll used an old unix system, beta of NeXT  
I think, where /bin/true was a 0 byte file with the executable bit set.  
That got interpreted as a shell script, does nothing, returns true.  
Clever, eh? Until he used true in his /etc/profile, which loaded  
/bin/sh, which parsed /etc/profile, which loaded /bin/sh... filled up  
all memory, brought down the system. He filed a bug. Their bug  
reporting system tried to calculate defect density in the file and  
threw a division by zero exception. I mean, I break stuff all the time,  
but I bow down to the master.)

(The point of that digression was that I've hung around enough system  
engineers and enough lawyers to wince when anybody says "there's no  
vulnerability in X". Reflex at this point, sorry. "I don't see how  
you'd exploit that" and "sucks less than the alternatives" are the best  
I can do...)

(The legal side of that includes software patents, disclaimer of  
liability, the author's "moral rights" in Germany that they _couldn't_  
disclaim until the law changed in december 2007, the not QUITE fully  
repealed US crypto export regulations, DMCA anti-circumvention  
nonsense... A 0-clause BSD does indeed suck least as a clear attempt to  
opt out, but IP law is inherently horrible for the forseeable future.)

> The potential for vulnerability is only introduced with complexity.

Complexity such as mixing multiple licenses in the same program, unless  
they're all fully convertible to a single license.

I've learned enough of this game that I don't want to play it anymore...

> Anyway, this thread has gotten fairly off-topic.



Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.