Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Sun, 2 Sep 2012 16:09:06 +0400
From: Solar Designer <solar@...nwall.com>
To: john-users@...ts.openwall.com
Subject: Re: JTR against 135 millions MD5 hashes

On Thu, Apr 12, 2012 at 10:49:53PM +0200, Simon Marechal wrote:
> It swaps on my 16Gb desktop. The only sane way to tackle this on a
> standard computer is to break it into pieces (look at the split
> command), crack a large part of it, and try to fit what is left in a
> single process.

As discussed on IRC, there's this recent blog post by m3g9tr0n and
Thireus, on cracking these same hashes:

http://blog.thireus.com/cracking-story-how-i-cracked-over-122-million-sha1-and-md5-hashed-passwords

It includes the resulting "wordlist" (83.6 million unique passwords,
122 million non-unique because of different hash types - according to
the blog post).

I posted a few comments there, including on how it was in fact possible
to load all of the hashes at once with John the Ripper.  Here's that one
comment:

"I've just confirmed that it is possible to load 146 million raw MD5
hashes at once.  I took bleeding-jumbo, changed
PASSWORD_HASH_SIZE_FOR_LDR in params.h from 4 to 6 to speedup loading of
the huge file (I'd have to wait tens of minutes otherwise), and ran John
on a file with exactly 146 million of raw MD5 hashes (32 hex chars per
line, 4818000000 bytes file size, hashes are of strings 0 through
145999999), using the --external=DumbForce mode (could as well be
incremental or wordlist, but not single crack or batch mode as these
need login names or the like).  The machine I ran this on is a 2xE5420
with 16 GB RAM, but only one CPU core was used and only 8 GB RAM ended
up being used by the john process (meaning this would work on an 8 GB
RAM machine with a little bit of swap as well, although performance
could suffer then).  The loading completed in under 3 minutes."

The number 146 million came from KoreLogic's announcement:

http://www.korelogic.com/InfoSecSouthwest2012_Ripe_Hashes.html

"We came up with a few--about 146 million."

I think the number of hashes in that dump that would potentially be
crackable as raw MD5 (32 hex digits) is actually slightly smaller -
maybe 135 million, as the Subject says.  (I never downloaded the
original file.)  That would fit in 8 GB RAM more comfortably (leaving a
few hundred MB for the system).

I don't know why 16 GB RAM didn't appear to be sufficient in Simon's
experiment.  bleeding-jumbo is supposed to use less RAM (than older
versions did) with --format=raw-md5 and a lot of hashes loaded, but not
by this much.  So if 8 GB is enough for bleeding-jumbo, 16 GB should
have been enough for older versions.  We might want to re-test, or maybe
just be happy that we'll make a release based on bleeding eventually.

Alexander

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.