Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Date: Sun, 10 Sep 2006 17:56:45 +0400
From: Solar Designer <>
Subject: Re: need advice on setting up cracking system for large organisation

On Sun, Sep 10, 2006 at 12:14:59PM +1200, Russell Fulton wrote:
> Long term if this proves valuable I will get
> dedicated resources for the project.  To get it off the ground I want to
> leverage those spare cpu cycles.  It is easier to make a business case
> when you have some solid evidence ;)

Sure.  Please note, however, that there's not a lot of a difference
between 1 CPU and 10 CPUs - even if you use all of them optimally - in
the percentage of passwords cracked, say, in a week.  The success rate
decreases rapidly the longer you let JtR run.  So if you're going to get
dedicated resources for this, my advice is that you settle for a single
fast workstation - e.g., a dual CPU x86 or PPC G5 system (x86s are
better for MD5, PPCs might be better for DES-based crypt(3) hashes).
You might also reuse any old/slow/retired machine for the dedicated
"secure drop box".

> > How fast are the "big machines" when it comes to password cracking?
> > They might have disk arrays, large memories and caches, but this is of
> > no use for John the Ripper.  Chances are that you can find a single
> > workstation that would be more suitable for the task.
> good point, particularly if JtR is not threaded.  Many of these boxes
> are multi cpu.

Well, multiple CPUs are of some help if you're going to be running
multiple instances of JtR anyway (one per hash type).  What I am saying
is that it might not be worth the risk to distribute the task across
multiple shared use systems.

> >> I have tried a run with john using the mangled list against 10 users in
> >> a BSD MD5 hash format password file.  It took 15 hours on a fairly fast
> >> Intel processor.
> > 
> > That's because the pre-mangled wordlist is very large and the MD5-based
> > hashes are quite slow to compute.  If you intend to be running against a
> > substantially larger number of hashes of this type, you can opt to use
> > smaller wordlists.
> in a university environment we do get people using words from foreign
> languages (particularly forms of their names) in the belief that these
> are 'secure'.  So that's why I used the full list.

That's reasonable, and I am not suggesting that you drop the foreign
languages entirely.  I am merely suggesting that you drop the larger
versions of the wordlists for each language, including English, when
cracking the slower hashes.  For English, you could keep the wordlists
under 1-tiny and 2-small, but not those under 3-large and 4-extra.  For
other languages, where two different size versions of wordlists are
available, keep only those under 1-*, not those under 2-*.  Of course,
you need to start your combined wordlist with the common password lists -
these are to be included prior to any language-specific wordlists, maybe
even with mangling rules pre-applied.

> I've  just decided that I desperately need the Pro version ;)  I'll
> order it on Monday.

Feel free.  It also includes a newer revision of the all.lst wordlist,
so you'll want to use #!comment's from that revision when you build your
own wordlists.

P.S. I've unsubscribed "ricardo anselmo".

Alexander Peslyak <solar at>
GPG key ID: 5B341F15  fp: B3FB 63F4 D7A3 BCCC 6F6E  FC55 A2FC 027C 5B34 1F15 - bringing security into open computing environments

Was I helpful?  Please give your feedback here:

To unsubscribe, e-mail and reply
to the automated confirmation request that will be sent to you.

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.