Openwall GNU/*/Linux - a small security-enhanced Linux distro for servers
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Tue, 22 Nov 2005 23:36:58 +0100
From: "Frank Dittrich" <frank_dittrich@...mail.com>
To: john-users@...ts.openwall.com
Subject: Re: Speed up John

Solar Designer wrote:
>Yes.  You might be surprised that with 1 or 100,000 hashes loaded for
>cracking, you would be at around 2000 "passes" after a month, too.  The
>reason for this is that each subsequent pass takes longer to process.
>The first thousand of passes is done quickly, the second thousand takes
>a while, and the third thousand is almost never completed.

Assuming the password can contain up to 8 characters, and a
character set of 95 different characters, you'll get about 6.7e15
different passwords.

The master could just once precompute all password candidates,
and save the corresponding .rec files after 100 million new passwords
have been created.
Then, just store the status information of these .rec files in a DB.
Re-generate .rec files for the clients by adding the --format option ...
and let each client ask for a new .rec file after 100 million passwords
have been tried.
(Of course, the cracking speed reported by the clients will be bogus,
but who cares.)

I did not check whether john creates a .rec file with --stdout.
If not, you'd have to use a dummy password algorithm which doesn't
consume much run time.

This would still require to compute the password candidates twice,
once by the master (but the .rec files could be re-used), and once by
the clients.

And if you want to use a newly generated .chr file instead of all.chr,
you'll have to repeat precalculating all password candidates on
the master.


Frank


Powered by blists - more mailing lists

Your e-mail address:

Powered by Openwall GNU/*/Linux - Powered by OpenVZ