Openwall GNU/*/Linux - a small security-enhanced Linux distro for servers
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Tue, 4 Apr 2006 22:26:17 +0200
From: "" <>
Subject: Re: JTR and Speed

> There are two reasons:
> 1. Matching salts.  In your second example, you have 527 password hashes
> with only 90 different salts - meaning that you have around 6 password
> hashes per salt on average.  For each candidate password (from your
> wordlist), John only needs to calculate 90 different hashes, not 527 -
> yet that is sufficient for checking the candidates against all of the
> loaded 527 password hashes.
> 2. Per-candidate "overhead".  John has to read each candidate password
> from your wordlist file and to set it up as a cryptographic key to be
> tried.  If you're using word mangling rules, the processing cost of that
> is added, too.  With only one password hash loaded, these operations are
> performed for each password hash calculated and checked.  With multiple
> hashes loaded, these per-candidate operations are only performed once
> for all password hash calculations and checks based on this candidate -
> for all salts.
> Both of these contribute to a higher effective combinations per second
> rate with more password hashes loaded for cracking simultaneously.
> BTW, it is uncommon to have only 90 different salts with 527 hashes.
> This indicates a problem with the way salts were being generated on the
> target system.

  Thank you for you fast reply :)

Powered by blists - more mailing lists

Your e-mail address:

Powered by Openwall GNU/*/Linux - Powered by OpenVZ