[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Date: Fri, 25 Apr 2008 14:22:14 +0200
From: Simon Marechal <simon@...quise.net>
To: john-users@...ts.openwall.com
Subject: working with large files
Hello,
I noticed that john slows down a lot when working with unsalted large
password files. It spends all its time at first in the functions that
are called when a password is found:
Each sample counts as 0.01 seconds.
% cumulative self self total
time seconds seconds calls s/call s/call name
53.99 35.69 35.69 1120 0.03 0.03 ldr_update_salt
26.55 53.24 17.55 1119 0.02 0.05 crk_process_guess
12.63 61.59 8.35 884044 0.00 0.00 ldr_load_pw_line
4.18 64.35 2.76 817642756 0.00 0.00 binary_hash3
This really isn't cool. I already increased the size of the values
returned by binary_hash and stuff like that. However, it would be really
useful to be able to speed up the ldr_update_salt function. Has anybody
a suggestion on how to do this in a clean way?
PS: I couldn't give it much thought, but i guess that replacing
ldr_update_salt by a piece of code in crk_process_guess that removes the
found password in the next_hash chain would be enough to gain much
speed. Would there be side effects?
--
To unsubscribe, e-mail john-users-unsubscribe@...ts.openwall.com and reply
to the automated confirmation request that will be sent to you.
Powered by blists - more mailing lists
Powered by Openwall GNU/*/Linux -
Powered by OpenVZ