Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Fri, 08 Jun 2012 21:48:02 +0200
From: magnum <john.magnum@...hmail.com>
To: john-users@...ts.openwall.com
Subject: Re: JtR to process the LinkedIn hash dump

On 06/08/2012 02:41 PM, jfoug wrote:
> It could be handled in several ways.  1. Removing the dupe ? NULL : as you
> mentioned.  2. As the work around (re-run the found passwords again), as I
> mentioned.  3. Turning the format into a salted format, 2 salts.  1 for the
> 00000's and one for the others.

For magnum-jumbo (lacking get_source) or existing Jumbo-5 I think a 
fourth option is worth contemplating: Null the input hashes in split() 
and the output hashes in cmp_one. This will not slow anything down, will 
completely avoid duplicates and will not have any problems with 
reloading .pot file (not even with old entries that do not have nulls). 
It only has the drawback that the pot entries you collect with LinkedIn 
format can't be used to --show full SHA1 hashes from other sources. 
Actually they can, if you use the LinkedIn format for those too. A 
better alternative is to use the pot file as a seed.

magnum

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.