[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Date: Tue, 8 Feb 2011 00:59:40 +0000
From: Brandon Enright <bmenrigh@...d.edu>
To: john-users@...ts.openwall.com
Cc: solar@...nwall.com, bmenrigh@...d.edu, Ron <ron@...llsecurity.net>
Subject: Re: Gawker.chr (was: 1.7.6 with jumbo 11 patch (or
omp-des-7) 64-Bit Fedora 14 RPM's)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On Tue, 8 Feb 2011 00:33:58 +0300
Solar Designer <solar@...nwall.com> wrote:
> On Sun, Feb 06, 2011 at 12:28:48AM +0000, Brandon Enright wrote:
> > I have been working on cracking Gawkers too. With the help from
> > others, I have cracked 667073 of 743863 unique hashes (89.67%). It
> > would be nice to sync up our cracks. Contact me off list if you're
> > up for doing so.
>
> This is impressive.
Thanks, I aim to impress ;-)
> Are you going to describe your approach?
Yes, that's the plan. I have a bad habit of knowing when to stop
cracking and start writing. It this point I'm mostly just doing
- --incremental cracking so perhaps I can start writing since nothing
interesting is happening anymore.
> Are you going to make the results public?
Yes, that's the plan.
> - at least a .chr file and top N passwords, but preferably all of the
> actual passwords such that new .chr files and the like may be
> generated from them.
Yeah, I want to release good stats. A .chr file as well as some Markov
mode stats files.
Ron (of skullsecurity.net) and I plan on posting the cracked list at
http://www.skullsecurity.net/wiki/index.php/Passwords
We're waiting on free time, more cracks, and time for the victims to
have changed their passwords. We'll also write a blog post providing
much more detail than what is in this email.
The very short version of my approach is that since day-1 I've had
between 24 and 32 cores crunching non-stop. On the nmap-dev list
Fyodor mentioned that it would be nice to use the Gawkers passwords in
Nmap's password stats and I mentioned that I'm cracking which caused
several people contacted me offering CPU time. I farmed out some
quite large Markov mode jobs to various people.
I'd say that for more than 2 weeks straight I had various Markov jobs
split up among at least 64 cores.
Between volunteer CPU time and others syncing up their cracks with me
the task hasn't been all that difficult.
I now have a --incremental=rockyou using Minga's .chr file going on 24
cores. It has been going for 45 days.
Brandon
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.16 (GNU/Linux)
iEYEARECAAYFAk1QlYMACgkQqaGPzAsl94L44ACgrFIY6jDFVmH1UBLtUGtZGcXm
iFEAnjBxVfkAyysT3Fo1rLBDOR3GRxN2
=9vlr
-----END PGP SIGNATURE-----
Powered by blists - more mailing lists
Powered by Openwall GNU/*/Linux -
Powered by OpenVZ