Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Fri, 2 Oct 2015 10:33:13 +0200
From: magnum <john.magnum@...hmail.com>
To: john-dev@...ts.openwall.com
Subject: Re: Kerberoast for John

On 02/10/15 09:52, Michael Kramer wrote:
> Am Mittwoch, 30. September 2015 22:39 CEST, magnum <john.magnum@...hmail.com> schrieb:
>>
>> Thanks! I committed your patch as-is and then made significant changes
>> and enhancements in a separate commit:
>> https://github.com/magnumripper/JohnTheRipper/commit/05e5146
>> https://github.com/magnumripper/JohnTheRipper/commit/00bd1bb

>> You were using OpenSSL EVP, which is slow and not thread-safe. I bet
>> that bug was because of that, so it was probably squashed in the process.

> Thanks again for fixing/enhancing my code! I was able to test it today and it works faster and better than before.
>
> But I still encounter this strange bug. If I just use ./john <myfile>, the speed gets slower over time.
>
> Some numbers:
>
> 0g 0:00:00:01 33.54% 1/3 (ETA: 08:32:05) 0g/s 405927p/s 405927c/s 405927C/s
>
> 0g 0:00:01:08 69.54% 2/3 (ETA: 08:33:40) 0g/s 43377p/s 535782c/s 535782C/s

> 0g 0:01:16:24  3/3 0g/s 2320p/s 526810c/s 526810C/s
>
> Is this behaviour normal?
> The file I've loaded has 311 hashes.

If you look at the c/s or C/s figures, it actually gets faster. The 
first stage is "single mode" which is expected to have a LOT better p/s 
for many salts than any other mode, due to its design. All other modes 
will have a c/s ~= (p/s / number of unique salts) and you can expect the 
c/s figure to match the benchmark speed figure.

If anything, the c/s of stage 2 (wordlist + rules) is curious. It seems 
to indicate stage 2 is slightly faster than incremental (stage 3). That 
is normally not the case.

magnum

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.