Openwall GNU/*/Linux - a small security-enhanced Linux distro for servers
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Tue, 6 Mar 2012 09:11:04 +0200
From: Milen Rangelov <gat3way@...il.com>
To: john-dev@...ts.openwall.com
Subject: Re: RAR format finally proper

Great :) Did the memory buffering improve things a lot with your code? I
was surprised to find out it does not improve performance a lot here :(

It would be great if you can find out some way to improve checking and
share it of course :)

As per AES/OpenSSL, I read somewhere they implemented runtime AES-NI
detection/use. Though I don't think this have made it into the debian
packages I use yet. It might improve things a lot.

I also scan the archive to find the smallest file to crack. I went a bit
too far by refusing to crack an archive if there is no file smaller than
4MB in it, it just becomes so slow. I am really considering to do some bulk
AES decryption on GPU, however memory and bus bandwidth requirements become
so bad :(

On Tue, Mar 6, 2012 at 2:16 AM, magnum <john.magnum@...hmail.com> wrote:

> Thanks to Milen's advice I am now committing a working version of RAR
> format that does not spawn unrar yet can attack all modes. It was fairly
> simple to get the clamav code running, I changed it to work on a memory
> buffer instead of a file handle, and removed legacy RAR stuff and
> autoconf macros. It now decrypts and CRC's on the fly. I also added OMP
> to the format. We do not support solid archives yet but I think that's
> doable with a little research (it's supported by the clamav code).
>
> I can see now the AES decryption get significant with larger files.
> For small files the speed is about the same as for the -hp mode. I
> tried a 2 MB file and speed went down from ~36 c/s per core to one-digit
> figures :-/  But rar2john will now scan the whole archive and pick the
> smallest file possible.
>
> There are a lot more optimisations possible. I'm not sure the unpack
> code bails out immediately on any kind of bad data, it may continue
> longer due to laxed checking. But in many cases it does bail out early,
> it averaged rejecting after 41% of the available data in a small test
> with various files. Anyway we now have a decent base to work from so
> things will only get better.
>
> We still need OpenCL of course. Samuele, hit it! :-)
>
> Note that input files created with any older version of rar2john need to
> be recreated.
>
> magnum
>
>

[ CONTENT OF TYPE text/html SKIPPED ]

Powered by blists - more mailing lists

Your e-mail address:

Powered by Openwall GNU/*/Linux - Powered by OpenVZ