Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Thu, 20 Aug 2020 13:34:17 -0400
From: Powen Cheng <madtomic@...il.com>
To: john-users@...ts.openwall.com
Subject: Re: Performance John in the cloud

"presentation
https://www.openwall.com/presentations/Passwords12-Probabilistic-Models/"

This is very useful.

"Cool, but that's for PBKDF2-SHA256, not for scrypt."

I see now you pointed this out.
I did some more reading and saw that the CPU method is the only way to
crack an ethereum wallet with scrypt parameters n:262144, r:8, p:1 as
currently there isn't any GPU on the market with enough RAM for the job I
guess.
This brings me to my next question. Is there a way to convert scrypt
n:262144 to n:1024?


---

https://stealthsploit.com/2018/01/04/ethereum-wallet-cracking-pt-2-gpu-vs-cpu/
When GPUs Can and Can’t Crack

Please note that manufacturers refer to the basic unit of scheduling
differently, so the “Threads per compute unit” will differ. NVIDIA cards
have a warp size of 32 (a warp has 32 threads) and AMD cards have wavefront
size of 64 (a wavefront has 64 threads)… When it comes to “compute units”,
NVIDIA cards have stream multiprocessors (SM) and AMD cards just use
“compute units” (CU). This’ll be put into context further down…

First let’s use the example from the wallet I used in my previous post
<https://stealthsploit.com/2017/06/12/ethereum-wallet-cracking/>.

{“dklen”:32,“n”:1024,”r”:8,”p”:1} – cracking on a GTX 1080

Step 1: (128 * 8) * 1024 = 1024 * 1024 = 1,048,576 bytes = 1 MB

Step 2: 32 (NVIDIA card) * 20 (a 1080 has 20 SMs
<https://www.techpowerup.com/gpudb/2839/geforce-gtx-1080>) = 640 parallel
computations

Step 3: 1MB * 640 = *6**40 MB RAM required per GTX 1080*

As a GTX 1080 has 8GB of RAM which is > 640 MB so we can crack the above
wallet without issue. Now let’s look at another example wallet:



{“dklen”:32,“n”:262144,”r”:8,”p”:1} – cracking on a Radeon RX Vega 64

Step 1: (128 * 8) * 262144 = 1024 * 262144 = 268,435,456 bytes = 256 MB

Step 2: 64 (AMD card) * 64 (an RX Vega 64 has 64 CUs
<https://www.techpowerup.com/gpudb/2871/radeon-rx-vega-64>) = 4,096
parallel computations

Step 3: 256MB * 4,096 = 1,048,576 MB RAM = *1,024 GB RAM required per Vega
64*

Last time I checked, a Vega 64 has less than a terabyte of RAM! So this
will crash and burn, often ending in a BSOD if the system doesn’t handle
the memory failure well.

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.