Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [day] [month] [year] [list]
Date: Thu, 14 Jul 2011 21:26:04 +0400
From: Solar Designer <>
Subject: entropy loss with narrow-pipe iterated hashes


Just to bookmark these, so to speak:

This is relevant in case we choose to use crypto cores with relatively
little internal state (to fit more cores per chip).

Summary: the entropy loss rate is low, but we need to be aware of what
it is or may be, and keep it in consideration for our decision-making.

Some excerpts from the above:

"Danilo Gligoroski, Vlastimil Klima: Practical consequences of the
aberration of narrow-pipe hash designs from ideal random functions, IACR
eprint, Report 2010/384, pdf.

The theoretical loss is -log2(1/e) = about 0.66 bits of entropy per
log2(N additional iterations)."

"See "Random Mapping Statistics", Flajolet, A Odlyzko, Advances in
cryptology, EUROCRYPT'89, 1990

The paper shows the bits of entropy lost is:
   t(k+1) = e^(t(k)-1)

So, for instance, by the 256rd iteration, you have only lost 7.01 bits
of entropy, not 8 bits. And, you will never get below
  ( ( pi*(2^n) )/2 )^0.5
where 'n' is the number of bits in the hash you iterate over. This is
about 128.3 bits for SHA-256."

"These entropy discussions are mute because in the real world we don't
care about 'entropy' we care about what I have heard referred to as
'conditional computational entropy' or the entropy experienced by
somebody with a real device, not a device that can enumerate all
states in an iterated 256-bit hash and know which states can be

Back in the real world, we don't lose any 'conditional computational
entropy' upon iteration."


Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.