RNG Whitening Bug Weakened All Versions of GPG

Werner Koch, maintainer of Libgcrypt and GnuPG, announced today:

"Felix Dörre and Vladimir Klebanov from the Karlsruhe Institute of Technology found a bug in the mixing functions of Libgcrypt's random number generator: An attacker who obtains 4640 bits from the RNG can trivially predict the next 160 bits of output. This bug exists since 1998 in all GnuPG and Libgcrypt versions. … All Libgcrypt and GnuPG versions released before 2016-08-17 are affected on all platforms. A first analysis on the impact of this bug in GnuPG shows that existing RSA keys are not weakened."

However, in the text of one of the patches (archived) which accompanied this announcement, we find a slightly different statement:

"This bug does not affect the default generation of keys because running gpg for key creation creates at most 2 keys from the pool: For a single 4096 bit RSA key 512 byte of random are required and thus for the second key (encryption subkey), 20 bytes could be predicted from the the first key. However, the security of an OpenPGP key depends on the primary key (which was generated first) and thus the 20 predictable bytes should not be a problem. For the default key length of 2048 bit nothing will be predictable."

In effect, this means that no key created with GPG to date carries more than 580 bytes of effective entropy (e.g., all 4096-bit and above RSA keys have 'subkeys' which – we now find – mathematically relate, in a possibly-exploitable way, to the primary key.)

It should be remembered that, due to the structure of the OpenPGP format, breaking a GPG subkey is often quite nearly as good as breaking the primary key – i.e. it will allow the attacker to create valid signatures, in the case of a signature-only subkey, or else to read intercepted ciphertext, or both.

And thus we find that, due to the staggeringly-braindamaged design of the protocol and of this implementation, GPG users who elected to use longer-than-default GPG keys (Phuctor presently contains 1,090,450 RSA moduli which exceed 2048 bits in length1) ended up with smaller-than-default effective cryptographic strength.

Likewise noteworthy is the fact that this bug was contained in an RNG 'whitening' routine. The popular but wholly-pseudoscientific practice of RNG 'whitening' creates the appearance of an effective source of entropy at times when – potentially – none exists2, at the cost of introducing a mathematical relationship (sometimes, as in the case at hand, a very exploitable one) between RNG output bits, which by their nature are intended to be wholly uncorrelated.

  1. Not all of these moduli were generated using GPG. 

  2. A whitened (walked over with, e.g., RIPEMD – as in GPG, or SHA2, or AES) stream of zeroes, will typically pass mathematical tests of entropy (e.g., the Diehard suite) with flying colors. While at the same time containing no meaningful entropy in the cryptographic sense. 

9 thoughts on “RNG Whitening Bug Weakened All Versions of GPG

  1. > ended up with smaller-than-default effective cryptographic strength

    At least in theory (so far supported by reading the code), the Koch version of pgp does the following :
    takes 512 bytes of entropy and creates main key ;
    takes a further 68 bytes of entropy + 20 bytes of junk derived from the previous + a futher 424 bytes of entropy and creates the "subkey".

    Leaving aside the many various ways in which Kock-pgp is braindamaged both in its design and implementation, this particular flaw by itself does not amount to someone asking for 4096 keys obtaining a weaker item than someone asking for the default (which still to this days is 2048).

    Nevertheless, I find it astounding that the man would have the gall to write what he wrote in the comments after he did what he did is astounding enough. It's perhaps not surprising that he doesn't have the intelligence to avoid tagging the mess with the remarkably USG-smelling "this doesn't matter anyway because you don't know how my sponsor is using it in practice".

    In any case, there's no place in civilised society for Werner Koch. Then again we saw this before.

    • > 20 bytes of junk derived from the previous + a futher 424 bytes of entropy

      This is sufficient to economically break the subkey's RSA modulus if the 20 known bytes can be mapped to known portions of the resulting modulus.

      • Seems rather probable that there's a lot more to this pile of garbage than the exposed tip, in general.

        In particular,might be an idea to look at just how much of the entropy used can be reconstructed on the basis of the pubkey.

  2. So, should I generate new subkeys?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>