The solution to a mathematical problem could render all public key cryptography obsolete – should people and businesses be worried about the “Cryptopocalypse“?

By on Nov 2, 2013 in Blog

Last month there was a presentation at the Black Hat conference that warned us of a “factoring cryptopocalypse” – where factoring numbers and solving the discrete log problem become easy, rendering two of the most common encryption methods currently used (RSA and DH) practically useless.

The consequences of such an occurrence are profound, because any data encrypted with these methods (which is practically ALL encrypted data) would be easily exposable using simple methods. Public-key cryptography also underpins some very widely used Internet standards such as Transport Layer Security (TLS), PGP encryption , and GPG encryption.

Understandably this presentation quickly became controversial in security circles, and has already generated quite a lot of commentary around the internet. Why this talk is so controversial is because it dispels two important mathematical assumptions that allow encryption to work (one of which is needed in any cryptographic system):

  1. That it is difficult (or infeasible) to determine the factors of large integers, when those factors are two or more large prime numbers.
  2. Or that it is difficult (or infeasible) to find the discrete logarithm of a random elliptic curve with respect to a publicly known base point.

 

What happens if these assumptions are no longer valid?

This basically means that if someone wants to unencrypt your files for any reason, they would be able to do so using minimal (or at least feasible) technological resources. All encryption technology is essentially built on the idea that it would take even the most powerful computers years / decades / centuries / eons to crack, even using continuous brute force tactics.

 

Why is this becoming an issue now? Surely the experts already know this and technology will solve it?

This is true to a degree, and indeed these issues have in fact been intensively studied for decades – some examples include the RSA problem, the Discrete Log Problem, and the P vs NP problem. The issue is in fact not about people or technology at all – it is really about mathematics.

Bruce Schneier wrote about this issue in 1999, and his explanation still stands today. There are four basic reasons why encryption (or more accurately algorithm factoring) gets easier over time:

  1. Computers are getting faster i.e. more computation power
  2. Computers are better networked i.e. more computation power
  3. Factoring algorithms are getting more efficient i.e. less computational power needed
  4. Fundamental advances in mathematics are giving us better factoring algorithms

 

When you talk about the first three reasons, these are actually not much of an issue at all (in relative terms). This is because technological advances are regular and easy to predict – and you can simply add another layer of complexity to your encryption so you can keep ahead of these technological developments e.g. replacing your 128-bit encryption with 256-bit encryption and so on.

It is in fact the fourth point which is causing the dystopian nightmares to manifest. Fundamental changes to mathematics are rare, but their effects can be profound – and if anything were to render a cryptographic algorithm (such as RSA) completely useless, it would be exactly for this reason.

 

Should people and businesses be worried about this development?

The short answer is no – at least for the foreseeable future – and this is the reason why.

The authors of this talk base their current warnings on recent fundamental advances to solving the discrete log problem (i.e. the works of Joux et al). However these specific works are limited to very specific characteristics of solving logarithm problems, which are already factored into the latest cryptographic systems i.e. they do not factor over small characteristic fields as described in the Joux papers.

Yes – there is still a chance that the fundamental mathematics could break an entire cryptographic system (regardless of complexity). Yes – technologies will continue to improve and it will therefore continue to get easier to crack encrypted systems.

At the moment however, it is still the far easier to simply increase key lengths to stay ahead of these advances. For example modern RSA encryption systems use 2048-bits as the current standard (which is still plenty for most security purposes) and 4096-bit systems are already widely available.

It is a still a little scary to imagine though – that so much of the internet security we rely on, all depends on the intractability of one particular mathematical problem. The risks are enough to make researchers worried, but they are not nearly high enough to be a real public concern.

 

Recommendations:

  • In the short term – check what encryption technologies you are using on your systems, and upgrade them to the latest version accordingly.
  • In the medium term – make sure that any encryption technologies used are integrated in such a way that they can be easily replaced later.
  • In the longer term – the industry consensus is to begin switching to ECC encryption technologies as they become available or a feasible. There are some issues with the availability of ECC encryption, but this will likely change quickly over the next year or so.

Post a Reply