Computational vs. information- theoretic security

7
13/09/17 1 Preventing crypto-apocalypse: Securing our data against the quantum computer threat Peter Knight Imperial College London; RS Chicheley Hall and NPL Rapid developments in quantum computing threaten aspects of encryption Two approaches to tackle this threat: quantum key distribution (QKD) addresses the problem of key distribution directly using principles of quantum physics. post-quantum cryptography (PQC) develops novel classical ciphers thought to be be invulnerable to quantum computers. Even though its definition allows it to be, QKD was historically not considered a part of PQC. ETSI has taken cognisance of this by bringing both QKD and PQC under an umbrella term: quantum-safe cryptography. The UK Blackett Review also supports parallel development Quantum Safe Cryptography and Security; An Introduction, Benefits, Enablers and Challenges, European Telecommunications Standards Institute (ETSI), White Paper V1.0.0 (2014-10), ISBN 979-10-92620-03-0. Computational vs. information- theoretic security DES, AES, RSA and one time pad (OTP) represent the contemporary cryptographic era. They feature varying key lengths and differ algorithmically. In this fictitious example, it is assumed that only the legitimate parties possess the keys for the four ciphers. An adversary captures the corresponding ciphertexts ctXs, with X denoting a cipher, and performs cryptanalysis with whatever abilities he/she has in order to know the original message. Depending on the cipher, the adversary may or may not be able to breach the cryptographic security today or in (near) future. Note that longer keys do not necessarily imply a higher level of security. From Jain et al, Contemporary Physics, 57:3, 366-387, 2016

Transcript of Computational vs. information- theoretic security

Page 1: Computational vs. information- theoretic security

13/09/17

1

Preventing crypto-apocalypse: Securing our data against the quantum computer threat

Peter Knight���Imperial College London; RS Chicheley Hall and NPL

•  Rapid developments in quantum computing threaten aspects of encryption•  Two approaches to tackle this threat:

–  quantum key distribution (QKD) addresses the problem of key distribution directly using principles of quantum physics.

–  post-quantum cryptography (PQC) develops novel classical ciphers thought to be be invulnerable to quantum computers.

–  Even though its definition allows it to be, QKD was historically not considered a part of PQC. ETSI has taken cognisance of this by bringing both QKD and PQC under an umbrella term: quantum-safe cryptography. The UK Blackett Review also supports parallel development

•  Quantum Safe Cryptography and Security; An Introduction, Benefits, Enablers and Challenges, European Telecommunications Standards Institute (ETSI), White Paper V1.0.0 (2014-10), ISBN 979-10-92620-03-0.

Computational vs. information-theoretic security •  DES, AES, RSA and one time pad (OTP) represent the

contemporary cryptographic era. They feature varying key lengths and differ algorithmically. In this fictitious example, it is assumed that only the legitimate parties possess the keys for the four ciphers.

•  An adversary captures the corresponding ciphertexts ctXs, with X denoting a cipher, and performs cryptanalysis with whatever abilities he/she has in order to know the original message. Depending on the cipher, the adversary may or may not be able to breach the cryptographic security today or in (near) future.

•  Note that longer keys do not necessarily imply a higher level of security.

•  From Jain et al, Contemporary Physics, 57:3, 366-387, 2016

Page 2: Computational vs. information- theoretic security

13/09/17

2

GO Science 3

Quantum Cryptography and Communication

•  Quantum cryptography may offer greater online security and protection for digital networks: quantum computer advances threaten conventional cryptography.

•  Modern cryptography uses a “public key” to encrypt and decrypt information. The key is based on the multiplication of two large prime numbers. The difficulty in breaking the encryption is due to the difficulty of reversing the process (i.e. Factoring). Factoring is a task that quantum computers should be able to perform relatively easily; this invalidates today’s “public key” methods used to secure the internet. If left unresolved this will affect private data, classified data, public service platforms, and legacy data.

•  Quantum Key Distribution (QKD) appears to be a riposte, as it uses the laws of quantum physics to set up a remote key without the possibility of intercept or the ability to copy. It is not susceptible to quantum computers. However, care must be taken to ensure that engineering imperfections do not weaken the system.

SECURING  THE  INTERNET  

GO Science 4

Quantum Cryptography and Communication

SECURING  THE  INTERNET  

•  Quantum cryptography uses light that has been dimmed to the point that it has been reduced to a sparse stream of photons.

•  BB84 method proposed by Brassard and Bennett in 1984 uses single photons that represented “1”s and “0”s depending on their polarisation. The exchange protocol between transmitter and receiver allows secure transmission of a key.

•  A method using a more subtle aspect of quantum mechanics was proposed by Artur Ekert in 1991. Two “entangled” photons were generated as a correlated pair for each bit. These know only that the other has opposite polarisation to itself and there are fundamentally no labels that describe which is which - until at least one is measured.

•  Thus, if the transmitter and receiver each possess one of the photons they know that any measurement will be the opposite of the other. Any measurement by a third party destroys the delicate entangled state.

Page 3: Computational vs. information- theoretic security

13/09/17

3

From the NCSC Whitepaper •  Public key cryptography used for digital signatures and for key establishment. Each user holds a private key and an

associated public key. The public key enables public functions: verifying a digital signature or encrypting data, so shared with others. The private key enables complementary private functions: creating a digital signature or decrypting encrypted data, so kept private.  

•  Security of public key cryptography depends on infeasibility to compute the private key associated with public key. Keys must be of sufficient size to ensure that such cryptanalytic computation is infeasible.

•  Common public key primitives in use include RSA.•  Shor’s algorithm is an efficient quantum algorithm for recovering private keys from public keys. Shor remains efficient

for all practical key sizes, so threat cannot be mitigated using larger keys.•  Need to end reliance upon cryptographic primitives vulnerable to Shor’s algorithm.•  NCSC believes this threat must be taken seriously, and there is a need to identify mitigations.•  The phrase “quantum-safe cryptography” is used to describe three distinct kinds of cryptography:

–  primitives for ‘post-quantum’ public key cryptography, i.e. public key cryptography not vulnerable to Shor–  primitives based on symmetric cryptography, e.g. for bulk encryption–  hardware for quantum key distribution or for processing pre-placed authentication keys where information-theoretic security is

required

NCSC does not presently endorse quantum key distribution for government use . Information-theoretic security is not considered necessary by most stakeholders for the vast majority of applications.

NCSC Whitepaper view 2•  Symmetric Cryptography

–  Many extant standards for symmetric cryptography offer good security and regarded as quantum-safe provided they are used with appropriate key sizes; these include AES for bulk encryption, and modern hash functions. Quantum computing may have some impact on the security of these primitives, it is unlikely to be significant for good cryptographic design.

•  Post-Quantum Public Key Cryptography–  Various mathematical structures have been suggested for building post-quantum public key primitives. Each

structure requires acceptance of some computational-hardness assumption. Different structures lead to primitives with different characteristics when measured in terms of key size or algorithm efficiency. Older proposals tend to have well understood security arguments but somewhat inflexible to use, while newer proposals have better functionality and performance but less well understood security characteristics.

•  NCSC believes that a transition from current Public Key Cryptography to Post-Quantum Public Key Cryptography will be the most appropriate way forward for the majority of Government users of cryptographic solutions.

•  The Blackett Review recommends that both Post-quantum Cryptography and QKD research progress and vulnerabilities and advantages be identified

Page 4: Computational vs. information- theoretic security

13/09/17

4

GO Science 7

Quantum Cryptography and Communication

SECURING  THE  INTERNET  

•  fast development using fibres, metrolinks

•  free space

•  Satellites

•  industry involved (Toshiba, NEC, idQuantique, etc, …..

•  Trials in financial districts

•  Vulnerability analyses

•  Nitin Jain, Birgit Stiller, Imran Khan, Dominique Elser, Christoph Marquardt & Gerd Leuchs (2016) Attacks on practical quantum key distribution systems (and how to prevent them), Contemporary Physics, 57:3, 366-387,

Authentication and man in the middle attack

•  Alice and Bob must be authenticated to each other. •  During the operation of the QKD protocol, Alice and Bob can use a (typically) small

amount of the distilled key from the previous run for authentication purposes. However, this poses a difficulty for the very first key exchange. To elaborate, Alice and Bob would need a pre-shared key before they have initiated the key exchange process.

•  QKD is more accurately described as a key growing (instead of generation) process: an initially small key is grown to arbitrary length.

Page 5: Computational vs. information- theoretic security

13/09/17

5

How close is quantum computing to being a real threat?

•  Small prototypes exist•  Scalable?•  Above error correction threshold achieved in

promising candidates (ion traps, superconducting circuits)

GO Science

Quantum Computation

•  Moore’s Law: the density of integrated circuit components doubles approximately every 18m. Over the last few decades the law has held; several factors prevent it from continuing for many years. Size of the transistors would reach atomic scale, and issues due to heating and current leakage become overwhelming.

•  Certain types of problem are intrinsically more complicated than conventional logic or arithmetic problems. Optimisation or integer factorisation are common examples and simple solutions would be very valuable.

•  A quantum computer would be able to address many of these with comparatively little effort and possibly act as a co-processor to conventional machines.

•  They would not be necessarily faster or possess more computational elements, but would be able to deal with different and important challenges much more efficiently.

 

SUPERCOMPUTING  

Page 6: Computational vs. information- theoretic security

13/09/17

6

GO Science

Quantum Computation

•  Modern computer technology works on the principle of storing and reading binary bits, i.e. 0’s and 1’s that can represent definite numbers or logical states.

•  Quantum computing uses qubits. The internal state of each represented by a point on the surface of a sphere. However, when measured they will output a “1” (north pole) or a “0” (south pole) with finer detail being lost.

•  A calculation takes place when a number of qubits interact. Some will be entangled with each other, i.e. share some knowledge unavailable to others. A quantum computer can only produce one output when measured which will be engineered to be correct to a high degree of probability.

•  There are several different paradigms of quantum computer each using a variety of qubit families:

•  Circuit model – many possible types, 10yr+ •  Measurement based or cluster state, 10yr+ •  Topological (based on surface properties), unknown timescale •  Analogue quantum computers, e.g. Optical Lattice, 5yr+ •  Adiabatic (slow change, optimisation) 10yr+

SUPERCOMPUTING  

How good are we?

Page 7: Computational vs. information- theoretic security

13/09/17

7

UK approaches: Blackett Review

•  Recommendation 5: Regulation should not present a barrier to the use, deployment and commercialisation of quantum technologies. The National Programme should ensure regulators and standards bodies are aware of the capabilities of the technologies under development, so that regulations are formulated to realise the full potential of these technologies. Test-beds and road-mapping should be considered as a route to development of the regulations by government.

•  Recommendation 7: The National Quantum Technologies Programme should fund collaborative work between UK quantum communications and cryptography research groups, leading to joint technical developments of both quantum key distribution (QKD) and post-quantum cryptography (PQC), as well as work on digital signatures and other uses of these technologies.

•  Recommendation 8: The National Cyber Security centre should support a pilot trial of QKD using realistic data in a realistic environment, with the facilities for the trial being provided by the Quantum Communications Hub. Such a trial should serve to stimulate the supply chain and show UK leadership in secure communications.

•  Recommendation 9: The National Physical Laboratory, the National Cyber Security Centre and academia should form a partnership to perform conformance tests and issue accreditation certificates. This process would need to involve engagement with other interested parties from industry, such as the communications and financial services sectors, and could lead to the establishment of an independent national facility.