This is the first of a series of
technical blogs that DigiCert is publishing on quantum computing and the
coming post-quantum transition. Upcoming articles will provide
additional, easy to understand information about what is happening and
steps that can be taken now to prepare for the future. Bookmark our blog
and follow us on Twitter @digicert to stay informed. The Committee on Technical Assessment of the Feasibility and
Implications of Quantum computing, part of the National Academy of
Sciences, recently released a report entitled “Quantum Computing: Progress and Prospects”.
The 200-page report gathers the consensus of industry experts and
conveys an important message about the current state of quantum
computing and its threat to modern cryptography: the time to start
preparing for a quantum-safe future is now. DigiCert has estimated that it takes several quadrillion years to factor a 2048-bit RSA key using classical computing technology,
an estimate that is referenced in the National Academy’s report.
However, a sufficiently capable quantum computer can break the same key
much faster, perhaps in only a few months. There are still many
technical challenges that must be overcome before it is possible to
build a quantum computer that threatens RSA and ECC, the two main
asymmetric cryptographic algorithms that the internet’s security is
based on. The report estimates that such a quantum computer must be five
orders of magnitude larger, with two orders of magnitude lower error
rates, than the first-generation quantum computers that exist today, and
likely requires technological advancements that haven’t been invented
yet. Given the early state of the field, with rapid progress towards being
able to build quantum computers only beginning to accelerate within the
last few years, the report concludes it is still too early to be able
to predict when it will be possible to build a scalable quantum
computer. Progress towards that goal must track not only the scaling
rate of the number of physical qubits the computers have, but also the
error rates. Error rates are important because they have a big impact on the
number of physical qubits required to make a logical qubit. Physical
qubits are the individual quantum systems that represent either a ‘0’ or
a ‘1’. However, physical qubits are prone to errors, through
unavoidable interactions with their environment, even at temperatures
approaching absolute zero. Uncorrected, it is impossible to perform
large, complex calculations involving qubits without the errors quickly
overwhelming the calculation. Many physical qubits can be combined into a single logical qubit,
much in the same way that classical error correcting codes use several
classical bits to encode a single classical bit. However, the overhead
for quantum error correcting codes are much larger. Researchers have yet
to produce even a single logical qubit, however progress is rapidly
being made towards that goal. Once logical qubits are available,
tracking the number of logical qubits will be the preferred metric. The
committee concluded that “no fundamental reason why a large,
fault-tolerant quantum computer could not be built in principle.” While it will take a long time to overcome those obstacles, it will
also take a long time to develop, standardize and deploy post-quantum
cryptographic techniques. In fact, the timescales are probably roughly
the same. Especially for high assurance applications, it is critical to
begin the transition effort now, to avoid the possibility that quantum
computers will arrive before our cryptographic techniques are capable of
protecting critical information. In the near term, research and development into commercial
applications of noisy intermediate-scale quantum computers will probably
drive progress in this area. How useful these computers turn out to be,
and what problems they are able to solve, will probably be the driver
for increased investments in improving quantum computing technologies.
Right now, the field is extremely active, with billions of dollars of
research funding being committed to the race to produce larger and more
capable quantum computers. Industry standards groups are also preparing for a post-quantum
future, and DigiCert is very active in these efforts. Most well-known is
the NIST post-quantum cryptography project,
which is working with researchers around the world to develop new
cryptographic primitives that are not susceptible to attack by quantum
computers. However, it will be several years before those algorithms are
ready for standardization. A simpler technology (hash-based signatures,
RFC 8391) has been standardized by the Internet Engineering Task Force
and will soon be standardized by NIST. While it has some drawbacks
compared to more advanced algorithms, namely larger signatures and a
limit on the total number of signings, it has the advantage of being
well-understood, quantum-safe and available now. Other efforts by standards groups include ANSI X9’s Quantum Risk
Study Group, which is preparing an information report specifically
examining the threat quantum computing poses to cryptography being used
in the financial services industry. The report is expected to be
available in early 2019. The European Telecommunications Standards
Institute (ETSI) also has a Quantum Safe Cryptography group, which has
been producing information reports for the past nine years. These two parallel efforts are rapidly heating up: one, by those who
are exploring how to build large, fault-tolerant quantum computers, and
the other, by those who are seeking to make sure quantum-safe
cryptography is available and ready to be deployed before that happens.
DigiCert will be producing a series of blog posts to help readers
understand this important transition, and what they can do to protect
their systems from the upcoming threat to existing asymmetric
cryptographic algorithms, like RSA and ECC.