[ad_1]

The US National Institute of Standards and Technology (NIST) has chosen the first group of encryption tools designed to withstand the attack of a future quantum computer, which could potentially crack the security used to protect privacy in the digital systems we rely on today.

Léo Ducas, senior researcher in the cryptology group at the Netherlands’ Centrum Wiskunde & Informatica (CWI), the national research institute for mathematics and computer science, is involved in the two most important algorithms of the upcoming NIST portfolio – one for public key encryption and one for digital signatures.

According to Ducas, who is also a professor at the University of Leiden, these new standards are inevitable because there is nervousness about the arrival of quantum computing. “We know quantum computing will not be rife tomorrow, but this standardisation procedure and its deployment take time,” he said. “Obviously there is certain sensitive information that needs to be secure and confidential – not just at present, but in the future as well. Take state secrets, for instance.”

Cyber security experts have warned that hackers are stealing data now to decrypt it in the future, when quantum computing could render modern encryption methods obsolete. A report published by NIST in April 2016 cited experts that acknowledged the possibility of quantum technology rendering the commonly used RSA algorithm insecure by 2030. “We need to be ready for that,” said Ducas. “This means we have to anticipate now.”

The announcement of the chosen tools follows a six-year effort managed by NIST, which started in 2016 with a call for the world’s cryptographers to devise and then vet encryption methods that could resist an attack from a future quantum computer. A total of 23 signature schemes and 59 encryption schemes were submitted, of which 69 were deemed complete and proper. The NIST competition consists of four rounds, during which some schemes are discarded and others studied more closely.

In July this year, NIST announced the first group of winners from its competition, which included Crystals-Kyber and Crystals-Dilithium, both developed by an international collaboration in which CWI participated. Other team members are ENS Lyon, Radboud University, Ruhr University Bochum, University of Waterloo, IBM, NXP, ARM, SRI International, Florida Atlantic University and Tsinghua University.

“It was a rather big team, but that was the key aspect,” said Ducas. “It consisted of both industrial and academic people, and all their knowledge was necessary to develop the algorithms we have. Take NXP, for example – they build chips and already use cryptology to embed in those chips. We needed their knowledge for the design, because it is essential that what we develop not only fits into devices like smartphones and laptops, but also in other places where chips are being used, like in the automotive industry. Fitting cryptology can be a big challenge.”

Apart from the two algorithms in which CWI was involved, two further algorithms for signatures were selected by NIST – Falcon and Sphincs+. Sphincs+ also was partially conceived in the Netherlands, led by Andreas Hüsling from TU Eindhoven.

Ducas added: “The selection of our schemes as a standard means that it will be deployed globally, protecting the privacy of billions of users. Fundamental research rarely gets such a direct and broad impact. The credit should go to the whole cryptographic research community. Whe schemes we proposed are merely the crystallisation of decades of scientific effort.”

The algorithms developed by the international team are based on lattices, one of Ducas’ specialities. “Both were designed together and share more than just the same mathematical platform,” he said. “We tried to make them look alike, so they will be easy to implement together.” The Falcon algorithm designed for signatures also uses a lattice platform.

“But that is where the similarity ends,” said Ducas. “This algorithm has different advantages and drawbacks.”

One of his biggest concerns is that this algorithm computes with floating point numbers, as opposed to integers. “Computers are obviously equipped to do this, but it is a real challenge for cryptology”, said Ducas. “Rounding can differ from computer to computer, so it has challenges for implementation. But because of its shorter keys, it was also selected for the NIST portfolio.”

Now the four algorithms have been selected, they need to be written down into proper standards. “This is obviously where NIST comes in, whereas we are mainly academics and technicians,” said Ducas. “NIST will draft up the ultimate text for the standard, but it will be in coordination with us.”

NIST hopes to publish the standardisation documents by 2024 but, according to Wikipedia, may speed up the process if there are major breakthroughs in quantum computing.

After the release of the standards, the industry needs to be pushed to put them to use, said Ducas. “I have a suspicion that most companies will want to be post-quantum resistant, so I think these standards will be easier to push than, for example, the hash function update from SHA-1 to SHA-2,” he said. “Moreover, I think IBM and NXP will incorporate their own designs within their own products.

“Eventually, NIST is pushing the core of the new standard, the mathematical knowledge, but on top of that, there are a lot of things that are involved, like protocols, documentation, and so on. It might even evolve into an ISO standard, who knows, but NIST is leading the crowd.”

So, will the new standards ensure we will be safe from quantum computers’ ability to possibly crack the RSA encryption? “This is related to the P versus NP problem,” said Ducas. “The best guarantee we can have are the years of documented failures. This is the case with existing cryptology, and still is the case with post-quantum cryptology.

“There is reasonable confidence to deploy, but no absolute mathematical guarantee. This is why we often say that cryptographers seldom sleep at night.”

[ad_2]

Source link