THREE NEW QUANTUM CRYPTOGRAPHY PROTOCOLS

Welcome to the exciting world of cryptography, where the minds of cryptographers and cryptanalysts are united in an endless battle for secrets and ciphers. We tried to clearly explain how quantum cryptography works, as always – with examples and humor!

In ancient times, people wondered how to secure the transmission of confidential information so that no one else could read it. It was then that the Caesar cipher appeared – so sophisticated that the characters in the alphabet were simply shifted a few positions to the right or left. For example, the letter A became C, and B became D. However, despite its simplicity, this encryption method was quite easy to decipher.

Nowadays, information is a valuable asset not only for states and corporations, but also for ordinary people, including us. We all use instant messengers and want our personal messages to remain confidential, accessible only to us and our interlocutor, and so that our financial data when we make online payments is not vulnerable on the Internet. All this is possible thanks to cryptography, which, like an invisible bodyguard, protects our secrets.

What is regular cryptography?

Conventional cryptography is the art of protecting information by encrypting and decrypting data. To visualize this process, let’s imagine ourselves in the role of Stirlitz, who intercepts radio signals and records strange numerical sequences for subsequent analysis and decoding of a secret message. To turn these numbers into understandable words, a key is needed, which could be, for example, a page of a book or a card with unique symbols. The main thing is that this key is known only to the sender and recipient of the message, thus ensuring the confidentiality of information.

In the middle of the 20th century, scientists came to the conclusion that if the key is random, the length of the key is equal to the length of the message and is used only once, then it is almost impossible to decrypt such a message. This principle can be compared to using a “one-time pad” – we write down information on one piece of paper and then destroy it to prevent third parties from deciphering the message.

To use this encryption method, you must not only have the appropriate infrastructure, but also have at your disposal an entire army of carriers of one-time pads that will circulate among all participants in confidential communications. However, this is only the first step in a complex process. Encryption techniques like these require such careful organization that it sometimes makes more sense to hand over the key in person. However, there are dangers lurking here too – experienced cryptanalysts, like cunning foxes, will always find a way to deceive and steal keys. To summarize, encryption is an art in which every step must be carefully weighed with security aspects in mind.

Exchanging keys in a personal meeting can be compared to trying to transfer a secret code from a safe via pigeon mail – inconvenient and unreliable. Even the most sophisticated government spies in the world's richest countries find it difficult to arrange regular meetings just to exchange keys. However, thanks to scientific discoveries, a method of exchanging keys through a public channel has emerged, which allows you to maintain confidentiality and not reveal secrets (Diffie-Hellman protocol). This cryptographic breakthrough was a security revolution and is still widely used today with some improvements.

The considered protocol is based on the assumption that the problem of calculating a discrete logarithm is extremely complex, and that even the most advanced computing systems are not yet able to solve it. Its reliability today is due to the lack of sufficient computing power and efficient algorithms. However, there will come a time when quantum computers become so powerful that everything will change. Peter Shor has already developed a quantum algorithm that can not only factor numbers, but also find discrete logarithms. This marks an important moment in the development of quantum computing.

By introducing changes to the quantum circuit, the researcher achieved something of a double whammy against the fundamentals of cryptography – RSA and the Diffie-Hellman protocol. With the prospect of a universal quantum computer approaching, all traditional encryption methods are vulnerable. However, don't panic! The introduction of quantum computing has not only shocked cryptographers, but also shed light on new perspectives. In particular, an innovative key distribution method has emerged that solves many of the problems associated with the Diffie-Hellman protocol. Thus, even the simplest man-in-the-middle attack becomes useless, because the fundamental principles of quantum mechanics impose their own insurmountable limitations.

First quantum protocol

The first quantum protocol BB84 for secret key distribution was developed by outstanding experts in the field of quantum technologies – Charles Bennett and Gilles Brassard back in 1984. Imagine the scene: on one side is Alice, on the other is Bob, exchanging information, while Eve is hiding somewhere in the shadows, eager to intercept the data and use it for her own selfish purposes.

Alice creates random numbers using a quantum generator and encodes each bit of information into a single photon. She then sends the photon to Bob, who tries to discover its condition. Unlike a regular key, which can be forged and read, a single photon is a true master of stealth! When Bob tries to find out his condition, he instantly changes, leaving him empty-handed.

Some theorists have been able to establish a connection between the errors and disturbances introduced into the breach and the proportion of information that is intercepted. Imagine that the key becomes the object of close attention. If the importance of the information in this key is too great, then the key itself is in danger of being compromised. But in reality, we were only transmitting random bits that did not carry any valuable information! As a result, the attacker receives only useless random numbers that have no value, like snow in Antarctica. However, if the key passes all checks and contains a minimum of errors, we can extract a secret symmetric sequence of bits from it without distortion and use it to reliably protect information. Thus, we return to the same problem, but within the framework of symmetric cryptography.

In an ideal world of security and privacy, the ideal key for encrypting a message would be the same length as the message itself—it's basic math. A simple example: transmitting just a few sentences in 200 characters requires sending 1600 bits of information. Alice, being resourceful, chooses to use single photons as data carriers. It encodes each bit as a light signal and sends it into space. Of course, photons travel not only through ideal conditions, but also through real fiber optic cables, where they may be subject to loss. However, this is not a problem, since each photon carries a unique bit of information, like a reliable courier. Thus, even if one photon gets lost along the way, there will always be another that will deliver the data to the recipient.

In the field of quantum communications, there is an interesting phenomenon: photons must “rest” every 100 kilometers before continuing on their way. This happens thanks to trusted nodes, which play an important role in the process of information transfer. Of particular interest, however, is the quantum repeater, a device that combines the entanglement elements of teleportation technology with quantum memory. If it is possible to teleport quantum particles and preserve their state, then it will be possible to transfer quantum keys over vast distances, thousands of kilometers, while ensuring reliable cryptographic security even in the presence of potential eavesdroppers. Despite all the promise, this technology has not yet reached its full potential.

To initiate data transfer through quantum channels, it is necessary to have a light source capable of generating single-photon states. However, creating such a source is a non-trivial task. One approach is to use a laser pulse containing an average of a million photons, which is then attenuated by an order of magnitude of 10 million. In this way, the required single-photon state is achieved, opening the way for data transmission via quantum channels. We see that even seemingly complex processes can be implemented using modern technologies and methods.

On average, there are only 0.1 photons per pulse of light, which can be compared to searching for a precious meme among ordinary posts on social networks. Out of 10 pulses, only one contains a photon, and sometimes even in one case out of 200 – two photons! This drawback must be taken into account during subsequent signal processing, similar to filtering out unnecessary selfies. The head of a quantum computer appears to have more states than there are pixels on the screen. There are four possible states here: two zeros and two ones. Zeros and ones in quantum mechanics behave like two parallel worlds that never seem to intersect. However, if you take two zeros and two ones from different bases, for example, vertical-horizontal and diagonal, they suddenly begin to behave unpredictably relative to each other, as if they are trying to find a common language.

The photon, as a quantum particle, has the property of inseparability. What happens to it during the measurement process? Paradoxically, it goes in both directions at the same time. When installing single photon detectors at the outputs, each of them is responsible for a specific logical state – zero or one. After measuring the states, the detectors compare their bases and weed out non-matching cases. If, for example, Alice sent 20 bits, and Bob received only 10, then they jointly decide whether to continue the process or whether it is better to stop it. Quantum information is not stored, but is converted into classical records in Bob's computer, where quantum bits are transformed into ordinary ones. Only after this is the basis checked, “post-processing” of the data is carried out, and the participants receive their quantum key. The problem is that Alice has no information about the basis in which the message was sent, and decides to measure the state in a random basis in order to then transmit the result. However, half the time it chooses the basis incorrectly, which results in the wrong state being transmitted and creating confusion on the receiving end. As a result, instead of a cipher, only white noise is transmitted, and Alice cannot decrypt the message.

A short list of quantum cryptography protocols

The E91 protocol (1991) in the field of quantum cryptography is a process for generating keys for secure communication based on entangled pairs of photons. Bell's theorem, in turn, witnesses violations of the ideal correlation between the parties in this process.

Protocol BBM92 (1992) is a fascinating slice of the world of quantum cryptography, with polarized photons and decoy states playing key characters. Their task is to ensure the secure transmission of quantum signals sent in an undefined direction.

When it comes to the B92 and MSZ96 protocols, we dive into the world of intrigue and complexity of quantum cryptography. There are some thought-provoking terms here: entanglement distillation, non-orthogonal quantum states, local filtering. All of them serve one purpose – to guarantee the security of information transmission even in conditions of loss and noise on communication channels. The MSZ96 protocol, in turn, prefers to use a weak optical field and non-orthogonal quantum states to encode a cryptographic key.

There is a protocol developed in 1998, which provides for six states of information transfer. It differs from the BB84 protocol in that it can operate effectively even in noisy environments, providing more reliable error detection. This is achieved by using a six-state polarization scheme on three orthogonal bases. In addition, this protocol operates successfully even in the presence of interference in the data transmission channel.

There is also the DPS protocol, developed in 2002, which is a simple and efficient method for distributing quantum keys. Unlike the traditional BB84 protocol, there is no need to select a basis. A feature of this protocol is the use of efficient time-domain pulses to quickly generate a key. Moreover, this protocol is resistant to photon splitting attacks even in low light conditions.

The SARG04 study (2004) is an improved version of the BB84 protocol, characterized by attenuated lasers and the use of Poisson sources. If the BB84 can be compared to a standard lie detector, then the SARG04 is already a professional polygrapher, capable of preventing deception using weak signals.

The COW protocol (2005) provides secure communication between parties using low-intensity coherent light pulses to transmit keys. The only thing required on the client side is a random number generator.

The three-stage quantum cryptography protocol (2006) involves using random polarization rotations to encrypt data and securely exchange keys. This is achieved through single photons, which seem to have knowledge of the realm of secrets. In the event of an attempt to interfere, this protocol is automatically adjusted to prevent violations.

The KMB09 protocol (2009) is an innovative technology that allows quantum messages to be transmitted over long distances with minimal losses and errors. This method is based on the use of two non-overlapping bases, which ensures high reliability of data transmission, especially in the case of photons with a high “IQ” level.

HDQKD is an advanced technology designed to protect quantum information. It allows data to be packaged into large formats, such as optical angular momentum, and transmitted over multi-core fiber optic or spatial communication links.

New quantum protocols

In August 2024, the US National Institute of Standards and Technology (NIST) introduced a number of new quantum cryptography protocols, offering an alternative to legacy encryption methods, including RSA. Lily Chen, chief cryptographer at NIST, is calling for new encryption methods to be adopted and outdated methods retired. Most scientists agree that the advent of large-scale quantum computers is still a long way off – at least ten years away. However, today there are two significant reasons for concern and caution.

First, many devices, from cars to smart homes, will be used for a long time and require quantum-resistant cryptography for future protection.

Secondly, today attackers can encrypt data that will be decrypted when quantum computers with sufficient power appear. This scenario poses a real “encrypt now, decrypt later” threat.

Today, security experts in various industries are concerned about the impact of quantum computers. They categorically warn: “Do not underestimate the potential of quantum technologies!” The latest encryption methods are being built with the possibility of using quantum computers as a tool for both security and breach of security. In their unique ability to solve certain problems, quantum computers are far ahead of their classical counterparts. Let's take, for example, the problem of lattice cryptography, which is based on the most complex geometric problem – finding the shortest vector. This task, which consists of determining the point located at the minimum distance from the origin, is so difficult that sometimes even mathematicians themselves cannot do it.

There are many innovative methods in the world of cryptography, such as isogonal cryptography, which is based on the use of elliptic curves for encryption. This makes the encryption particularly strong and makes it difficult to decipher.

Then there is error-correcting code-based cryptography, where recovering the code structure from error messages becomes a real art.

However, lattice cryptography is currently considered the most promising method. For example, back in 2016, NIST announced a competition for the best post-quantum encryption algorithm, which emphasizes the importance and relevance of the development of this area.

In the long-awaited 2022, the winners were announced – CRYSTALS-Kyber, CRYSTALS-Dilithium, Sphincs+ and FALCON. Sounds great, doesn't it? However, they were then renamed to the boring FIPS 203–206. Today NIST introduced us to the new FIPS 203, 204 and 205 standards, and rumors have it that FIPS 206 is just around the corner. It seems that FIPS 203, 204 and 206 favor lattice cryptography, while FIPS 205 is on its way. These standards include not only encryption algorithm codes, but also recommendations for their implementation, as well as scenarios where they can be used. Each protocol has three layers of security designed to ensure that future standards do not run into problems if vulnerabilities appear in the algorithms.

This year, the cryptographic community was shocked by the revelations of scientist Ili Chen from Tsinghua University. His claims that lattice cryptography is vulnerable to quantum attacks have raised serious doubts among even the most experienced hackers. However, after the community took the initiative and conducted a thorough analysis of Chen's arguments, it turned out that there were significant gaps in his theory, and the authority of lattice cryptography was restored.

This episode raises a fundamental question about the complexity of the mathematical problems underlying cryptographic schemes. There is no hard mathematical proof yet. The only real indicator of the strength of encryption, even with standard algorithms such as RSA, is the numerous failed attempts to break it over the years.

Conclusion

The centuries-old struggle between those who keep their secrets in records and those who seek to reveal them by any means has not subsided for several millennia. With the development of technology, the security requirements for quantum cryptographic systems will have to be revised again and again. Without efficient, reliable and accessible quantum cryptography, the digital space in which we all live and work today becomes simply unimaginable in the future.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *