top of page
  • Writer's pictureOmnipresent

An Introduction to Quantum Computing


New technologies continue to revolutionize the world and the way humans are able to improve their lives. IoT and AI are the current frontrunners of these new technologies, however, even more opportunities lie ahead. One such opportunity is quantum computing, which leverages the principles of quantum mechanics in such a way that classical computing can be surpassed.


In this article, we introduce the fundamentals of quantum computing, explain the great implications it will have on technology, and briefly discuss current progress in the field of quantum computing.


A Refresher on Classical Computing





Modern computers use classical methods of computing, pioneered by the works of computer scientists such as Alan Turing. In this model of computation, bits are treated as “atomic”- irreducible units of information that are used as the foundation of more complex structures. Practically speaking, bits are binary digits, with a value of either 0 or 1. Using various combinations of zeros and ones, it becomes possible to represent numbers by using base-2 notation. Using these numbers, it becomes possible to represent characters, letters, symbols and words by making use of encoding systems such as ASCII and Unicode. Using these words, it becomes possible to represent much more complex concepts such as articles and books. Thus, bits can be seen as the “atoms” of the computing world.


However, bits are capable of more than just storing information- through the use of boolean logic, complex logic and functionality can be created. It is important to note that classical computers are deterministic- that is, for any exact input, only one output is possible. In other words, there is no randomness at a fundamental level. There can be nondeterminism in computing (e.g. parallel execution), but this occurs at higher levels of complexity and is not a concern for the topics discussed in this article.


Quantum Computing vs Classical Computing


Unlike classical computers, quantum computers differ significantly at a fundamental level. Although quantum computers also make use of bits, they differ in the fact that a quantum bit (known as a qubit) is not forced to be either 0 or 1. Instead, it has a probability of being 1, and a probability of being 0. In other words, qubits exist in a mixed state- a combination of an infinite amount of states between 0 and 1, and this property is known as superposition.





Although this mixed state exists, when you read a qubit, you will get a result of either 0 or 1- you will not get an intermediate value. This is because the act of measuring collapses the superposition of the qubit. This allows us to perform classic-style processing and computation with a qubit.


This may give the impression that a quantum computer is nothing more than a noisy classical computer, however, the superposition of a qubit, in conjunction with the principle of quantum entanglement, allows quantum computing to work much faster than classical computers. To elaborate, entanglement is the property of a qubit that allows it to be strongly correlated to other qubits- their readings will correlate with each other, even though no information is exchanged between them.


As a concrete example of entanglement, suppose a state of two qubits is created such that the logical OR will always be 1. If these qubits are entangled, and one qubit is measured to be 1, the other qubit, if measured, will return a reading of 0. This would also happen if the first qubit was measured to be 0- the other would be measured to be 1. In a way, both qubits “know” what the result would be, despite the fact that they exchange no information. This correlation would hold up no matter how far apart these qubits are.


By having entangled qubits, it becomes possible to have a collection of qubits that express many states at once, whereas a classical bit can only express one state at one time. For example, consider 2 bits- there are four possible states for these two bits (00, 01, 10, and 11). Two classical bits can only represent one of these states at one time. Two qubits, however, can represent all four of those states at once. This is what makes quantum computing extremely powerful- simultaneous representation of multiple states makes it faster to perform computationally heavy tasks.


Implications of Quantum Computing


Quantum computing has far-reaching power and will revolutionize the science and technology sectors. Due to superposition and entanglement, it is significantly more efficient than classical computers at certain calculations. Quantum computers are expected to see use in simulations within the fields of chemistry and physics. Of particular importance to those in the technology industry, is the fact that quantum computers will change the way cryptography is done.


Quantum Computing’s Impact on Cryptography


Cryptography plays a critical role in the security of digital transactions, allowing retailers and shoppers to transact safely online. Modern cryptography is based around the concept of one-way functions, which are defined as functions that are easy to compute, but either impractical or impossible to reverse (i.e. obtain the original inputs, given the output). If it is impossible to reverse a function, inputs to that function can be used as “secrets”. However, quantum computers will be able to reverse these functions efficiently, forcing cryptography to undergo a large-scale transformation.


A concrete example of how quantum computing will achieve this, is by assessing the RSA cryptography system. RSA is built around prime factors, and is currently the most widely used cryptography system in the world.


As a refresher on the topic of prime factors, recall that a prime number is a number that can only be divided (without a remainder) by 1 and itself. For example, the number 23 is not divisible without a remainder, unless 1 or 23 is the divisor. Attempting to divide 23 by any other number besides 1 and itself will create a fraction, therefore it is a prime number. 17 and 101 are additional examples of prime numbers.


Additionally, recall that integer factorization is the process of breaking a number down into a product of smaller numbers. For example, 2 and 3 are factors of the number 6. Therefore, prime factorization is the process of breaking a number down into a product of smaller prime numbers; non-prime numbers are not counted. Continuing from the given example, 2 and 3 are prime factors of the number 6.


RSA cryptography depends on the fact that it takes an unrealistic amount of time to break a very large number down into a product of prime factors. A classical computer would take around 300 trillion years to do this for a RSA-2048 key (which is a 617 digit number). This is the essence of a one-way function and modern cryptography; it is easy to calculate the product of 2 primes, and it can be done in seconds, but it is practically impossible to get the original primes, given only the output. Furthermore, this is also what makes RSA safe- the original numbers cannot be deduced by any method in a reasonable amount of time, even brute force.


Quantum computing, however, would be able to reverse this function significantly faster due to the ability of qubits to exist in multiple states at once, and therefore represent multiple numbers at once. A research paper released in 2019 by the authors Gidney and Ekera posits that an RSA-2048 key could be cracked by a quantum computer in approximately 8 hours. While this is still a while off (the highest factorization done on a quantum computer so far was a 7-digit number in 2019, however it used some classical computing), awareness of future changes serves as an important topic of conversation for any business looking to future proof.


Conclusion


Although quantum computing is a field which is still in the early stages of research and development, it presents a great opportunity for those who desire the highest level of efficiency and algorithmic power. Classical computers are fast approaching their theoretical limits- as process nodes become smaller, the risk of electrons misbehaving becomes larger, ironically due to a quantum mechanics principle known as quantum tunneling. This tunneling can cause bits to switch on, even if they are not supposed to be switched on- an issue with no easy fix.


Consequently, scientists and industry are being encouraged to think from a quantum point of view, while also trying to find business strategies to cope with the upcoming limits, such as developing industry-specific hardware. Those who begin early adoption of quantum technology may well find themselves to be pioneers in the upcoming era.

bottom of page