최근 컴퓨터의 발전이 한계에 도달했다. 인공지능, 가상현실 등의 발달로 처리할 데이터양은 날로 느는 반면, 더는 소자를 작게 하는 것이 어려워졌기 때문이다. 소자를 원자 하나 이하로 구현하기는 불가능하다. 또한, 원자 단위의 미시세계에서는 예기치 못했던 문제들이 발생하기 때문에 이제 기존 방식으로는 성능을 높일 수 없다. 이에 양자역학의 원리에 따라 작동되는 양자 컴퓨터가 대안으로 떠오르고 있다.
With the development of artificial intelligence, virtual reality, etc., the amount of data to be processed is increasing, but the limits of integrated circuits are approaching. So instead of gates made of transistors, quantum computers that use the principles of quantum mechanics as their operational rules are emerging as an alternative. What exactly is quantum, and how can it be an alternative? A reporter who has no connection with science takes a close look at everything from quantum to the recently popular quantum computer with a learning mind. How conventional computers work A computer is literally a machine that computes. As a computer calculator, what makes it different from the calculators we commonly use is that it can perform larger and more diverse calculations.
The language of computers is only 0 and 1. Computers express numbers by changing the state of the elements in the computer. To make this easier, computers use the binary system. The reason why the commonly used decimal system is not used in computers is simple. It requires 10 states for the elements. In the binary system, 2 states are enough for the elements. If a semiconductor diode is closed, it is 0, and if it is open, it is 1. A diode can only have a value of 0 or 1. This basic unit is the bit.
All integers can also be expressed in binary. If diodes are lined up in a long row, each representing 0 and 1, a large number can be expressed, and diodes lined up in several rows can represent many numbers. If many diodes are left in this way, each representing 0 and 1, large and large numbers can be remembered. These are memory devices such as HDDs and SSDs.
The opening and closing of the diode, or switching, also enables calculations. The most basic command of a computer is addition. Multiplication is the accumulation of additions, and subtraction and division are the reverse of addition and multiplication. A calculation is made by sending a command to the processing unit to retrieve the number stored in the memory device and change the switching state of the diode in some way. This is what happens in the CPU.
Information flow in computers Computers simply change numbers into other numbers. Computers can be thought of as calculators that change input bits into output bits. Computers have made remarkable progress in accordance with Moore's law, which states that the integration of semiconductors doubles every two years.
However, recent computer development has reached its limit. With the development of artificial intelligence, virtual reality, etc., the amount of data to be processed is increasing day by day, but it is becoming more difficult to make the components smaller. It is impossible to implement a component smaller than one atom. Moreover, performance can no longer be improved using existing methods because unexpected problems arise in the microscopic world of atoms.
Accordingly, quantum computers that operate according to the principles of quantum mechanics are emerging as an alternative.
Design of a quantum computer In December 1959, physicist Richard Feynman gave a lecture to the American Physical Society called "There's Plenty of Room at the Bottom." In it, Feynman made a series of remarks that hinted at the possibilities of computation using the principles of quantum mechanics.
As time went by, in 1981, Feynman presented the idea of a quantum computer at a conference jointly hosted by MIT and IBM, saying, "Nature is not classical. If we want to simulate nature, we have to do it quantum mechanically."
In 1989, David Deutsch, an Israeli-born British physicist, proposed a computer that operates on the principles of quantum mechanics.
How Turing machines work In the past, British mathematician Alan Turing designed the Turing machine. The Turing machine is a machine that receives a series of 0s and 1s recorded on a one-dimensional tape as input and processes all tasks. Turing thought that all mathematical operations could be implemented through this machine. Such Turing machines are called universal machines.
Deutsch extended the concept of Turing machines from mathematics to physics. He thought that all physical phenomena could be described by a universal machine. The universal machine must implement quantum mechanics, so it must be a quantum computer.
Principles of quantum computers Quantum computers, which use the principles of quantum mechanics, operate on completely different principles from existing computers, or classical computers. The uncertainty principle of quantum mechanics states that measurement values are given probabilistically due to the superposition of states with different characteristics. Quantum computers that use this allow one bit to be 0 and 1 at the same time. This bit in a quantum computer is called a quantum bit, or Qbit for short.
Qubits are usually explained with a graph called a 'Bloch sphere'. Phenomena such as superposition are foreign to our experience in the macroscopic world we live in, so there is no proper visualization method. The Bloch sphere is just a diagram that can be understood in light of everyday experience.
Bloch sphere model illustrating a qubit, the basis of a quantum computer In a Bloch sphere, the surface of the sphere is a probabilistic representation of all possible events, where the sum of the probabilities for two outcomes is 1.
While bits can only have the values 0 and 1, corresponding to the poles of the sphere, qubits can exist anywhere on the sphere. On the surface except for the North and South Poles, the probabilities for two events coexist: 0 and 1.
If there is a 10% chance that an object called A is at B, that means that one out of ten times it will be at B and nine out of ten times it will not be at B. In this case, A can never be in two places at the same time. If A is found at B, it cannot exist anywhere else. However, in the microscopic world, A exists 'simultaneously' at B and at places other than B. However, the probability of its existence is only 10% at B.
Qubits can process data in parallel and at the same time, and as the number increases, the amount of information that can be processed also increases exponentially. Two qubits can superimpose the four forms of '00, 01, 10, 11', and n qubits can do 2n. Therefore, the parallel processing of the input information makes the computation speed incomparably faster than that of existing digital computers.
The utility of quantum computers In 1994, Peter Shor, an American theoretical computer scientist, proved that the quantum factorization algorithm is much more efficient than the classical information theory one. He presented the Shor algorithm, and announced a method to factorize arbitrary integers in polynomial time using a quantum computer.
The RSA encryption, which is widely used today, is one of the public key encryption systems. It is known as the first algorithm that can be used for not only encryption but also digital signatures. This encryption system, which is based on factoring, is based on the fact that factoring small numbers is not difficult, but factoring large numbers is difficult. If an algorithm is discovered that can factorize large numbers dramatically faster, this cryptosystem will be rendered worthless.
Adi Samir, the 'S' in RSA This is called RSA-768.
Composite numbers are
It is made up of the product of these two prime numbers.
The factorization took hundreds of computers connected in parallel for two years. If it had been done on a single-core AMD Opteron 2.2GHz computer, it would have taken 2,000 years, the factorization team estimates.
The biggest feature of quantum computers is that they utilize superposition states, so they can perform simultaneous processing. Quantum computers perform calculations through a process of maintaining a superposition state and ultimately obtain results by measuring them.
In the world of quantum mechanics, observations also affect the observed object. The problem is that in order for the superposition to be maintained, decoherence must be prevented. It must not be measured. However, avoiding decoherence is very difficult.
Computer work is a combination of 'sequential work' and 'conditional branching'. Conditional branching is to decide whether to execute the instructions that belong to a certain condition when a certain condition occurs. However, in order to decide whether to execute the instructions, you need to know whether a certain condition has occurred, and since this is an observation, decoherence occurs. In other words, a method is needed to decide whether to execute the instructions without measuring.
In the next article, we will learn about the method and commercialization of quantum computers.