The scientific and industrial revolutions will burst forth all at once, and its impact will be certainly witnessed in the coming five or ten years. Quantum computing, artificial intelligence (AI), and blockchain are some of the emerging technologies that will gain a lot of traction as this new worldwide scientific revolution progresses. Do these disruptive inventions have any connection with China's traditional culture? What relationship do they have with basic science, such as mathematics and physics?
In this article, we will be sharing a report compiled by Shoucheng Zhang, winner of the China International Science and Technology Cooperation Award in January 2018, and a foreign member of the Chinese Academy of Sciences.
"Quantum computing, artificial intelligence (AI), and blockchain are the three most fundamental technologies in the field of information technology today. The emphasis on basic science will enable IT to develop across disciplines. Physics and Mathematics are intertwined, and they have clearly contributed significantly towards the current IT revolution," Shoucheng Zhang.
I will first tell a story of scientific discovery related to the "angel particle." before speaking about quantum computing. Many interesting modern scientific discoveries have some connection with changes in philosophical concepts. These include several ancient and ingrained philosophical theories of the Chinese people. For example, it seems like the world has always been one of the oppositions between positive and negative. If there are positive numbers, there must be negative ones. Similarly, if there is yin, there is yang, and if there is good, there must also be evil. This worldview of opposition also manifests in the world of fundamental particle physics.
Paul Dirac was a great, historically important theoretical physicist who unified Einstein's special theory of relativity with quantum mechanics. Meanwhile, he performed a simple mathematical operation, i.e., taking a square root. While taking out a square root, there will always be both a positive and a negative solution.
A typical person might pay attention only to the positive solution and ignore the negative one. Dirac interpreted the negative solution with a meaning, i.e., all particles inevitably have antiparticles, and he predicted that all particles would have such antiparticles.
Physics had undoubtedly not discovered any antiparticle by 1928, and people viewed Dirac's idea of antiparticles with considerable skepticism; they said that his equation must have gone wrong. Dirac maintained that it was correct. Five years later, Dirac was lucky when physicists discovered the antiparticle to the electron - the positron - in rays of cosmic radiation. This interpretation of the positron was named the Dirac sea.
Subsequently, particle physics discovered the antiproton and antineutrino and was able to apply them. For example, positrons already have wide use in medicine. Positron-emission tomography (PET) scans use positrons and electrons to produce an image, and they are the best method for detecting Alzheimer's disease.
Today, Chinese people pay close attention to scientific developments. Therefore, what is the strongest driver of science? I believe it is a curiosity about life. Theoretical physicists' overall history has demonstrated this curiosity. For example, a falling apple inspired Sir Isaac Newton to discover gravity. When riding an elevator, Einstein felt that the up-and-down motion of the elevator played a role similar to gravitation, and from this perception, he was able to discover the great general theory of relativity.
However, scientific development should not blindly trust authority. After Dirac had become a famous theoretical physicist, scientists strongly believed that an antiparticle must exist for any particle living in the world. But another theoretical physicist, Ettore Majorana, wondered out of curiosity whether some particles existed without antiparticles. He developed the Majorana equation, which intriguingly describes a particle that either does not have an antiparticle or which is its own antiparticle.
Later, the world of physics was busy in a feverish search for two particles. The first, known as the "God particle," was detected by a European particle accelerator in 2012. Peter Higgs, the physicist who anticipated its existence, won a Nobel Prize. The second particle was the "Majorana fermion."
I am a theoretical physicist. A theoretical physicist generally predicts so that experimental physicists can test these predictions. In 2010, my lab group predicted that a Majorana fermion is detectable within an assembled apparatus. However, we still needed to find a signal that could prove the existence of this particle.
One day I was thinking, if a Majorana particle has only one side with no opposing side, then, it is half of a regular particle. Our theoretical group made a brave prediction: since a Majorana particle is different from ordinary particles, it is half of a regular particle. Therefore, its conductivity will also be different.
While the conductivity of ordinary particles is measurable in multiples of integers 0, 1, 2, and 3, a Majorana particle must have conductivity measured at half intervals. We predicted it would have an interval of 0.5 or 1/2. Later, we made an empirical observation in close cooperation with an experimental group, that this location of 0.5 existed. Here you can see an original figure from the experiment and how an interval appears at 0.5, proving the existence of the Majorana fermion. We named it the "angel particle," a name which people quite liked.
Today's computers belong to one of two categories: classical computers and quantum computers. Classical computers easily solve some types of problems. For example, a conventional computer can rapidly calculate the product of two large numbers. However, while determining whether a number is the product of two factors, (for instance, it is feasible to rewrite 15 as 3 x 5) you can perform the calculation for a small number like this.
But if I gave you a very large number, a classical computer would have to spend a long time to determine whether that number was the product of two factors. The reason is it uses a brute-force algorithm that divides the number by every number possible until it finally confirms that the number is a product of two factors. A classical computer does this very slowly.
Classical computers can only use the brute-force method to calculate a problem. However, the quantum world is a mysterious world of parallels. For example, in the famous double-slit experiment, if I release a particle such as a photon in front of two slits - a left slit and a right slit, it will go through one or the other. However, the quantum world contains inherent parallelism in which an elementary particle at some instant simultaneously passes through two slits. If it were only the left or the right slit, then it would not create the resulting image.
The quantum world is innately parallel. It can compute this problem in a second. Moreover, it can instantly calculate all possibilities because of the inherent parallelism of the quantum world, and this is the fundamental concept of quantum computing. However, it is tough to construct this quantum computer because of its fundamental unit.
The fundamental unit of a classical computer is the bit, i.e., a piece of information which is either 0 or 1. These 0s and 1s are useful in expressing all information, which is the fundamental concept of classical computing. However, in the quantum world, where a particle simultaneously passes through left and a right slit, a particle is in a state of superposition. It is not relevant to define a quantum bit as a 0 or a 1; it is in a state of superposition between 0 and 1.
We have all heard about how Schrodinger's cat superposed between life and death. This phenomenon is extremely curious. This fundamental phenomenon explains how a quantum bit's state is inherently unstable: when you go to observe the surroundings to know whether it is on the right or left – or a 0 or a 1 - any slight noise will significantly interfere with the quantum bit.
Quantum computing has recently drawn interest from around the world, especially from famous American companies. Although Google, Microsoft, IBM, and Intel have all invested in quantum computing, they have been unable to resolve the instability problem of a quantum bit. If someday they tell us that they've made 50 quantum bits, the primary question remains of how many of them are useful. In the quantum computing framework, just one useful quantum bit requires 10, 20, or even 50 error-correcting bits to assist it, making actual quantum computing very difficult to achieve.
However, the discovery of the "angel particle" has fundamentally altered the challenge of quantum computer development, from a process of changing the quantity to one of improving quality. Quantum bits themselves contain an error-correcting mechanism: the two angel particles that can be split off from a quantum bit. A typical particle has two sides, but an angel particle only has one side. Therefore, an angel particle is generally equivalent to just half a particle.
Therefore, a typical quantum bit can use two angel particles to store itself. As soon as the two angel particles store it, those particles are entangled even at a great distance. Since the noise of the classical world does not have any involvement in the entanglement, there is no way it can disrupt the quantum bit stored by the angel particles. This change is revolutionary.
It was because of this reason, in a talk at a recent meeting of physicists in the United States, I said the discovery of the angel particle is fascinating. . These particles are useful, regardless of how many quantum bits a quantum computer contains. There is no need for them to be occupied with error correction when each bit can instead provide its error correction ability. This will serve to increase the pace of development of quantum computing rapidly.
The proposal for artificial intelligence (AI) as a fundamental concept came in the 1960s. Today, AI is developing so rapidly primarily because of the convergence of several new technologies. Moore's Law describes a doubling of capability every 18 months. However, if quantum computing enters the picture, there will not only be a doubling every 18 months but a complete shift in thinking from quantity to quality.
Human's computing capability is continuously increasing for years. The birth of the Internet and the Internet of Things (IoT) has produced a massive amount of data. Intelligent algorithms are changing at a rapid rate. Big data can assist in machine learning. However, data of all kinds is the foundation of AI. Without data, there could be no artificial intelligence, regardless of the effectiveness of algorithms or the power of computers.
Although I've seen it develop rapidly, I feel that AI is still in its most early stages. However, why do I say that? I'll give you an analogy. Imagine humankind initially seeing birds and wanting to fly. Original research into an airplane was a simple imitation of biology. People would attach wings to their arms in imitation. However, the invention of a real plane happened when the humans understood the principles of flight - such as aerodynamics. They were finally able to design optimal artificial flying machines after they had the laws of physics and necessary mathematical equations in place. Today's airplanes fly high and fast. However, they look nothing like birds, and this is a crucial difference.
Today's AI largely reflects human neurons. It is more important that we think about how the opportunity exists here for breakthroughs in basic science. We can create a radical change in artificial intelligence only by truly understanding the fundamental principles of intelligence.
How to determine that AI has reached a human level of intelligence? Perhaps some of you know about the Turing test. The Turing test involves a human conversing with a machine but without knowing whether the conversation partner is a human or a machine. It would indicate that robots have reached the level of humans if the human continues the conversation for a day without perceiving the conversation partner i.e. whether it is human or machine. I do not agree with this method of determination although Turing was a great computer scientist. Many human emotions are not at all rational, and it is quite challenging to get a rational machine to learn and imitate an irrational human brain.
Therefore, I want to propose a new process to determine when robots can have intelligence that indeed surpasses that of humans. I believe that the most significant thing we humans do is make scientific discoveries. The day when a robot makes a real scientific discovery will be the day when robots will surpass humans.
Recently I've written a paper to publish in the Proceedings of the National Academy of Sciences wherein I propose that humanity's most significant scientific discoveries are the theory of relativity, quantum mechanics, and the periodic table. Could AI automatically discover the periodic table without any assistance? Similarly, could it help humanity determine new medicines, or is it possible to use machine learning methods in finding out new materials? These are the standards for ascertaining the level of AI.
In today's world, individuals produce a large quantity of data: genetic data, health data, educational data, and behavioral data. This is particularly necessary for the development of AI. Much of this data is contained within central organizations and is not decentralized genuinely. The rise of blockchain has made the creation of a decentralized data market possible.
I describe the whole idea of blockchain with a single phrase - "in math we trust" - because the foundation of mathematics is the basis of this idea. The most fundamental aspect of the entire blockchain and information technology fields is basic math, which can protect personal privacy in the data market and utilize data to carry out reasonable statistical calculations. For example, a marvelous computational method called the zero-knowledge proof can prove that my data is precious without revealing the location of my private data.
The public's understanding of the blockchain is not yet an understanding of its fundamental first principles. Speaking in terms of the most basic physical principles, reaching consensus is like everyone agreeing to the same "account book," which in physics is similar to a magnetic material in disorder; however, all pointing in the same direction once it is magnetized.
The consensus is also reachable in the natural world, and a phenomenon called a reduction in entropy. The entropy of a situation when everyone is facing in the same direction upon entering an agreement is far less than that of a disordered state. It is challenging to achieve such a consensus because entropy is always increasing.
There is a need for consumption of energy to reach a system of consensus on a blockchain that uses an algorithm. This may sound irrational: why does an account have to consume energy? However, this makes eminent sense in terms of the second law of thermodynamics. Since reaching consensus is a decrease in entropy but the world's entropy must increase, the creation of some entropy as a byproduct of reaching consensus must take place. This kind of decentralized mechanism is exceptionally similar to the situation in the natural world of a disordered magnetic material becoming orderly and magnetized, for which a price in energy consumption is payable.
Hence, in the future in an ideal world, each person will possess his or her data in a completely decentralized storage system that will be unhackable. People can use encrypted algorithms on a blockchain to make calculations that both protect personal privacy and are useful so that things like the data theft from Facebook will ever happen.
The problems of quantum computing, artificial intelligence, and blockchain technology are for the entire human race, and they should be solved today. Chinese scientists have a great opportunity: in addition to making progress in applied technology, they should also make original breakthroughs in basic science like those in physics and mathematics described above, even if things like the principle of increasing entropy or particles and antiparticles sound abstract. The marvelousness of the world lies in the fact that basic science can provide the entire information technology industry with vast new prospects for further development.
Read similar blogs and learn more about Alibaba Cloud's products and solutions at www.alibabacloud.com/blog.
Alibaba Clouder - October 26, 2020
Alibaba Clouder - February 4, 2020
Alibaba Clouder - September 20, 2018
Alibaba Clouder - September 28, 2020
Alibaba Cloud Blockchain Service Team - January 17, 2020
Alibaba Clouder - January 14, 2021
An end-to-end platform that provides various machine learning algorithms to meet your data mining and analysis requirements.Learn More
BaaS provides an enterprise-level platform service based on leading blockchain technologies, which helps you build a trusted cloud infrastructure.Learn More
More Posts by Alibaba Clouder