Top 10 Tech Trends for 2022, Hear What the Experts Have to Say

On December 28, Alibaba DAMO Academy released the top ten technology trends in 2022. This is the fourth consecutive year that DAMO Academy has released the forecast of cutting-edge technology trends.
On December 28, Bodhidharma Institute analyzed 7.7 million public papers and 85,000 patents in the past three years, covering 159 fields, conducted in-depth interviews with nearly 100 scientists, and proposed ten major technological trends that may become reality in 2022. , covering areas such as artificial intelligence, chips, computing and communications. Let's hear what the experts say and think about these tech trends.

Top 10 tech trends for 2022.Trend - AI for Science

Artificial intelligence has become a new production tool for scientists, giving birth to a new paradigm of scientific research.

E Weinan (Academician of Chinese Academy of Sciences, Professor of Peking University and Princeton University)

Top 10 tech trends for 2022.Trend.For hundreds of years, the data-driven Kepler paradigm and the first-principles-driven Newton paradigm have been the two basic paradigms of scientific research. The booming AI for Science has the potential to promote the deep integration of the two existing paradigms and inspire a new scientific revolution. The "Top Ten Science and Technology Trends of Bodhidharma Institute in 2022" lists AI for Science as an important trend, which undoubtedly sees the huge potential brought by the combination of artificial intelligence and traditional scientific research. I hope it will help push more researchers to join it and accelerate the progress of this scientific revolution.

Top 10 tech trends for 2022.Trend.AI for Science brings not only breakthroughs in a few points, but a comprehensive change in scientific research methods. To adapt to such a new environment, scientists need to have a deeper understanding of AI in order to make good use of AI. The company has accumulated a large amount of AI R&D capabilities and resources. They can not only provide the computing resources that are desperately needed in academia, but also help to create basic scientific research tools. Undoubtedly, academia and industry need more collaboration, uphold the spirit of open source and openness, eliminate portal views, and build a scientific research community of AI for Science.

Top 10 tech trends for 2022.Trend.The top ten technology trends of Dharma Academy may be one of such efforts. I look forward to this effort led by the DAMO Academy to accelerate the in-depth integration of information science and traditional science. It is also expected that AI for Science is not just a new wave, but a brand-new scientific era.

Wu Fei (Director, Institute of Artificial Intelligence, Zhejiang University)

I think AI will definitely be a tool for scientists, but I hope it's not limited to that.

Top 10 tech trends for 2022.Trend.We are in the age of a data-intensive computing paradigm. We have massive data, and scientists can apply their own methodology and conduct scientific exploration from this massive data. The use of methodology for scientific discovery and scientific exploration must be based on artificial intelligence, so artificial intelligence will definitely become a tool for scientists.

Why do I say I don't want to be limited to this? Behind the tools is an infiltration of computational thinking of artificial intelligence, so I hope that in the process of using tools, scientists can form a computational thinking centered on design, construction and calculation.

Top 10 tech trends for 2022.Trend.For example, like AlphaFold , the generation of this great research result is not only using deep neural network as a tool, but also bringing together scientists from different disciplines. It first designed a clear computational idea to solve the prediction of the three-dimensional spatial structure of proteins from amino acid sequences.

Therefore, we need to form computational thinking to carry out scientific exploration, and at the same time apply artificial intelligence as a tool to create a new future of science. AI for Science is a promising artificial intelligence trend.

Top 10 tech trends for 2022.Trend.Hua Xiansheng (Head of the City Brain Laboratory of Alibaba Dharma Academy)

At present, there have been notable breakthroughs in the direction of AI for Science, but there are mainly some point-like results, including molecular biology, quantum mechanics, etc., which have not yet formed large-scale results, and there is still a lot of room for breakthroughs.

Top 10 tech trends for 2022.Trend.Using AI to help scientific research is based on two points, one is based on data and the other is based on calculation. Because we want to form AI capabilities on the basis of data and computing power . Therefore, if this discipline has relatively good data, relatively large amounts of data, relatively abundant data, and its problems are related to large-scale computing, it may be a place where breakthroughs can be made relatively quickly. For example, molecular biology, as well as astronomy, geographic science, and atmospheric science, all have a lot of data, and the problems are very complex and require powerful calculations. It may be possible to use some AI capabilities here to make some breakthroughs faster.

Top 10 tech trends for 2022.Trend.In essence, AI for Science is not much different from AI for Industry. AI is also used as a tool to promote the development of the field. It's just that this field is a little different, and its threshold is relatively high, because it is something that scientists have to do, not something that an ordinary person or an ordinary technical worker can do. But in essence, because of the data in this field, algorithms can be designed to mine the "mysteries" in the data and solve problems in this field.

Today, artificial intelligence technology has taken a big step forward. It can make computing-assisted scientific research to intelligent-assisted scientific research, and intelligence will bring some changes to the methods of scientific research. The effect it brings should be comparable to that of the industry, that is, it can improve the efficiency of scientific research, produce more results, and even change from manual workshops to mass production. Of course this is not easy, but there may be such a trend.

Top 10 tech trends for 2022.Trend.For scientific research, it is possible to change from such a small probability event to an event with a higher probability and become more scientific, rather than a particularly random one, which can have more certainty. This is what AI for Science is all about, and of course still a long way to go.

Our AI has been running in other fields for nearly ten years. In Science, it is just beginning. There are some point-like technologies, which are nothing more than two things.

The first is that AI experts need to understand scientific problems, and this threshold is relatively high.

The second point is that scientists need to understand the principles of AI, to know what it can and cannot do, and what advantages it has. This does not mean that we must only use the existing AI capabilities to solve this scientific research problem. It may also be to develop better AI capabilities in the process of collaboration between AI experts and scientists to solve the corresponding scientific problems. This It is the same process as AI for Industry. Many of our AI technologies are also produced in the process of solving the problems of the corresponding industries. Therefore, the two aspects must be combined to solve the problem. It's just that there are differences, and the threshold for science is relatively high.

Top 10 tech trends for 2022.Trend.When AI is used for Industry, it is actually gradually moving from a single point of technology to a platform. In the future of AI for Science, I think it will gradually become a platform. At this time, AI experts combine a certain field, a certain discipline, or even a certain type of problem in a certain discipline to build a scientific research platform with scientists. At this time, scientists may have greater degrees of freedom and more powerful tools, enabling them to conduct scientific research in larger batches and achieve richer and more important scientific breakthroughs.

Top 10 tech trends for 2022.Trend.If we go further down, I would like to learn from the process of AI for Industry. After AI for Industry goes from a single point technology to a platform, I judge it to be a system, or an evolutionary system or a co-evolutionary system. Because the platform solves the ability to generate capabilities and landing applications on a large scale, and the systematic solution is to solve the problems of the industry in a long-term, continuous and in-depth manner, and generate core values. The same is true for scientific research. If such an AI system can be built in every field, then scientific discovery may be realized in an automatic or semi-automatic mode. It can make some scientific discoveries continuously, lastingly, deeply and extensively. This May be a longer term future. Of course, what the automatic mode can solve are relatively simple scientific discoveries and scientific demonstrations, not the most cutting-edge, cutting -edge, and complex problems - this part is to use powerful AI systems and scientific professional capabilities to discover, solve solve.

Yin Wotao (Head of Decision Intelligence Laboratory of Alibaba Dharma Academy)

DeepMind and collaborators recently published a paper in Nature, AI helped solve mathematical problems, which attracted a lot of attention. What everyone cares about is not the results, but how AI functions in mathematical research.

Top 10 tech trends for 2022.Trend.Let me introduce the background first. This article is about low-dimensional topology. There are more than a dozen mathematical quantities in it, of which the key quantities are obtained through neural network fitting. The author guessed that there is an unknown nonlinear relationship in the low-level topology. The author generated a lot of data and fitted the approximate function with a neural network. It was found that three of the quantities played an important role in the fitting process, and only these three quantities were used. Also fits very well. By doing fitting experiments anyway , generating new data and obtaining a new observation model, the mathematician finally guessed an inequality structure with wisdom, and further gave a strict proof.

What everyone cares about is the process of human-computer interaction, a large number of neural network experiments and multiple rounds of interaction between two mathematicians. Nature describes the interaction process. Since ancient times, for example, Kepler and other applied scientists have repeatedly made experimental observations to find patterns, and now AI algorithms and AI experts play this role. This innovation mainly uses neural network technology, including multivariate nonlinear function progressive technology and black box interpretation technology.

In conclusion, this success story will inspire pure mathematicians and AI to cooperate to prove some new conjectures or discover new structures.

Top 10 tech trends for 2022.Trend.Extending to AI for Science, my view is that AI can indeed accelerate scientific experiments. In addition to doing computer simulations, AI can also tell you the direction of experiments. In recent astrophysics, AI optimizes the pointing of space telescopes and collects more and more interesting data. It is a bit like autopilot, using AI as an autopilot telescope to speed up regular discovery.

Second, AI promotes the combination of human and machine. Of course, it is simple to say, but the specific operations are much more complicated. Scientists and AI experts must interact closely, such as data generation, mapping, building and training neural networks, and using neural networks to verify results. Solving problems are highly related.

Last but not least is the development of explainable AI tools. The conclusions generated by AI need to be easy to understand and traceable, so as to build a bridge with science and gain the trust of scientists.

Huang Fei (Researcher, Language Technology Laboratory, Alibaba Dharma Academy)

My work is mainly doing natural language understanding, dialogue, including AI models. AI for Science is a very new direction. Our team's work in this area has not started long. Based on the pre-trained model system AliceMind , we have been able to prove nearly 400 theorems by using limited supervised data and combined with reinforcement learning .

There are some differences between AI for industry and AI for science. The former is mainly to solve practical problems in industry and find patterns based on data in the real world. For science-oriented artificial intelligence, the goal is not only to find patterns in the data, but also to find the underlying laws that generate these patterns to explain different phenomena. For scientific research-oriented artificial intelligence, common prediction methods such as mapping from data to labels, and traditional classification learning methods such as predicting diseases given disease pictures can be used, but the model pays more attention to the underlying mathematical problems, physical problems, etc. Comprehension, insights for interpretability, and more appropriate representation analysis for the data in the research question. We need to find better representations based on the source data and have a better understanding of the problems in the field.

The current AI is mainly used in industrial scenarios and is based on a large amount of data. If AI also needs a lot of scientific research data in scientific research, such as for biological proteins or some specific fields, AI may play a relatively large role here. However, for the representation and application of knowledge in a specific field, it involves symbolic logic, including knowledge graphs, and even human experience and textual knowledge. How to represent and apply domain knowledge is currently relatively limited in AI work in this regard.

Top 10 tech trends for 2022.Trend.Another issue is cross-team collaboration. The current model is that physicists and mathematicians put forward problems and needs, and AI experts help solve them. More effective work is one where both parties take a step forward , and AI experts have a deep understanding of the relevant subject knowledge, so that there is a better way to use computer modeling. If physicists, mathematicians, chemists and biologists can better understand the trends of computers and AI, they will also give very important suggestions, which will be very good for the modeling of AI models and the construction of the entire AI system s help.

Top 10 tech trends for 2022.Trend.Trend Two Size Model Co-evolution



The large model parameter competition has entered a cooling-off period, and large and small models will co-evolve on the cloud edge.

Zhou Zhihua (Head of the Department of Computer Science and Technology and Dean of the School of Artificial Intelligence, Nanjing University)

On the one hand, large models have achieved unpredictable success in many problems in the past, and on the other hand, their huge training energy consumption and carbon emissions cannot be ignored. Personally, I think that large models will play a role in some major tasks related to the national economy and people's livelihood in the future, while in other scenarios, small models may be used through methods similar to ensemble learning, especially "reuse" through a small amount of training. And integrate existing small models to achieve good performance.

We have proposed an idea called "learningware", and are currently doing some explorations in this area. The general idea is that, assuming that many people have already made models and are willing to share them in a market, the market organizes and manages learning by establishing protocols. When people make new applications in the future, they can train the model without collecting data from scratch. You can use the statute to go to the market to find out if there is a model that is closer to the demand, and then take it home and use your own data to polish it a little before you can use it. There are still some technical challenges to solve, and we are looking into this direction.

On the other hand, it is possible to make models leaner by leveraging human common sense and domain expertise, which requires a combination of logical reasoning and machine learning. Logical reasoning is better at using human knowledge, and machine learning is better at using data facts. How to organically combine the two has always been a major challenge in artificial intelligence. The trouble is that logical reasoning is a rigorous deductive process from "general to special" based on mathematical logic, while machine learning is a less rigorous inductive process from "special to general" with probability approximate correctness, which is very different in methodology. . Existing explorations generally rely on one of them and introduce some elements of the other. Recently, we are exploring the way in which the two sides can use each other in a relatively balanced manner.

Academic Vice President of Beijing Zhiyuan Artificial Intelligence Research Institute )

Although great progress has been made in ultra-large-scale pre-training models at home and abroad, it should be recognized that there are still many important problems to be solved in the development of large models. For example, the theoretical basis of the pre-training model is not yet clear (for example, is there a limit to the parameter scale of the intelligence of the large model), how can the large model be applied to the actual system efficiently and at low cost; secondly, the construction of the large model needs to overcome the data quality, training efficiency, computing power Consumption , model delivery and many other obstacles; finally , most large models currently lack cognitive ability, which is one of the reasons why some scholars question whether such models can be called "basic models". Can more general intelligence be achieved through large models? How to achieve this? These all require continuous exploration by academia and industry. The "Top Ten Technology Trends of Bodhidharma Institute in 2022" proposes that large and small models will co-evolve in the cloud, edge and end. The large model outputs model capabilities to the small models at the edge and end, and the small model is responsible for the actual reasoning and execution. At the same time, the small model feeds back the algorithm and execution results to the large model, so that the ability of the large model can be continuously strengthened and an intelligent system of organic circulation is formed. This view is instructive and helps large models move from the laboratory to large-scale industrial applications.

The development of large models, in terms of cognitive intelligence, does not rule out the possibility of further increase in model parameters, but the parameter competition itself is not the purpose, but to explore the possibility of further performance improvement. The research on large models also focuses on the original innovation of architecture, and further improves the cognitive intelligence capabilities of trillion-level models through continuous model learning, increasing memory mechanisms, and breaking through triple knowledge representation methods. In terms of the model itself, multimodal, multilingual, programming-oriented new models will also become the focus of research.

In terms of efficient application, the threshold for using large models will be greatly lowered, allowing large models to be used, and promoting small and medium-sized enterprises to form an AI industrialization development model of "large model + small amount of data fine-tuning".

Main implementation:
the computing power consumption of the model during pre-training, adaptation to downstream tasks, and reasoning ;
2) Speed improvement: The inference speed of models with a scale of 100 billion or more is increased by 2 orders of magnitude by means of model distillation, model tailoring, etc.;
3) Building a platform: By building a one-stop development and application platform to provide full-process pre-training services from online model construction, online model deployment, and application release, it can support the development and deployment of hundreds of applications. The wide application of the technology will become a key booster for the high-quality development of China's economy.

Yang Hongxia (Scientist, Intelligent Computing Laboratory, Alibaba Dharma Academy)

In recent years, with the rapid development of pre-training technology in the field of deep learning, ultra-large-scale models have gradually entered people's field of vision and become the focus of the field of artificial intelligence. The pre-training model has achieved great
breakthroughs in multiple AI fields such as text, image processing, video, and voice, and has gradually become the foundation model of AI. At the same time, the pre-training model is also actively combined with the field of life sciences, including protein , genes, and have broad prospects in tasks such as cell classification, gene regulatory relationship discovery, and bacterial drug resistance analysis. However , there are still several issues for pre-training large models that need to be broken through:

1. The current mainstream practice is to first train a large model (Pretrained Model) to obtain a model with large parameter scale and high precision, and then based on the downstream task data, prune and fine-tune the method (Finetune) to compress the volume of the model. Reduce the pressure of deployment without losing accuracy, but the industry has not found a general way to directly train small models to obtain satisfactory accuracy;

2. Training hundreds of billions and trillions of models often requires thousands of GPU cards, which also brings great challenges to the promotion and inclusiveness of large models;

3. The Pretrain stage of the pre-training model is mainly composed of a large amount of unstructured data due to the large amount of parameters. How to combine it with structured data such as knowledge for more effective cognitive reasoning is also a very big challenge, and so on.

The above issues have brought the large model parameter competition into a cooling-off period, and the co-evolution of large and small models in the cloud, edge and terminal has brought new breakthrough possibilities. Cloud-side-device collaboration makes it easier for small models to acquire general knowledge and capabilities. Small models focus on specific scenarios for ultimate optimization, which improves performance and efficiency. At the same time, it solves the problem of too single data sets for large models in the past. In the end, the whole society does not need to By repeatedly training similar large models, the models can be shared to maximize the efficiency of computing power and energy. The co-evolution of large and small models can also better serve more complex new scenarios, such as virtual reality and digital humans, which require simultaneous deployment and interaction between the cloud and the edge. At the same time, the system is more flexible to protect user data privacy. Maintain your own small models on different endpoints.

Trend Three Silicon Photonics Chips
Optoelectronic fusion combines the advantages of photons and electronics, breaking through the limitations of Moore's Law.

Zhou Zhiping (Professor of Peking University, Distinguished Principal Investigator of Shanghai Institute of Optics and Mechanics)

DAMO Academy chose "silicon photonic chip" as one of the 10 major technology trends in 2022, which confirms the huge application value of this technology in the field of information and communication.

People's understanding of electrons and photons, and the change of the resulting technological era, have led to the emergence and development of the information society. And the information society's preference for small, inexpensive, low-power devices and systems has spawned a wide variety of semiconductor chips.

The earliest microelectronic chips (integrated circuits) benefited from the perfect combination of silicon materials and CMOS devices. They have the advantages of small size, low cost and high integration. They are now an integral part of the global economy and have become a traditional industry. , and its development speed has also slowed down.

Optoelectronic chips focus on the interaction between photons and electrons, and have the characteristics of multi-channel, large bandwidth and high speed, and are the key technology supporting high-speed communication. However , it is difficult to achieve low cost and high integration because it is fabricated on gallium arsenide and indium phosphide materials.

chips subtly embody the characteristics of optoelectronics on a brand-new silicon-based chip, so that optoelectronic chips have achieved breakthroughs in cost and integration, and are expected to greatly increase the number of clusters in the data center. , the communication efficiency between servers, and even between chips.

silicon photonic chip is the silicon-based optoelectronic chip: using the design method and manufacturing process of integrated circuits, the micro-nano-scale photonic, electronic, and optoelectronic devices are heterogeneously integrated on the same silicon substrate to form a complete A new type of large-scale optoelectronic integrated chip with comprehensive functions. It more significantly reflects the continuous efforts of human society in nanotechnology and the great interest in smaller devices and more compact systems.

Trend 4 Green Energy AI

Artificial intelligence helps large-scale green energy consumption and realizes a multi-energy complementary power system.

Tuesday Special ( Chief System Architect of China Electric Power Research Institute )

In order to achieve "carbon peaking and carbon neutrality", State Grid adheres to the concept of "energy transformation and green development", and promotes the transformation of energy and power from high-carbon to low-carbon, from fossil energy-based to green energy-based.

However, a large number of green energy is connected, which is intermittent, random and fluctuating, which makes the large power grid show more complex nonlinear random characteristics, multi-state variable coupling and multi-time scale dynamic characteristics. At the same time, the development and construction of UHV AC and DC projects have made the system form and operating characteristics increasingly complex, and the coupling relationship between UHV AC and DC and between UHV DC transmission and receiving end power grids is complex, which will lead to large changes in network structure and complex operation modes. It is changeable and difficult to formulate, the number of monitoring sections is increasing, and the control rules are becoming more and more complex. That is to say, China's power grid has become a power grid with strong coupling and high fluctuation. The power grid operation is faced with the challenges of limited adjustable resources and security, and more refined requirements are put forward for the formulation of power dispatching operation and operation mode.

Therefore, AI has broad application scenarios in the operation of complex new power systems. China Electric Power Research Institute has also carried out some explorations on this:
(1) At present, the complete online analysis process of the power grid takes about 5~7 minutes, but the dynamic process of the power grid changes at a speed of seconds. In some application scenarios, the response speed is too slow. Therefore, the China Electric Power Research Institute has built a digital twin model of the power grid. , Combined with memory computing, parallel computing, graph computing, complex event processing and AI technology, a second-level response grid online analysis software platform has been built, which can track the power grid operating status in real time at millisecond -level speed , and provide millisecond-level delay for power grid analysis applications. The high-fidelity power flow section data provides breakthrough technical support for the next-generation power grid control system with second-level response.
(2) At present, the scheduling control is still dominated by artificial memory and experience, and the degree of automation and intelligence is low. Taking the Hunan Power Grid in 2021 as an example, it will communicate with the outside world through 1 circuit of UHVDC and 3 circuits of 500kV AC tie lines. In order to improve the external power receiving capacity as much as possible, the mode of Qishao DC limit grading control and dynamic adjustment is adopted. The control requirements involve the load level of Hunan Power Grid, the start-up of the load center and Hunan Central Thermal Power, the hydro-thermal power rotation backup, the safety control load shedding, etc. 16 There are various factors, 93 control strategies, and the rules are very complex. The China Electric Power Research Institute has adopted technologies such as digital twins and decision maps to realize the digitization of regulations and have been put into practical production applications. It can comprehensively and millisecond-level monitoring of all operating modes specified in the power grid stability regulations. After the operating conditions change, it can immediately push the limits and alarms of the current mode, shorten the time required for response, and effectively reduce the risk of cascading failures.
(3) With the combination of UHV AC and DC in Hunan, multi-point external power feeding , sharding and zoning networking, and large-scale distributed new energy access, the operating characteristics of the power grid are constantly changing, and the complexity of stability control has increased sharply. The day-ahead method arrangement and intra-day method adjustment of the analysis are faced with problems such as one-sided basic information, insufficient quantitative comparison, insufficient professional communication, unclear power grid risks, and extensive dispatching decisions, which have been difficult to meet the requirements of power grid analysis and decision-making. Based on digital twin and regulation digitization, China Electric Power Research Institute has developed a rolling deduction system for power grid operation mode, which promotes the transformation of operation mode planning and verification from traditional offline calculation and manual decision-making to online intelligence, and supports the completion of an application software platform to meet the requirements of safety and stability. The entire process of the specified operation mode is visualized and deduced to ensure the safe and stable operation and reliable power supply of the complex large power grid. In this process, Hunan Power Grid has problems in how to optimize decision-making for all control means, how to predict reservoir water level, coal storage, power supply gap, electricity market quotation trends, etc., how to arrange hydropower plans to ensure grid security and economic operation, and how to allocate Problems such as hydro-wind power generation and how to fully consume green energy all require the support of AI technology.

In general, AI technology is inseparable from the intelligent control and operation of new power systems. With the support of AI technology, multiple digital twins interacting between physical power grids and IT applications are built. Each digital twin solves a certain problem. Scenario or an aspect of the grid operation problem. In this way, when there are enough twins to form a grid regulation digital twin system to solve all aspects of grid operation problems, intelligent regulation can be achieved.

Trend 7 Global Privacy Computing

To solve the dilemma of data protection and circulation, privacy computing moves from small data to big data.

Ren Kui (Professor of Zhejiang University, Dean of the School of Cyberspace Security, Zhejiang University)

Data regulatory compliance has become a global trend. China has promulgated the Data Security Law and the Personal Information Protection Law, and the EU has also promulgated GDPR (General Data Protection Regulation). Under the background that my country has raised data as a new factor of production, how to make good use of the value of data has become a proposition of the times. Privacy computing technology realizes the protection of data privacy on the basis of completing computing tasks. If we call the Internet the infrastructure of the new era, privacy computing is the key infrastructure security technology that can solve most of the problems of data protection and security utilization.

the secure multi-party computing proposed in 1982, and later homomorphic encryption, trusted computing, differential privacy and so on. However, privacy computing did not have much practical value earlier. For example, fully homomorphic encryption is theoretically good, but the performance overhead is too large and it is difficult to use in practice. Now with hardware acceleration and software innovation, we are gradually seeing the trend of practicality, of course, there is still a process.

We are currently focusing on three more important technologies: secure multi-party computing, differential privacy, and data desensitization. These three directions have very exciting technological breakthroughs and substantial scenarios.

Top 10 tech trends for 2022.Trend eight- star calculation



The integrated communication and computing of satellite and ground promotes the comprehensive digitization of space, space, earth and sea.

Zhang Ming (Head of XG Lab of Alibaba Dharma Academy)

The world is in a rising period of satellite Internet development, and the development of commercial aerospace industries in various countries is accelerating.

China attaches great importance to the development of satellite launch/manufacturing technology, as well as the integration of satellite communication with cloud computing and Internet technology: In April 2020, satellite Internet, 5G, Internet of Things, and industrial Internet were included in the scope of "new infrastructure" of communication network infrastructure. The low-orbit satellite Internet has entered a stage of rapid development; in 2021, the "14th Five-Year Plan" pointed out that it will build a global coverage and efficient operation of communication, navigation, and remote sensing space infrastructure systems, and build a commercial space launch site to further promote the development of commercial aerospace.

Overseas, represented by SpaceX, as of December 2021, the Starlink constellation has launched a total of 1,890 satellites. The public beta ended in October this year, and it was officially licensed in 12 countries and started operating in 16 countries. Currently there are more than 100,000 users worldwide. According to the foreign speed test website Ookla , in the second quarter of 2021, Starlink 's test download speed averaged 108Mbps, and the network latency averaged 37ms.

Due to the comprehensive improvement of low-latency and low-cost connection capabilities of low-orbit satellites, satellite-to-ground computing will also comprehensively expand various applications. In addition to covering traditional personal communication and Internet services, satellite fixed communication services, public security/emergency/scientific research and other government sector applications, SGC will also fully serve the digital transformation applications of various industries in society, including marine, aviation, railway, vehicle Networking, agriculture, drones and unmanned equipment, digital government and smart cities, etc.

For example, in the deep sea, more comprehensive and timely marine information can be obtained through network control and real-time data analysis; drilling platforms in deserts or oceans can improve safety and efficiency through real-time remote control and real-time data analysis; Data aggregation and analysis can greatly improve control over crop growth to improve yields.

Of course, in order to truly achieve successful commercial use and large-scale development, Xingdi Computing still involves many breakthroughs in core technologies. Taking low-orbit satellite terminals as an example, the first is to be oriented by scenario requirements and commercial value; For example, in terms of key technologies, how to design a new millimeter wave phased array antenna and the corresponding beamforming control algorithm to meet the performance index requirements in a low-cost way; how to design a new satellite-to-ground communication protocol to meet the needs of satellite Internet multi-user, mobility , complex and dynamic business needs; in addition, in terms of terminal integration and optimization, there are still many engineering problems that need to be broken through and solved, so as to meet the multi-faceted needs in different scenarios of sea, land and air.

We believe that satellite-to-ground computing, which combines cloud computing, communication technology and satellite Internet, will become a new generation of infrastructure in the future.


Related Articles

Explore More Special Offers

  1. Short Message Service(SMS) & Mail Service

    50,000 email package starts as low as USD 1.99, 120 short messages start at only USD 1.00