Six major trends in artificial intelligence development
Artificial intelligence is one of the strategic technologies leading social and economic development, and artificial intelligence is the core of the development of many industries.
The development of artificial intelligence is changing with each passing day, and earth-shaking changes are taking place every day. This article by Xiaobao puts forward six insights into the development of artificial intelligence in 2022 and even in the next few years, together to embrace the great era of artificial intelligence.
Huge models are the general trend of AI development
Although it is not the toad palace exiled to the immortal, why fear the ice palace is cold to the bone. Peeking through the curtain and squinting at Jinwu Xiao, how many Juncai are here.
The famous poem above is not the work of the ancient Chinese master, but the masterpiece of "Yuan 1.0", the world's largest artificial intelligence massive model recently released by. "Source 1.0" has 245.7 billion parameters. With the help of the text analysis model, a 5TB high-quality Chinese data set is obtained. As a language model, it can better complete tasks such as reading, comprehension, reasoning, and logical judgment especially for Chinese.
Why is there such a huge model? With the increase in the application of artificial intelligence scenarios in various industries, AI models are becoming more and more diverse and complex. Small AI models can complete practical applications in various industries, but their versatility is poor and accuracy is limited. If the application scenario is changed, the AI small model may not be applicable.
"How artificial intelligence develops the cognitive ability of human beings with logic, consciousness and reasoning is the direction that artificial intelligence research has been exploring. At present, training a huge model with a large amount of parameters through large-scale data is considered to be very important. There is hope to realize an important direction of general artificial intelligence." Academician Wang Endong believes that with the rise of huge models, huge quantification has become a very important trend in the development of artificial intelligence in the future.
Massive models will be the foundation of normative innovation. In the past ten years, the size of artificial intelligence model parameters has increased step by step, from tens of millions to hundreds of billions. In 2020, OpenAI released the GPT-3 deep learning model with 175 billion parameters, officially bringing the language model into the era of 100 billion parameters. In 2021, there will also be several super models around the world. For example, in the English domain, there are the MT-NLG massive models launched by Microsoft and NVIDIA, with 530 billion parameters; in the Chinese domain, there are Source 1.0 with 245.7 billion parameters, Baidu. And the ERNIE 3.0 Titan with 260 billion parameters launched by the PCL team, and even a model with 1.6 trillion parameters such as Switch Transformer (Switch Transformer uses Mix of Expert, MoE, not a single model)
Judging from the research and development of massive models in 2021, the trend of massive models has just begun. In 2022, even trillion-level single models will be launched. At the same time, the optimization of resource usage of huge models will also become a new research direction. It is believed that there will be huge models with more parameters but less GPUs in the future.
For the development of artificial intelligence, on the basis of achieving the ideal generalization ability, the smaller the model, the more extensive and flexible it will be to adapt to the application scenarios. However, on this road to complete the ideal, it is first necessary to use a large model and extract a large amount of valuable data for sufficient training and learning, in order to gradually optimize to the best and most ideal results, and then train a small model to achieve Flexible and extensive adaptation.
AI continues to penetrate beyond the edge
Edge computing refers to the processing and analysis of data at the edge nodes of the network. An edge node refers to any node with computing resources and network resources between the data generation source and the cloud center. For example, a mobile phone can be an edge node between a person and a cloud center, and a gateway is an edge node between a smart home and a cloud center.
With the rapid development of IoT technology and the ever-increasing real-time requirements of business, edge testing and end-side computing capabilities are becoming more and more important. For example, in the Industrial Internet, the accuracy and real-time requirements of data collection are constantly improving, and the amount of data collected is also increasing. In order to better analyze data in real time, process massive terminal data and reduce the pressure of cloud network transmission, artificial intelligence computing power will continue to penetrate to the edge, whether it is the light edge closer to the end-side data or the heavy edge closer to the core data center, 2022 This year will usher in a golden opportunity for development.
IDC predicts that by 2023, nearly 20 percent of processors will be deployed at the edge to handle AI workloads; 70 percent of enterprises will be running various levels of data processing at the IoT edge.
The development of the terminal side will also usher in a golden opportunity. The terminal intelligence itself has high real-time, low latency, and strong privacy. In recent years, it has developed rapidly and is widely used in face recognition, gesture recognition, image search, interactive games, etc. While edge intelligence is severely limited by insufficient computing power, IDC predicts that the market share of servers used for inference workloads will overtake training in the near future and maintain this trend over the forecast period. Moreover, as major companies continue to introduce XPUs with higher computing power, the limit of computing power on terminal intelligence will become smaller and smaller.
AI promotes the diversified development of chips
In the field of artificial intelligence, data, algorithms, and computing power are the three key elements, and when it comes to computing power, chips are inseparable. In recent years, artificial intelligence applications have become more and more diverse, so artificial intelligence chips show a diversity of The development trend, through the continuously evolving architecture, provides a steady stream of power for the next generation of computing.
• From the demand side: With the rapid development of smart cities, smart manufacturing, smart finance, autonomous driving and other fields, the application scenarios supporting speech recognition, computer vision, and natural language processing are becoming more and more extensive, and the demand for artificial intelligence chips by enterprises is increasing. come bigger.
• From the supply side: The differentiated use of artificial intelligence in different industries and different scenarios has also spawned artificial intelligence chips with differentiated features. The wide application of artificial intelligence chips and the continuous enrichment of commonly used applications have brought excellent development opportunities for manufacturers specializing in the development of artificial intelligence chips. For example, Cambrian, Horizon, etc. have all participated in the chip industry to accelerate the research and development and progress of chips.
Changes in both demand and supply drive the diversified and innovative development of the AI chip industry and technology:
From the perspective of technical architecture, AI chips can be roughly divided into two types:
• Algorithm acceleration chips: Based on some commonly used chip architectures, add acceleration units for AI algorithms, such as using CPU, GPU, ASIC and DSP architectures to accelerate some existing AI algorithms.
• Adaptive smart chip: This type of chip itself has more flexibility, has the ability to adjust itself, change itself, adapt to new work needs, and even have some self-learning capabilities. Such as neuromorphic chips, software-defined reconfigurable chips, etc.
An idea to solve the excessive power consumption of artificial intelligence chips is to further imitate the working method of biological neurons than mainstream artificial neural networks, and this method is also called "neuromorphic".
The AI industry is changing with each passing day, and chip manufacturers need to continuously develop and upgrade new chip products to meet this challenge, especially GPUs, but also FPGAs, ASICs, and NPUs.
IDC research found that in the first half of 2021, among Chinese artificial intelligence chips, GPU is still the first choice for data center acceleration, accounting for more than 90% of the market share, while ASIC, FPGA, NPU and other non-GPU chips are also used in various industries and fields. Increasing adoption, the overall market share is close to 10%, and its share is expected to exceed 20% by 2025.
Neuromorphic chips have the advantages of low power consumption, low latency, and high processing speed. Their industrialization and commercialization are still evolving. The development of machine learning and in-depth brain research will bring more opportunities to the further development of neuromorphic chips. many possibilities.
AI chips can be deployed in the cloud, edge and terminal. The cloud is a data and large-scale computing power center, which is responsible for massive data processing and large-scale computing. Cloud AI chips need to have high storage capacity, high floating-point processing speed, and high scalability. In order to share the computing power pressure of the cloud and improve the real-time response speed of application scenarios, AI chips will be deployed on the edge side in a large-scale and decentralized manner in the future, which requires the chips to have strong adaptability and can adapt to various complex scenarios.
With the widespread application of artificial intelligence, the demand for various types of AI chips emerges in an endless stream.
• Computing power requirements: Artificial intelligence needs to perform financial operations on unstructured data, and AI chips need to provide powerful computing power to efficiently deal with various usage scenarios.
• Cooling methods in colleges and universities: As computing power increases, efficient cooling methods are becoming more and more important. High computing power and low energy consumption are the trends in the development of AI chips.
• Algorithm flexibility: In the future, new algorithms other than deep learning will emerge, which requires AI chips not only to adapt to deep learning algorithms, but also to apply different algorithms.
AI and Cloud Accelerated Convergence
AI and cloud computing are a natural pair, and the combination of AI and cloud computing is the general trend.
Top manufacturers at home and abroad have begun to deploy in the field of cloud intelligence. Both Baidu Cloud and Alibaba Cloud were upgraded to Baidu Smart Cloud and Alibaba Smart Cloud respectively. JD Cloud and AI Division officially unified the original three brands of JD Cloud, JD AI, and JD IoT into the "JD Zhilian Cloud" brand. At the same time, Amazon AWS, Google Cloud and Microsoft AZURE, although they have not added the word "smart" to their names, have all regarded AI as a strategic layout.
1. AI makes cloud computing stronger, and the most direct way is to increase computing power.
With the rapid development of AI industrialization and the rapid development of new technologies such as IoT, 5G, and edge computing, the amount of data provided has increased rapidly and the data structure has become increasingly complex. The fusion of AI and cloud computing has spawned many new architectural approaches.
• Edge Computing: Forming a new cloud-edge computing architecture
• Multimodal computing that integrates AI capabilities such as vision, speech, and semantics
• Real-time AI applications with high real-time and low latency, such as autonomous driving.
2. Cloud computing makes AI stronger The three elements of AI are computing power, data and scenarios. Cloud computing can provide three elements. Cloud computing can aggregate data, provide large-scale cluster computing capabilities, and can also accumulate a large number of customers and applications. Scenarios will provide greater convenience for AI.
The integration of AI and cloud is an inevitable trend. AI public cloud services can enable enterprises to efficiently deploy artificial intelligence applications, easily obtain AI capabilities on the cloud, and effectively access and use artificial intelligence technology. The development of artificial intelligence technology continues to accelerate, and the adoption of AI public cloud services can quickly iterate at a lower cost in the early stage.
Industry leaders are also beginning to deploy private clouds to support their emerging business applications, including artificial intelligence. The trend toward hybrid IT architectures with public cloud, private cloud, and traditional data centers has a significant impact on enterprise technology and business innovation.
Artificial intelligence scene blooms more
With the development of artificial intelligence technology, artificial intelligence has gradually realized the deployment and even large-scale application in production and business environments, assisting all walks of life to develop in a more intelligent, green, integrated and diversified direction.
According to IDC's 2021 survey on the application status of artificial intelligence technology, computer vision is currently the most popular application technology, followed by video surveillance, image recognition, smart cameras, face recognition and other enterprise applications. It is expected that in the future, enterprises will deepen the application of technologies such as speech recognition and natural language processing. The graph below shows the proportion of AI scenarios that enterprises have deployed and deployed within three years.
From the perspective of industry applications, the application scenarios of artificial intelligence have transitioned from fragmentation to deep integration, and from single-point applications to diversified application scenarios. Artificial intelligence is widely used in finance, energy, manufacturing, transportation, etc. Specifically:
• The enterprise has high investment in computing power and wide coverage, and has entered the mature application stage of anti-fraud, risk assessment, intelligent recommendation, etc.
• Rapid development in manufacturing industries such as intelligent supply chain, intelligent interconnection, and intelligent equipment operation and maintenance
• Application scenarios such as visual perception and smart oil fields, which are limited by development time, computing power, models, technology, funds, etc., are still in the early stages of development and have broad prospects in the future.
• Responding to the concept of low energy consumption, low emission, recyclability and sustainable development, artificial intelligence can be applied to green energy and environmental ecological governance.
In addition, IDC conducts a comprehensive assessment of the investment in AI in different industries, the maturity of industry application scenarios, and the maturity of data platforms. Telecommunications, manufacturing; transportation, medical care, energy, and education ranked sixth to ninth. The figure below shows the penetration of China's artificial intelligence industry in 2021.
It can be seen that the implementation of artificial intelligence technology brings more value to the industry, not only improves the operation efficiency and production efficiency of enterprises, but also promotes the ability of enterprises to innovate. In 2022, the field of artificial intelligence will usher in a wider and more comprehensive application, more blossoms, and comprehensively promote the progress and development of society.
Favorable policies, AI is like Jiutian Kunpeng
As one of the strategic technologies, the artificial intelligence industry can effectively promote the global economic recovery and enterprise innovation during the epidemic period and the new normal. Artificial intelligence has attracted great attention from countries all over the world. Countries have accelerated the national strategic layout of artificial intelligence. Driven by favorable policies, AI has developed like a nine-day Kunpeng, and it has leaped 90,000 miles, and its application in various fields has been continuously deepened.
From 2017 to 2021, my country has formulated policies many times to encourage artificial intelligence to continuously develop from basic theoretical research to industrial applications to achieve the development of the entire industry chain. In the "14th Five-Year Plan" outline, the new generation of artificial intelligence is regarded as one of the seven frontier fields for public relations, and it is encouraged to accelerate the construction of open source algorithm platforms such as breakthroughs in the basic theory of artificial intelligence, the development of special chips, and deep learning frameworks to promote learning. Innovations in the fields of reasoning and decision-making, image graphics, voice and video, and natural language recognition and processing accelerate the integration and development of artificial intelligence and digital information technologies such as big data, Internet of Things, and edge computing, and promote industrial optimization and upgrading and overall productivity growth.
In August 2021, the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T) released the "Artificial Intelligence and Machine Learning Strategic Plan", which established three major goals: to promote the next generation of artificial intelligence and machine learning technologies in the Department of Homeland Security. Utilize, increase R&D investment, and leverage these technologies to build a secure cyber infrastructure; facilitate the deployment of existing mature AI and machine learning capabilities in DHS missions; build and nurture an interdisciplinary AI/ML workforce team.
The 2020 European Commission’s AI White Paper sets a clear vision for AI in Europe: an ecosystem of excellence and trust. April 2021, European Commission A draft regulation has been proposed to strengthen the regulation of artificial intelligence (AI) technologies. The draft intends to create a list of so-called "AI high-risk application scenarios," and set new standards for the development and use of AI technologies in critical infrastructure, college admissions, and loan applications that are identified as "high-risk" applications. Carry out targeted regulation.
According to the reality of the data of the Organization for Economic Cooperation and Development, more than 60 countries and regions around the world have successively issued artificial intelligence policies and development priorities, formulated and released national AI strategies. Support related industries, enhance the competitiveness and innovation of enterprises, and actively explore AI development methods that meet their own needs and advantages.
Knowledge Base Team
Knowledge Base Team
Knowledge Base Team
Knowledge Base Team
Explore More Special Offers
50,000 email package starts as low as USD 1.99, 120 short messages start at only USD 1.00