×
Community Blog Why Is Serverless the Main Battlefield in the Next Decade?

Why Is Serverless the Main Battlefield in the Next Decade?

This blog discusses the trends and challenges of serverless adoption and shares 2 case studies on Alibaba Cloud's serverless platform, Function Compute.

By Buchen, Alibaba Cloud Serverless Team

1

"Only keep going beyond can drive us forward."

Buchen has been working in Alibaba Cloud for 10 years. Since Buchen joined Alibaba Cloud in 2010, he has participated in the R&D of the Alibaba Cloud Apsara distributed system, acted as a Batch Compute architect and Tablestore (NoSQL) R&D manager, and was deeply involved in Alibaba Cloud system R&D and product iteration. In 2016, Buchen became the R&D owner of Alibaba Cloud Function Compute and was dedicated to building a next-generation elastic and highly available serverless computing platform.

For Buchen and many experts in this field, serverless is a technical challenge to overcome in the next decade. In the serverless wave, Alibaba Cloud always stand at the forefront with the richest set of technologies and products in China. "Serverless is still at its early stage in China, so we do not dare to take it lightly. To win this battle, we need to polish our technologies and products and provide a better user experience."

We have interviewed Buchen to explore his ideas about serverless development, technical challenges, and implementation.

Attitude About Serverless

Many predict that cloud computing will become the infrastructure of the whole society and businesses in the future. By then, cloud computing resources will be available and accessible to the general public, just like water and electricity, without the need to fully understand the underlying technicalities. For example, when we use water, we do not need to know where it comes from, how it is filtered, and how pipes are laid to transfer it. Serverless allows you to focus on application logic and do not need to care about service-related matters, including management, configuration, and O&M. Instead, you only need to pay as you go.

As you can see, serverless provides a way to turn cloud computing into the social and commercial infrastructure. It is also closer to the cloud-native approach advocated in the industry. Therefore, people would naturally use cloud computing in the serverless approach.

Developers outside China are more confident about serverless than developers in China. Many companies outside China have started their businesses based on the Lambda ecosystem. Some large enterprises in China have gradually started to use serverless tools and products. However, a large number of enterprises still wonder whether to use serverless.

Generally, there is an adaptation period after a new product emerges. After a series of serverless products emerge, users have many concerns about whether to use serverless products, whether to migrate traditional projects to the serverless architecture, and how to migrate them to the serverless architecture. Enterprises often ask questions about how to ensure the security and stability of Function Compute and whether migrating traditional projects to the serverless architecture incurs significant reconstruction costs and risks. It is reasonable that enterprises have these concerns.

We believe that these problems will be solved gradually with the development of serverless, extended function as a service (FaaS) definition, and more complete toolchain construction. In theory, problems that can be solved by technologies are not problems.

Buidling Serverless Architecture

Serverless brings attractive features, such as extreme flexibility, reduced costs, and improved development efficiency. To develop and launch businesses in traditional mode, we need team cooperation to go through code development and combination, joint commissioning, resource evaluation, testing and online environment building, and debugging, release, and O&M. In the serverless era, developers only need to develop features or functions they are responsible for and deploy them to the testing or online environment. Rather, they do not need to consider or worry about subsequent O&M work.

Generally, database services built by enterprises through ECS instances have lower availability than database services provided by cloud vendors. In addition, API gateways and data storage services provided by cloud vendors feature better performance and higher security and reliability.

Is Serverless for Everyone?

We do not recommend small enterprises build the serverless architecture themselves. This is because the core element of serverless is pay-as-you-go, that is, resources are scheduled based on the usage. During Double 11, the traffic soars to a 100-million-level concurrency. If you do not reserve server resources for such amount of traffic, you cannot schedule resources to others to use by then. If resources cannot be scheduled on demand, serverless cannot be implemented. So, we do not recommend enterprises who do not have abundant resources construct serverless capabilities themselves. However, they can use public cloud products to implement serverless.

Major vendors are confident that serverless is the final state or a way to the final state of cloud computing. This is because serverless can solve many problems, and it is more like or closer to cloud computing. In addition, no one wants to be left behind in the wave of cloud computing development. Therefore, serverless has become the main battlefield.

How Robust Are Serverless Implementations?

The robustness of serverless implementations lies in the following key factors:

Performance: This includes security, stability, and elasticity. If performance is poor, cloud computing cannot be implemented let alone serverless. Security, stability, and performance are the prerequisites for serverless implementation.

Functionality: To implement serverless, functions are indispensable. Serverless is not only FaaS. Even FaaS includes many components, such as backend as a service (BaaS), triggers, logs, monitoring, and alerts in addition to online running. Developers are willing to use serverless only when its functions meet their requirements.

Experience: The user experience is important to serverless, which includes the ease of use, stability, and security of functions, product flexibility, and toolchain integrity. In addition to the preceding three factors, I think community, ecology, and openness are also important.

What Is Function Compute?

Alibaba Cloud is one of the first public cloud vendors to launch serverless platforms in China. Its FaaS platform product is Function Compute. In terms of event triggering, supported languages, user experience, and other aspects, Function Compute has many highlights.

Event triggering: Alibaba Cloud Function Compute can be triggered by Alibaba Cloud services, such as Object Storage Service (OSS), Log Service, Message Service, Tablestore, API Gateway, and CDN. Its unique callback mechanism greatly reduces architectural and coding costs for asynchronous models.

Supported languages: Alibaba Cloud Function Compute supports mainstream development languages, such as Node.js, Java, and Python. It also supports Go, C and C++, Ruby, and Lua under custom runtime environments.

User experience: Alibaba Cloud Function Compute provides a web console and SDK. You can manage function applications in the web console or by running interactive commands.

Service mode: Functions can be managed by services and applications. A single function instance can process multiple requests concurrently, which reduces costs for computing resources.

Learn more about Function Compute at https://www.alibabacloud.com/help/doc-detail/52895.htm

Technical Challenges

To implement serverless, we need to solve a lot of problems, for example, how to quickly and smoothly migrate traditional projects to the serverless architecture, how to build the serverless architecture, how to implement better debugging in the serverless architecture, and how to reduce costs. My colleague Xu Xiaobin has described the challenges for implementing serverless in his article "The Concept and Challenges of Serverless."

Implementing serverless on a large scale in mainstream scenarios is not easy and faces many challenges. Next, I will analyze these challenges in detail.

Challenge 1: Difficult to Make Businesses Lightweight

To enable auto scaling and pay-as-you-go, a platform must be able to scale out business instances in seconds or even milliseconds. This is challenging to the infrastructure and imposes demanding requirements on businesses, especially large business applications. If it takes 10 minutes to distribute and start an application, auto scaling cannot promptly respond to changes in business traffic.

Challenge 2: High Responsiveness Requirements for the Infrastructure

Once the instance of a serverless application or function can be scaled out in seconds or even milliseconds, the related infrastructure soon faces great pressure. The most common infrastructure is service discovery and log monitoring systems. In the past, the change frequency of instances in a cluster might be several times per hour. Now, the frequency has become several times per second. If the responsiveness of these systems cannot keep up with the evolvement of instances, the overall experience can be greatly compromised.

Challenge 3: Inconsistent Lifecycles Between Business Processes and Containers

A serverless platform relies on standardized application lifecycles to implement features such as automatic container removal and application self-recovery. In the standard container- and Kubernetes-based system, the platform can only control the container lifecycle. Therefore, business personnel must ensure that the business process lifecycle is consistent with the container lifecycle, including startup, stop, and specifications of readiness and liveness probes.

Challenge 4: Observability Must Be Improved

In serverful mode, if a problem occurs in the production environment, users naturally think of logging on to the server. In serverless mode, users do not need to worry about servers and cannot see the servers by default. When the system encounters a fault and the platform cannot automatically restore from the fault, comprehensive observability is required. If the observability in serverless mode is insufficient, users will not feel secure.

Challenge 5: R&D and O&M Personnel Must Adopt New Habits

When developers deploy their own applications for the first time, almost all of them deploy the applications on a single server or at a single IP address. This is a hard habit to break. When a serverless platform is gradually implemented, developers have to change their thinking to gradually adapt to the fact that the IP address may change at any time. They need to operate and maintain systems based on considerations of service versions and traffic.

Currently, serverless only has a framework, which has many problems to solve. That is why people wonder whether to use serverless, because they do not see sufficient successful cases. In fact, Alibaba implemented serverless during Double 11 in 2020. In addition, Alibaba Cloud has led a number of enterprises to use Function Compute to reduce their IT costs. You can learn more about our serverless implementation for Double 11 in the blog, Alibaba Cloud Achieved Large-Scale Implementation of Serverless in Core Business Scenarios

Becoming the Serverless Everyone Needs

Function Compute inference, audio and video processing, image and text processing, real-time file processing, and real-time stream processing. Function Compute has already obtained a large number of customers, including Shimo Docs, Mango TV, Sina Weibo, and Malong Technologies.

Case Study: Sina Weibo

Take Sina Weibo as an example. Function Compute handles billions of requests for Weibo per day on average. Its millisecond-level scalability of computing resources ensures a controllable latency for applications, and the number of visits does not affect the user experience when a hot event comes out. Function Compute helps Weibo continuously reduce costs in image processing. As a result, Weibo no longer needs to maintain a large number of idle servers to process surge traffic at business peak hours. Without needing to maintain complex server statuses, engineers can shift the focus on infrastructure management to cooperation with product teams to increase business value.

Case Study: MosoInk

Not only Internet companies like Sina have implemented serverless, but also new startups are joining the serverless group.

MosoInk is a high-tech company founded by some Chinese students after they graduate from colleges or universities in the United States. This company focuses on the new technology research and platform operations in the digital publishing and mobile learning fields in the mobile Internet era. With the explosive demand for online education, MosoInk has stepped up efforts to integrate high-quality course resources in the industry to continuously expand its business boundaries. However, while acquiring opportunities, the technical team has also faced unprecedented challenges.

Video processing is one of the most difficult problems encountered by the MosoInk technical team. MosoInk has to process a large number of video teaching materials every day, which involve a series of complex technical tasks such as video editing, structuring, combination, transcoding, resolution adjustment, and client adaptation. Through technical practices over the past several years, the MosoInk technical team has established a complete set of controllable video processing mechanisms by using FFmpeg and other technologies, which have supported the rapid development of the business. However, MosoInk engineers have not expected the growth rate of businesses in this year. At peak hours, the video processing requirements are dozens of times over that of previous years, which overwhelmed the existing architecture and seriously affected the user experience.

For now, MosoInk has three core requirements: reduced costs, extreme flexibility, and zero O&M. Fortunately, serverless is best at meeting these requirements. After extensive investigation on serverless services provided by Chinese cloud vendors, the MosoInk technical team believes that Alibaba Cloud Function Compute is the most suitable solution for them in video processing.

Function Compute is compatible with existing code logic and supports mainstream development languages. With these benefits, the MosoInk technical team can seamlessly migrate the code logic from the original architecture to Function Compute at low costs. Function Compute can be connected to the OSS trigger. When new video files are uploaded to OSS, the Function Compute instance is automatically enabled to start a video processing lifecycle.

By integrating serverless workflows, the MosoInk technical team can centrally orchestrate distributed tasks and concurrently process sliced large files and merge them. The team can also schedule computing resources of tens of thousands of instances in a short time to quickly execute video processing tasks.

Compared with traditional methods, the serverless solution based on Function Compute helps MosoInk reduce about 60% of IT costs in video processing scenarios.

The Main Battlefield in the Next Decade

Serverless is expected to have more comprehensive product forms, extreme flexibility, better toolchains, reduced costs, more efficient development efficiency, a faster migration speed, and simpler and more powerful cloud migration. With serverless, developers can focus on business code development in the same pattern and run code on different platforms without needing to consider their differences. In other words, by utilizing one method, developers can switch between businesses without a learning curve.

For developers, the serverless R&D model also brings challenges to the R&D system. For the frontend team, serverless not only enables frontend engineers to have more capabilities but also changes the positioning of the entire frontend industry. People often think that the frontend work is simple, like frontend engineers only need to develop webpages based on UIs, and other work is done by the backend team. After the frontend is integrated with serverless, the frontend team is required not only to develop webpages but also deliver the entire application.

Then, the backend team wonder whether they do not need to do anything thereafter. In fact, this is not the case. The evolution of the serverless R&D model helps the backend team dive into the underlying layer and focus on technical research, for example, how to make the data and service capabilities stronger and more reliable.

Alibaba Cloud is integrating toolchains and community and product capabilities to provide a solution that will promote serverless development. The goal of Alibaba Cloud serverless is to become "the serverless everyone needs", which is quite different from other cloud vendors. Only serverless vendors who prioritize user requirements can provide good serverless products.

In the future, serverless will be ubiquitous. Any complex technical solution can be implemented as a fully managed and serverless-enabled backend service. Not only cloud products but also services, clouds, and ecological capabilities from partners and third parties will be implemented in the API-plus-serverless mode. Serverless is the most important part in the platform strategies of platform products or organizations that use APIs to provide functions, such as DingTalk, Didi, and WeChat.

0 0 0
Share on

Alibaba Clouder

2,631 posts | 635 followers

You may also like

Comments

Alibaba Clouder

2,631 posts | 635 followers

Related Products

  • Function Compute

    Alibaba Cloud Function Compute is a fully-managed event-driven compute service. It allows you to focus on writing and uploading code without the need to manage infrastructure such as servers.

    Learn More
  • Serverless Workflow

    Visualization, O&M-free orchestration, and Coordination of Stateful Application Scenarios

    Learn More
  • ECS(Elastic Compute Service)

    An online computing service that offers elastic and secure virtual cloud servers to cater all your cloud hosting needs.

    Learn More
  • Link IoT Edge

    Link IoT Edge allows for the management of millions of edge nodes by extending the capabilities of the cloud, thus providing users with services at the nearest location.

    Learn More