ApsaraVideo Live provides comprehensive features for you to ingest and play live streams. This topic describes the procedure of ingesting and playing a live stream.

Basic service process of ApsaraVideo Live

  1. The caster uses a video capture device to collect the content for live streaming and uses the stream ingest SDK to ingest the stream to Alibaba Cloud CDN nodes.
  2. ApsaraVideo Live uses edge ingest to ingest the stream to a specific live center of ApsaraVideo Live. CDN nodes accelerate the ingested stream to ensure the stability of the upstream transmission.
  3. ApsaraVideo Live ingests the live steam from the live center to CDN nodes.
  4. The audience uses the ApsaraVideo Player SDK to watch the ingested stream.


No. Objective Implementation channel Operation Related API operation Documentation
1 Make preparations Console ApsaraVideo Live console
  1. Create an Alibaba Cloud account
  2. What is Domains?
  3. Activate ApsaraVideo Live and purchase resource plans
2 Add domain names Console
  1. Add a domain name
  2. Verify the ownership of a domain name
  3. Configure a CNAME record
AddLiveDomain Domain names for CDN
3 Bind the domain names
  • Console
  • API
Bind the domain names
4 Configure access control policies
  • Console
  • API
5 Generate an ingest URL and a streaming URL Console Use the URL generator N/A Construct an ingest URL and a streaming URL
6 Ingest the stream Stream ingest tool
  1. Download and install Open Broadcaster Software (OBS)
  2. Specify the ingest URL and stream name

For more information, see Push the stream.

N/A Stream pushing, stream pulling, and streaming
7 Play the stream Media player
  1. Download and install the VLC media player
  2. Specify the streaming URL

For more information, see Play and watch the stream.

N/A Stream pushing, stream pulling, and streaming

Supported streaming protocols

ApsaraVideo Live allows you to use the Real-Time Messaging Protocol (RTMP) for stream ingest and use the following protocols for live streaming: RTMP, HTTP-FLV, HTTP Live Streaming (HLS), and Advanced Real-Time Communication (ARTC).

  • RTMP

    RTMP is an open protocol developed by Adobe for the transmission of audio, video, and data between Adobe Flash Platform technologies, including Adobe Flash Player and Adobe AIR.

    RTMP can be used for both stream ingest and live streaming. It splits audio and video streams into segments and transmits them in the form of data packets. You can encrypt the packets for transmission over the Internet to ensure confidentiality. However, the process of splitting and assembling data into packets is complicated, which may lead to unstable transmission in high concurrency scenarios.


    HTTP-FLV is developed by Adobe to stream audio and videos.

    It encapsulates streams into the FLV format and adds header information to audio and video frames. Thanks to its simplified design, HTTP-FLV features high concurrency and low latency. HTTP-FLV is not widely supported by mobile browsers, but it is a perfect protocol for designing live streaming applications on mobile clients.

  • HLS

    HLS is developed by Apple.

    It breaks a video stream into segments. Each segment is 5 to 10 seconds in length. HLS also generates M3U8 playlists to manage the segments. During HLS-based playback, player clients download complete segments for playback, which ensures the playback smoothness. However, the playback usually starts after a specific number of segments are cached, resulting in high latency. Generally, the latency can be up to 10 to 30 seconds. Compared with HTTP-FLV, HLS is widely used in iPhone and Android browsers and is commonly used to share URLs in QQ and WeChat.

  • ARTC

    ARTC is developed by Alibaba Cloud for real-time streaming (RTS).

    By referring to the interaction implementation of Web Real-Time Communication (WebRTC), Alibaba Cloud develops ARTC based on the User Datagram Protocol (UDP). You can use ARTC to enjoy end-to-end live streaming that features high concurrency and low latency within one second. In addition, interactions can be instantly implemented and more video and audio features are supported, such as the AAC coding format and B-frames. Deployed on CDN nodes, the RTS feature achieves a great balance among costs, application scope, and capacity to help provide more advanced live streaming.

The following table compares the preceding protocols.

Protocol Latency Benefit Drawback Characteristic Applicable client Applicable scenario
RTMP 1s to 3s Low latency
  • Requires a self-developed player that supports RTMP for iOS
  • Uses non-standard TCP ports
TCP-based persistent connection PCs Live streaming without high requirements for timeliness
FLV 1s to 3s
  • Low latency
  • Allows you to use HTML5 to encapsulate and decapsulate packets for playback
Requires integration with the ApsaraVideo Player SDK TCP-based persistent connection PCs Live streaming without high requirements for timeliness
HLS More than 10 seconds
  • Provides native support for iOS, Android, and HTML5
  • Allows you to use HTML5 to encapsulate and decapsulate packets for playback
High latency HTTP-based short-lived connection PCs and mobile clients Live streaming without high requirements for timeliness, mobile clients, and HTML5 players
  • Ultra-low latency
  • Excellent response to unstable network connections
Does not support the AAC coding format and B-frames in HTML5 players (To address this drawback, you can use the real-time transcoding feature to remove B-frames and generate audio streams in the Opus format.) UDP PCs and mobile clients Live streaming with high requirements for timeliness, such as live streaming sale in e-commerce, online education, and online social communications