This topic describes how to implement stream ingest over Real-Time Streaming (RTS).
Overview
To improve the stream ingest performance in poor network conditions, Push SDK supports stream ingest over RTS that is based on real-time communication (RTC), apart from the traditional stream ingest over Real-Time Messaging Protocol (RTMP). For a comparison of the two stream ingest methods, see Differences between RTS and standard streaming. You can use the RTMP, Flash Video (FLV), or HTTP Live Streaming (HLS) protocols to play an RTS stream. You can also use the ApsaraVideo Real-Time Communication (ARTC) protocol to implement RTS-based playback.
The billing of RTS is different from that of standard streaming. For more information, see Billing of RTS.
Prerequisites
Required operations are complete by using the ApsaraVideo Live console or ApsaraVideo Live API, such as adding domain names, configuring CNAME records for the domain names, and associating the domain names. For more information, see Get started with ApsaraVideo Live.
Limits
Take note of the following limits on stream ingest over RTS:
Only mono audio is supported.
Only Low Complexity (LC) is supported for audio encoding.
The audio sampling rate must be 48 kHz.
Automatic image ingest in case of poor network conditions is not supported.
Audio-only and video-only stream ingest is not supported.
Implementation
Enable the RTS feature. For more information, see Enable RTS.
Generate an ingest URL. For more information, see Generate ingest and streaming URLs.
Specify the ingest URL that starts with
artc://in Push SDK to ingest the stream.Sample code:
mAlivcLivePusher.startPush(mPushUrl);