All Products
Search
Document Center

ApsaraVideo Live:Use Push SDK for Flutter

Last Updated:Dec 18, 2025

This topic introduces how to use Push SDK for Flutter with examples, covering its core interfaces and basic workflow.

Features

  • Supports stream ingest over Real-Time Messaging Protocol (RTMP). 

  • Supports stream ingest over Alibaba Real-Time Communication (ARTC) based on User Datagram Protocol (UDP).

  • Uses H.264 for video encoding and AAC for audio encoding.

  • Supports custom configurations for bitrate control, resolution, and display mode.

  • Supports various camera operations.

  • Supports real-time retouching and custom retouching effects. 

  • Allows you to add and remove animated stickers as watermarks. 

  • Supports external audio and video inputs in different formats such as YUV and pulse-code modulation (PCM).

  • Supports ingest of audio-only and video-only streams and stream ingest in the background.

  • Supports background music. 

  • Supports video snapshot capture. 

  • Supports automatic reconnection and error handling. 

  • Supports Automatic Gain Control (AGC), Automatic Noise Reduction (ANR), and Acoustic Echo Cancellation (AEC) algorithms. 

  • Allows you to switch between the software and hardware encoding modes for video files. This improves the stability of the encoding module. 

Limitations

Take note of the following limits before you use Push SDK for Flutter:

  • You must configure screen orientation before stream ingest. You cannot rotate the screen during live streaming. 

  • You must disable auto screen rotation for stream ingest in landscape mode. 

  • In hardware encoding mode, the value of the output resolution must be a multiple of 16 to be compatible with the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars. 

API reference

Push SDK for Flutter API reference

Procedure

  1. Register the SDK

  2. Configure stream ingest parameters

  3. Start stream ingest

Feature usage

Register the SDK

Before registration, you must configure the SDK license. Push SDK for Flutter supports the unified license. To apply for and configure a license, see Integrate a Push SDK license.

Then, register the SDK before stream ingestion.

  1. Register the SDK.

    AlivcLiveBase.registerSDK();
  2. Set a listener for registration.

    AlivcLiveBase.setListener(AlivcLiveBaseListener(
      onLicenceCheck: (AlivcLiveLicenseCheckResultCode result, String reason) {
        if (result == AlivcLiveLicenseCheckResultCode.success) {
          /// The SDK is registered.
        }
      },
    ));

Configure stream ingest parameters

All basic parameters have default values. We recommend that you use the default values.

/// Create an AlivcLivePusher instance.
AlivcLivePusher livePusher = AlivcLivePusher.init();

/// Create a Config object to associate AlivcLivePushConfig with AlivcLivePusher.
livePusher.createConfig();

/// Create an AlivcLivePushConfig instance.
AlivcLivePushConfig pusherConfig = AlivcLivePushConfig.init();

/// Configure stream ingest parameters.
/// Set the resolution to 540P.
pusherConfig.setResolution(AlivcLivePushResolution.resolution_540P);
/// Specify the frame rate. We recommend that you set it to 20 frames per second (FPS).
pusherConfig.setFps(AlivcLivePushFPS.fps_20);
/// Specify whether to enable adaptive bitrate streaming. The default value is true.
pusherConfig.setEnableAutoBitrate(true);
/// Specify the group of pictures (GOP) size. A larger value indicates a higher latency. We recommend that you set it to a number from 1 to 2.
pusherConfig.setVideoEncodeGop(AlivcLivePushVideoEncodeGOP.gop_2);
/// Specify the reconnection duration. The value cannot be less than 1000. Unit: milliseconds. We recommend that you use the default value.
pusherConfig.setConnectRetryInterval(2000);
/// Disable the mirroring mode for preview.
pusherConfig.setPreviewMirror(false);
/// Set the stream orientation to portrait.
pusherConfig.setOrientation(AlivcLivePushOrientation.portrait);

Start stream ingest

  1. Create a livePusher engine.

    livePusher.initLivePusher();
  2. Register listeners for stream ingest events.

    /// Set the listener for the stream ingest status.
    livePusher.setInfoDelegate();
    /// Set the listener for stream ingest errors.
    livePusher.setErrorDelegate();
    /// Set the listener for the network status during stream ingest.
    livePusher.setNetworkDelegate();
  3. Configure callbacks related to stream ingest.

    /// Listener for stream ingest errors
    /// Configure the callback for SDK errors
    livePusher.setOnSDKError((errorCode, errorDescription) {});
    /// Configure the callback for system errors
    livePusher.setOnSystemError((errorCode, errorDescription) {});
    
    /// Listener for the stream ingest status
    /// Configure the callback for preview start
    livePusher.setOnPreviewStarted(() {});
    /// Configure the callback for preview stop
    livePusher.setOnPreviewStoped(() {});
    /// Configure the callback for first frame rendering
    livePusher.setOnFirstFramePreviewed(() {});
    /// Configure the callback for start of stream ingest
    livePusher.setOnPushStarted(() {});
    /// Configure the callback for pause of camera stream ingest
    livePusher.setOnPushPaused(() {});
    /// Configure the callback for resume of camera stream ingest
    livePusher.setOnPushResumed(() {});
    /// Configure the callback for restart of stream ingest
    livePusher.setOnPushRestart(() {});
    /// Configure the callback for end of stream ingest
    livePusher.setOnPushStoped(() {});
    
    /// Listener for the network status during stream ingest
    /// Configure the callback for failed connections
    livePusher.setOnConnectFail((errorCode, errorDescription) {});
    /// Configure the callback for network recovery
    livePusher.setOnConnectRecovery(() {});
    /// Configure the callback for disconnection
    livePusher.setOnConnectionLost(() {});
    /// Configure the callback for poor network connections
    livePusher.setOnNetworkPoor(() {});
    /// Configure the callback for failed reconnections
    livePusher.setOnReconnectError((errorCode, errorDescription) {});
    /// Configure the callback for reconnection start
    livePusher.setOnReconnectStart(() {});
    /// Configure the callback for successful reconnection
    livePusher.setOnReconnectSuccess(() {});
  4. Create a preview view for stream ingest.

    var x = 0.0; // The custom value
    var y = 0.0; // The custom value
    var width = MediaQuery.of(context).size.width; // The custom value
    var height = MediaQuery.of(context).size.height; // The custom value
    AlivcPusherPreview pusherPreviewView = AlivcPusherPreview(
          onCreated: _onPusherPreviewCreated,
          x: x,
          y: y,
          width: width,
          height: height);
      return Container(
            color: Colors.black,
            width: width,
            height: height,
            child: pusherPreviewView);
  5. Start preview.

    /// Callback for preview creation
    _onPusherPreviewCreated(id) {
         /// Start preview
        livePusher.startPreview();
    }
    Note

    Assume that the screen orientation of the Flutter project is portrait, and you call setOrientation to set the screen orientation to landscape. After you create a preview and call startPreview to start preview, the video may not fill the preview window. We recommend that you add a short delay before invoking startPreview.

    For example: Future.delayed(Duration(milliseconds: 100));

  6. Start stream ingest. You can start stream ingest only after the preview succeeds.

    String pushURL = "Test ingest URL (rtmp://......)"; 
    livePusher.startPushWithURL(pushURL);
    Note
    • RTMP and RTS (artc://) ingest URLs are supported. To generate ingest URLs, see Generate ingest and streaming URLs.

    • ApsaraVideo Live does not support ingesting multiple streams to the same URL simultaneously. The second ingest request will be rejected.

Ingest-related methods

/// Pause stream ingest from the camera. You can call setPauseImg to configure the image displayed during the pause. Then, call the pause method to switch from camera feeds to the specified image. The audio stream continues to be ingested.
livePusher.pause();
/// Resume stream ingest to switch from image to camera feeds. The audio stream continues to be ingested.
livePusher.resume();
/// Stop a stream that is being ingested.
livePusher.stopPush();
/// Stop preview. However, this operation does not take effect for a stream that is being ingested. When preview is stopped, the preview window is frozen at the last frame.
livePusher.stopPreview();
/* Restart stream ingest when the stream is being ingested or when an error callback is received. All resources in AlivcLivePusher are reinitialized, including preview and ingestion. If an error occurs, you can call this method or the reconnectPushAsync method to restart stream ingest. You can also call the destroy method to destroy the stream ingest instance. */
livePusher.restartPush();
/* Reconnect and repush the RTMP stream during streaming or network error state (setNetworkDelegate). In the error state, you can also call destroy to dispose the instance.*/
livePusher.reconnectPushAsync();
/// Stop stream ingest and preview. After you call this method, all resources related to AlivcLivePusher are disposed.
livePusher.destory();

Camera-related methods

/// Switch between the front and rear cameras.
livePusher.switchCamera();
/// Enable or disable flash. You cannot enable flash for the front camera.
livePusher.setFlash(false);

/// Adjust the focal length to zoom in or out. If you set the input parameter to a positive number, the system increases the focal length. If you set the input parameter to a negative number, the system decreases the focal length.
double max = await livePusher.getMaxZoom();
livePusher.setZoom(min(1.0, max));

/// Configure manual focus.
/// The autoFocus parameter specifies whether to enable autofocus. This parameter takes effect only for this call. Whether autofocus is enabled depends on the setAutoFocus method.
double pointX = 50.0; // The custom value
double pointY = 50.0; // The custom value
bool autoFocus = true;
livePusher.focusCameraAtAdjustedPoint(pointX, pointY, autoFocus);

/// Disable autofocus.
livePusher.setAutoFocus(false);
/// Disable the mirroring mode for preview.
livePusher.setPreviewMirror(false);
/// Disable the mirroring mode for stream ingest.
livePusher.setPushMirror(false);

Ingest an image

Push SDK for Flutter supports ingesting an image when the application is switched to the background or the bitrate is low. 

When the application is switched to the background, video stream ingest is paused by default, and only the audio stream is ingested. Streamers can display an image, informing viewers that the streamer is away and will be back soon.

/// Specify the image that is ingested during the pause.
String pauseImagePath = "xxxx"; // xxxx specifies the path of the image.
pusherConfig.setPauseImg(pauseImagePath);

You can also specify an image to be ingested in poor network conditions. When the bitrate is low, the image is displayed to prevent stuttering. Sample code:

/// Specify the image that is ingested in poor network conditions.
String networkPoorImagePath = "xxxx"; // xxxx specifies the path of the image.
pusherConfig.setNetworkPoorImg(networkPoorImagePath);

Configure preview display mode

Push SDK for Flutter supports the following preview modes. The preview mode does not affect stream ingest.

  • AlivcPusherPreviewDisplayMode.preview_scale_fill: The video fills the preview window. If the aspect ratios of the video and preview window are inconsistent, video deformation occurs.

  • AlivcPusherPreviewDisplayMode.preview_aspect_fit: The aspect ratio of the video is preserved. If aspect ratios differ, black bars appear on the preview window.

  • AlivcPusherPreviewDisplayMode.preview_aspect_fill: The video is cropped to fit the preview window when aspect ratios differ.

Sample code:

/// Set the preview display mode.
pusherConfig.setPreviewDisplayMode(AlivcPusherPreviewDisplayMode.preview_aspect_fit);

Configure video quality

Push SDK for Flutter supports the following video quality modes: Resolution Priority, Fluency Priority, and custom.

Important

To configure video quality, you must enable bitrate control: pusherConfig.setEnableAutoBitrate(true);

Resolution Priority (default)

In this mode, the SDK automatically configures bitrate parameters to ensure the video quality.

pusherConfig.setQualityMode(AlivcLivePushQualityMode.resolution_first);

Fluency Priority

In this mode, the SDK automatically configures bitrate parameters to ensure the smoothness of the ingested video stream.

pusherConfig.setQualityMode(AlivcLivePushQualityMode.fluency_first);

Custom mode

In custom mode, the SDK configures bitrate based on the values that you specify, including initial, minimum, and target bitrates.

  • TargetVideoBitrate: In good network conditions, the bitrate is gradually increased to the target bitrate to improve the video quality.

  • MinVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum to prevent stuttering.

  • InitialVideoBitrate: The initial bitrate when a live stream starts.

pusherConfig.setQualityMode(AlivcLivePushQualityMode.custom);
pusherConfig.setInitialVideoBitrate(1000);
pusherConfig.setMinVideoBitrate(600);
pusherConfig.setTargetVideoBitrate(1400);

When you configure bitrates, refer to the recommended settings provided by Alibaba Cloud: 

  • Recommended settings for Resolution Priority mode

    Resolution

    Initial bitrate

    Minimum bitrate

    Target bitrate

    360P

    600

    300

    1000

    480P

    800

    300

    1200

    540P

    1000

    600

    1400

    720P

    1500

    600

    2000

    1080P

    1800

    1200

    2500

  • Recommended settings for Resolution Priority mode

    Resolution

    Initial bitrate

    Minimum bitrate

    Target bitrate

    360P

    400

    200

    600

    480P

    600

    300

    800

    540P

    800

    300

    1000

    720P

    1000

    300

    1200

    1080P

    1500

    1200

    2200

Configure adaptive resolution

The SDK supports dynamically adjusting the resolution of an ingested stream. When the feature is enabled, the resolution is automatically reduced to ensure the smoothness and quality in poor network conditions. Sample code:

/// Enable adaptive resolution。
pusherConfig.setEnableAutoResolution(true);
Important
  • Adaptive resolution takes effect only when the video quality mode is set to Resolution Priority or Fluency Priority.

  • Some players may not support dynamic resolution. We recommend that you use ApsaraVideo Player.

Configure background music

/// Start the playback of background music。
String musicPath = "xxxx"; // xxxx specifies the path in which the music resources are stored.
livePusher.startBGMWithMusicPathAsync(musicPath);
/// Stop the playback of background music. If you want to change the background music, call the method that is used to start the playback of background music. You do not need to stop the playback of the current background music.
livePusher.stopBGMAsync();
/// Pause the playback of background music. You can call this method only after the playback of background music starts.
livePusher.pauseBGM();
/// Resume the playback of background music. You can call this method only after the playback of background music is paused.
livePusher.resumeBGM();
/// Enable looping.
livePusher.setBGMLoop(true);
/// Configure denoising. When enabled, the system filters out non-vocal parts from the collected audio. This feature may slightly reduce the volume of the human voice. We recommend that you allow your users to determine whether to enable this feature. By default, this feature is disabled.
livePusher.setAudioDenoise(true);
/// Configure in-ear monitoring. In-ear monitoring is suitable for karaoke scenarios. When enabled, headphone users can hear their voice. When disabled, they cannot hear their voice on headphones. This parameter does not take effect if no headphones are detected. 
livePusher.setBGMEarsBack(true);
/// Specify the volume of the background music in the mixed audio.
livePusher.setBGMVolume(50); // Valid values: 0 to 100. Default value: 50
/// Specify the volume of the human voice in the mixed audio.
livePusher.setCaptureVolume(50); // Valid values: 0 to 100. Default value: 50
/// Configure muting. If you enable this feature, the background music and human voice are muted. To separately mute the background music or human voice, call the method that is used to configure the volume.
livePusher.setMute(true);

Configure callbacks related to background music:

/// Configure the callback for end of playback of background music.
livePusher.setOnBGMCompleted(() {});
/// Configure the callback for timeout of the download of background music.
livePusher.setOnBGMDownloadTimeout(() {});
/// Configure the callback for failed playback of background music.
livePusher.setOnBGMOpenFailed(() {});
/// Configure the callback for paused playback of background music.
livePusher.setOnBGMPaused(() {});
/// Configure the callback for playback progress.
livePusher.setOnBGMProgress((progress, duration) {});
/// Configure the callback for resumed playback of background music.
livePusher.setOnBGMResumed(() {});
/// Configure the callback for start of playback of background music.
livePusher.setOnBGMStarted(() {});
/// Configure the callback for stop of playback of background music.
livePusher.setOnBGMStoped(() {});

Capture snapshots

/// Capture a snapshot.
String dir = "xxxx"; // xxxx specifies the path in which snapshots are stored.
if (Platform.isIOS) {
    /// dir parameter: On iOS, the path is a relative path. A custom directory is automatically generated in the system sandbox. If you set this parameter to "", snapshots are stored in the root directory of the system sandbox.
    /// dirTypeForIOS parameter: Optional. If you do not specify this parameter, snapshots are stored in the [document] directory of the system sandbox.
    livePusher.snapshot(1, 0, dir, dirTypeForIOS: AlivcLiveSnapshotDirType.document);
} else {
    livePusher.snapshot(1, 0, dir);
}
/// Set the listener for snapshot capture. You can call this method only after you call snapshot.
livePusher.setSnapshotDelegate();

/// Configure callbacks related to snapshot capture.
livePusher.setOnSnapshot((saveResult, savePath, {dirTypeForIOS}) {
  	// The callback that is triggered when a snapshot is stored.
    if (saveResult == true) {
      if (Platform.isIOS) {
        // Construct the full path of snapshots in the system sandbox. Format: dirTypeForIOS + savePath.
      } else {
        // Obtain the path of snapshots based on the value of savePath.
      }
    }
  });

Configure watermarks

Push SDK for Android supports adding one or more watermarks in the PNG format. Sample code: 

String watermarkBundlePath = "xxxx"; //xxxx specifies the path in which the watermark image is stored.
double coordX = 0.1;
double coordY = 0.1;
double width = 0.3;
/// Add a watermark.
livePusher.addWatermark(watermarkBundlePath, coordX, coordY, width);

Where:

  • coordX and coordY are relative values that determine the watermark position. coordX=0.1 specifies that the left edge of the watermark is positioned at 10% of the stream width. When the resolution is 540 x 960, the x position is 540 x 0.1 = 54 pixels.

  • width specifies the watermark width relative to stream width. The height is proportionally scaled.

Note
  • To add a text watermark, convert the text into a PNG image, then call this method to add the image as a watermark.

  • To ensure the clarity and edge smoothness of the watermark, we recommend that you use a source image that has the same size as your configuration. For example, if the resolution of the output video is 544 × 940 and the width of the watermark is set to 0.1f, the recommended source image width is 544 × 0.1f = 54.4 pixels.

Push external audio/video sources

Push SDK for Android supports ingesting external audio/video sources, such as a video file.

Before ingestion, enable custom audio and video input.

/// Enable custom audio and video input.
pusherConfig.setExternMainStream(true);
/// Specify the color format for video data. In this example, YUVNV21 is used. You can also use other formats based on your business requirements.
pusherConfig.setExternVideoFormat(AlivcLivePushVideoFormat.YUVNV21);
/// Specify the bit depth format for audio data. In this example, S16 is used. You can also use other formats based on your business requirements.
pusherConfig.setExternMainStream(AlivcLivePushAudioFormat.S16);

After you enable custom audio and video stream input, you can push external audio and video streams.

Push external video data

/// Only continuous buffer data in the YUV or RGB format can be sent by using the sendVideoData method. You can specify the video buffer, length, width, height, timestamp, and rotation angle.
Uint8List bufferData = xxxx; // xxxx indicates the continuous video buffer data in the Uint8List format.
int width = 720; // The video width.
int height = 1280; // The video height.
int dataSize = xxxx; // xxxx indicates the size of the data.
int pts = xxxx; // xxxx indicates the timestamp in microseconds.
int rotation = 0; // The rotation angle.
livePusher.sendVideoData(bufferData, width, height, size, pts, rotation);

Push external audio data

/// Only continuous buffer data in the PCM format can be sent by using the sendPCMData method. You can specify the audio buffer, length, and timestamp.
Uint8List bufferData = xxxx; // xxxx indicates the continuous audio buffer data in the Uint8List format.
int dataSize = xxxx; // xxxx indicates the size of the data.
int sampleRate = xxxx; // xxxx indicates the audio sample rate.
int channel = 0; // The number of sound channels.
int pts = xxxx; // xxxx indicates the timestamp in microseconds.
livePusher.sendPCMData(bufferData, size, sampleRate, channel, pts);

Obtain the version number of native Push SDK

/// Obtain the version number of the native Push SDK.
String sdkVersion = await AlivcLiveBase.getSdkVersion();

Configure logs

/// Enable log printing in the console.
AlivcLiveBase.setConsoleEnable(true);
/// Set the log level to Debug.
AlivcLiveBase.setLogLevel(AlivcLivePushLogLevel.debug);

/// Specify the maximum size of each shard. The total log size is five times the maximum shard size.
const int saveLogMaxPartFileSizeInKB = 100 * 1024 * 1024;
/// Log path.
String saveLogDir = "TODO";
/// Specify the log path and log shard size。
AlivcLiveBase.setLogPath(saveLogDir, saveLogMaxPartFileSizeInKB);

Reset the config object (iOS)

You can call this method to clear the settings in the Config object when you do not use AlivcLivePushConfig on iOS. The default settings will be restored for the next time you create an AlivcLivePusher.

We recommend that you call this method after you call the destroy method for AlivcLivePusher.

/// Reset the Config object on iOS.
livePusher.destroyConfigForIOS();

Add retouching effects

Push SDK for Flutter provides retouching effects by using plug-ins. To use the retouching feature, find the flutter_livepush_beauty_plugin plug-in in the example\plugins directory of the SDK package.

Note

The retouching plug-in is not separately released.

/// 1. Initialize the retouching object.
AlivcLiveBeautyManager beautyManager = AlivcLiveBeautyManager.init();
beautyManager.setupBeauty();
/// 2. Show the retouching panel.
beautyManager.showPanel();
/// 3. Close the retouching panel (for Android).
beautyManager.hidePanel();
/// 4. Dispose the retouching object.
beautyManager.destroyBeauty();

FAQ

How do I troubleshoot stream ingest failures?

You can use the troubleshooting tool to check whether the ingest URL is valid.

How do I obtain information about ingested streams?

Go to Stream Management page and view ingested audio and video streams in Active Streams.

How do I play a stream?

After you start stream ingest, you can use a player (such as ApsaraVideo Player, FFplay, or VLC) to test stream pulling. To obtain playback URLs, see Generate ingest and streaming URLs.