This topic describes the basic procedure for using the Push SDK for Flutter and provides examples of how to use its features.
Features
Supports stream ingest over Real-Time Messaging Protocol (RTMP).
Supports stream ingest over Alibaba Real-Time Communication (ARTC), which is based on User Datagram Protocol (UDP).
Uses H.264 for video encoding and Advanced Audio Coding (AAC) for audio encoding.
Supports custom configurations for bitrate control, resolution, and display mode.
Supports various camera operations.
Supports real-time retouching and custom retouching effects.
Allows you to add and remove animated stickers as watermarks.
Supports external audio and video inputs in different formats such as YUV and pulse-code modulation (PCM).
Supports audio-only and video-only stream ingest, and stream ingest in the background.
Supports background music.
Supports video snapshot capture.
Supports automatic reconnection and error handling.
Supports Automatic Gain Control (AGC), Automatic Noise Reduction (ANR), and Acoustic Echo Cancellation (AEC) algorithms.
Allows you to switch between the software and hardware encoding modes for video files. This improves the stability of the encoding module.
Limitations
Note the following limitations before you use the Push SDK for Flutter:
You must configure the screen orientation before stream ingest. You cannot rotate the screen during live streaming.
You must disable auto screen rotation for stream ingest in landscape mode.
In hardware encoding mode, the value of the output resolution must be a multiple of 16 to be compatible with the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars.
API reference
For more information about the API, see the Flutter Push SDK API reference.
Procedure
The basic procedure is as follows:
Feature usage
Register the SDK
The Push SDK for Flutter uses a unified license. For information about how to apply for and configure a license, see the License integration guide.
You must register the SDK before you can use the stream ingest feature. Otherwise, the features of the Push SDK will be unavailable.
Register the SDK.
ImportantConfigure the license before you call the method to register the SDK.
AlivcLiveBase.registerSDK();Set a listener for callbacks.
AlivcLiveBase.setListener(AlivcLiveBaseListener( onLicenceCheck: (AlivcLiveLicenseCheckResultCode result, String reason) { if (result == AlivcLiveLicenseCheckResultCode.success) { /// The SDK is registered. } }, ));
Configure stream ingest parameters
All basic stream ingest parameters have default values. We recommend that you use these default values. You can perform a simple initialization without extra configurations. The following sample code shows an example:
/// Create an AlivcLivePusher instance.
AlivcLivePusher livePusher = AlivcLivePusher.init();
/// Create a Config object to associate AlivcLivePusherConfig with AlivcLivePusher.
livePusher.createConfig();
/// Create an AlivcLivePusherConfig instance.
AlivcLivePusherConfig pusherConfig = AlivcLivePusherConfig.init();
/// Configure stream ingest parameters.
/// Set the resolution to 540p.
pusherConfig.setResolution(AlivcLivePushResolution.resolution_540P);
/// Set the video capture frame rate to 20 fps. We recommend that you use 20 fps.
pusherConfig.setFps(AlivcLivePushFPS.fps_20);
/// Enable adaptive bitrate. The default value is true.
pusherConfig.setEnableAutoBitrate(true);
/// Set the Group of Pictures (GOP) size. A larger GOP size results in higher latency. We recommend that you set this parameter to a value from 1 to 2.
pusherConfig.setVideoEncodeGop(AlivcLivePushVideoEncodeGOP.gop_2);
/// Set the reconnection duration to 2s. Unit: milliseconds. The value must be 1,000 or greater. We recommend that you use the default value.
pusherConfig.setConnectRetryInterval(2000);
/// Disable preview mirroring.
pusherConfig.setPreviewMirror(false);
/// Set the stream orientation to portrait.
pusherConfig.setOrientation(AlivcLivePushOrientation.portrait);Start stream ingest
Create a stream ingest engine instance.
livePusher.initLivePusher();Register listeners for stream ingest callbacks.
/// Set the listener for the stream ingest status. livePusher.setInfoDelegate(); /// Set the listener for stream ingest errors. livePusher.setErrorDelegate(); /// Set the listener for the network status during stream ingest. livePusher.setNetworkDelegate();Listen for stream ingest-related callbacks.
/// Listener for stream ingest errors /// Callback for SDK errors livePusher.setOnSDKError((errorCode, errorDescription) {}); /// Callback for system errors livePusher.setOnSystemError((errorCode, errorDescription) {}); /// Listener for the stream ingest status /// Callback for preview start livePusher.setOnPreviewStarted(() {}); /// Callback for preview stop livePusher.setOnPreviewStoped(() {}); /// Callback for first frame rendering livePusher.setOnFirstFramePreviewed(() {}); /// Callback for stream ingest start livePusher.setOnPushStarted(() {}); /// Callback for camera stream ingest pause livePusher.setOnPushPaused(() {}); /// Callback for camera stream ingest resume livePusher.setOnPushResumed(() {}); /// Callback for stream ingest restart livePusher.setOnPushRestart(() {}); /// Callback for stream ingest stop livePusher.setOnPushStoped(() {}); /// Listener for the network status during stream ingest /// Stream ingest connection failed livePusher.setOnConnectFail((errorCode, errorDescription) {}); /// Network recovered livePusher.setOnConnectRecovery(() {}); /// Connection lost livePusher.setOnConnectionLost(() {}); /// Callback for poor network conditions livePusher.setOnNetworkPoor(() {}); /// Callback for reconnection failure livePusher.setOnReconnectError((errorCode, errorDescription) {}); /// Callback for reconnection start livePusher.setOnReconnectStart(() {}); /// Callback for successful reconnection livePusher.setOnReconnectSuccess(() {});Create a preview view for stream ingest.
var x = 0.0; // Custom value var y = 0.0; // Custom value var width = MediaQuery.of(context).size.width; // Custom value var height = MediaQuery.of(context).size.height; // Custom value AlivcPusherPreview pusherPreviewView = AlivcPusherPreview( onCreated: _onPusherPreviewCreated, x: x, y: y, width: width, height: height); return Container( color: Colors.black, width: width, height: height, child: pusherPreviewView);Start the preview.
/// Callback for view creation _onPusherPreviewCreated(id) { /// Start preview livePusher.startPreview(); }NoteIf the screen orientation of your Flutter project is portrait and you set the stream ingest orientation to landscape using setOrientation, the preview image may not fill the preview view. This can happen if you call startPreview in the view creation callback after you manually rotate the screen. To resolve this issue, we recommend that you add a delay before you call the startPreview method. For example, you can add a delay of 100 ms:
Future.delayed(Duration(milliseconds: 100));
Start stream ingest. You can start stream ingest only after the preview starts successfully.
String pushURL = "Test ingest URL (rtmp://......)"; livePusher.startPushWithURL(pushURL);NoteRTMP and Real-Time Streaming (RTS) (artc://) ingest URLs are supported. To generate these URLs, see Generate ingest and playback URLs.
ApsaraVideo Live does not support ingesting multiple streams to the same ingest URL at the same time. If you attempt to do so, the second ingest request is rejected.
Common stream ingest controls
/// Pause stream ingest from the camera. You can call the [setPauseImg] method and then the [pause] method to switch from camera stream ingest to static image ingest. The audio stream continues.
livePusher.pause();
/// Switch from static image ingest to camera stream ingest. The audio stream continues.
livePusher.resume();
/// In the stream ingest state, call this method to stop stream ingest. After the operation is complete, stream ingest stops.
livePusher.stopPush();
/// You can stop the preview only in the preview state. If you stop the preview during stream ingest, the operation is invalid. After the preview stops, the preview window freezes on the last frame.
livePusher.stopPreview();
/// You can restart stream ingest when the stream is being ingested or when an error callback is received. If an error occurs, you can call this method or the [reconnectPushAsync] method to reconnect, or call the [destory] method to destroy the stream ingest instance. After the operation is complete, all resources in [AlivcLivePusher] are reinitialized, including preview and stream ingest.
livePusher.restartPush();
/// You can call this method during stream ingest or when an error callback related to [setNetworkDelegate] is received. If an error occurs, you can call this method or the [restartPush] method to restart stream ingest, or call the [destory] method to destroy the stream ingest instance. After the operation is complete, the stream is reconnected.
livePusher.reconnectPushAsync();
/// After the stream ingest instance is destroyed, stream ingest and preview stop, and the preview window is removed. All resources related to [AlivcLivePusher] are destroyed.
livePusher.destory();Camera-related operations
/// Switch between the front and rear cameras.
livePusher.switchCamera();
/// Enable or disable the flash. You cannot enable the flash for the front camera.
livePusher.setFlash(false);
/// Adjust the focal length to zoom in or out of the captured video. If you set the input parameter to a positive number, the focal length is increased. If you set the input parameter to a negative number, the focal length is decreased.
double max = await livePusher.getMaxZoom();
livePusher.setZoom(min(1.0, max));
/// Manually focus the camera.
/// The [autoFocus] parameter specifies whether to enable autofocus. This parameter takes effect only for this focus operation. Whether autofocus is enabled for subsequent operations depends on the value set by the autofocus interface.
double pointX = 50.0; // Custom value
double pointY = 50.0; // Custom value
bool autoFocus = true;
livePusher.focusCameraAtAdjustedPoint(pointX, pointY, autoFocus);
/// Disable autofocus.
livePusher.setAutoFocus(false);
/// Disable preview mirroring.
livePusher.setPreviewMirror(false);
/// Disable stream ingest mirroring.
livePusher.setPushMirror(false);Image stream ingest
To improve the user experience, the Push SDK lets you ingest an image from the background or when the bitrate is low.
When the SDK is switched to the background, video stream ingest is paused by default and only the audio stream is ingested. In this scenario, you can ingest an image. For example, you can display an image to inform users that The Streamer Is Away And Will Be Back Soon.
/// Set the image to display during a pause.
String pauseImagePath = "xxxx"; // xxxx is the path of the image on the mobile phone.
pusherConfig.setPauseImg(pauseImagePath);When network conditions are poor, you can ingest a static image. After you set the image, the SDK ingests the image when it detects that the current bitrate is low. This prevents video stuttering. The following sample code shows an example:
/// Set the image to display in poor network conditions.
String networkPoorImagePath = "xxxx"; // xxxx is the path of the image on the mobile phone.
pusherConfig.setNetworkPoorImg(networkPoorImagePath);Preview display mode
The Push SDK supports three preview modes. The preview display mode does not affect stream ingest.
AlivcPusherPreviewDisplayMode.preview_scale_fill: The preview fills the window. If the aspect ratio of the video is different from the aspect ratio of the window, the preview is stretched.
AlivcPusherPreviewDisplayMode.preview_aspect_fit: The aspect ratio of the video is preserved during preview. If the aspect ratio of the video is different from the aspect ratio of the window, black bars appear in the preview.
AlivcPusherPreviewDisplayMode.preview_aspect_fill: The video is cropped to fit the aspect ratio of the window during preview. If the aspect ratio of the video is different from the aspect ratio of the window, the preview is cropped.
The following sample code shows an example:
/// Set the preview display mode to preserve the video aspect ratio.
pusherConfig.setPreviewDisplayMode(AlivcPusherPreviewDisplayMode.preview_aspect_fit);Configure video quality
Three video quality modes are supported: Resolution Priority, Fluency Priority, and Custom.
To configure video quality, you must enable bitrate control: pusherConfig.setEnableAutoBitrate(true);
Resolution Priority mode (default)
In Resolution Priority mode, the SDK internally configures bitrate parameters to prioritize the definition of the ingested video.
pusherConfig.setQualityMode(AlivcLivePushQualityMode.resolution_first);Fluency Priority mode
In Fluency Priority mode, the SDK internally configures bitrate parameters to prioritize the fluency of the ingested video.
pusherConfig.setQualityMode(AlivcLivePushQualityMode.fluency_first);Custom mode
In Custom mode, the SDK configures the bitrate based on the values that you specify. When you set the mode to Custom, you must define the initial bitrate, minimum bitrate, and target bitrate.
Target bitrate: In good network conditions, the bitrate gradually increases to the target bitrate to improve video definition.
Minimum bitrate: In poor network conditions, the bitrate gradually decreases to the minimum bitrate to reduce video stuttering.
Initial bitrate: The bitrate at the beginning of a live stream.
pusherConfig.setQualityMode(AlivcLivePushQualityMode.custom);
pusherConfig.setInitialVideoBitrate(1000);
pusherConfig.setMinVideoBitrate(600);
pusherConfig.setTargetVideoBitrate(1400);When you set a custom bitrate, refer to the recommended settings from Alibaba Cloud to configure the bitrate. For more information, see the following tables:
Recommended settings for custom bitrate control (Resolution Priority)
Resolution
Initial bitrate initialVideoBitrate
Minimum bitrate minVideoBitrate
Target bitrate targetVideoBitrate
360p
600
300
1000
480p
800
300
1200
540p
1000
600
1400
720p
1500
600
2000
1080p
1800
1200
2500
Recommended settings for custom bitrate control (Fluency Priority)
Resolution
Initial bitrate initialVideoBitrate
Minimum bitrate minVideoBitrate
Target bitrate targetVideoBitrate
360p
400
200
600
480p
600
300
800
540p
800
300
1000
720p
1000
300
1200
1080p
1500
1200
2200
Configure adaptive resolution
Adaptive resolution dynamically adjusts the resolution of an ingested stream. After you enable this feature, the resolution is automatically reduced in poor network conditions to improve video fluency and definition. The following sample code shows an example:
/// Enable adaptive resolution.
pusherConfig.setEnableAutoResolution(true);Adaptive resolution takes effect only when the video quality mode is set to Resolution Priority or Fluency Priority. This feature is not available in Custom mode.
Some players may not support dynamic resolution. To use the adaptive resolution feature, we recommend that you use ApsaraVideo Player.
Configure background music
/// Start playing background music.
String musicPath = "xxxx"; // xxxx is the path of the music resource on the mobile phone.
livePusher.startBGMWithMusicPathAsync(musicPath);
/// Stop playing background music. If background music is playing and you need to switch the song, you only need to call the interface to start playing background music. You do not need to stop the current background music.
livePusher.stopBGMAsync();
/// Pause playing background music. You can call this interface only after the background music starts playing.
livePusher.pauseBGM();
/// Resume playing background music. You can call this interface only when the background music is paused.
livePusher.resumeBGM();
/// Enable loop playback for music.
livePusher.setBGMLoop(true);
/// Set the noise reduction switch. After you enable noise reduction, non-vocal parts of the collected sound are filtered. This may slightly suppress the human voice. We recommend that you allow users to choose whether to enable noise reduction. By default, this feature is not used.
livePusher.setAudioDenoise(true);
/// Set the in-ear monitoring switch. The in-ear monitoring feature is mainly used in karaoke scenarios. After you enable in-ear monitoring, you can hear the streamer's voice in the headphones. After you disable this feature, you cannot hear the human voice in the headphones. In-ear monitoring does not take effect if no headphones are inserted.
livePusher.setBGMEarsBack(true);
/// Configure audio mixing. Set the volume of the background music.
livePusher.setBGMVolume(50); // Value range: [0, 100]. Default value: 50.
/// Configure audio mixing. Set the volume of the collected human voice.
livePusher.setCaptureVolume(50); // Value range: [0, 100]. Default value: 50.
/// Mute the audio. After you mute the audio, both the music and human voice are muted. To mute only the music or human voice, use the audio mixing volume setting interface.
livePusher.setMute(true);Listen for background music-related callbacks:
/// The background music finishes playing.
livePusher.setOnBGMCompleted(() {});
/// A timeout occurs when the background music is downloaded for playback.
livePusher.setOnBGMDownloadTimeout(() {});
/// Failed to start playing the background music.
livePusher.setOnBGMOpenFailed(() {});
/// The background music is paused.
livePusher.setOnBGMPaused(() {});
/// The current playback progress of the background music.
livePusher.setOnBGMProgress((progress, duration) {});
/// The background music is resumed.
livePusher.setOnBGMResumed(() {});
/// The background music starts playing.
livePusher.setOnBGMStarted(() {});
/// The background music stops playing.
livePusher.setOnBGMStoped(() {});Capture snapshots
/// Call the snapshot interface.
String dir = "xxxx"; // xxxx indicates the set path.
if (Platform.isIOS) {
/// dir setting requirements: On iOS, this parameter specifies a relative storage path. A custom directory is automatically generated in the system sandbox path. If you set this parameter to "", the snapshot is saved in the system sandbox path.
/// dirTypeForIOS: Optional. If this parameter is not set, the snapshot is saved in the [document] path of the system sandbox by default.
livePusher.snapshot(1, 0, dir, dirTypeForIOS: AlivcLiveSnapshotDirType.document);
} else {
livePusher.snapshot(1, 0, dir);
}
/// Set the snapshot callback. You need to call this after you call [snapshot].
livePusher.setSnapshotDelegate();
/// Listen for the snapshot callback.
livePusher.setOnSnapshot((saveResult, savePath, {dirTypeForIOS}) {
// The snapshot is saved.
if (saveResult == true) {
if (Platform.isIOS) {
// Concatenate dirTypeForIOS and savePath to obtain the complete path where the snapshot is saved in the sandbox.
} else {
// Obtain the path where the snapshot is saved on the SD card based on savePath.
}
}
});Configure watermarks
The Push SDK lets you add one or more watermarks. The watermark images must be in the PNG format. The following sample code shows an example:
String watermarkBundlePath = "xxxx"; //xxxx is the path of the watermark image resource on the mobile phone.
double coordX = 0.1;
double coordY = 0.1;
double width = 0.3;
/// Add a watermark.
livePusher.addWatermark(watermarkBundlePath, coordX, coordY, width);coordX, coordY, and width are relative values. For example, coordX:0.1 indicates that the x-coordinate of the watermark is at 10% of the x-axis of the ingested video. If the ingest resolution is 540 × 960, the x-coordinate of the watermark is 54.
The height of the watermark image is scaled based on the aspect ratio of the source image and the specified width.
To implement a text watermark, you can convert the text into an image and then use this interface to add the watermark.
To ensure the definition and edge smoothness of the watermark, use a source watermark image that has the same size as the output watermark. For example, if the output video resolution is 544 × 940 and the displayed width of the watermark is 0.1f, use a source watermark image with a width of about 544 × 0.1f = 54.4.
Push external audio and video streams
The Push SDK supports ingesting external audio and video sources, such as an audio or video file.
Before you push an external audio or video stream, you must enable custom audio and video stream input.
/// Enable external stream input.
pusherConfig.setExternMainStream(true);
/// Set the color format of video data. In this example, YUVNV21 is used. You can set it to another format as needed.
pusherConfig.setExternVideoFormat(AlivcLivePushVideoFormat.YUVNV21);
/// Set the bit depth format of audio data. In this example, S16 is used. You can set it to another format as needed.
pusherConfig.setExternMainStream(AlivcLivePushAudioFormat.S16);After you enable custom audio and video stream input, you can push external audio and video streams.
Insert external video data
/// Only continuous buffer data of external videos in YUV and RGB formats can be sent using the sendVideoData interface. You can send the video data buffer, length, width, height, timestamp, and rotation angle.
Uint8List bufferData = xxxx; // xxxx indicates the continuous buffer video data in the Uint8List format.
int width = 720; // Video width
int height = 1280; // Video height
int dataSize = xxxx; // xxxx indicates the data size.
int pts = xxxx; // xxxx indicates the timestamp in microseconds.
int rotation = 0; // Rotation angle
livePusher.sendVideoData(bufferData, width, height, size, pts, rotation);Insert external audio data
/// Only continuous buffer data in the PCM format can be sent using the sendPCMData interface. You can send the audio data buffer, length, and timestamp.
Uint8List bufferData = xxxx; // xxxx indicates the continuous buffer audio data in the Uint8List format.
int dataSize = xxxx; // xxxx indicates the data size.
int sampleRate = xxxx; // xxxx indicates the sample rate.
int channel = 0; // The number of sound channels.
int pts = xxxx; // xxxx indicates the timestamp in microseconds.
livePusher.sendPCMData(bufferData, size, sampleRate, channel, pts);Get the version number of the native Push SDK
/// Get the version number of the native live stream ingest SDK.
String sdkVersion = await AlivcLiveBase.getSdkVersion();Configure logs
/// Enable log printing in the console.
AlivcLiveBase.setConsoleEnable(true);
/// Set the log level to Debug.
AlivcLiveBase.setLogLevel(AlivcLivePushLogLevel.debug);
/// The maximum size of each shard. The total log size is five times the maximum shard size.
const int saveLogMaxPartFileSizeInKB = 100 * 1024 * 1024;
/// The log path.
String saveLogDir = "TODO";
/// Set the log path and log shard size.
AlivcLiveBase.setLogPath(saveLogDir, saveLogMaxPartFileSizeInKB);Reset the Config object (iOS)
On iOS, if you no longer use AlivcLivePusherConfig, we recommend that you call this method to reset the native config object. The next time you create the object, its settings are restored to the default state.
We recommend that you call this method after you call the destroy method for AlivcLivePusher.
/// Reset the config object on iOS.
livePusher.destroyConfigForIOS();Add retouching effects
The Push SDK for Flutter provides plug-in-based retouching capabilities. To use the retouching feature, find and use the flutter_livepush_beauty_plugin plug-in in the example\plugins directory of the SDK package.
The retouching plug-in is not released separately.
/// 1. Initialize the retouching object.
AlivcLiveBeautyManager beautyManager = AlivcLiveBeautyManager.init();
beautyManager.setupBeauty();
/// 2. Open the retouching panel.
beautyManager.showPanel();
/// 3. Close the retouching panel (for Android).
beautyManager.hidePanel();
/// 4. Destroy the retouching object.
beautyManager.destroyBeauty();FAQ
Stream Ingest Fails
You can use the troubleshooting tool to check whether the ingest URL is valid.
How Do I Obtain Information About Ingested Audio And Video Streams?
You can go to Stream Management to view and manage ingested audio and video streams on the Active Streams tab.
How Do I Play A Stream?
After you start stream ingest, you can use a player, such as ApsaraVideo Player, FFplay, or VLC, to test stream pulling. For more information about how to obtain a streaming URL, see Generate ingest and streaming URLs.