All Products
Search
Document Center

ApsaraVideo Live:Use Push SDK for Android

Last Updated:Nov 26, 2025

This topic introduces how to use Push SDK for Android with examples, covering its core interfaces and basic workflow.

Features

  • Supports stream ingest over Real-Time Messaging Protocol (RTMP).

  • Supports RTS stream ingest and pulling based on Real-Time Communication (RTC).

  • Supports co-streaming and battles.

  • Adopts H.264 for video encoding and AAC for audio encoding.

  • Supports custom configurations for features such as bitrate control, resolution, and display mode.

  • Supports various camera operations.

  • Supports real-time retouching and custom retouching effects.

  • Allows you to add and remove animated stickers as watermarks.

  • Allows you to stream screen recordings.

  • Supports external audio and video inputs in different formats such as YUV and pulse-code modulation (PCM).

  • Supports mixing of multiple streams.

  • Supports ingest of audio-only and video-only streams and stream ingest in the background.

  • Supports background music.

  • Supports video snapshot capture.

  • Supports automatic reconnection and error handling.

  • Supports Automatic Gain Control (AGC), Automatic Noise Reduction (ANR), and Acoustic Echo Cancellation (AEC) algorithms.

  • Allows you to switch between the software and hardware encoding modes for video files. This improves the stability of the encoding module.

Limitations

Take note of the following limits before you use Push SDK for Android:

  • You must configure screen orientation before stream ingest. You cannot rotate the screen during live streaming.

  • You must disable auto screen rotation for stream ingest in landscape mode.

  • In hardware encoding mode, the value of the output resolution must be a multiple of 16 to be compatible with the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars.

API reference

API references for Basic Edition

Procedure

  1. Register the SDK

  2. Configure stream ingest parameters

  3. Ingest streams

Feature usage

Register the SDK

You must register the SDK before stream ingestion. Otherwise, you cannot use the SDK.

An SDK license is required for registration. To apply for and configure a license, see Integrate a Push SDK license.

Call the following methods before you use Push SDK for Android: 

AlivcLiveBase.setListener(new AlivcLiveBaseListener() {
  @Override
  public void onLicenceCheck(AlivcLivePushConstants.AlivcLiveLicenseCheckResultCode result, String reason) {
    Log.e(TAG, "onLicenceCheck: " + result + ", " + reason);
  }
});
AlivcLiveBase.registerSDK();
  • AlivcLiveBase allows you to configure log levels, set local log paths, and retrieve the SDK version.

  • Call the registerSDK method in AlivcLiveBase to register the SDK license.

  • By implementing the onLicenceCheck method through AlivcLiveBase#setListener, you can asynchronously verify license configuration.

    Note

    This callback is only triggered after initializing the pusher instance.

Configure stream ingest parameters

All basic parameters have default values. We recommend that you use the default values.

// Initialize the class for stream ingest settings.
AlivcLivePushConfig mAlivcLivePushConfig = new AlivcLivePushConfig();
// Specify the stream ingest mode. By default, the basic stream ingest mode is used.
mAlivcLivePushConfig.setLivePushMode(AlivcLiveMode.AlivcLiveBasicMode);
// Specify the resolution. The default resolution is 540p.
mAlivcLivePushConfig.setResolution(AlivcResolutionEnum.RESOLUTION_540P);
// Specify the frame rate. The default frame rate is 20 frames per second (FPS).
mAlivcLivePushConfig.setFps(AlivcFpsEnum.FPS_25);
// Specify the group of pictures (GOP) size. Unit: seconds. The default GOP size is 2 seconds.
mAlivcLivePushConfig.setVideoEncodeGop(AlivcVideoEncodeGopEnum.GOP_TWO);
// Specify whether to enable bitrate control. The default value is true.
mAlivcLivePushConfig.setEnableBitrateControl(true);
// Specify the screen orientation. The default orientation is portrait. In landscape mode, you can configure whether the home button is positioned on the left or right.
mAlivcLivePushConfig.setPreviewOrientation(AlivcPreviewOrientationEnum.ORIENTATION_PORTRAIT);
// Specify the audio encoding format. The default format is AAC-LC.
mAlivcLivePushConfig.setAudioProfile(AlivcAudioAACProfileEnum.AAC_LC);
// Specify the video encoding mode. By default, hardware encoding is used.
mAlivcLivePushConfig.setVideoEncodeMode(AlivcEncodeModeEnum.Encode_MODE_HARD);
// Specify the audio encoding mode. By default, software encoding is used.
mAlivcLivePushConfig.setAudioEncodeMode(AlivcEncodeModeEnum.Encode_MODE_SOFT);
// Specify whether to use the front camera or the rear camera. By default, the front camera is used.
mAlivcLivePushConfig.setCameraType(AlivcLivePushCameraTypeEnum.CAMERA_TYPE_FRONT);
// Specify the image that is ingested when your application is switched to the background or video stream ingest is paused.
mAlivcLivePushConfig.setPausePushImage("TODO: Image Path");
// Specify the image that is ingested in poor network conditions.
mAlivcLivePushConfig.setNetworkPoorPushImage("TODO: Image Path");
Important
  • Considering mobile device performance and network bandwidth requirements, we recommend that you set the resolution to 540P. Most mainstream live streaming apps use 540P.

  • If you disable adaptive bitrate, the bitrate is fixed at the initial bitrate and will not automatically adjust between the target and minimum bitrates. In poor network conditions, this may cause playback stuttering.

Ingest a camera stream

  1. Initialize the SDK.

    After you configure stream ingest parameters, call the init method to initialize the class. Sample code:

    AlivcLivePusher mAlivcLivePusher = new AlivcLivePusher();
    mAlivcLivePusher.init(mContext, mAlivcLivePushConfig);
    Note

    AlivcLivePusher does not support multiple instances. Therefore, each init method must correspond to a destroy method.

  2. Register preview callbacks.

    Call the setLivePushInfoListener method to register preview callbacks:

    /**
     * Configure callbacks for stream ingest events.
     *
     * @param infoListener The listener.
     */
    mAlivcLivePusher.setLivePushInfoListener(new AlivcLivePushInfoListener() {
        @Override
        public void onPreviewStarted(AlivcLivePusher pusher) {
            // Notifies that preview starts.
        }
        // Other Override methods
        //....
        //....
    });

  3. Start preview.

    To preview the camera feeds, specify the SurfaceView for the camera. Sample code:

    mAlivcLivePusher.startPreview(mSurfaceView)// Start preview. You can also call the startPreviewAysnc method (asynchronous).
  4. Start stream ingest.

    Add the following code to the onPreviewStarted callback method: 

    mAlivcLivePusher.startPush(mPushUrl);
    Note
    • RTMP and RTS (artc://) ingest URLs are supported. To generate ingest URLs, see Generate ingest and streaming URLs.

    • ApsaraVideo Live does not support ingesting multiple streams to the same URL simultaneously. The second ingest request will be rejected.

Ingest-related methods

Push SDK supports stream ingest control, such as start, stop, pause, and resume ingestion, stop preview, and dispose stream ingest objects. You can add buttons to perform these operations.

/* Pause stream ingest. For a stream that is being ingested, the video preview and video stream ingest are paused at the last frame, and the audio stream continues to be ingested. */
mAlivcLivePusher.pause();
/* Resume stream ingest. The preview and ingest of audio and video streams are resumed. */
mAlivcLivePusher.resume();
/* Stop a stream that is being ingested. */
mAlivcLivePusher.stopPush();
/* Stop preview. This operation does not take effect for a stream that is being ingested. When preview is stopped, the preview window is frozen at the last frame. */
mAlivcLivePusher.stopPreview();
/* Restart stream ingest when the stream is being ingested or when an error callback is received. All resources in ALivcLivePusher are reinitialized, including preview and ingestion. If an error occurs, you can call this method or the reconnectPushAsync method to restart stream ingest. You can also call the destroy method to destroy the stream ingest instance. */
mAlivcLivePusher.restartPush();
/* Reconnect and repush the RTMP stream during streaming or network error state (errors related to AlivcLivePusherNetworkDelegate). In the error state, you can also call destroy to dispose the instance.*/
mAlivcLivePusher.reconnectPushAsync();
/* Dispose the stream ingest instance. After you call this method, stream ingest and preview are stopped, and the preview window is removed. All resources related to AlivcLivePusher are released. */
mAlivcLivePusher.destroy();

Camera-related methods

You can perform camera-related operations in the streaming, paused, or reconnecting state. For example, you can switch between the front and rear cameras and configure the flash, focal length, zooming, and mirroring mode.

/* Switch between the front camera and the rear camera. */
mAlivcLivePusher.switchCamera();
/* Enable or disable the flash. You cannot enable the flash for the front camera. */
mAlivcLivePusher.setFlash(true); 
/* Adjust the focal length to zoom in or out. Valid values: [0,getMaxZoom()]. */
mAlivcLivePusher.setZoom(5);
/* Configure manual focus. Set the following parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus parameter specifies whether to enable autofocus. The autoFocus parameter takes effect only for this call. Whether autofocus is enabled otherwise depends on the setAutoFocus method. */
mAlivcLivePusher.focusCameraAtAdjustedPoint(x, y, true);
/* Specify whether to enable autofocus. */
mAlivcLivePusher.setAutoFocus(true);
/* Configure mirroring. The methods for mirroring are PushMirror and PreviewMirror. PushMirror takes effect only for stream playback, and PreviewMirror takes effect only for preview. */
mAlivcLivePusher.setPreviewMirror(false);
mAlivcLivePusher.setPushMirror(false);
Important

You can call the camera-related methods only after preview starts.

Ingest a screen sharing stream

Push SDK supports ingesting screen sharing streams. You need to initialize the SDK and configure the preview before ingestion. After the stream is ingested, you can set advanced configurations. 

Android's MediaProjection API is used for capturing screen content, which requires user permission request. The data returned from the permission request must be passed to setMediaProjectionPermissionResultData in AlivcLivePushConfig to enable screen sharing. By default, the camera is disabled during screen sharing.

Configure screen sharing

// resultData: the system intent for screen sharing
mAlivcLivePushConfig.setMediaProjectionPermissionResultData(resultData);

Configure screen rotation

In screen sharing mode, the SDK supports dynamic switch between portrait and landscape modes based on the detected device orientation.

To use the feature, register OrientationEventListener at the application layer to detect device orientation and pass the screen sharing rotation angle based on the detected result in the setScreenOrientation interface. Sample code: 

mAlivcLivePusher.setScreenOrientation(0);

Configure privacy protection

This feature prevents sensitive information exposure during screen recording. During secure operations such as password input, streamers can enable privacy protection.

When enabled, screen sharing is paused. If setPausePushImage is configured, the specified image is displayed during the pause. If not configured, the last frame is displayed.

Sample code: 

mAlivcLivePusher.pauseScreenCapture();// Enable privacy protection.
mAlivcLivePusher.resumeScreenCapture();// Disable privacy protection.

Ingest both camera and screen sharing streams

When a streamer is sharing the screen with viewers, they can also ingest a camera stream. There are two modes in this scenario: 

  1. The streamer can see the camera stream.

  2. The streamer cannot see the camera stream.

    For example, during game streaming, the streamer does not want the camera view to block the game screen. However, the viewers can view the camera stream.

Camera view available to streamer

After screen sharing is enabled, call the following methods to enable or disable camera preview.

mAlivcLivePusher.startCamera(surfaceView);// Enable camera preview.
mAlivcLivePusher.stopCamera();// Disable camera preview.
Note

We recommend that you set the aspect ratio of surfaceView to 1:1 in screen sharing mode. Otherwise, you need to adjust SurfaceView aspect ratio on rotation, then stop (stopCamera) and restart (startCamera) the camera preview.

Camera view unavailable to streamer

If the streamer does not need the camera view but wants to show it to viewers, set startCamera to null, and enable stream mixing for camera feeds. Sample code: 

mAlivcLivePusher.startCameraMix(x, y, w, h);// Enable camera stream mixing.
mAlivcLivePusher.stopCameraMix();// Disable camera stream mixing.

Where:

  • x: the initial x-coordinate of the camera view in the mixed stream.

  • y: the initial y-coordinate of the camera view in the mixed stream.

  • w: the width of the camera view.

  • h: the height of the camera view.

The values are normalized. Valid values: 0-1.0f.

Configure preview display mode

Push SDK for Android supports the following preview modes. The preview mode does not affect stream ingest.

  • ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: The video fills the preview window. If the aspect ratios of the video and preview window are inconsistent, video deformation occurs.

  • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: The aspect ratio of the video is preserved. If aspect ratios differ, black bars appear on the preview window.

  • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: The video is cropped to fit the preview window when aspect ratios differ.

Sample code: 

mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);

Ingest an image

Push SDK for Android supports ingesting an image when the application is switched to the background or the bitrate is low. 

When the application is switched to the background, video stream ingest is paused by default, and only the audio stream is ingested. Streamers can display an image, informing viewers that the streamer is away and will be back soon.

mAlivcLivePushConfig.setPausePushImage("Image Path");// Specify the image that is ingested when the app is switched to the background.

You can also specify an image to be ingested in poor network conditions. When the bitrate is low, the image is displayed to prevent stuttering. Sample code:

mAlivcLivePushConfig.setNetworkPoorPushImage("Image Path");// Specify the image that is ingested in poor network conditions.

Ingest an audio-only stream

To transmit only audio to viewers, set setAudioOnly to true: 

mAlivcLivePushConfig.setAudioOnly(true);

Push external audio/video sources

Push SDK for Android supports ingesting external audio/video sources, such as a video file.

Before ingestion, enable custom audio and video input.

/**
* Other parameters, such as the output resolution, audio sample rate, and number of channels, are configured in config by using the setResolution, setAudioSampleRate, and setAudioChannels methods.
*/
mAlivcLivePushConfig.setExternMainStream(true,AlivcImageFormat.IMAGE_FORMAT_YUVNV12,AlivcSoundFormat.SOUND_FORMAT_S16);

Then, ingest external audio and video sources: 

External audio stream

/**
* This method does not configure the time sequence. You must manually configure the time sequence of input audio frames.
*/
mAlivcLivePusher.inputStreamAudioData(byte[] data, int size, int sampleRate, int channels, long pts);

External video stream

/**
* This method does not configure the time sequence. You must manually configure the time sequence of input video frames.
*/
mAlivcLivePusher.inputStreamVideoData(byte[] data, int width, int height, int stride, int size, long pts, int rotation);

Configure watermarks

Push SDK for Android supports adding one or more watermarks in the PNG format. Sample code: 

mAlivcLivePushConfig.addWaterMark(waterPath,0.1,0.2,0.3);// Add a watermark.

Where:

  • x and y are relative values that determine the watermark position. x=0.1 specifies that the left edge of the watermark is positioned at 10% of the stream width. When the resolution is 540 x 960, the x position is 540 x 0.1 = 54 pixels.

  • width specifies the watermark width relative to stream width. The height is proportionally scaled.

Note

To add a text watermark, convert the text into a PNG image, then call this method to add the image as a watermark.

Configure video quality

Push SDK for Android supports the following video quality modes: Resolution Priority, Fluency Priority, and custom.

Important

To configure video quality, you must enable bitrate control: mAlivcLivePushConfig.setEnableBitrateControl(true);

Resolution Priority (default)

In this mode, the SDK automatically configures bitrate parameters to ensure the video quality.

mAlivcLivePushConfig.setQualityMode(AlivcQualityModeEnum.QM_RESOLUTION_FIRST);// Prioritize resolution

Fluency Priority

In this mode, the SDK automatically configures bitrate parameters to ensure the smoothness of the ingested video stream.

mAlivcLivePushConfig.setQualityMode(AlivcQualityModeEnum.QM_FLUENCY_FIRST);// Prioritize fluency

Custom

In custom mode, the SDK configures bitrate based on the values that you specify, including initial, minimum, and target bitrates.

  • TargetVideoBitrate: In good network conditions, the bitrate is gradually increased to the target bitrate to improve the video quality.

  • MinVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum to prevent stuttering.

  • InitialVideoBitrate: The initial bitrate when a live stream starts.

mAlivcLivePushConfig.setQualityMode(AlivcQualityModeEnum.QM_CUSTOM);// Custom mode
mAlivcLivePushConfig.setTargetVideoBitrate(1000); // The target bitrate is 1,000 Kbit/s.
mAlivcLivePushConfig.setMinVideoBitrate(300); // The minimum bitrate is 300 Kbit/s.
mAlivcLivePushConfig.setInitialVideoBitrate(800); // The initial bitrate is 800 Kbit/s.

When you configure bitrates, refer to the recommended settings provided by Alibaba Cloud: 

Table 1. Recommended settings for Resolution Priority mode

Resolution

initialVideoBitrate

minVideoBitrate

targetVideoBitrate

360p

600

300

1000

480p

800

300

1200

540p

1000

600

1400

720p

1500

600

2000

1080p

1800

1200

2500

Table 1. Recommended settings for Resolution Priority mode

Resolution

initialVideoBitrate

minVideoBitrate

targetVideoBitrate

360p

400

200

600

480p

600

300

800

540p

800

300

1000

720p

1000

300

1200

1080p

1500

1200

2200

Configure adaptive resolution

The SDK supports dynamically adjusting the resolution of an ingested stream. When the feature is enabled, the resolution is automatically reduced to ensure the smoothness and quality in poor network conditions. Sample code:

mAlivcLivePushConfig.setEnableAutoResolution(true); // Enable adaptive resolution. The default value is false.
Important
  • Adaptive resolution takes effect only when the video quality mode is set to Resolution Priority or Fluency Priority.

  • Some players may not support dynamic resolution. We recommend that you use ApsaraVideo Player.

Configure background music

Push SDK for Android allows you to manage background music. You can control the background music playback and configure audio mixing, denoising, in-ear monitoring, and muting. Sample code: 

/* Start the playback of background music. */
mAlivcLivePusher.startBGMAsync(mPath);
/* Stop the playback of background music. If you want to change the background music, call the startBGMAsync method. You do not need to stop the current playback. */
mAlivcLivePusher.stopBGMAsync();
/* Pause the playback of background music. You can call this method only after the playback of background music starts. */
mAlivcLivePusher.pauseBGM();
/* Resume the playback of background music. You can call this method only after the playback of background music is paused. */
mAlivcLivePusher.resumeBGM();
/* Enable looping. */
mAlivcLivePusher.setBGMLoop(true);
/* Configure denoising. When enabled, the system filters out non-vocal parts from the collected audio. This feature may slightly reduce the volume of the human voice. We recommend that you allow your users to determine whether to enable this feature. By default, this feature is disabled. */
mAlivcLivePusher.setAudioDenoise(true);
/* Configure in-ear monitoring. In-ear monitoring is suitable for karaoke scenarios. When enabled, headphone users can hear their voice. When disabled, they cannot hear their voice on headphones. This parameter does not take effect if no headphones are detected. */
mAlivcLivePusher.setBGMEarsBack(true);
/* Configure audio mixing to adjust the volumes of the background music and human voice. */
mAlivcLivePusher.setBGMVolume(50);// Specify the volume of the background music.
mAlivcLivePusher.setCaptureVolume(50);// Specify the volume of the human voice.
/* Configure muting. When enabled, the background music and the human voice are both muted. To separately mute one of them, call the method that is used to configure audio mixing. */
mAlivcLivePusher.setMute(true);
Important

You can call these methods only after the preview starts.

Capture snapshots

Push SDK for Android supports capturing snapshots of local video streams. Sample code: 

// Capture snapshots of video streams. Parameters: the number of snapshots to capture, the capture interval, and the callback.
pusher.snapshot(1, 1, new AlivcSnapshotListener() {
    @Override
    public void onSnapshot(Bitmap bmp) {
            // You can save each snapshot as a local PNG file. Sample code:
        String dateFormat = new SimpleDateFormat("yyyy-MM-dd-hh-mm-ss-SS").format(new Date());
        File f = new File(context.getExternalFilesDir(Environment.DIRECTORY_PICTURES), "snapshot-" + dateFormat + ".png");
        if (f.exists()) {
            f.delete();
        }
        try {
            FileOutputStream out = new FileOutputStream(f);
            bmp.compress(Bitmap.CompressFormat.PNG, 90, out);
            out.flush();
            out.close();
        } catch (FileNotFoundException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
});

Handle background running and screen locking

  • When the app is switched to the background or the screen is locked, call the pause() or resume() method in the AlivcLivePusher class to pause or resume stream ingest.

  • For non-system audio and video calls, the SDK continues capturing and ingesting the audio stream. You can call the mAlivcLivePusher.setMute() method to specify whether to enable audio capturing when the app is switched to the background or the screen is locked.

Callbacks

Callback type

Class

Method

Stream ingest callbacks

AlivcLivePushInfoListener

mAlivcLivePusher.setLivePushInfoListener()

Network callbacks

AlivcLivePushNetworkListener

mAlivcLivePusher.setLivePushNetworkListener()

Error callbacks

AlivcLivePushErrorListener

mAlivcLivePusher.setLivePushErrorListener()

Background music callbacks

AlivcLivePushBGMListener

mAlivcLivePusher.setLivePushBGMListener()

Stream ingest callbacks

Stream ingest callbacks are used to notify the app of the SDK status, including preview started, first video frame rendered, first frame of audio/video stream sent, ingest started, and ingest stopped.

  • onPushStarted: indicates that the server is connected.

  • onFirstFramePushed: indicates that the first frame of the audio or video stream is sent.

  • onPushStarted and onFirstFramePushed: indicate that stream is ingested.

Network callbacks

Network callbacks are used to notify the app of the network status and connection status. When a brief network drop or switch occurs, the SDK will attempt to reconnect automatically, as long as the interruption stays within the timeout period and retry limit configured in AlivcLivePushConfig. If the reconnection succeeds, stream ingest resumes.

  • onConnectFail: indicates that stream ingest fails. We recommend that you check whether the ingest URL is valid (for example, whether the URL contains invalid characters), whether there is an authentication issue, whether the upper limit on the number of concurrently ingested streams is exceeded, and whether the stream is in the blacklist. Make sure that the ingest URL is valid and available before you try to ingest the stream. The relevant error codes include 0x30020901 to 0x30020905 and 0x30010900 to 0x30010901.

  • onConnectionLost: indicates that the connection is lost. The SDK automatically reconnects to the network and returns onReconnectStart. If the connection is not recovered after the maximum number of reconnection attempts (config.connectRetryCount) is reached, onReconnectFail is returned.

  • onNetworkPoor: indicates that the network speed is slow. If you receive this callback, the current network may not be able to fully support your ingested stream, even though the stream is not interrupted. In this case, you can handle your own business logic, for example, notify the user of the poor network conditions.

  • onNetworkRecovery: indicates that the network is recovered.

  • onReconnectFail: indicates that the SDK fails to automatically reconnect to the network because the network is disconnected for a period that exceeds the reconnection timeout period and the maximum number of reconnection attempts specified in AlivcLivePushConfig. After the network recovers, call the mAlivcLivePusher.reconnectPushAsync method to reconnect.

  • onSendDataTimeout: indicates that a timeout occurred when sending data. We recommend that you check the current network and re-ingest the stream when the network recovers.

  • onPushURLAuthenticationOverdue: indicates that the authentication of the ingest URL expires. You must provide a new URL to the SDK.

Error callbacks

  • onSystemError: indicates a system error occurred. You must dispose the engine and try again.

  • onSDKError: indicates an SDK error occurred. You need to perform operations based on the error code:

    • If the error code is 805438211, the device performance is poor and the frame rate for encoding and rendering is low. You need to prompt the streamer and stop time-consuming business logic, such as advanced retouching and animation, at the application layer.

    • You need to pay special attention to the callbacks that are related to microphone and camera permissions. The 268455940 error code indicates that the app requires the permissions on the microphone. The 268455939 error code indicates that the app requires the permissions on the camera.

    • For other error codes, no additional operations are required. All error codes are recorded in logs.

Background music callbacks

  • onOpenFailed: indicates that the background music fails to start playback. Check whether the music file is valid and whether its path is correctly specified. Call the startBGMAsync method to try again.

  • onDownloadTimeout: indicates a timeout during the playback of the background music. This usually occurs when the background music comes from a URL. In this case, check the network status and call the startBGMAsync method to play the music again.

Common methods and interfaces

Common methods

/* In custom mode, you can adjust the minimum and target bitrates in real time. */
mAlivcLivePusher.setTargetVideoBitrate(800);
mAlivcLivePusher.setMinVideoBitrate(400);
/* Specify whether autofocus is supported. */
mAlivcLivePusher.isCameraSupportAutoFocus();
/* Specify whether flash is supported. */
mAlivcLivePusher.isCameraSupportFlash();
/* Query whether the stream is being ingested. */
mAlivcLivePusher.isPushing(); 
/* Obtain the ingest URL. */
mAlivcLivePusher.getPushUrl();
/* Query the stream ingest performance debugging information. For more information about the parameters of stream ingest performance, see the API references or comments in the code. */
mAlivcLivePusher.getLivePushStatsInfo();
/* Obtain the SDK version number. */
mAlivcLivePusher.getSDKVersion();
/* Specify the log level to filter debugging information. */
mAlivcLivePusher.setLogLevel(AlivcLivePushLogLevelAll);
/* Query the status of Push SDK for Android. */
mAlivcLivePusher.getCurrentStatus();
/* Query the last error code. If no error occurs, ALIVC_COMMON_RETURN_SUCCESS is returned. */
mAlivcLivePusher.getLastError();

APIs

Class

Description

AlivcLivePushConfig

Stream ingest configurations.

AlivcLivePusher

Stream ingest features.

AlivcLivePusherErrorListener

Callback for system and SDK errors.

AlivcLivePusherNetworkListener

Callback for network state.

AlivcLivePusherInfoListener

Callback for streaming statistics.

AlivcLivePusherBGMListener

Callback for background-music events.

AlivcLivePushCustomFilter

Callback for custom filters.

AlivcLivePushCustomDetect

Callback for custom facial recognition.

AlivcSnapshotListener

Callback that returns a captured snapshot.

Integrate the retouching feature

To use the retouching feature in Push SDK for Android, import a retouching library and configure callbacks.

Note

You must obtain a license for Queen SDK. For more information, see Obtain a license for Queen SDK.

  1. Import the retouching library and panel by using Maven. Add the following code to the build.gradle file of your project. For the latest version of Queen SDK, see Download Queen SDK.

    implementation "com.aliyun.maliang.android:queen:2.5.0-official-full"
    implementation("com.aliyun.maliang.android:queen_menu:2.5.0-official-full") {
        exclude group: 'com.aliyun.maliang.android', module: 'queen'
    }

    You can also integrate the LiveBeauty module provided in the demo:

    File or folder

    Description

    live_beauty

    The abstract class for retouching.

    queen_beauty

    The user interface (UI) widgets for retouching.

  2. Obtain the LiveBeauty plug-in library.

    1. Use the clone command to download the plug-in library code from LiveBeautyto your local device.

      git clone https://github.com/MediaBox-Demos/amdemos-android-live.git
    2. Open the command line in the root directory of your Android Studio project. Run the following code, and then choose File > New > Import Module to import the LiveBeauty module to your Android project.

      git submodule add https://github.com/MediaBox-Demos/amdemos-android-live.git ***/***/***

      Where ***/***/*** specifies the path to save the LiveBeauty module.

    3. Add the path of the module to the settings.gradle file of your project:

      include ':app', ':LiveBeauty', ':LiveBeauty:live_queenbeauty'

      In the sample code, "app" is the main module.

    4. Add the dependency on the LiveBeauty module to the build.gradle file of the main module of your project:

      dependencies {
          implementation project(':LiveBeauty')
          implementation project(':LiveBeauty:live_queenbeauty')
      }
    5. Click File > Sync Project with Gradle Files and wait for Gradle to complete the synchronization. Then, you can use the LiveBeauty module.

  3. Configure the UI module of the retouching plug-in.

    1. Add the QueenBeautyMenu widget to the layout XML file of your project. Example:

      <com.aliyunsdk.queen.menu.QueenBeautyMenu
          android:id="@+id/beauty_beauty_menuPanel"
          android:layout_width="match_parent"
          android:layout_height="wrap_content"
          android:layout_alignParentBottom="true"
          android:layout_centerHorizontal="true" />
    2. Initialize QueenBeautyMenu in the Activity. Example:

      // Initialize the rotouching panel.
      QueenMenuPanel beautyMenuPanel = QueenBeautyMenu.getPanel(context);
      beautyMenuPanel.onHideMenu(); 
      beautyMenuPanel.onHideValidFeatures(); 
      beautyMenuPanel.onHideCopyright(); 
      
      // Add the panel to the layout.
      QueenBeautyMenu beautyBeautyContainerView = findViewById(R.id.beauty_beauty_menuPanel);
      beautyBeautyContainerView.addView(beautyMenuPanel);
  4. Configure the callbacks for facial recognition and retouching.

    If you want to access a third-party retouching library, register the setCustomDetect and setCustomFilter callbacks.

    • The data parameter returned by the customDetectProcess method of AlivcLivePushCustomDetect serves as the pointer for collected video frame data. The callback includes comprehensive parameters: long data, int width, int height, int rotation, int format, and long extra. These parameters provide detailed information about the video frame, enabling third-party retouching libraries to identify, analyze, or process the raw frame data with precision and flexibility.

    • The inputTexture parameter returned by the customFilterProcess method of AlivcLivePushCustomFilter represents the source image texture available for processing by third-party retouching libraries. The callback provides comprehensive parameters: int inputTexture, int textureWidth, int textureHeight, and long extra. If no custom processing is performed, return the original inputTexture. For processed textures, return the new texture ID.

    Sample code

    /**
     * The callback for facial recognition.
     */
    mAlivcLivePusher.setCustomDetect(new AlivcLivePushCustomDetect() {
        @Override
        public void customDetectCreate() {
    
        }
    
        @Override
        public long customDetectProcess(long dataPtr, int width, int height, int rotation, int format, long extra) {
            return 0;
        }
    
        @Override
        public void customDetectDestroy() {
    
        }
    });
    
    /**
     * The callback for retouching.
     */
     
     /**
     * Initialize BeautyManager
     */
    mAlivcLivePusher.setCustomFilter(new AlivcLivePushCustomFilter() {
        @Override
        public void customFilterCreate() {
            initBeautyManager();
        }
    
     /**
     * Process the video stream and add retouching effects
     */
        @Override
        public int customFilterProcess(int inputTexture, int textureWidth, int textureHeight, long extra) {
            if (mBeautyManager == null) {
                return inputTexture;
            }
    
            return mBeautyManager.onTextureInput(inputTexture, textureWidth, textureHeight);
        }
    
        @Override
        public void customFilterDestroy() {
            destroyBeautyManager();
        }
    });

Usage notes

Take note of the following items when you use Push SDK for Android:

Item

Description

Obfuscation rules

Check the obfuscation configurations. Make sure that SDK classes are not obfuscated.

-keep class com.alivc.** { *;}

Method call

  • You can call both synchronous and asynchronous methods. However, we recommend that you try not to call synchronous methods because these methods consume the resources of the main thread.

  • Push SDK for Android throws exceptions when you fail to call the required methods or call methods in an invalid sequence. You must add a try-catch statement to prevent unexpected crash.

  • The following diagram shows how to call methods in proper sequence:

    image

FAQ

How do I troubleshoot stream ingest failures?

You can use the troubleshooting tool to check whether the ingest URL is valid.

How do I obtain information about ingested streams?

Go to Streams and view ingested audio and video streams in Active Streams.

How do I play a stream?

After you start stream ingest, you can use a player (such as ApsaraVideo Player, FFplay, or VLC) to test stream pulling. To obtain playback URLs, see Generate ingest and streaming URLs.