This topic describes how to use Push SDK for Android and the classes and methods in the SDK. This topic also provides examples on how to use Push SDK for Android to implement specific features.

Features of Push SDK for Android

  • Supports the stream ingest over Real-Time Messaging Protocol (RTMP).
  • Adopts H.264 for video coding and Advanced Audio Coding (AAC) for audio coding.
  • Supports custom configurations for features such as bitrate control, resolution, and display mode.
  • Supports a variety of camera operations.
  • Supports real-time retouching and allows you to customize retouching effects.
  • Supports using animated stickers as animated watermarks and allows you to add and remove animated watermarks.
  • Supports live stream recording.
  • Supports external audio and video input in different formats such as YUV and pulse-code modulation (PCM).
  • Supports mixing streams.
  • Supports the ingest of audio-only and video-only streams and stream ingest in the background.
  • Supports background music and allows you to manage background music.
  • Supports capturing snapshots from streams.
  • Supports automatic reconnection and exception handling.

Classes

Class Description
AlivcLivePushConfig The configuration class that is used to initialize the stream ingest configurations.
AlivcLivePusher The core class of Push SDK.
AlivcLivePusherErrorListener The callbacks that are used when errors occur.
AlivcLivePusherNetworkListener The callbacks that are used to manage network services.
AlivcLivePusherInfoListener The callbacks that are used for notifications and status detection.
AlivcLivePusherBGMListener The callbacks that are used for playing background music.
AlivcLivePushCustomFilter The callbacks that are used for custom filter services.
AlivcLivePushCustomDetect The callbacks that are used for custom face detection.
AlivcSnapshotListener The callbacks that are used for capturing snapshots.

Set the stream ingest parameters

To use Push SDK for Android, you must set the stream ingest parameters and create a preview view. After that, you can start to ingest streams. You can set stream ingest parameters in the AlivcLivePushConfig class. Each parameter has a default value.

For information about the default values and valid values, see Push SDK for Android V4.2.1 Reference or comments in the codes.

You can change the values of parameters as needed. To modify these parameters in real time, see the parameters and methods of the AlivcLivePusher class.

  1. Set basic stream ingest configurations. The following code provides an example:
    AlivcLivePushConfig mAlivcLivePushConfig = new
    AlivcLivePushConfig(); // Initialize the stream ingest configurations.
    mAlivcLivePushConfig.setResolution(AlivcResolutionEnum.RESOLUTION_540P); // Set the resolution to 540p. The maximum value is 720p.
    mAlivcLivePushConfig.setFps(AlivcFpsEnum.FPS_20); // We recommend that you set the frame rate to 20 frames per second (FPS).
    mAlivcLivePushConfig.setEnableBitrateControl(true); // Enable adaptive bitrate streaming. The default value is true.
    mAlivcLivePushConfig.setPreviewOrientation(AlivcPreviewOrientationEnum.ORIENTATION_PORTRAIT);
    // By default, the preview is in portrait mode. You can change the mode to landscape left or landscape right.
    mAlivcLivePushConfig.setAudioProfile(AlivcAudioAACProfileEnum.AlivcAudioAACProfileEnum.AAC_LC); // Set the audio encoding format.
    mAlivcLivePushConfig.setEnableBitrateControl(true); // Enable adaptive bitrate. The default value is true.
    Note:
    • All these parameters have default values. We recommend that you use the default values.
    • Considering the performance of most mobile phones and network bandwidth requirements, we recommend that you set the resolution to 540p. Most mainstream live streaming apps use 540p.
    • After adaptive bitrate streaming is disabled, the bitrate is fixed at the initial value and is not automatically adjusted between the maximum bitrate and the minimum bitrate. If the network is unstable, this setting may cause stuttering. Use the adaptive bitrate streaming feature with caution.
  2. Specify a bitrate control mode. Push SDK for Android provides three bitrate control modes for you to ingest streams. You can specify a mode by setting the AlivcQualityModeEnum parameter.
    • QM_RESOLUTION_FIRST: quality first. Push SDK for Android sets bitrate parameters to ensure the quality of video streams first.
    • QM_FLUENCY_FIRST: smoothness first. Push SDK for Android sets bitrate parameters to ensure the smoothness of video streams first.
    • QM_CUSTOM: custom mode. Push SDK for Android sets bitrate parameters based on your custom settings.
    The following code provides examples:
    • The following code provides an example on how to configure the quality-first or smoothness-first mode:
      mAlivcLivePushConfig.setQualityMode(AlivcQualityModeEnum.QM_RESOLUTION_FIRST); // In this example, the quality-first mode is used.
      If you use the quality-first or smoothness-first mode, you do not need to set the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters. Push SDK for Android automatically ensures the quality or smoothness of video streams when the network is unstable.
    • The following code provides an example on how to configure the custom mode:
      mAlivcLivePushConfig.setQualityMode(AlivcQualityModeEnum.QM_CUSTOM);
      mAlivcLivePushConfig.setTargetVideoBitrate(1000); // The maximum bitrate is 1,000 Kbit/s.
      mAlivcLivePushConfig.setMinVideoBitrate(300); // The minimum bitrate is 300 Kbit/s.
      mAlivcLivePushConfig.setInitialVideoBitrate(800); // The initial bitrate is 800 Kbit/s.
      If you use the custom mode, you must set the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters.
      • initialVideoBitrate: the initial bitrate when the live streaming starts.
      • minVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum value to avoid stuttering.
      • targetVideoBitrate: In good network conditions, the bitrate is gradually increased to the maximum value to improve the quality of the video stream.
      We recommend that you use the following parameter values for the custom mode:

      Quality first

      Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
      360p 600 300 1000
      480p 800 300 1200
      540p 1000 600 1400
      720p 1500 600 2000

      Smoothness first

      Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
      360p 400 200 600
      480p 600 300 800
      540p 800 300 1000
      720p 1000 300 1200
  3. Enable adaptive resolution streaming. After adaptive resolution streaming is enabled, the resolution is automatically reduced to ensure the smoothness and quality of video streams in poor network conditions. The adaptive resolution streaming feature is not supported by all players. If you need to use this feature, we recommend that you use ApsaraVideo Player. The following code provides an example:
    mAlivcLivePushConfig.setEnableAutoResolution(true); // Enable adaptive resolution streaming. The default value is false.
    Adaptive resolution streaming is supported only if the AlivcQualityModeEnum parameter is set to the quality-first or smoothness-first mode.
  4. Use the retouching feature.
    1. To integrate the retouching feature, you must import the retouching resources by using Android Archive (AAR) files. Add the following code to the build.gradle file in the root directory:
      com.aliyun.maliang.android:queen:1.3.0.1_beta-aliyunlive-pro
      The following table describes the modules provided in the demo that you can integrate.
      File or folder Description
      beauty The abstract class of retouching.
      beautyui The UI widgets of retouching.
      queenbeauty The retouching components of Queen SDK.
      Imageutil The Imageutil public class that is used to manage the images for retouching.
    2. In addition, you must set two callbacks in the code:
      /**
      * The callbacks that are used for facial recognition.
      **/
      mAlivcLivePusher.setCustomDetect(new AlivcLivePushCustomDetect() {
          @Override
          public void customDetectCreate() {
              Log.d(TAG, "customDetectCreate start");
              initBeautyManager();
              Log.d(TAG, "customDetectCreate end");
          }
      
          @Override
          public long customDetectProcess(long data, int width, int height, int rotation, int format, long extra) {
              Log.d(TAG, "customDetectProcess start: data ptr:" + data + ",width:" + width + ",height:" + height + "," + format + "," + rotation);
      
              if (mBeautyManager != null) {
                  mBeautyManager.onDrawFrame(data, BeautyImageFormat.kNV21, width, height, 0, mCameraId);
                  Log.d(TAG, "keria: " + mCameraId);
              }
              Log.d(TAG, "customDetectProcess end");
      
              return 0;
          }
      
          @Override
          public void customDetectDestroy() {
              Log.d(TAG, "customDetectDestroy start");
              destroyBeautyManager();
              Log.d(TAG, "customDetectDestroy end");
          }
      });
      
      /**
      * The callbacks that are used for retouching.
      **/
      
      mAlivcLivePusher.setCustomFilter(new AlivcLivePushCustomFilter() {
          @Override
          public void customFilterCreate() {
              Log.d(TAG, "customFilterCreate start");
      
              initBeautyManager();
      
              Log.d(TAG, "customFilterCreate end");
          }
      
          @Override
          public void customFilterUpdateParam(float fSkinSmooth, float fWhiten, float fWholeFacePink, float fThinFaceHorizontal, float fCheekPink, float fShortenFaceVertical, float fBigEye) {
      
          }
      
          @Override
          public void customFilterSwitch(boolean on) {
      
          }
      
          @Override
          public int customFilterProcess(int inputTexture, int textureWidth, int textureHeight, long extra) {
              Log.d(TAG, "customFilterProcess start: textureId" + inputTexture + ",width:" + textureWidth + ",height:" + textureHeight);
      
              int ret = mBeautyManager != null ? mBeautyManager.onTextureInput(inputTexture, textureWidth, textureHeight) : inputTexture;
      
              Log.d(TAG, "customFilterProcess end, textureId:" + ret);
              Log.d(TAG, "keria1: " + mCameraId);
      
              return ret;
          }
      
          @Override
          public void customFilterDestroy() {
              destroyBeautyManager();
          }
      });
      If you need to access a third-party retouching library, you can set the setCustomDetect and setCustomFilter callbacks in the code.
      • The data parameter returned by the customDetectProcess callback that includes the long data, int width, int height, int rotation, int format, and long extra parameters serves as the pointer for querying data. The third-party retouching library can identify or process the returned data.
      • The values of the inputTexture parameter returned by the customFilterProcess callback that includes the int inputTexture, inttextureWidth, int textureHeight, and long extra parameters are the image textures that can be processed by the third-party retouching library. If you need to return a processed texture, the ID of the processed texture is returned. Otherwise, the ID of the original texture is returned.
    3. Specify the image for background stream ingest. Push SDK for Android supports the ingest of an image when your app is switched to the background or the bitrate is low. This enhances the user experience. When your app is switched to the background, stream ingest is paused by default. In this case, only an image and audio streams can be ingested. For example, you can ingest an image that displays a message reminding the leave of the streamer. The following code provides an example:
      mAlivcLivePushConfig.setPausePushImage("The path of the specified image for background stream ingest in the PNG format"); // Specify the image for background stream ingest.
      In addition, you can specify a static image for stream ingest in poor network conditions. After that, the specified image is ingested when the bitrate is low. This avoids stuttering. The following code provides an example:
      mAlivcLivePushConfig.setNetworkPoorPushImage("The path of the image that is ingested for poor network conditions"); // Specify the image for stream ingest in poor network conditions.
    4. Configure watermarks. Push SDK for Android allows you to add one or more watermarks in the PNG format. The following code provides an example:
      mAlivcLivePushConfig.addWaterMark(waterPath,0.1,0.2,0.3); // Add a watermark.
      Note:
      • The values of the x, y, and width parameters are relative. For example, a value of 0.1 for the x parameter indicates that the left edge of the watermark is displayed at the 10% position on the x-axis of the streaming image. Therefore, if the stream ingest resolution is 540 × 960, the value for the x parameter is 54.
      • The height of the watermark is calculated based on the input width in a proportional aspect ratio.
      • If you want to add a text watermark, you can transform the text into an image and call the addWaterMark method to add the image as a watermark.
    5. Specify a preview mode.
      mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);
      Note:
      • AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: In this mode, the video fills the entire preview view. If the aspect ratio of the video does not match the view, the preview image is deformed.
      • AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: In this mode, the initial aspect ratio of the video is used during the preview. If the aspect ratio of the video does not match the view, black edges appear on the preview view.
      • AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: In this mode, the aspect ratio of the video is changed to fit the preview view. If the aspect ratio of the video does not match the view, the video is cropped to fit the preview view.
      The preceding three preview modes do not affect stream ingest.

Use AlivcLivePusher

AlivcLivePusher is the core class of Push SDK for Android. This class provides parameters for video preview, stream ingest callback, and stream ingest control. You can also use this class to modify parameters during stream ingest.
Note
  • Execute the try-catch statement to manage exceptions if you use the methods in the class.
  • You must follow the specified sequence to call the methods. An incorrect sequence may cause an error.
This section describes how to use the key methods for stream ingest.
  1. Initialize Push SDK for Android. Use the init method to initialize the configured stream ingest parameters. The following code provides an example:
    AlivcLivePusher mAlivcLivePusher = new AlivcLivePusher();
    mAlivcLivePusher.init(mContext, mAlivcLivePushConfig);
    AlivcLivePusher does not support multiple instances. Therefore, Push SDK for Android provides a destory method for each init method.
  2. Register stream ingest callbacks. Stream ingest callbacks are grouped into Info, Error, and Network callbacks.
    • Info: the callbacks that are used for notifications and status detection.
    • Error: the callbacks that are used when errors occur.
    • Network: the callbacks that are used to manage network services.
    When an event occurs, the corresponding callback is triggered to notify you. The following code provides an example:
    /**
      * Configure the listener for stream ingest errors.
      *
      * @param errorListener The listener for errors.
      */
    mAlivcLivePusher.setLivePushErrorListener(new AlivcLivePushErrorListener() {
      @Override
      public void
    onSystemError(AlivcLivePusher livePusher, AlivcLivePushError error) {
          if(error != null) {
              // Add UI notifications or custom error solutions.
          }
      }
      @Override
      public void onSDKError(AlivcLivePusher
    livePusher, AlivcLivePushError error) {
          if(error != null) {
              // Add UI notifications or custom error solutions.
          }
      }
    });
    /**
    * Configure the listener for the stream ingest status.
    *
    * @param infoListener The listener for notifications.
    */
    mAlivcLivePusher.setLivePushInfoListener(new AlivcLivePushInfoListener() {
      @Override
      public void
    onPreviewStarted(AlivcLivePusher pusher) {
          // Indicates the start of preview.
      }
      @Override
      public void
    onPreviewStoped(AlivcLivePusher pusher) {
          // Indicates the end of preview.
      }
      @Override
      public void
    onPushStarted(AlivcLivePusher pusher) {
          // Indicates that stream ingest starts.
      }
      @Override
      public void onPushPauesed(AlivcLivePusher
    pusher) {
          // Indicates that stream ingest is paused.
      }
      @Override
      public void
    onPushResumed(AlivcLivePusher pusher) {
          // Indicates that stream ingest is resumed.
      }
      @Override
      public void
    onPushStoped(AlivcLivePusher pusher) {
          // Indicates that stream ingest ends.
      }
      @Override
      public void onPushRestarted(AlivcLivePusher
    pusher) {
          // Indicates that stream ingest is restarted.
      }
      @Override
      public void
    onFirstFramePreviewed(AlivcLivePusher pusher) {
          // Indicates that the first frame appears.
      }
      @Override
      public void onDropFrame(AlivcLivePusher
    pusher, int countBef, int countAft) {
          // Indicates frame loss.
      }
      @Override
      public void
    onAdjustBitRate(AlivcLivePusher pusher, int curBr, int targetBr) {
          // Indicates that the bitrate is adjusted.
      }
      @Override
      public void onAdjustFps(AlivcLivePusher
    pusher, int curFps, int targetFps) {
          // Indicates that the frame rate is adjusted.
      }
    });
    /**
    * Configure the listener for the network status.
    *
    * @param infoListener The listener for notifications.
    */
    mAlivcLivePusher.setLivePushNetworkListener(new AlivcLivePushNetworkListener()
    {
      @Override
      public void
    onNetworkPoor(AlivcLivePusher pusher) {
          // Indicates poor network conditions.
      }
      @Override
      public void
    onNetworkRecovery(AlivcLivePusher pusher) {
          // Indicates that the network is recovered.
      }
      @Override
      public void
    onReconnectStart(AlivcLivePusher pusher) {
          // Indicates that the reconnection starts.
      }
      @Override
      public void
    onReconnectFail(AlivcLivePusher pusher) {
          // Indicates that the reconnection failed.
      }
      @Override
      public void
    onReconnectSucceed(AlivcLivePusher pusher) {
          // Indicates that the reconnection is successful.
      }
      @Override
      public void
    onSendDataTimeout(AlivcLivePusher pusher) {
          // Indicates that data transmission times out.
      }
      @Override
      public void
    onConnectFail(AlivcLivePusher pusher) {
          // Indicates that the connection failed.
      }
    });
    /**
    * Configure the listener for the status of the background music.
    *
    * @param pushBGMListener The listener for notifications.
    */
    mAlivcLivePusher.setLivePushBGMListener(new AlivcLivePushBGMListener() {
      @Override
      public void onStarted() {
          // Indicates that the playback of background music is started.
      }
      @Override
      public void onStoped() {
          // Indicates that the playback of background music is stopped.
      }
      @Override
      public void onPaused() {
          // Indicates that the playback of background music is paused.
      }
      @Override
      public void onResumed() {
          // Indicates that the playback of background music is resumed.
      }
      /**
       * Configure the callback for the playback progress.
       *
       * @param progress The callback for the playback progress.
       */
      @Override
      public void onProgress(final long
    progress, final long duration) {
      @Override
      public void onCompleted() {
          // Indicates that the playback ends.
      }
      @Override
      public void onDownloadTimeout() {
          // Indicates that the player times out. The player is reconnected and seeks the previous playback position.
      }
      @Override
      public void onOpenFailed() {
          // Indicates an invalid stream. The stream is inaccessible.
      }
    });
  3. Start the preview. You can start the preview after you initialize the livePusher object. Use the mSurfaceView parameter for camera previews. The following code provides an example:
    mAlivcLivePusher.startPreview(mSurfaceView)// Start the preview. You can also use the asynchronous method startPreviewAysnc to start the preview.
  4. Start to ingest streams. You can start stream ingest only after the preview succeeds. Therefore, you must set the onPreviewStarted callback by adding the following code to the callback:
    mAlivcLivePusher.startPush(mPushUrl);
    Note:
    • Push SDK for Android provides the asynchronous method startPushAsync to start stream ingest.
    • Push SDK for Android supports the URLs of the streams that are ingested over RTMP. For more information about how to obtain ingest URLs, see Ingest and streaming URLs.
    • Start stream ingest with a valid URL. Then, use a player, such as ApsaraVideo Player, FFplay, and VLC, to test stream pulling. For more information about how to obtain source URLs, see Ingest and streaming URLs.
  5. Set other stream ingest configurations. Push SDK for Android allows you to control stream ingest. For example, you can start, stop, restart, pause, and resume stream ingest, stop preview, and destroy stream ingest objects. You can add buttons as needed to perform these operations. The following code provides an example:
    /* You can pause the ongoing stream ingest. If you pause the ongoing stream ingest, the system pauses the video preview and the video stream ingest at the last frame, and continues the ingest of audio-only streams. */
    mAlivcLivePusher.pause();
    /* You can resume the paused stream ingest. Then, the system resumes the audio and video preview and the stream ingest. */
    mAlivcLivePusher.resume();
    /* You can stop the ongoing stream ingest. */
    mAlivcLivePusher.stopPush();
    /* You can stop the ongoing preview. However, this operation does not take effect for the ongoing stream ingest. When the preview is stopped, the preview view is frozen at the last frame. */
    mAlivcLivePusher.stopPreview();
    /* You can restart stream ingest that is in the streaming status or when the method receives an error callback. If an error occurs, you can use only this method or reconnectPushAsync to restart stream ingest. You can also use the destory method to destroy the stream ingest object. After you complete the preceding operation, restart all ALivcLivePusher resources that are required for operations, such as the preview and stream ingest. */
    mAlivcLivePusher.restartPush();
    /* You can use this method in the streaming status or when the method receives callbacks caused by errors in AlivcLivePusherNetworkDelegate. If an error occurs, you can use only this method or restartPush to restart stream ingest. You can also use the destory method to destroy the stream ingest object. After you complete the preceding operation, you can restart the stream ingest over RTMP. */
    mAlivcLivePusher.reconnectPushAsync();
    /* After the stream ingest object is destroyed, the stream ingest and the preview are stopped, and preview views are deleted. All resources related to AlivcLivePusher are destroyed. */
    mAlivcLivePusher.destroy();
  6. Manage background music. Push SDK for Android allows you to manage background music. For example, you can configure the playback of background music, and the audio mixing, noise reduction, in-ear monitoring, and muting features. You can call background music methods only after the preview starts. The following code provides an example:
    /* Start the playback of background music. */
    mAlivcLivePusher.startBGMAsync(mPath);
    /* Stop the playback of background music. To change the background music, call the method that is used to start the playback of background music. You do not need to stop the playback of the current background music. */
    mAlivcLivePusher.stopBGMAsync();
    /* Pause the playback of the background music. You can call this method only after the background music starts. */
    mAlivcLivePusher.pauseBGM();
    /* Resume the playback of background music. You can call this method only after the background music is paused. */
    mAlivcLivePusher.resumeBGM();
    /* Enable looping.*/
    mAlivcLivePusher.setBGMLoop(true);
    /* Configure noise reduction. When noise reduction is enabled, the system filters out non-vocal parts from collected audio. This feature may slightly reduce the volume of the human voice. Therefore, we recommend that you allow your users to determine whether to enable this feature. This feature is disabled by default.*/
    mAlivcLivePusher.setAudioDenoise(true);
    /* Configure in-ear monitoring. In-ear monitoring applies to the KTV scenario. When in-ear monitoring is enabled, you can hear your voice on your earphones during streaming. When in-ears monitoring is disabled, you cannot hear your voice on your earphones during streaming. This feature does not take effect if no earphones are detected. */
    mAlivcLivePusher.setBGMEarsBack(true);
    /* Configure audio mixing by adjusting the volumes of the background music and the human voice. */
    mAlivcLivePusher.setBGMVolume(50); // Set the volume of the background music.
    mAlivcLivePusher.setCaptureVolume(50); // Set the volume of the human voice.
    /* Configure muting. If you enable this feature, the background music and the human voice are muted. To separately mute background music or the human voice, call the method that is used to configure audio mixing. */
    mAlivcLivePusher.setMute(true);
  7. Perform camera-related operations. You can perform camera-related operations only after you start preview in streaming status, paused status, or reconnecting status. For example, you can switch the camera and configure the flash, the focal length, zooming, and the mirroring mode. If you do not start preview, the following methods are invalid. The following code provides an example:
    /* Switch between the front and the rear cameras.*/
    mAlivcLivePusher.switchCamera();
    /* Enable or disable the flash. You cannot enable the flash for the front camera.*/
    mAlivcLivePusher.setFlash(true); 
    /* Set the focal length to zoom in and out of images. The camera can zoom from 0 to the value that is returned from getMaxZoom. */
    mAlivcLivePusher.setZoom(5);
    /* Configure manual focus. To configure manual focus, you must set the following two parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus specifies whether to enable autofocus. The autoFocus parameter is valid only when you call this method. Other autofocus settings follow the same settings. */
    mAlivcLivePusher.focusCameraAtAdjustedPoint(x, y, true);
    /* Configure autofocus.*/
    mAlivcLivePusher.setAutoFocus(true);
    /* Configure the mirroring mode. Mirroring-related methods are PushMirror and PreviewMirror. The PushMirror method is used to configure the mirroring mode for stream ingest. The PreviewMirror method is used to configure the mirroring mode for preview. PushMirror is valid only for playback images. PreviewMirror is valid only for preview views. */
    mAlivcLivePusher.setPreviewMirror(false);
    mAlivcLivePusher.setPushMirror(false);
  8. Ingest external audio and video sources. Push SDK for Android allows you to import external audio and video sources for stream ingest. For example, you can ingest an audio or video file.
    • Configure the input of external audio and video sources in stream ingest settings. The following code provides an example:
      /**
      * AlivcImageFormat: indicates the format of the input video images.
      * AlivcSoundFormat: indicates the format of the input audio frames.
      * Other parameters: To specify the resolution, the audio sample rate, and the number of channels,
      set setResolution, setAudioSamepleRate, and setAudioChannels in AlivcLivePushConfig.
      * Note: To import custom video and audio streams, use methods such as inputStreamVideoData and inputStreamAudioData. 
      */
      mAlivcLivePushConfig.setExternMainStream(true,AlivcImageFormat.IMAGE_FORMAT_YUVNV12,
      AlivcSoundFormat.SOUND_FORMAT_S16);
    • The following code provides an example on how to import external video data:
      /**
      * Import custom video streams.
      *
      * @param data The byte array of the video image.
      * @param width The width of the video image.
      * @param height The height of the video image.
      * @param stride The stride of the video image.
      * @param size The size of the video image.
      * @param pts The presentation timestamps (PTS) of the video image, in microseconds.
      * @param rotation The roration angle of the video image.
      * This method does not control the time sequence. You must control the time sequence of input video frames by yourself.
      * Note: Set setExternMainStream(true,***) in AlivcLivePushConfig when you call this method.
      */
      mAlivcLivePusher. inputStreamVideoData(byte[] data, int width, int height, int stride, int size, long pts, int rotation);
      /**
      * Import custom video streams.
      *
      * @param dataptr The pointer of the native memory for the video image.
      * @param width The width of the video image.
      * @param height The height of the video image.
      * @param stride The stride of the video image.
      * @param size The size of the video image.
      * @param pts The PTS of the video image, in microseconds.
      * @param rotation The roration angle of the video image.
      * This method does not control the time sequence. You must control the time sequence of input video frames by yourself.
      * Note: Set setExternMainStream(true,***) in AlivcLivePushConfig when you call this method.
      */
      mAlivcLivePusher. inputStreamVideoPtr(long dataptr, int width, int height, int stride, int size, long pts, int rotation);
    • The following code provides an example on how to import external audio data:
      /**
      * Import custom audio data.
      * @param data The byte array of audio data.
      * @param size
      * @param sampleRate
      * @param channels
      * @param pts The PTS of audio, in microseconds.
      * This method does not control the time sequence. You must control the time sequence of input audio frames by yourself.
      */
      mAlivcLivePusher. inputStreamAudioData(byte[] data, int size, int sampleRate, int channels, long pts);
      /**
      * Import custom audio data.
      * @param dataptr The pointer of the native memory for audio data.
      * @param size
      * @param sampleRate
      * @param channels
      * @param pts The PTS of audio, in microseconds.
      * This method does not control the time sequence. You must control the time sequence of input audio frames by yourself.
      */
      mAlivcLivePusher. inputStreamAudioPtr(long dataPtr, int size, int sampleRate, int channels, long pts);
  9. Add animated stickers. Push SDK for Android allows you to add animated stickers as watermarks to live streams.
    • To make an animated sticker, you can modify the materials in the demo. Make a sequence frame image for the animated sticker, open the config.json file, and set the following parameters:
      "du": 2.04, // The duration for playing the animated sticker once.
      "n": "qizi", // The name of the animated sticker. Make sure that the name of the folder for making the animated sticker is the same as the name of the sticker. The name of the sticker contains the name followed by the sequence number, such as qizi0.
      "c": 68.0, // The number of animation frames, which is the number of images included in an animated sticker.
      "kerneframe": 51, // The keyframe. Specify an image as the keyframe. For example, the 51st frame is specified as the keyframe in the demo. Make sure that the specified frame exists.
      "frameArry": [
          {"time":0,"pic":0},
          {"time":0.03,"pic":1},
          {"time":0.06,"pic":2},
          ],
      // The parameters of the animated sticker. The preceding settings indicate that the first frame (qizi0) is displayed 0 seconds after the start, and the second frame (qizi1) is displayed 0.03 seconds after the start.Specify all frames in the animation in the same way.
      Set other fields as described by the .json file in the demo.
    • The following code provides an example on how to add an animated sticker:
      /**
      * Add an animated sticker.
      * @param path The path of the animated sticker. The path must contain config.json.
      * @param x The starting position on the x-axis. Valid values: 0 to 1.0f.
      * @param y The starting position on the y-axis. Valid values: 0 to 1.0f.
      * @param w The screen width of the animated sticker. Valid values: 0 to 1.0f.
      * @param h The screen height of the animated sticker. Valid values: 0 to 1.0f.
      * @return id The ID of the sticker. You must specify the sticker ID if you want to remove a sticker.
      */
      mAlivcLivePusher.addDynamicsAddons("Path of the sticker", 0.2f, 0.2f, 0.2f, 0.2f);
    • The following code provides an example on how to remove an animated sticker:
      mAlivcLivePusher.removeDynamicsAddons(int id);
  10. Call other methods.
    /* In custom mode, you can change the minimum bitrate and the maximum bitrate in real time. */
    mAlivcLivePusher.setTargetVideoBitrate(800);
    mAlivcLivePusher.setMinVideoBitrate(400);
    /* Specify whether autofocus is supported.*/
    mAlivcLivePusher.isCameraSupportAutoFocus();
    /* Specify whether the flash is supported. * /
    mAlivcLivePusher.isCameraSupportFlash();
    /* Obtain the status of stream ingest.*/
    mAlivcLivePusher.isPushing(); 
    /* Obtain the ingest URL.*/
    mAlivcLivePusher.getPushUrl();
    /* Obtain the debugging information about stream ingest performance. For information about the parameters of stream ingest performance, see API references or comments in the code.*/ */
    mAlivcLivePusher.getLivePushStatsInfo();
    /* Obtain the version number of Push SDK for Android. */
     mAlivcLivePusher.getSDKVersion();
    /* Set a log level to filter debugging information.*/
    mAlivcLivePusher.setLogLevel(AlivcLivePushLogLevelAll);
    /* Obtain the status of Push SDK for Android.*/
    AlivcLivePushStats getCurrentStatus();
    /* Obtain the last error code. If no error code is returned, the output displays ALIVC_COMMON_RETURN_SUCCESS.*/
    AlivcLivePushError getLastError();

Use the screen recording feature

  • Configure the screen recording mode
    Screen recording supports the following modes:
    • Record screen with the camera disabled: When you request the screen recording permission, call the setMediaProjectionPermissionResultData method in AlivcLivePushConfig and specify the Intent object returned by MediaProjectionManager.createScreenCaptureIntent as the parameter.
    • Record screen with the camera enabled and enable camera preview on the streamer side: Viewers can view the content that is recorded with the camera.
      1. When you request the screen recording permission, call the setMediaProjectionPermissionResultData method in AlivcLivePushConfig and specify the Intent object returned by MediaProjectionManager.createScreenCaptureIntent as the parameter.
      2. Pass surfaceView into the StartCamera method and call this method.
    • Record screen with the camera enabled and disable camera preview on the streamer side: Viewers can still view the content that is recorded from the camera.
      1. When you request the screen recording permission, call the setMediaProjectionPermissionResultData method in AlivcLivePushConfig and specify the Intent object returned by MediaProjectionManager.createScreenCaptureIntent as the parameter.
      2. Call the StartCamera method without passing surfaceView into the method.
      3. Call the startCameraMix method to tune the position of the camera view on the viewer side.
  • Enable screen recording
    Screen recording uses MediaProjection. When you request screen recording permission, call the setMediaProjectionPermissionResultData method in AlivcLivePushConfig and specify the Intent object returned by MediaProjectionManager.createScreenCaptureIntent as the parameter. By default, the camera is disabled during screen recording. The following code provides an example on how to configure this setting in the configuration:
    mAlivcLivePushConfig.setMediaProjectionPermissionResultData(resultData)
  • Eanble camera preview
    You can call the camera preview method after you have enabled screen recording. The following code provides an example:
    mAlivcLivePusher.startCamera(surfaceView); // Enable camera preview.
    mAlivcLivePusher.stopCamera(); // Disable camera preview.
    Note:
    • We recommend that you set the aspect ratio of the surfaceView to 1:1 for the camera preview in screen recording mode. In this way, you do not need to adjust the aspect ratio of surfaceView when you rotate your screen.
    • If you do not set the aspect ratio of the surfaceView to 1:1 for the camera preview, you must adjust the aspect ratio, disable camera preview, and then enable camera preview.
    • If the streamer does not require preview, set the surfaceview parameter to null.
  • Enable stream mixing
    You can enable this feature when the streamer does not require camera preview but viewers do. This feature is typically used in game live streaming. When the streamer does not want to stream the game content, the streamer can display the camera view on top of the game content. The following code provides an example:
    /**
    * @param x The starting position on the x-axis. Valid values: 0 to 1.0f.
    * @param y The starting position on the y-axis. Valid values: 0 to 1.0f.
    * @param w The width of the screen. Valid values: 0 to 1.0f.
    * @param h The height of the screen. Valid values: 0 to 1.0f.
    * @return
    */
    mAlivcLivePusher.startCameraMix(x, y, w, h); // Enable camera and screen stream mixing.
    mAlivcLivePusher.stopCameraMix(); // Disable camera and screen stream mixing.
  • Set screen rotation
    In screen recording mode, you can configure the screen to rotate between portrait mode and landscape mode. The following code provides an example:
    mAlivcLivePusher.setScreenOrientation(0);
    When you change the orientation of the screen, you must enable OrientationEventListener at the application layer and pass the orientation setting to this method.
  • Enable or disable privacy protection
    This feature allows the streamer to protect privacy during live stream recording. The streamer can enable this feature when the streamer enters a password and then disable the feature. The following code provides an example:
    mAlivcLivePusher.pauseScreenCapture(); // Enable privacy protection.
    mAlivcLivePusher.resumeScreenCapture(); // Disable privacy protection.
    If you set the setPausePushImage parameter in AlivcLivePushConfig, the specified image is displayed during the pause of live stream recording. If you do not set this parameter, the last frame is displayed during the pause of live stream recording.

Usage notes

  • Obfuscation rules
    Check the obfuscation configurations. Make sure that the package names of Push SDK for iOS are removed from the obfuscation list.
    -keep class com.alivc.** { *;}
  • Method call
    • You can call both synchronous and asynchronous methods. However, we recommend that you avoid using asynchronous methods because this consumes the resources of the main thread.
    • Push SDK for Android throws exceptions when you fail to call the required methods or call methods in an invalid order. You must handle try catch exceptions to prevent unexpected quits.
    • The following figure shows the call procedure to be followed.Call procedure
  • Limits
    • You must configure screen rotation before stream ingest. You cannot rotate the screen during stream ingest.
    • You must disable auto screen rotation for stream ingest in landscape mode.
    • In hardware encoding mode, the output resolution must be multiples of 16 due to the compatibility of the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black edges.
  • Version upgrade instruction

    Update Push SDK for Android to the latest version. For more information, see Update Push SDK for Android from V4.0.2 to V4.1.0 or later.

    Notice Integrate the latest version of ApsaraVideo Player SDK for Android when you update the Push SDK.