This topic describes how to use Push SDK for iOS and the classes and methods in the SDK. This topic also provides examples on how to use Push SDK for iOS to implement specific features.

Features of Push SDK for iOS

  • Supports the stream ingest over Real-Time Messaging Protocol (RTMP).
  • Adopts H.264 for video coding and Advanced Audio Coding (AAC) for audio coding.
  • Supports custom configurations for features such as bitrate control, resolution, and display mode.
  • Supports a variety of camera operations.
  • Supports real-time retouching and allows you to customize retouching effects.
  • Supports using animated stickers as animated watermarks and allows you to add and remove animated watermarks.
  • Supports live stream recording.
  • Supports external audio and video input in different formats such as YUV and pulse-code modulation (PCM).
  • Supports mixing streams.
  • Supports the ingest of audio-only and video-only streams and stream ingest in the background.
  • Supports background music and allows you to manage background music.
  • Supports capturing snapshots from streams.
  • Supports automatic reconnection and exception handling.

Set the stream ingest parameters

To use Push SDK for iOS, you must set the stream ingest parameters and create a preview view. After that, you can start to ingest streams. You can set stream ingest parameters in the AlivcLivePushConfig class. Each parameter has a default value.

For more information about the default values and valid values, see Push SDK for iOS V4.2.1 Reference or comments in the codes.

You can change the values of parameters as needed. To modify these parameters in real time, see the parameters and methods of the AlivcLivePusher class.

  1. Set basic stream ingest configurations. Run the following command to import the header file for the view controller that requires the AlivcLivePusher class: #import <AlivcLivePusher/AlivcLivePusherHeader.h>. The following code can be used:
    AlivcLivePushConfig *config = [[AlivcLivePushConfig alloc] init]; // The configuration class used to initialize the stream ingest configurations. You can also use initWithResolution. 
    config.resolution = AlivcLivePushResolution540P; // The resolution is set to 540p by default. The maximum resolution is 720p.
    config.fps = AlivcLivePushFPS20; // We recommend that you set the frame rate to 20 frames per second (FPS).
    config.enableAutoBitrate = true; // Enable adaptive bitrate. The default value is true.
    config.videoEncodeGop = AlivcLivePushVideoEncodeGOP_2; // The default value is 2. The longer the interval between key frames, the higher the latency. We recommend that you set this value to a number from 1 to 2. 
    config.connectRetryInterval = 2000; // The reconnection interval in milliseconds. The default reconnection interval is 2 seconds. The reconnection interval cannot be shorter than 1 second. We recommend that you retain the default value. 
    config.previewMirror = false; // The default value is false. We recommend that you use the default value. 
    config.orientation = AlivcLivePushOrientationPortrait; // The default screen orientation is portrait. You can change the orientation to landscape left or landscape right. 
    Note:
    • All these parameters have default values. We recommend that you use the default values.
    • Considering the performance of most mobile phones and network bandwidth requirements, we recommend that you set the resolution to 540p. Most mainstream live streaming apps use 540p.
    • After adaptive bitrate streaming is disabled, the bitrate is fixed at the initial value and is not automatically adjusted between the maximum bitrate and the minimum bitrate. If the network is unstable, this setting may cause stuttering. Use the adaptive bitrate streaming feature with caution.
  2. Specify a bitrate control mode. Push SDK for iOS provides three bitrate control modes. You can specify a mode by setting the qualityMode parameter.
    • AlivcLivePushQualityModeResolutionFirst: quality first. Push SDK for iOS sets bitrate parameters to ensure the quality of video streams first.
    • AlivcLivePushQualityModeResolutionFirst: smoothness first. Push SDK for iOS sets bitrate parameters to ensure the smoothness of video streams first.
    • AlivcLivePushQualityModeCustom: custom mode. Push SDK for iOS sets bitrate parameters based on your custom settings.
    The following code provides examples:
    • The following code provides an example on how to configure the quality-first or smoothness-first mode:
      config.qualityMode = AlivcLivePushQualityModeResolutionFirst; // The default mode is quality-first. You can change this mode to smoothness-first or custom mode. 
      If you use the quality-first or smoothness-first mode, you do not need to set the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters. Push SDK for iOS automatically ensures the quality or smoothness of video streams when the network is unstable.
    • The following code provides an example on how to configure the custom mode:
      config.qualityMode = AlivcLivePushQualityModeCustom; // Select the custom mode.
      config.targetVideoBitrate = 1400; // The maximum bitrate is 1,400 Kbit/s.
      config.minVideoBitrate = 600; // The minimum bitrate is 600 Kbit/s.
      config.initialVideoBitrate = 1000; // The initial bitrate is 1,000 Kbit/s.
      If you use the custom mode, you must set the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters.
      • initialVideoBitrate: the initial bitrate when the live streaming starts.
      • minVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum value to avoid stuttering.
      • targetVideoBitrate: In good network conditions, the bitrate is gradually increased to the maximum value to improve the quality of the video stream.
      We recommend that you use the following parameter values for the custom mode:

      Quality first

      Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
      360p 600 300 1000
      480p 800 300 1200
      540p 1000 600 1400
      720p 1500 600 2000

      Quality first

      Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
      360p 400 200 600
      480p 600 300 800
      540p 800 300 1000
      720p 1000 300 1200
  3. Enable adaptive resolution streaming. After adaptive resolution streaming is enabled, the resolution is automatically reduced to ensure the smoothness and quality of video streams in poor network conditions. The adaptive resolution streaming feature is not supported by all players. If you need to use this feature, we recommend that you use ApsaraVideo Player. The following code provides an example:
    config.enableAutoResolution = YES; // Enable adaptive resolution streaming. The default value is NO.
    Adaptive resolution streaming is supported only if the qualityMode parameter is set to the quality-first or smoothness-first mode.
  4. Use the retouching feature. Push SDK for iOS provides basic and advanced retouching effects. Basic retouching effects allow you to whiten and smooth the skin, and add a pink glow. Advanced retouching effects include skin whitening, smoothing, adding a pink glow, eye widening, and face resizing and slimming. These effects are based on facial recognition. The following code provides an example:
    # pragma mark - "Retouching effects and related methods"
    /**
     * @brief Specify whether to enable a retouching effect.
     * @param type A value of the QueenBeautyType parameter. The value indicates a retouching effect.
     * @param isOpen YES: enables the retouching effect. NO: disables the retouching effect.
     *
     */
    - (void)setQueenBeautyType:(kQueenBeautyType)type enable:(BOOL)isOpen;
    /**
     * @brief Set the retouching parameters.
     * @param param A parameter of a retouching effect. This parameter is a value of the QueenBeautyParams parameter.
     * @param value Required. The valid value ranges from 0 to 1. Set the value to 0 if the original value is smaller than 0. Set the value to 1 if the original value is greater than 1.
     */
    - (void)setQueenBeautyParams:(kQueenBeautyParams)param
    value:(float)value;
    # pragma mark - "Methods for using filters"
    /**
     * @brief Specify the filter material. Before you specify the filter material, set the kQueenBeautyTypeLUT parameter.
     * @param imagePath The path of the filter material.
     */
    - (void)setLutImagePath:(NSString *)imagePath;
    # pragma mark - "Methods for shaping"
    /**
     * @brief Specify the shaping effect. Before you specify the shaping effect, set the kQueenBeautyTypeFaceShape parameter.
     * @param faceShapeType Required. The shaping effect that you want to use. For more information about the configurations, see the QueenBeautyFaceShapeType parameter.
     * @param value Required. The property value of the shaping effect.
     */
    - (void)setFaceShape:(kQueenBeautyFaceShapeType)faceShapeType
    value:(float)value;
    # pragma mark - "Methods for makeup effects"
    /**
     * @brief Specify the makeup effect and the paths of makeup materials. Before you specify a makeup effect, set the kQueenBeautyTypeMakeup parameter.
     * @param makeupType The makeup effect.
     * @param imagePaths The paths of the makeup materials.
     * @param blend The blending mode for the makeup effect.
     */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend;
    /**
     * @brief Specify the makeup effect and the path of the makeup materials.
     * @param makeupType The makeup effect.
     * @param imagePaths The paths of the makeup materials.
     * @param blend The blending mode for the makeup effect.
     * @param fps The frame rate.
     */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend fps:(int)fps;
    /**
     * @brief Set the transparency property for a makeup effect. You can specify the gender.
     * @param makeupType The makeup effect.
     * @param isFeMale Specify whether the gender is female. A value of YES indicates female. A value of NO indicates male.
     * @param alpha The transparency for the makeup effect.
     */
    - (void)setMakeupAlphaWithType:(kQueenBeautyMakeupType)makeupType
    female:(BOOL)isFeMale alpha:(float)alpha;
    /**
     * @brief Specify the blending mode for a makeup effect.
     * @param makeupType The makeup effect.
     * @param blend The blending mode for the makeup effect.
     */
    - (void)setMakeupBlendWithType:(kQueenBeautyMakeupType)makeupType
    blendType:(kQueenBeautyBlend)blend;
    /**
     * @brief Clear all makeup effects.
     */
    -(void)resetAllMakeupType;
  5. Specify the image for background stream ingest. Push SDK for iOS supports the ingest of an image when your app is switched to the background or the bitrate is low. This enhances the user experience. When your app is switched to the background, stream ingest is paused by default. In this case, only an image and audio streams can be ingested. For example, you can ingest an image that displays a message reminding the leave of the streamer. The following code provides an example:
    config.pauseImg = [UIImage imageNamed:@"image.png"]; // Specify the image for background stream ingest.
    In addition, you can specify a static image for stream ingest in poor network conditions. After that, the specified image is ingested when the bitrate is low. This avoids stuttering. The following code provides an example:
    config.networkPoorImg = [UIImage imageNamed:@"picture.png"]; // Specify the image for stream ingest in poor network conditions.
  6. Configure watermarks. Push SDK for iOS allows you to add one or more watermarks in the PNG format. The following code provides an example:
    NSString *watermarkBundlePath = [[NSBundle mainBundle] pathForResource:
    [NSString stringWithFormat:@"watermark"] ofType:@"png"]; // Specify the path of the watermark.
    [config addWatermarkWithPath: watermarkBundlePath
          watermarkCoordX:0.1
          watermarkCoordY:0.1
          watermarkWidth:0.3]; // Add a watermark.
    Note:
    • The values of the coordX, coordY, and width parameters are relative. For example, a value of 0.1 for the coordX parameter indicates that the left edge of the watermark is displayed at the 10% position on the x-axis of the streaming image. Therefore, if the streaming resolution is 540 × 960, the value for the coordX parameter is 54.
    • The height of the watermark is calculated based on the input width in a proportional aspect ratio.
    • If you want to add a text watermark, you can transform the text into an image and call the addWatermark method to add the image as a watermark.
    • To ensure the clarity and smooth edges of the watermark image, we recommend that you use a source image in the watermark output size. If the resolution of the output video is 544 × 940 and the watermark width is 0.1f, we recommend that you use the following width for the source image: 544 × 0.1f = 54.4.
  7. Specify a preview mode. The following code provides an example:
    mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);
    Note:
    • ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: In this mode, the video fills the entire preview view. If the aspect ratio of the video does not match the view, the preview image is deformed.
    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: In this mode, the initial aspect ratio of the video is used during the preview. If the aspect ratio of the video does not match the view, black edges appear on the preview view, which is set by default.
    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: In this mode, the aspect ratio of the video is changed to fit the preview view. If the aspect ratio of the video does not match the view, the video is cropped to fit the preview view.
    You can specify these modes in AlivcLivePushConfig. You can also call the setpreviewDisplayMode method to specify a preview mode during preview and stream ingest.

    This configuration takes effect only for preview. The actual resolution of the output video follows the default configurations in AlivcLivePushConfig. The preview mode does not affect the actual resolution. You can select a preview mode to adapt to different screen sizes of mobile phones.

Use AlivcLivePusher

AlivcLivePusher is the core class of Push SDK for iOS. This class provides parameters for video preview, stream ingest callback, and stream ingest control. You can also use this class to modify parameters during stream ingest. This section describes how to use the key methods for stream ingest.

  1. Initialize Push SDK for iOS. Call the initWithConfig method to initialize the configured stream ingest parameters. The following code provides an example:
    self.livePusher = [[AlivcLivePusher alloc] initWithConfig:config];
    AlivcLivePusher does not support multiple instances. Therefore, Push SDK for iOS provides a destory method for each init method.
  2. Register stream ingest callbacks. Stream ingest callbacks are grouped into Info, Error, and Network callbacks.
    • Info: the callbacks that are used for notifications and status detection.
    • Error: the callbacks that are used when errors occur.
    • Network: the callbacks that are used to manage network services.
    You can use the delegate method to receive the specified callbacks.
    [self.livePusher setInfoDelegate:self];
    [self.livePusher setErrorDelegate:self];
    [self.livePusher setNetworkDelegate:self];
  3. Start the preview. You can start the preview after you initialize the livePusher object. Create a view instance in the UIView class or a class that inherits from UIView to start preview. The following code provides an example:
    [self.livePusher startPreview:self.view];
  4. Start to ingest streams. You can start stream ingest only after preview succeeds. Therefore, you must set the onPreviewStarted callback for AlivcLivePusherInfoDelegate by adding the following code to the callback:
    [self.livePusher startPushWithURL:@"Ingest URL for test (rtmp://......)"];
    Note:
    • Push SDK for iOS provides the asynchronous method startPushWithURLAsync to start stream ingest.
    • Push SDK for Android supports the URLs of the streams that are ingested over RTMP. For more information about how to obtain ingest URLs, see Ingest and streaming URLs.
    • Start stream ingest with a valid URL. Then, use a player, such as ApsaraVideo Player, FFplay, and VLC, to test stream pulling. For more information about how to obtain source URLs, see Ingest and streaming URLs.
  5. Set other stream ingest configurations. Push SDK for iOS allows you to control stream ingest. For example, you can start, stop, restart, pause, and resume stream ingest, stop preview, and destroy stream ingest objects. You can add buttons as needed to perform these operations. The following code provides an example:
    /* You can pause the ongoing stream ingest. If you pause the ongoing stream ingest, the system pauses the video preview and the video stream ingest at the last frame, and continues the ingest of audio-only streams. */
    [self.livePusher pause];
    /* You can resume the paused stream ingest. Then, the system resumes the audio and video preview and the stream ingest. */
    [self.livePusher resume];
    /* You can stop the ongoing stream ingest. */
    [self.livePusher stopPush];
    /* You can stop the ongoing preview. However, this operation does not take effect for the ongoing stream ingest. When the preview is stopped, the preview view is frozen at the last frame. */
    [self.livePusher stopPreview];
    /* You can restart stream ingest that is in the streaming status or when the method receives an error callback. If an error occurs, you can use only this method or reconnectPushAsync to restart stream ingest. You can also use the destory method to destroy the stream ingest object. After you complete the preceding operation, restart all ALivcLivePusher resources that are required for operations, such as the preview and stream ingest. */
    [self.livePusher restartPush];
    /* You can use this method in the streaming status or when the method receives callbacks caused by errors in AlivcLivePusherNetworkDelegate. If an error occurs, you can use only this method or restartPush to restart stream ingest. You can also use the destory method to destroy the stream ingest object. After you complete the preceding operation, you can restart the stream ingest over RTMP. */
    [self.livePusher reconnectPushAsync];
    /* After the stream ingest object is destroyed, the stream ingest and the preview are stopped, and preview views are deleted. All resources related to AlivcLivePusher are destroyed. */
    [self.livePusher destory];
    self.livePusher = nil;
    /* Obtain the status of stream ingest. */
    AlivcLivePushStatus status = [self.livePusher getLiveStatus];
  6. Adjust retouching effects in real time. Push SDK for iOS allows you to adjust retouching effects in real time during stream ingest. You can enable the retouching feature and set retouching parameters as needed. The following code provides an example:
    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinBuffing enable:YES];
    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinWhiting enable:YES];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsWhitening value:0.8f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSharpen value:0.6f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSkinBuffing value:0.6];
  7. Manage background music. Push SDK for iOS allows you to manage background music. For example, you can configure the playback of background music, and the audio mixing, noise reduction, in-ear monitoring, and muting features. You can call background music methods only after the preview starts. The following code provides an example:
    /* Start the playback of background music. */
    [self.livePusher startBGMWithMusicPathAsync:musicPath];
    /* Stop the playback of background music. To change the background music, call the method that is used to start the playback of background music. You do not need to stop the playback of the current background music. */
    [self.livePusher stopBGMAsync];
    /* Pause the playback of the background music. You can call this method only after the background music starts. */
    [self.livePusher pauseBGM];
    /* Resume the playback of background music. You can call this method only after the background music is paused. */
    [self.livePusher resumeBGM];
    /* Enable looping.*/
    [self.livePusher setBGMLoop:true];
    /* Configure noise reduction. When noise reduction is enabled, the system filters out non-vocal parts from collected audio. This feature may slightly reduce the volume of the human voice. Therefore, we recommend that you allow your users to determine whether to enable this feature. This feature is disabled by default.*/
    [self.livePusher setAudioDenoise:true];
    /* Configure in-ear monitoring. In-ear monitoring applies to the KTV scenario. When in-ear monitoring is enabled, you can hear your voice on your earphones during streaming. When in-ears monitoring is disabled, you cannot hear your voice on your earphones during streaming. This feature does not take effect if no earphones are detected. */
    [self.livePusher setBGMEarsBack:true];
    /* Configure audio mixing by adjusting the volumes of the background music and the human voice. */
    [self.livePusher setBGMVolume:50]; // Set the volume of the background music.
    [self.livePusher setCaptureVolume:50]; // Set the volume of the human voice.
    /* Configure muting. If you enable this feature, the background music and the human voice are muted. To separately mute background music or the human voice, call the method that is used to configure audio mixing. */
    [self.livePusher setMute:isMute?true:false];
  8. Perform camera-related operations. You can perform camera-related operations only after you start preview in streaming status, paused status, or reconnecting status. For example, you can switch the camera and configure the flash, the focal length, zooming, and the mirroring mode. If you do not start preview, the following methods are invalid. The following code provides an example:
    /* Switch between the front and the rear cameras.*/
    [self.livePusher switchCamera];
    /* Enable or disable the flash. You cannot enable the flash for the front camera.*/
    [self.livePusher setFlash:false]; 
    /* Set the focal length to zoom in and out of images. If you set the input parameter to a positive number, the system increases the focal length. If you set the input parameter to a negative number, the system decreases the focal length. */
    CGFloat max = [_livePusher getMaxZoom];
    [self.livePusher setZoom:MIN(1.0, max)]; 
    /* Configure manual focus. To configure manual focus, you must set the following two parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus specifies whether to enable autofocus. The autoFocus parameter is valid only when you call this method. Other autofocus settings follow the same settings. */
    [self.livePusher focusCameraAtAdjustedPoint:CGPointMake(50, 50) autoFocus:true];
    /* Configure autofocus.*/
    [self.livePusher setAutoFocus:false];
    /* Configure the mirroring mode. Mirroring-related methods are PushMirror and PreviewMirror. The PushMirror method is used to configure the mirroring mode for stream ingest. The PreviewMirror method is used to configure the mirroring mode for preview. PushMirror is valid only for playback images. PreviewMirror is valid only for preview views. */
    [self.livePusher setPushMirror:false];
    [self.livePusher setPreviewMirror:false];
  9. Implement the live question and answer (Q&A) feature. To implement the Q&A feature, you must insert supplemental enhancement information (SEI) into live streams and parse SEI by using the player. Push SDK for iOS provides a method to insert SEI. This method can be called only during stream ingest. The following code provides an example:
    /*
    msg: The SEI messages to be inserted into the live stream. The SEI messages are in the JSON format. The ApsaraVideo Player SDK can receive and parse SEI messages. 
    repeatCount: the number of frames to which the SEI messages are inserted. To ensure that no SEI messages are dropped for a frame, you must set the number of repetitions. A value of 100 indicates that SEI messages are inserted into the subsequent 100 frames. ApsaraVideo Player deduplicates the same SEI messages. 
    delayTime: the period of time to wait before the frames are sent. Unit: milliseconds. 
    KeyFrameOnly: specifies whether to send only keyframes. 
    */
    [self.livePusher sendMessage:@"Information about questions" repeatCount:100 delayTime:0 KeyFrameOnly:false];
  10. Ingest external audio and video sources. Push SDK for iOS allows you to import external audio and video sources for stream ingest. For example, you can ingest an audio or video file.
    • Configure the input of external audio and video sources in stream ingest settings. The following code provides an example:
      config.externMainStream = true; // Enable the input of external streams.
      config.externVideoFormat = AlivcLivePushVideoFormatYUVNV21; // Specify the color format for video data. In this example, the color format is YUVNV21. You can also use other formats as needed. 
      config.externMainStream = AlivcLivePushAudioFormatS16; // Specify the bit depth format for audio data. In this example, the bit depth format is S16. You can also use other formats as needed.
    • The following code provides an example on how to import external video data:
      /* The sendVideoData metthod supports only native YUV and RGB buffer data. You can use the sendVideoData method to send the buffer, length, width, height, timestamp, and rotation angle of video data.*/
      [self.livePusher sendVideoData:yuvData width:720 height:1280 size:dataSize pts:nowTime rotation:0];
      /* For CMSampleBufferRef video data, you can call the sendVideoSampleBuffer method.*/
      [self.livePusher sendVideoSampleBuffer:sampleBuffer]
      /* You can also transform CMSampleBufferRef video data to continuous buffer data before you call the sendVideoData method. The following code provides an example.*/
      // Query the length of sample buffer.
      - (int) getVideoSampleBufferSize:(CMSampleBufferRef)sampleBuffer {
      if(!sampleBuffer) {
          return 0;
      }
      int size = 0;
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
         int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
         for(int i=0; i<count; i++) {
             int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
             int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
             size += stride*height;
         }
      }else {
         int height = (int)CVPixelBufferGetHeight(pixelBuffer);
         int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
         size += stride*height;
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return size;
      }
      // Transform video sample buffer to native buffer.
      - (int) convertVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer toNativeBuffer:(void*)nativeBuffer
      {
      if(!sampleBuffer || !nativeBuffer) {
         return -1;
      }
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      int size = 0;
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
         int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
         for(int i=0; i<count; i++) {
             int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
             int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
             void *buffer = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, i);
             int8_t *dstPos = (int8_t*)nativeBuffer + size;
             memcpy(dstPos, buffer, stride*height);
             size += stride*height;
         }
      }else {
         int height = (int)CVPixelBufferGetHeight(pixelBuffer);
         int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
         void *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);
         size += stride*height;
         memcpy(nativeBuffer, buffer, size);
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return 0;
      }
    • The following code provides an example on how to import external audio data:
      /* The sendPCMData method supports only native pulse-code modulation (PCM) buffer data. You can use sendPCMData to transmit the buffer, length, and timestamp of audio data.*/
      [self.livePusher sendPCMData:pcmData size:size pts:nowTime];
  11. Add animated stickers. Push SDK for iOS allows you to add animated stickers as watermarks to live streams.
    • To make an animated sticker, you can modify the materials in the demo. Make a sequence frame image for the animated sticker, open the config.json file, and set the following parameters:
      "du": 2.04, // The duration for playing the animated sticker once.
      "n": "qizi", // The name of the animated sticker. Make sure that the name of the folder for making the animated sticker is the same as the name of the sticker. The name of the sticker contains the name followed by the sequence number, such as qizi0.
      "c": 68.0, // The number of animation frames, which is the number of images included in an animated sticker.
      "kerneframe": 51, // The keyframe. Specify an image as the keyframe. For example, the 51st frame is specified as the keyframe in the demo. Make sure that the specified frame exists.
      "frameArry": [
          {"time":0,"pic":0},
          {"time":0.03,"pic":1},
          {"time":0.06,"pic":2},
          ],
      // The parameters of the animated sticker. The preceding settings indicate that the first frame (qizi0) is displayed 0 seconds after the start, and the second frame (qizi1) is displayed 0.03 seconds after the start.Specify all frames in the animation in the same way.
      Set other fields as described by the .json file in the demo.
    • The following code provides an example on how to add an animated sticker:
      /**
      * Add an animated sticker.
      * @param path The path of the animated sticker. The path must contain config.json.
      * @param x The starting position on the x-axis. Valid values: 0 to 1.0f.
      * @param y The starting position on the y-axis. Valid values: 0 to 1.0f.
      * @param w The screen width of the animated sticker. Valid values: 0 to 1.0f.
      * @param h The screen height of the animated sticker. Valid values: 0 to 1.0f.
      * @return id The ID of the sticker. You must specify the sticker ID if you want to remove a sticker.
      */
      [self.livePusher addDynamicWaterMarkImageDataWithPath: "Sticker Path" x:0.2f y:0.2f w:0.2f h:0.2f];
    • The following code provides an example on how to remove an animated sticker:
      [self.livePusher removeDynamicWaterMark:id];
  12. Configure debugging tools.
    Note In released versions, do not call the method that is used to invoke DebugView.
    DebugView is a UI debugging tool that allows you to diagnose issues. DebugView provides a draggable and always-on-top window for debugging. For example, you can query the logs of stream ingest, monitor the metrics for stream ingest performance in real time, and generate line charts of main performance metrics. The following code provides an example:
    [AlivcLivePusher showDebugView]; // Show DebugView.
  13. Call other methods.
    /* In custom mode, you can change the minimum bitrate and the maximum bitrate in real time. */
    [self.livePusher setTargetVideoBitrate:800];
    [self.livePusher setMinVideoBitrate:200]
    /* Obtain the status of stream ingest.*/
    BOOL isPushing = [self.livePusher isPushing]; 
    /* Obtain the ingest URL.*/
    NSString *pushURLString = [self.livePusher getPushURL];
    /* Obtain the debugging information about stream ingest performance. For information about the parameters of stream ingest performance, see API references or comments in the code.*/ */
    AlivcLivePushStatsInfo *info = [self.livePusher getLivePushStatusInfo];
    /* Obtain the version number of Push SDK for iOS. */
    NSString *sdkVersion = [self.livePusher getSDKVersion];
    /* Set a log level to filter debugging information.*/
    [self.livePusher setLogLevel:(AlivcLivePushLogLevelDebug)];

Use ReplayKit for live stream recording

ReplayKit is an iOS 9 framework that allows you to record your screen content. In iOS 10, you can use third-party extensions to broadcast the screen. Push SDK for iOS can work with app extensions to support live stream recording in iOS 10 and later.
  • Create a live streaming extension

    You can use ReplayKit to create app extensions for live stream recording. An app extension cannot be used as an app. You cannot submit an app extension unless it is inside a containing app. However, an app extension directly communicates with a host app by using requests and responses with no support from the containing app. During the communication, the host app sends a request to launch the app extension.

    The Alibaba Cloud Live Streaming demo has provided the AlivcLiveBroadcast and AlivcLiveBroadcastSetupUI app extensions to record and broadcast screen content. To create a live streaming extension in an app, perform the following steps:
    1. Choose New > Target in the current project. Then, click Broadcast Upload Extension. The following page appears.Step 1
    2. Modify the Product Name parameter, select Include UI Extension, and then click Finish. After that, a live streaming extension and UI are created, as shown in the following figure.Step 2
    3. Configure the Info.plist extension, as shown in the following figure.Step 3
    4. Launch the app to automatically install both the live streaming extension and the app to your mobile phone. After the installation, launch an app that supports ReplayKit, such as TowerDash. Then, click the live streaming button. You can see the icon of the extension in the drop-down menus, as shown in the following figure.Step 4
  • Integrate Push SDK for iOS
    1. Add the AlivcLivePusher.framework and AlivcLibRtmp.framework dependencies to the stream ingest extension.
    2. Configure the UI extension parameters, including the streaming URL, resolution, and screen rotation, as shown in the following figure.Call procedure
    3. Use AlivcLivePusher to configure live streaming settings. The SampleHandle object provides all the methods for live stream recording with ReplayKit. You can call these methods for AlivcLivePusher in Push SDK for iOS to use the relevant features.
      • Start stream ingest
        You can call the broadcastStartedWithSetupInfo method to set the stream ingest parameters and start to ingest streams. The following code provides an example:
        - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
        self.pushConfig = [[AlivcLivePushConfig alloc] init];
        self.livePusher = [[AlivcLivePusher alloc];
        self.pushConfig.externMainStream = true;
        [self.livePusher initWithConfig:self.pushConfig];
        [self.livePusher startPushWithURL:pushUrl];
        }
      • Pause stream ingest
        You can call the broadcastPaused method to pause stream ingest. The following code provides an example:
        - (void)broadcastPaused {
        [self.livePusher pause];
        }
      • Resume stream ingest
        You can call the broadcastResumed method to resume stream ingest. The following code provides an example:
        - (void)broadcastResumed {
        [self.livePusher resume];
        }
      • Stop stream ingest
        You can call the broadcastFinished method to stop stream ingest. The following code provides an example:
        - (void)broadcastFinished {
        [self.livePusher stopPush];
        [self.livePusher destory];
        self.livePusher = nil;
        }
      • Send audio and video data
        You can call the broadcastFinished method to stop stream ingest. The following code provides an example:
        - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
        switch (sampleBufferType) {
         case RPSampleBufferTypeVideo:
             // Handle video sample buffer
             [self.livePusher processVideoSampleBuffer:sampleBuffer];
             break;
         case RPSampleBufferTypeAudioApp:
             // Handle audio sample buffer for app audio
              [self.livePusher processAudioSampleBuffer:sampleBuffer withType:sampleBufferType];
             break;
         case RPSampleBufferTypeAudioMic:
             // Handle audio sample buffer for mic audio
              [self.livePusher processAudioSampleBuffer:sampleBuffer withType:sampleBufferType];
             break;
         default:
             break;
         }
        }

Usage notes

  • Limits
    • You must configure screen rotation before stream ingest. You cannot rotate the screen during stream ingest.
    • In hardware encoding mode, the output resolution must be multiples of 16 due to the compatibility of the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black edges.
  • Version upgrade instruction

    Delete the existing version of Push SDK for iOS before you upgrade the SDK to the latest version. For more information about how to upgrade Push SDK for iOS, see Update Push SDK for iOS from V4.0.2 to V4.1.0 or later.