This topic describes how to use AliLive SDK for iOS and the classes and methods in the SDK. This topic also provides examples on how to use the features provided by AliLive SDK for iOS.

Note For more information about how to ingest streams on mobile devices, see Stream ingest, stream pulling, and streaming.

Features

  • Supports stream ingest over Real-Time Messaging Protocol (RTMP).
  • Adopts H.264 for video encoding and Advanced Audio Coding (AAC) for audio encoding.
  • Supports custom configurations for features such as bitrate control, resolution, and display mode.
  • Supports various camera operations.
  • Supports real-time retouching and allows you to adjust retouching effects.
  • Allows you to add animated stickers as animated watermarks and remove animated stickers.
  • Supports live stream recording.
  • Supports external audio and video input in different formats such as YUV and pulse-code modulation (PCM).
  • Supports multi-channel mixed streams.
  • Supports ingest of audio-only and video-only streams and stream ingest in the background.
  • Supports background music and allows you to manage background music.
  • Supports the capture of snapshots from streams.
  • Supports automatic reconnection and error handling.
  • Supports the audio 3A algorithm.

Limits

Take note of the following limits before you use Push SDK for iOS:
  • You must configure screen orientation before stream ingest. You cannot rotate the screen during a live stream.
  • You must disable auto screen rotation for stream ingest in landscape mode.
  • In hardware encoding mode, the value of the output resolution must be an integer multiple of 16 to be compatible with the encoder. For example, if you set resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars.

Procedure

The following table describes how to use AliLive SDK for iOS.

Step Description References
1. Set stream ingest parameters

Complete stream ingest configurations such as the basic parameters, bitrate control mode, adaptive resolution, and retouching feature.

Set stream ingest parameters
2. Use AliLive SDK for iOS to ingest streams

After you initialize AliLive SDK for iOS, register stream ingest callbacks, and create a preview view, you can start to ingest streams. You can manage streams, configure background music, perform operations on the camera, enable live quiz, ingest external audio sources, and add animated stickers based on your business requirements.

Use AliLive SDK for iOS to ingest streams
3. Optional. Configure stream ingest for screen recordings

If you want to ingest screen recordings, you can configure stream ingest for screen recordings.

Configure stream ingest for screen recordings

Set stream ingest parameters

You can set stream ingest parameters by using the AlivcLivePushConfig class. Each parameter has a default value. For more information about the default value and valid values for each parameter, see Aliyun Live Pusher API Reference Manual for iOS Platforms V4.4.1 or comments.

Note To modify these parameters in real time during stream ingest, refer to the parameters and methods provided by the AlivcLivePusher class.
  1. Complete basic stream ingest configurations.

    Import the header file in the view controller that requires AlivcLivePusher: #import <AlivcLivePusher/AlivcLivePusher.h>. The following sample code provides an example:

    AlivcLivePushConfig *config = [[AlivcLivePushConfig alloc] init]; // The configuration class that is used to initialize stream ingest configurations. You can also use the initWithResolution method to initialize stream ingest configurations. 
    config.resolution = AlivcLivePushResolution540P; // By default, the resolution is set to 540p. The maximum resolution is 720p.
    config.fps = AlivcLivePushFPS20; // We recommend that you set the frame rate to 20 frames per second (FPS).
    config.enableAutoBitrate = true; // Enable adaptive bitrate. The default value is true.
    config.videoEncodeGop = AlivcLivePushVideoEncodeGOP_2; // The default value is 2. The longer the interval between key frames, the higher the latency. We recommend that you set this parameter to 1 or 2. 
    config.connectRetryInterval = 2000; // The reconnection interval in milliseconds. The default reconnection interval is 2 seconds. The reconnection interval cannot be shorter than 1 second. We recommend that you use the default value. 
    config.previewMirror = false; // The default value is false. We recommend that you use the default value. 
    config.orientation = AlivcLivePushOrientationPortrait; // The default screen orientation is portrait. You can change the orientation to landscape left or landscape right. 
    Note
    • We recommend that you set the resolution to 540p based on the performance of mobile phones and network bandwidth requirements. In most cases, mainstream apps for live streaming use 540p.
    • All parameters for basic stream ingest configurations have default values. We recommend that you use the default values.
    • After adaptive bitrate is disabled, the bitrate is fixed at the initial value and is not automatically adjusted between the specified bitrate and the minimum bitrate. If you disable adaptive bitrate, stuttering may occur when the network is unstable. Proceed with caution when you disable adaptive bitrate.
  2. Specify a bitrate control mode.

    The following table describes the bitrate control modes provided by AliLive SDK for iOS. Specify the value of each parameter based on your business requirements.

    Bitrate control mode Description Sample code
    AlivcLivePushQualityModeResolutionFirst Quality-first mode. AliLive SDK for iOS sets bitrate parameters to prioritize the quality of video streams.
    config.qualityMode = AlivcLivePushQualityModeResolutionFirst; // The default mode is quality-first. You can change this mode to smoothness-first or custom mode. 
    AlivcLivePushQualityModeFluencyFirst Smoothness-first mode. AliLive SDK for iOS sets bitrate parameters to prioritize the smoothness of video streams.
    config.qualityMode = AlivcLivePushQualityModeFluencyFirst; // The default mode is smoothness-first. You can change this mode to quality-first or custom mode. 
    AlivcLivePushQualityModeCustom Custom mode. AliLive SDK for iOS sets bitrate parameters based on your custom settings. If you use the custom mode, you must specify the maximum, minimum, and initial bitrates.
    • initialVideoBitrate: the initial bitrate when a live stream starts.
    • minVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum value to prevent stuttering.
    • targetVideoBitrate: In good network conditions, the bitrate is gradually increased to the maximum value to improve the quality of the video stream.
    config.qualityMode = AlivcLivePushQualityModeCustom; // Select the custom mode.
    config.targetVideoBitrate = 1400; // The maximum bitrate is 1,400 Kbit/s.
    config.minVideoBitrate = 600; // The minimum bitrate is 600 Kbit/s.
    config.initialVideoBitrate = 1000; // The initial bitrate is 1,000 Kbit/s.
    Note
    • If you use the quality-first or smoothness-first mode, you do not need to configure the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters. Push SDK automatically ensures the quality or smoothness of video streams when the network is unstable.
    • If you use the custom mode, configure the bitrate parameters based on the recommended settings. The following table shows the recommended settings.
    Table 1. Recommended settings for custom bitrate (quality first)
    Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
    360P 600 300 1000
    480P 800 300 1200
    540P 1000 600 1400
    720P 1500 600 2000
    1080P 1800 1200 2500
    Table 2. Recommended settings for custom bitrate (smoothness first)
    Resolution initialVideoBitrate minVideoBitrate targetVideoBitrate
    360P 400 200 600
    480P 600 300 800
    540P 800 300 1000
    720P 1000 300 1200
    1080P 1500 1200 2200
  3. Configure adaptive resolution.

    If adaptive resolution is enabled, the resolution is automatically reduced to ensure the smoothness and quality of video streams in poor network conditions. The following sample code provides an example on how to enable adaptive resolution:

    config.enableAutoResolution = YES; // Enable adaptive resolution. The default value is NO.
    Notice
    • The adaptive resolution feature is not supported by all players. If you need to use this feature, we recommend that you use ApsaraVideo Player.
    • The adaptive resolution feature takes effect only when you use the quality-first or smoothness-first mode by configuring the AlivcQualityModeEnum parameter. This feature is unavailable if you use the custom mode.
  4. Configure the retouching feature.

    AliLive SDK for iOS provides basic and advanced retouching effects. Basic retouching effects include skin whitening, skin smoothing, and rosy cheeks. Advanced retouching effects based on facial recognition include skin whitening, skin smoothing, rosy cheeks, big eyes, face resizing, and face slimming. For more information, see Overview. The following sample code provides an example on how to configure the retouching feature:

    # pragma mark - "Retouching effects and related methods"/**
     * @brief Enable or disable a retouching effect.
     * @param type Specify a value for the QueenBeautyType parameter.
     * @param isOpen YES: enables the retouching effect. NO: disables the retouching effect.
     *
     */
    - (void)setQueenBeautyType:(kQueenBeautyType)type enable:(BOOL)isOpen;
    /**
     * @brief Set retouching parameters.
     * @param param Specify a retouching parameter. This parameter is a value of the QueenBeautyParams parameter.
     * @param value Specify a value for the retouching parameter. Valid values: 0 to 1. If the original value is smaller than 0, set the value to 0. If the original value is greater than 1, set the value to 1.
     */
    - (void)setQueenBeautyParams:(kQueenBeautyParams)param
    value:(float)value;
    # pragma mark - "Methods for using filters"
    /**
     * @brief Specify a filter material. Before you specify the filter material, set the kQueenBeautyTypeLUT parameter.
     * @param imagePath Specify the path of the filter material.
     */
    - (void)setLutImagePath:(NSString *)imagePath;
    # pragma mark - "Methods for face shaping"
    /**
     * @brief Specify a face shaping effect. Before you specify the face shaping effect, set the kQueenBeautyTypeFaceShape parameter.
     * @param faceShapeType Specify the face shaping effect that you want to use. This parameter is similar to the QueenBeautyFaceShapeType parameter.
     * @param value Specify a value for the faceShapeType parameter.
     */
    - (void)setFaceShape:(kQueenBeautyFaceShapeType)faceShapeType
    value:(float)value;
    # pragma mark - "Methods for makeup"
    /**
     * @brief Specify a makeup type and the paths of makeup materials. Before you specify the makeup type, set the kQueenBeautyTypeMakeup parameter.
     * @param makeupType Specify a makeup type.
     * @param imagePaths Specify the paths of makeup materials.
     * @param blend Specify mixed makeup.
     */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend;
    /**
     * @brief Specify a makeup type and the paths of makeup materials.
     * @param makeupType Specify a makeup type.
     * @param imagePaths Specify the paths of makeup materials.
     * @param blend Specify mixed makeup.
     * @param fps Specify the frame rate.
     */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend fps:(int)fps;
    /**
     * @brief Configure the transparency for a makeup type. You can specify the gender.
     * @param makeupType Specify a makeup type.
     * @param isFeMale Specify whether the gender is female. A value of YES indicates female. A value of NO indicates male.
     * @param alpha Specify the transparency for the makeup.
     */
    - (void)setMakeupAlphaWithType:(kQueenBeautyMakeupType)makeupType
    female:(BOOL)isFeMale alpha:(float)alpha;
    /**
     * @brief Specify the type of mixed makeup.
     * @param makeupType Specify a makeup type.
     * @param blend Specify mixed makeup.
     */
    - (void)setMakeupBlendWithType:(kQueenBeautyMakeupType)makeupType
    blendType:(kQueenBeautyBlend)blend;
    /**
     * @brief Clear all makeup effects.
     */
    - (void)resetAllMakeupType;
  5. Specify an image for background stream ingest.

    AliLive SDK for iOS allows you to ingest images when your app is switched to the background or the bitrate is low. This improves user experience. When your app is switched to the background, video stream ingest is paused. In this case, you can ingest only images and audio streams. For example, you can ingest an image in which a message is displayed to notify the audience that the streamer left. The following sample code provides an example on how to configure image ingest when your app is switched to the background.

    config.pauseImg = [UIImage imageNamed:@"image.png"]; // Specify the image for stream ingest when your app is switched to the background.

    You can specify a static image for stream ingest in poor network conditions. If the bitrate is low, the image that you specify is ingested to prevent stuttering. The following sample code provides an example on how to configure image ingest in poor network conditions:

    config.networkPoorImg = [UIImage imageNamed:@"image.png"]; // Specify the image for stream ingest in poor network conditions.
  6. Configure watermarks.
    AliLive SDK for iOS allows you to add one or more watermarks. The source image of a watermark must be in the PNG format. The following sample code provides an example on how to add watermarks:
    NSString *watermarkBundlePath = [[NSBundle mainBundle] pathForResource:
    [NSString stringWithFormat:@"watermark"] ofType:@"png"]; // Specify the image to be added as a watermark.
    [config addWatermarkWithPath: watermarkBundlePath
          watermarkCoordX:0.1
          watermarkCoordY:0.1
          watermarkWidth:0.3]; // Add the watermark.
    Note
    • The values of the watermarkCoordX, watermarkCoordY, and watermarkWidth parameters are relative. For example, a value of 0.1 for the watermarkCoordX parameter indicates that the left edge of the watermark is displayed at the 10% position on the x-axis of the stream image. Therefore, if the stream resolution is 540 × 960, the value of the watermarkCoordX parameter is 54.
    • The height of the watermark is scaled based on the width and height of the source image and the input width of the watermark.
    • If you want to add a text watermark, you can convert the text into an image 321and call the addWatermarkWithPath method to add the image as a watermark.
    • To ensure the clarity and smoothness of the edges of the watermark, we recommend that you use a source image in the watermark output size. For example, if the resolution of the output video is 544 × 940 and the width of the watermark is 0.1f, we recommend that you use the following width for the source image: 544 × 0.1f = 54.4.
  7. Specify a preview mode.
    AliLive SDK for iOS supports the following preview modes. The preview mode does not affect stream ingest.
    • ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: In this mode, the video fills the entire preview view. If the aspect ratio of the video is not the same as the aspect ratio of the preview view, the preview image is deformed.
    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: In this mode, the initial aspect ratio of the video is used during the preview. If the aspect ratio of the video is not the same as the aspect ratio of the preview view, black bars appear on the preview view. This is the default preview mode.
    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: In this mode, the aspect ratio of the video is changed to fit the preview view. If the aspect ratio of the video is not the same as the aspect ratio of the preview view, the video is cropped to fit the preview view.

    The following sample code provides an example on how to specify a preview mode:

    mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);
    Note
    • You can specify a preview mode in the AlivcLivePushConfig class. You can also call the setpreviewDisplayMode method to specify a preview mode during preview and stream ingest.
    • This configuration takes effect only for preview. The actual resolution of the output video follows the resolution that is configured in the AlivcLivePushConfig class. The preview mode does not affect the actual resolution. You can select a preview mode to adapt to different screen sizes of mobile phones.

Use AliLive SDK for iOS to ingest streams

AlivcLivePusher is the core class of AliLive SDK for iOS. This class provides parameters for video preview, stream ingest callback, and stream ingest management. You can also use this class to set parameters during stream ingest. This section describes how to use the key methods for stream ingest.

  1. Initialize the stream ingest parameters.

    Call the initWithConfig method to initialize the stream ingest parameters. The following sample code provides an example on how to initialize the stream ingest parameters:

    self.livePusher = [[AlivcLivePusher alloc] initWithConfig:config];
    Note The AlivcLivePusher class does not support multiple instances. Therefore, AliLive SDK for iOS provides a destroy method for each init method.
  2. Register stream ingest callbacks.
    Three types of stream ingest callbacks are provided:
    • Info: the callbacks that are used for notifications and status detection.
    • Error: the callbacks that are used when errors occur.
    • Network: the callbacks that are used to manage network services.
    You can use the delegate method to receive the specified callbacks. The following sample code provides an example on how to configure the delegate method to receive the specified callbacks:
    [self.livePusher setInfoDelegate:self];
    [self.livePusher setErrorDelegate:self];
    [self.livePusher setNetworkDelegate:self];
  3. Start preview.

    You can start preview after you initialize the livePusher object. Create a view instance in the UIView class or use a class that inherits from UIView to start preview. The following sample code provides an example on how to start preview:

    [self.livePusher startPreview:self.view];
  4. Start stream ingest.

    You can start stream ingest only after the preview succeeds. Therefore, you must add the following code to the callbacks to specify the onPreviewStarted callback for the AlivcLivePusherInfoDelegate class:

    [self.livePusher startPushWithURL:@"Ingest URL for test (rtmp://......)"];
    Note
    • Push SDK for iOS allows you to call the startPushWithURLAsync operation to start stream ingest in an asynchronous manner.
    • Push SDK for iOS supports the URLs of the streams that are ingested over RTMP. For more information about how to obtain ingest URLs, see Ingest and streaming URLs.
    • Start stream ingest by using a valid URL. Then, use a player, such as ApsaraVideo Player, FFplay, and VLC, to test stream pulling. For more information about how to obtain source URLs, see Ingest and streaming URLs.
  5. Complete other stream ingest configurations.

    AliLive SDK for iOS allows you to manage stream ingest. For example, you can start, stop, restart, pause, and resume stream ingest, stop preview, and destroy stream ingest objects. You can add buttons to perform these operations. The following sample code provides an example on how to manage stream ingest:

    /* You can pause a stream that is being ingested. If you pause a stream that is being ingested, the video preview and the video stream ingest are paused at the last frame, and the audio stream continues to be ingested. */
    [self.livePusher pause];
    /* You can resume stream ingest. After you resume stream ingest, the preview and ingest of audio and video streams are resumed. */
    [self.livePusher resume];
    /* You can stop a stream that is being ingested. */
    [self.livePusher stopPush];
    /* You can stop preview. However, this operation does not take effect for a stream that is being ingested. When the preview is stopped, the preview view is frozen at the last frame. */
    [self.livePusher stopPreview];
    /* You can restart stream ingest when the stream is being ingested or when all error callbacks are received. If an error occurs, you can use only this method or the reconnectPushAsync method to restart stream ingest. You can also use the destroy method to destroy the stream ingest object. Then, you can restart all AlivcLivePusher resources that are required for operations, such as the preview and stream ingest. */
    [self.livePusher restartPush];
    /* You can use this method when the stream is being ingested or when error callbacks related to the AlivcLivePusherNetworkDelegate class are received. If an error occurs, you can use only this method or the restartPush method to restart stream ingest. You can also use the destroy method to destroy the stream ingest object. Then, you can restart stream ingest over Real Time Messaging Protocol (RTMP). */
    [self.livePusher reconnectPushAsync];
    /* After the stream ingest object is destroyed, stream ingest and preview are stopped, and preview views are removed. All resources related to the AlivcLivePusher class are destroyed. */
    [self.livePusher destory];
    self.livePusher = nil;
    /* Query the status of stream ingest. */
    AlivcLivePushStatus status = [self.livePusher getLiveStatus];
  6. Adjust the level of retouching effects in real time.

    AliLive SDK for iOS allows you to adjust the level of retouching effects in real time during stream ingest. You can enable the retouching feature and set the parameters for the retouching feature based on your business requirements. This feature is provided by Queen SDK. The following sample code provides an example on how to set retouching parameters:

    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinBuffing enable:YES];
    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinWhiting enable:YES];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsWhitening value:0.8f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSharpen value:0.6f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSkinBuffing value:0.6];
  7. Manage background music.

    AliLive SDK for iOS allows you to manage background music. For example, you can configure the playback of background music and the audio mixing, noise reduction, in-ear monitoring, and muting features. You can call background music methods only after the preview starts. The following sample code provides an example on how to manage background music:

    /* Start the playback of background music. */
    [self.livePusher startBGMWithMusicPathAsync:musicPath];
    /* Stop the playback of background music. To change the background music, call the method that is used to start the playback of background music. You do not need to stop the playback of the current background music. */
    [self.livePusher stopBGMAsync];
    /* Pause the playback of background music. You can call this method only after the playback of background music starts. */
    [self.livePusher pauseBGM];
    /* Resume the playback of background music. You can call this method only after the playback of background music is paused. */
    [self.livePusher resumeBGM];
    /* Enable looping. */
    [self.livePusher setBGMLoop:true];
    /* Configure noise reduction. If you enable noise reduction, the system filters out non-vocal parts from collected audio. This feature may slightly reduce the volume of the human voice. We recommend that you allow your users to determine whether to enable this feature. By default, this feature is disabled. */
    [self.livePusher setAudioDenoise:true];
    /* Configure in-ear monitoring. In-ear monitoring is suitable for scenarios that involve karaoke. If you enable in-ear monitoring, you can hear your voice on your earphones during streaming. If you disable in-ear monitoring, you cannot hear your voice on your earphones during streaming. This feature does not take effect if no earphones are detected. */
    [self.livePusher setBGMEarsBack:true];
    /* Adjust the volumes of the background music and the human voice to configure audio mixing. */
    [self.livePusher setBGMVolume:50]; // Adjust the volume of the background music.
    [self.livePusher setCaptureVolume:50]; // Adjust the volume of the human voice.
    /* Configure muting. If you enable this feature, the background music and the human voice are muted. To separately mute the background music or the human voice, call the method that is used to configure audio mixing. */
    [self.livePusher setMute:isMute?true:false];
  8. Perform operations on the camera.

    You can perform operations on the camera only after you start preview in the streaming state, paused state, or reconnecting state. For example, you can switch the camera and configure the flash, the focal length, zooming, and the mirroring mode. If you do not start preview, the following methods are invalid. The following sample code provides an example on how to perform operations on the camera:

    /* Switch between the front and rear cameras. */
    [self.livePusher switchCamera];
    /* Enable or disable the flash. You cannot enable the flash for the front camera. */
    [self.livePusher setFlash:false]; 
    /* Adjust the focal length to zoom in and out images. If you set the input parameter to a positive number, the system increases the focal length. If you set the input parameter to a negative number, the system decreases the focal length. */
    CGFloat max = [_livePusher getMaxZoom];
    [self.livePusher setZoom:MIN(1.0, max)]; 
    /* Configure manual focus. To configure manual focus, you must set the following parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus parameter specifies whether to enable autofocus. The autoFocus parameter takes effect only when you call this method. The value of the autoFocus parameter determines whether autofocus is enabled later. */
    [self.livePusher focusCameraAtAdjustedPoint:CGPointMake(50, 50) autoFocus:true];
    /* Configure autofocus. */
    [self.livePusher setAutoFocus:false];
    /* Configure the mirroring mode. The methods for mirroring are PushMirror and PreviewMirror. The PushMirror method is used to enable the mirroring mode for stream ingest. The PreviewMirror method is used to enable the mirroring mode for preview. The PushMirror method takes effect only for playback images. The PreviewMirror method takes effect only for preview views. */
    [self.livePusher setPushMirror:false];
    [self.livePusher setPreviewMirror:false];
  9. Use the live quiz feature.

    To use the live quiz feature, you must insert supplemental enhancement information (SEI) into live streams and parse SEI by using the player. AliLive SDK for iOS provides a method to insert SEI. You can call this method only during stream ingest. The following sample code provides an example on how to enable live quiz:

    /*
    sendMessage: Specify the SEI messages to be inserted into the live stream. The SEI messages are in the JSON format. ApsaraVideo Player SDK can receive and parse SEI messages. 
    repeatCount: Specify the number of frames into which the SEI messages are inserted. To ensure that no SEI messages are dropped for a frame, you must specify the number of repetitions. For example, a value of 100 indicates that SEI messages are inserted into the subsequent 100 frames. ApsaraVideo Player SDK removes duplicate SEI messages. 
    delayTime: Specify the period of time to wait before the frames are sent. Unit: milliseconds. 
    KeyFrameOnly: Specify whether to send only keyframes. 
    */
    [self.livePusher sendMessage:@"Information about questions" repeatCount:100 delayTime:0 KeyFrameOnly:false];
  10. Configure external audio and video sources.

    AliLive SDK for iOS allows you to import external audio and video sources for stream ingest. For example, you can ingest an audio or video file.

    1. Configure the input of external audio and video sources in stream ingest settings.
      The following sample code provides an example on how to configure external audio and video sources:
      config.externMainStream = true; // Enable the input of external streams.
      config.externVideoFormat = AlivcLivePushVideoFormatYUVNV21; // Specify the color format for video data. In this example, the color format is YUVNV21. You can also use other formats based on your business requirements. 
      config.externMainStream = AlivcLivePushAudioFormatS16; // Specify the bit depth format for audio data. In this example, the bit depth format is S16. You can also use other formats based on your business requirements.
    2. Import external video data.
      The following sample code provides an example on how to import external video data:
      /* The sendVideoData method supports only native YUV and RGB buffer data. You can use the sendVideoData method to transmit the buffer, length, width, height, timestamp, and rotation angle of video data. */
      [self.livePusher sendVideoData:yuvData width:720 height:1280 size:dataSize pts:nowTime rotation:0];
      /* For CMSampleBufferRef video data, you can use the sendVideoSampleBuffer method. */
      [self.livePusher sendVideoSampleBuffer:sampleBuffer]
      /* You can also convert CMSampleBufferRef video data to native buffer before you use the sendVideoData method. The following sample code provides an example on how to convert video data. */
      // Query the length of sample buffer.
      - (int) getVideoSampleBufferSize:(CMSampleBufferRef)sampleBuffer {
      if(!sampleBuffer) {
          return 0;
      }
      int size = 0;
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
         int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
         for(int i=0; i<count; i++) {
             int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
             int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
             size += stride*height;
         }
      }else {
         int height = (int)CVPixelBufferGetHeight(pixelBuffer);
         int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
         size += stride*height;
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return size;
      }
      // Convert video sample buffer to native buffer.
      - (int) convertVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer toNativeBuffer:(void*)nativeBuffer
      {
      if(!sampleBuffer || !nativeBuffer) {
         return -1;
      }
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      int size = 0;
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
         int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
         for(int i=0; i<count; i++) {
             int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
             int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
             void *buffer = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, i);
             int8_t *dstPos = (int8_t*)nativeBuffer + size;
             memcpy(dstPos, buffer, stride*height);
             size += stride*height;
         }
      }else {
         int height = (int)CVPixelBufferGetHeight(pixelBuffer);
         int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
         void *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);
         size += stride*height;
         memcpy(nativeBuffer, buffer, size);
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return 0;
      }
    3. Import external audio data.
      The following sample code provides an example on how to import external audio data:
      /* The sendPCMData method supports only native pulse-code modulation (PCM) buffer data. You can use this method to transmit the buffer, length, and timestamp of audio data. */
      [self.livePusher sendPCMData:pcmData size:size pts:nowTime];
  11. Add animated stickers.

    AliLive SDK for iOS allows you to add animated stickers as watermarks to live streams.

    1. To create an animated sticker, you can modify the materials in the demo. Create a sequence frame image for the animated sticker. Set the following parameters in the config.json file:
      "du": 2.04, // Specify the duration for each time the animated sticker is played.
      "n": "qizi", // Specify the name of the animated sticker. Make sure that the name of the folder in which the animated sticker is created is the same as the name of the sticker. The name of the sticker contains the name followed by the sequence number. Example: qizi0.
      "c": 68.0, // Specify the number of animation frames, which is the number of images included in an animated sticker.
      "kerneframe": 51, // Specify an image as the keyframe. For example, specify the 51st frame as the keyframe in the demo. Make sure that the specified frame exists.
      "frameArry": [
          {"time":0,"pic":0},
          {"time":0.03,"pic":1},
          {"time":0.06,"pic":2},
          ],
      // Set the parameters of the animated sticker. In the preceding settings, "time":0,"pic":0 indicates that the first frame qizi0 is displayed 0 seconds after the start. "time":0.03,"pic":1 indicates that the second frame qizi1 is displayed 0.03 second after the start. Specify all frames in the animation in the same manner.
      Note Configure other fields as described in the .json file in the demo.
    2. Add an animated sticker.
      The following sample code provides an example on how to add an animated sticker:
      /**
      * Add an animated sticker.
      * @param path Specify the path of the animated sticker. The path must contain config.json.
      * @param x Specify the starting position on the x-axis. Valid values: 0 to 1.0f.
      * @param y Specify the starting position on the y-axis. Valid values: 0 to 1.0f.
      * @param w Specify the width. Valid values: 0 to 1.0f.
      * @param h Specify the height. Valid values: 0 to 1.0f.
      * @return id Specify the ID of the sticker. You must specify the sticker ID if you want to remove a sticker.
      */
      [self.livePusher addDynamicWaterMarkImageDataWithPath: "Sticker path" x:0.2f y:0.2f w:0.2f h:0.2f];
    3. Remove an animated sticker.
      The following sample code provides an example on how to remove an animated sticker:
      [self.livePusher removeDynamicWaterMark:id];
  12. Configure debugging tools.

    DebugView is a UI debugging tool that allows you to diagnose issues. DebugView provides a window that is draggable and always displayed at the top of the view for debugging. You can use DebugView to query the logs of stream ingest, monitor the metrics for stream ingest performance in real time, and generate line charts for main performance metrics.

    Note In released versions, do not call the method that is used to invoke DebugView.
    The following sample code provides an example on how to invoke DebugView:
    [AlivcLivePusher showDebugView]; // Show DebugView.
  13. Call other methods.
    /* In custom mode, you can change the minimum bitrate and the maximum bitrate in real time. */
    [self.livePusher setTargetVideoBitrate:800];
    [self.livePusher setMinVideoBitrate:200]
    /* Query whether the stream is being ingested. */
    BOOL isPushing = [self.livePusher isPushing]; 
    /* Query the ingest URL. */
    NSString *pushURLString = [self.livePusher getPushURL];
    /* Query the debugging information about stream ingest performance. For information about the parameters of stream ingest performance, see the API references or comments in the code. */
    AlivcLivePushStatsInfo *info = [self.livePusher getLivePushStatusInfo];
    /* Query the version number. */
    NSString *sdkVersion = [self.livePusher getSDKVersion];
    /* Specify the log level to filter debugging information. */
    [self.livePusher setLogLevel:(AlivcLivePushLogLevelDebug)];

Configure stream ingest for screen recordings

ReplayKit is a framework that is introduced by iOS 9, which allows you to record the screen. In iOS 10, ReplayKit allows you to use third-party app extensions to stream screen recordings. In iOS 10 or later, you can use AliLive SDK for iOS together with app extensions to stream screen recordings.

To ensure system stability, iOS provides fewer resources for the app extension that captures screen recordings. If the app extension occupies an excessive amount of memory, the app extension is terminated and exited. To address memory limits on app extensions, AliLive SDK for iOS designates the recording process to the extension app and the host app. The extension app is used to capture screen recordings and send the captured content to the host app by using inter-process communication. The host app is used to create an AlivcLivePusher object, which pushes the screen recordings to clients. The whole process of stream ingest is complete in the host app. You can also configure audio collection and the transmission of collected audio in the host app. The extension app is used only to capture screen recordings.
Notice In the demo of AliLive SDK for iOS, the inter-process communication between the extension app and the host app is enabled by using an app group. This part of logic is encapsulated in AlivcLibReplayKitExt.framework.
To stream screen recordings in iOS, the system creates the extension app to capture screen recordings. The following procedure shows how to stream screen recordings in iOS:
  1. Create an app group.
    Log on to the Apple Developer console and perform the following operations:
    1. On the Certificates, Identifiers & Profiles page, register an app group. For more information, see Register an app group.
    2. On the Identifiers page, click App IDs and select your app ID. Click the app ID and enable the app group. You must perform the preceding operations on the app IDs of both the host app and the extension app. For more information, see Enable app capabilities.
    3. Download the regenerated provisioning profiles and reconfigure the provisioning profiles in Xcode.
    Then, the extension app can communicate with the host app.
    Note After you create an app group, you must save the value of the app group identifier. This value is used in subsequent steps.
  2. Create an extension app.
    In the demo, AliLive SDK for iOS provides the AlivcLiveBroadcast and AlivcLiveBroadcastSetupUI app extensions to stream screen recordings. To create an extension app in an app, perform the following steps:
    1. Choose New > Target in the current project. Then, click Broadcast Upload Extension, as shown in the following figure.Screen recording 2
    2. Set the Product Name parameter, select Include UI Extension, and then click Finish, as shown in the following figure. After that, a live streaming extension and a live streaming UI are created.Screen recording 1
    3. Configure the Info.plist extension. By default, Xcode creates a header file and a source file named SampleHandler in the target that you created, as shown in the following figure.Screen recording 3
      Drag AlivcLibReplayKitExt.framework to the project. This way, the target extension depends on AlivcLibReplayKitExt.framework. 1Replace the code in SampleHandler.m with the following code. You must replace kAPPGroup in the code with the app group identifier that was created in the preceding step. The following sample code provides an example on how to configure the Info.plist extension:
      
      #import "SampleHandler.h"
      #import <AlivcLibReplayKitExt/AlivcLibReplayKitExt.h>
      
      @implementation SampleHandler
      
      - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *>
      *)setupInfo {
      
          //User has requested to start the broadcast. Setup info from the UI extension can
      be supplied but optional.
          [[AlivcReplayKitExt sharedInstance] setAppGroup:kAPPGROUP];
      }
      
      - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
          if (sampleBufferType != RPSampleBufferTypeAudioMic) {
              // The sound is collected and sent by the host app.
              [[AlivcReplayKitExt sharedInstance] sendSampleBuffer:sampleBuffer withType:sampleBufferType];
          }
      }
      
      - (void)broadcastFinished {
      
          [[AlivcReplayKitExt sharedInstance] finishBroadcast];
      }
      @end
      
                                          
    The target of the Broadcast Upload Extension is created in your project, and AlivcLibReplayKitExt.framework is customized for the screen recording extension.
  3. Integrate AliLive SDK for iOS into the host app.
    Create the AlivcLivePushConfig and AlivcLivePusher objects in the host app. Set the externMainStream parameter to true. Set the audioFromExternal parameter to false, which specifies that audio collection continues to be processed in the SDK. Call the startScreenCapture method to receive the screen recording data from the extension app. Then, start or stop stream ingest. Perform the following operations to integrate AliLive SDK for iOS into the host app:
    1. Add the AlivcLivePusher.framework, AlivcLibRtmp.framework, RtsSDK.framework, and AlivcLibReplayKitExt.framework dependencies to the host app. 3
    2. Initialize AliLive SDK for iOS and configure an external video source for stream ingest.
      Set the externMainStream parameter to true, the externVideoFormat parameter to AlivcLivePushVideoFormatYUV420P, and the audioFromExternal parameter to false. Set other stream ingest parameters as needed. The following sample code provides an example:
             self.pushConfig.externMainStream = true;
          self.pushConfig.externVideoFormat = AlivcLivePushVideoFormatYUV420P;
          self.pushConfig.audioSampleRate = 44100;
          self.pushConfig.audioChannel = 2;
          self.pushConfig.audioFromExternal = false;
          self.pushConfig.videoEncoderMode = AlivcLivePushVideoEncoderModeSoft;
          self.pushConfig.qualityMode = AlivcLivePushQualityModeCustom;
          self.pushConfig.targetVideoBitrate = 2500;
          self.pushConfig.minVideoBitrate = 2000;
          self.pushConfig.initialVideoBitrate = 2000;
          self.livePusher = [[AlivcLivePusher alloc] initWithConfig:self.pushConfig];
      
                                      
    3. Call the following methods of the AlivcLivePusher class to use live streaming features:
      • Receive screen recording data.
        Replace kAPPGroup in the code with the app group identifier that was created in the preceding step. The following sample code provides an example:
        [self.livePusher startScreenCapture:kAPPGROUP];
      • Start stream ingest.
        The following sample code provides an example on how to start stream ingest:
        [self.livePusher startPushWithURL:self.pushUrl]
      • Stop stream ingest.
        The following sample code provides an example on how to stop stream ingest:
        
        [self.livePusher stopPush];
        [self.livePusher destory];
        self.livePusher = nil;

Considerations

  • Package size
    • The size of AliLive SDK for iOS is 10.7 MB.
    • After you integrate AliLive SDK for iOS, the size of the IP Application Accelerator (IPA) package increases by about 2.8 MB.
  • Compatible mobile phones

    iPhone 5s or later, with iOS 8.0 or later.

  • Version updates

    Remove the existing version of AliLive SDK for iOS before you update the SDK to the latest version. For more information about how to update AliLive SDK for iOS, see Update Push SDK for iOS from V4.0.2 to V4.1.0 or later.