All Products
Search
Document Center

ApsaraVideo Live:Use Push SDK for iOS

Last Updated:Mar 05, 2024

This topic describes how to use Push SDK for iOS and the classes and methods in the SDK. This topic also provides examples on how to use the features provided by Push SDK for iOS.

Note

For more information about how to ingest streams on mobile devices, see Stream ingest, stream pulling, and streaming.

Features

  • Supports stream ingest over Real-Time Messaging Protocol (RTMP).

  • Supports stream ingest and stream pulling over the Real-Time Streaming (RTS) protocol that is based on Real-Time Communication (RTC).

  • Supports co-streaming and battles.

  • Adopts H.264 for video encoding and Advanced Audio Coding (AAC) for audio encoding.

  • Supports custom configurations for features such as bitrate control, resolution, and display mode.

  • Supports various camera operations.

  • Supports real-time retouching and allows you to adjust retouching effects.

  • Allows you to add animated stickers as animated watermarks and remove animated stickers.

  • Allows you to stream screen recordings.

  • Supports external audio and video inputs in different formats such as YUV and pulse-code modulation (PCM).

  • Supports mixing of multiple streams.

  • Supports ingest of audio-only and video-only streams and stream ingest in the background.

  • Supports background music and allows you to manage background music.

  • Supports the capture of video snapshots.

  • Supports automatic reconnection and error handling.

  • Supports the audio 3A algorithm.

  • Allows you to switch between the software and hardware encoding modes for video files. This improves the stability of the encoding module.

Limits

Take note of the following limits before you use Push SDK for iOS:

  • You must configure screen orientation before stream ingest. You cannot rotate the screen during live streaming.

  • You must disable auto screen rotation for stream ingest in landscape mode.

  • In hardware encoding mode, the value of the output resolution must be a multiple of 16 to be compatible with the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars.

Procedure

The following table describes how to use the basic edition of Push SDK for iOS.

Step

Description

References

1. Register the SDK.

Configure license-related parameters to register Push SDK for iOS. If you do not invoke the registration function, you cannot use the stream ingest feature.

Register the SDK

2. Configure stream ingest parameters.

Complete the stream ingest configurations, such as the basic parameters, bitrate control mode, adaptive resolution feature, and retouching feature.

Configure stream ingest parameters (basic edition)

3. Use Push SDK for iOS to ingest streams.

After you initialize Push SDK for iOS, register stream ingest callbacks, and create a preview view, you can start to ingest streams. You can manage streams, configure background music, configure camera settings, enable live quiz, ingest external audio sources, and add animated stickers based on your business requirements.

Note

ApsaraVideo Live does not allow you to ingest multiple streams to a URL at the same time. If you attempt to ingest multiple streams at the same time, only the first stream is ingested.

Use Push SDK for iOS to ingest streams (basic edition)

4. (Optional) Configure stream ingest for screen recordings.

If you want to stream screen recordings, configure stream ingest for screen recordings.

Configure stream ingest for screen recordings (basic edition)

The following table describes how to use the interactive edition of Push SDK for iOS.

Step

Description

References

1. Register the SDK.

Configure license-related parameters to register Push SDK for iOS. If you do not invoke the registration function, you cannot use the stream ingest feature.

Register the SDK

2. Configure co-streaming settings.

After you configure co-streaming settings, streamers and viewers can interact with each other at an ultra-low latency of less than 300 ms by using the interactive edition of Push SDK for iOS.

Configure co-streaming settings (interactive edition)

Register the SDK

In Push SDK for iOS V4.4.2 and later, an all-in-one license is used. You must register Push SDK for iOS before you can use the stream ingest feature. For more information, see Integrate a Push SDK license.

Configure stream ingest parameters (basic edition)

You can configure stream ingest parameters by using the AlivcLivePushConfig class. Each parameter has a default value. For more information about the default value and valid values for each parameter, see API reference for Push SDK for iOS V6.9.0 (basic edition) or API reference for Push SDK for iOS V6.9.0 (interactive edition) or view the comments in the code.

Note

To modify these parameters in real time during stream ingest, refer to the parameters and methods provided by the AlivcLivePusher class.

  1. Complete basic stream ingest configurations.

    Import the header file into the view controller that requires AlivcLivePusher: #import <AlivcLivePusher/AlivcLivePusher.h>. Sample code:

    AlivcLivePushConfig *config = [[AlivcLivePushConfig alloc] init];// Initialize the AlivcLivePushConfig class. You can also use initWithResolution to perform initialization. 
    config.resolution = AlivcLivePushResolution540P; // By default, the resolution is set to 540p. The maximum resolution is 720p.
    config.fps = AlivcLivePushFPS20; // We recommend that you set the frame rate to 20 frames per second (FPS).
    config.enableAutoBitrate = true; // Specify whether to enable adaptive bitrate streaming. The default value is true.
    config.videoEncodeGop = AlivcLivePushVideoEncodeGOP_2;// The default value is 2. The longer the interval between key frames, the higher the latency. We recommend that you set this parameter to a value from 1 to 2. 
    config.connectRetryInterval = 2000; // The reconnection interval in milliseconds. The default reconnection interval is 2 seconds. The reconnection interval cannot be shorter than 1 second. We recommend that you use the default value. 
    config.previewMirror = false; // The default value is false. We recommend that you use the default value. 
    config.orientation = AlivcLivePushOrientationPortrait; // The default screen orientation is portrait. You can change the orientation to landscape left or landscape right.

    Note
    • We recommend that you set the resolution to 540p based on the performance of mobile phones and network bandwidth requirements. In most cases, mainstream apps for live streaming use 540p.

    • All parameters used for basic stream ingest configurations have a default value. We recommend that you use the default values.

    • If adaptive bitrate streaming is disabled, the bitrate is fixed at the initial value and is not automatically adjusted between the specified target bitrate and the minimum bitrate. In this case, stuttering may occur when the network is unstable. Proceed with caution before you disable this feature.

  2. Specify the bitrate control mode.

    The following table describes the bitrate control modes provided by Push SDK for iOS. Specify the value of each parameter based on your business requirements.

    Bitrate control mode

    Description

    Sample code

    AlivcLivePushQualityModeResolutionFirst

    The quality-first mode. Push SDK for iOS configures bitrate parameters to prioritize the quality of video streams.

    config.qualityMode = AlivcLivePushQualityModeResolutionFirst; // The default mode is quality-first. You can change the mode to smoothness-first or custom mode.

    AlivcLivePushQualityModeFluencyFirst

    The smoothness-first mode. Push SDK for iOS configures bitrate parameters to prioritize the smoothness of video streams.

    config.qualityMode = AlivcLivePushQualityModeFluencyFirst; // The default mode is smoothness-first. You can change the mode to quality-first or custom mode.

    AlivcLivePushQualityModeCustom

    The custom mode. Push SDK for iOS configures bitrate parameters based on your custom settings. If you use the custom mode, you must specify the initial, minimum, and target bitrates.

    • initialVideoBitrate: the initial bitrate when a live stream starts.

    • minVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum bitrate to prevent stuttering.

    • targetVideoBitrate: In good network conditions, the bitrate is gradually increased to the target bitrate to improve the quality of a video stream.

    config.qualityMode = AlivcLivePushQualityModeCustom// Select the custom mode.
    config.targetVideoBitrate = 1400; // The maximum bitrate is 1,400 Kbit/s.
    config.minVideoBitrate = 600; // The minimum bitrate is 600 Kbit/s.
    config.initialVideoBitrate = 1000; // The initial bitrate is 1,000 Kbit/s.
    Note
    • If you use the quality-first or smoothness-first mode, you do not need to configure the initialVideoBitrate, minVideoBitrate, and targetVideoBitrate parameters. Push SDK can ensure the quality or smoothness of video streams when network jitter occurs.

    • If you use the custom mode, configure the bitrate parameters based on the recommended settings. The following table shows the recommended settings.

    Table 1. Recommended settings for custom bitrate in quality-first mode

    Resolution

    initialVideoBitrate

    minVideoBitrate

    targetVideoBitrate

    360p

    600

    300

    1000

    480p

    800

    300

    1200

    540p

    1000

    600

    1400

    720p

    1500

    600

    2000

    1080p

    1800

    1200

    2500

    Table 2. Recommended settings for custom bitrate in smoothness-first mode

    Resolution

    initialVideoBitrate

    minVideoBitrate

    targetVideoBitrate

    360p

    400

    200

    600

    480p

    600

    300

    800

    540p

    800

    300

    1000

    720p

    1000

    300

    1200

    1080p

    1500

    1200

    2200

  3. Configure the adaptive resolution feature.

    When you enable the adaptive resolution feature, the resolution is automatically reduced to ensure the smoothness and quality of video streams in poor network conditions. Sample code:

    config.enableAutoResolution = YES; // Specify whether to enable adaptive resolution. The default value is NO.
    Important
    • The adaptive resolution feature is not supported by all players. If you need to use this feature, we recommend that you use ApsaraVideo Player.

    • The adaptive resolution feature takes effect only when you use the quality-first or smoothness-first mode by configuring the AlivcQualityModeEnum parameter. This feature is unavailable if you use the custom mode.

  4. Configure the retouching feature.

    Push SDK for iOS provides basic and advanced retouching effects. Basic retouching effects include skin whitening, skin smoothing, and rosy cheeks. Advanced retouching effects based on facial recognition include skin whitening, skin smoothing, rosy cheeks, big eyes, face resizing, and face slimming. For more information, see Overview. Sample code:

    # pragma mark - "Retouching effects and related methods/**
    * @brief Enable or disable a retouching effect.
    * @param type Specify a value for the QueenBeautyType parameter.
    * @param isOpen YES: enables the retouching effect. NO: disables the retouching effect.
    *
    */
    - (void)setQueenBeautyType:(kQueenBeautyType)type enable:(BOOL)isOpen;
    /**
    * @brief Set retouching parameters.
    * @param param Specify a retouching parameter. This parameter is a field of the QueenBeautyParams parameter.
    * @param value Specify a value for the retouching parameter. Valid values: 0 to 1. If the original value is smaller than 0, set the value to 0. If the original value is greater than 1, set the value to 1.
    */
    - (void)setQueenBeautyParams:(kQueenBeautyParams)param
    value:(float)value;
    # pragma mark - "Methods for filters"
    /**
    * @brief Specify a filter material. Before you specify the filter material, set the kQueenBeautyTypeLUT parameter.
    * @param imagePath Specify the path of the filter material.
    */
    - (void)setLutImagePath:(NSString *)imagePath;
    # pragma mark - "Methods for face shaping"
    /**
    * @brief Specify a face shaping effect. Before you specify the face shaping effect, set the kQueenBeautyTypeFaceShape parameter.
    * @param faceShapeType Specify the face shaping effect that you want to use. This parameter is similar to the QueenBeautyFaceShapeType parameter.
    * @param value Specify a value for the faceShapeType parameter.
    */
    - (void)setFaceShape:(kQueenBeautyFaceShapeType)faceShapeType
    value:(float)value;
    # pragma mark - "Methods for makeup"
    /**
    * @brief Specify a makeup type and the paths of makeup materials. Before you specify the makeup type, set the kQueenBeautyTypeMakeup parameter.
    * @param makeupType Specify a makeup type.
    * @param imagePaths Specify the paths of makeup materials.
    * @param blend Specify mixed makeup.
    */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend;
    /**
    * @brief Specify a makeup type and the paths of makeup materials.
    * @param makeupType Specify a makeup type.
    * @param imagePaths Specify the paths of makeup materials.
    * @param blend Specify mixed makeup.
    * @param fps Specify the frame rate.
    */
    - (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
    paths:(NSArray<NSString *> *)imagePaths
    blendType:(kQueenBeautyBlend)blend fps:(int)fps;
    /**
    * @brief Configure the transparency for a makeup type. You can specify the gender.
    * @param makeupType Specify a makeup type.
    * @param isFeMale Specify whether the gender is female. YES: female. NO: male.
    * @param alpha Specify the transparency for the makeup.
    */
    - (void)setMakeupAlphaWithType:(kQueenBeautyMakeupType)makeupType
    female:(BOOL)isFeMale alpha:(float)alpha;
    /**
    * @brief Specify the type of mixed makeup.
    * @param makeupType Specify a makeup type.
    * @param blend Specify mixed makeup.
    */
    - (void)setMakeupBlendWithType:(kQueenBeautyMakeupType)makeupType
    blendType:(kQueenBeautyBlend)blend;
    /**
    * @brief Clear all makeup effects.
    */
    - (void)resetAllMakeupType;
  5. Specify an image for background stream ingest.

    Push SDK for iOS allows you to ingest images when your app is switched to the background or the bitrate is low. This improves user experience. When your app is switched to the background, video stream ingest is paused. In this case, only audio streams are ingested. You can also specify an image that you want to ingest. For example, you can ingest an image in which a message such as The streamer will be back soon. is displayed to notify the viewers. Sample code:

    config.pauseImg = [UIImage imageNamed:@"image.png"]; // Specify the image for stream ingest when your app is switched to the background.

    You can specify a static image for stream ingest in poor network conditions. If the bitrate is low, the image that you specify is ingested to prevent stuttering. Sample code:

    config.networkPoorImg = [UIImage imageNamed:@"image.png"]; // Specify the image for stream ingest in poor network conditions.
  6. Configure watermarks.

    Push SDK for iOS allows you to add one or more watermarks. Watermarks must be in the PNG format. Sample code:

    NSString *watermarkBundlePath = [[NSBundle mainBundle] pathForResource:
    [NSString stringWithFormat:@"watermark"] ofType:@"png"]; // Specify the path of the watermark.
    [config addWatermarkWithPath: watermarkBundlePath
     watermarkCoordX:0.1
     watermarkCoordY:0.1
     watermarkWidth:0.3]; // Add the watermark.
    Note
    • The values of the watermarkCoordX, watermarkCoordY, and watermarkWidth parameters are relative. For example, a value of 0.1 for the watermarkCoordX parameter indicates that the left edge of the watermark is at the 10% position on the x-axis of the stream ingest screen. Therefore, if the stream resolution is 540 × 960, the value of the watermarkCoordX parameter is 54.

    • The height of the watermark is scaled based on the width and height of the source image and the input width value of the watermark.

    • If you want to add a text watermark, you can convert the text into an image and call the addWatermarkWithPath method to add the image as a watermark.

    • To ensure the clarity and smoothness of the edges of the watermark, we recommend that you use a source image that has the same size as the output watermark. For example, if the resolution of the output video is 544 × 940 and the width of the watermark is 0.1f, we recommend that you use the following width for the source image: 544 × 0.1f = 54.4.

  7. Specify the preview mode.

    Push SDK for iOS supports the following preview modes. The preview mode does not affect stream ingest.

    • ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: In this mode, the video fills the entire preview window. If the aspect ratio of the video is not the same as the aspect ratio of the preview window, deformation occurs during preview.

    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: In this mode, the original aspect ratio of the video is used during preview. If the aspect ratio of the video is not the same as the aspect ratio of the preview window, black bars appear on the preview window. This is the default preview mode.

    • ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: In this mode, the video is cropped to fit the preview window during preview. If the aspect ratio of the video is not the same as the aspect ratio of the preview window, the video is cropped.

    Sample code:

    mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);
    Note
    • You can specify the preview mode in the AlivcLivePushConfig class. You can also call the setpreviewDisplayMode method to specify the preview mode during preview and stream ingest.

    • This configuration takes effect only for preview. The actual resolution of the output video follows the resolution that is configured in the AlivcLivePushConfig class. The preview mode does not affect the actual resolution. You can select a preview mode to adapt to different screen sizes of mobile phones.

Use Push SDK for iOS to ingest streams (basic edition)

AlivcLivePusher is the core class of Push SDK for iOS. This class provides parameters for camera preview, stream ingest callbacks, and stream ingest management. You can also use this class to modify parameters during stream ingest. This section describes how to use the key methods for stream ingest.

  1. Initialize the AlivcLivePusher class.

    After you configure stream ingest parameters, call the initWithConfig method to initialize the class. Sample code:

    self.livePusher = [[AlivcLivePusher alloc] initWithConfig:config];
    Note

    The AlivcLivePusher class does not support multiple instances. Therefore, you must call the destroy method once for each call of the initWithConfig method.

  2. Register stream ingest callbacks.

    The following stream ingest callbacks are supported:

    • Info: the callbacks that are used for notifications and status detection.

    • Error: the callbacks that are returned when errors occur.

    • Network: the callbacks that are related to network.

    You can use the delegate method to receive the specified callbacks. Sample code:

    [self.livePusher setInfoDelegate:self];
    [self.livePusher setErrorDelegate:self];
    [self.livePusher setNetworkDelegate:self];
  3. Start preview.

    You can start preview after you initialize the livePusher object. Create a view instance in the UIView class or use a class that inherits from UIView to start preview. Sample code:

    [self.livePusher startPreview:self.view];
  4. Start stream ingest.

    You can start stream ingest only after the preview succeeds. Therefore, you must add the following code to specify the onPreviewStarted callback for AlivcLivePusherInfoDelegate:

    [self.livePusher startPushWithURL:@"Ingest URL for testing (rtmp://......)"];
    Note
    • Push SDK for iOS allows you to call startPushWithURLAsync to start stream ingest in an asynchronous manner.

    • Push SDK for iOS supports ingest URLs in the RTMP format and RTS format. Compared with ingest URLs in the RTMP format, ingest URLs in the RTS format provide improved stability and perform better in poor network conditions. We recommend that you use ingest URLs in the RTS format. For more information about the comparison between ingest URLs in the RTMP format and RTS format and how to ingest streams over RTS, see Use Push SDK to ingest streams over RTS.

    • After you use a valid ingest URL to start stream ingest, you can use a player such as ApsaraVideo Player, FFplay, or VLC to test stream pulling. For information about how to obtain a streaming URL, see Generate ingest and streaming URLs.

    • The SDK provides built-in configurations for the background mode. When you switch the app to the background, the video is paused at the last frame. The app continues playing the audio in the background. Open your project in Xcode. On the Signing & Capabilities tab, select Audio, AirPlay, and Picture in Picture in the Background Mode section. This ensures that audio can be collected when the app is switched to the background. For more information, see Switch the app to the background or answer a phone call.

  5. Complete other stream ingest configurations.

    Push SDK for iOS allows you to manage stream ingest. For example, you can start, stop, restart, pause, and resume stream ingest, stop preview, and destroy stream ingest instances. You can add buttons to perform these operations. Sample code:

    /* Call the pause method to switch from camera ingest to image ingest after you configure the pauseImage parameter. The audio stream continues to be ingested. */
    [self.livePusher pause];
    /* Switch from image ingest to camera ingest. The audio stream continues to be ingested. */
    [self.livePusher resume];
    /* Stop a stream that is being ingested. */
    [self.livePusher stopPush];
    /* Stop preview. However, this operation does not take effect for a stream that is being ingested. When preview is stopped, the preview window is frozen at the last frame. */
    [self.livePusher stopPreview];
    /* Restart stream ingest when the stream is being ingested or when an error callback is received. If an error occurs, you can use only this method or the reconnectPushAsync method to restart stream ingest. You can also call the destroy method to destroy the stream ingest instance. Then, you can restart all ALivcLivePusher resources that are required for operations, such as preview and stream ingest. */
    [self.livePusher restartPush];
    /* Call this method when the stream is being ingested or when an error callback related to AlivcLivePusherNetworkDelegate is received. If an error occurs, you can use only this method or the restartPush method to restart stream ingest. You can also call the destroy method to destroy the stream ingest instance. Then, you can restart stream ingest over RTMP. */
    [self.livePusher reconnectPushAsync];
    /* Destroy the stream ingest instance. After you call this method, stream ingest and preview are stopped, and the preview window is removed. All resources related to AlivcLivePusher are destroyed. */
    [self.livePusher destory];
    self.livePusher = nil;
    /* Query the status of stream ingest. */
    AlivcLivePushStatus status = [self.livePusher getLiveStatus];
  6. Adjust the level of retouching effects in real time.

    Push SDK for iOS allows you to adjust the level of retouching effects in real time during stream ingest. You can enable the retouching feature and set the parameters for the retouching feature based on your business requirements. This feature is provided by Queen SDK. Sample code:

    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinBuffing enable:YES];
    [_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinWhiting enable:YES];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsWhitening value:0.8f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSharpen value:0.6f];
    [_queenEngine setQueenBeautyParams:kQueenBeautyParamsSkinBuffing value:0.6];
  7. Manage background music.

    Push SDK for iOS allows you to manage background music. For example, you can configure features such as the playback of background music, audio mixing, noise reduction, in-ear monitoring, and muting. You can call relevant methods only after the preview starts. Sample code:

    /* Start the playback of background music. */
    [self.livePusher startBGMWithMusicPathAsync:musicPath];
    /* Stop the playback of background music. If you want to change the background music, call the method that is used to start the playback of background music. You do not need to stop the playback of the current background music. */
    [self.livePusher stopBGMAsync];
    /* Pause the playback of background music. You can call this method only after the playback of background music starts. */
    [self.livePusher pauseBGM];
    /* Resume the playback of background music. You can call this method only after the playback of background music is paused. */
    [self.livePusher resumeBGM];
    /* Enable looping. */
    [self.livePusher setBGMLoop:true];
    /* Configure noise reduction. If you enable noise reduction, the system filters out non-vocal parts from the collected audio. This feature may slightly reduce the volume of the human voice. We recommend that you allow your users to determine whether to enable this feature. By default, this feature is disabled. */
    [self.livePusher setAudioDenoise:true];
    /* Configure in-ear monitoring. In-ear monitoring is suitable for scenarios that involve karaoke. If you enable in-ear monitoring, you can hear your voice on your earphones during streaming. If you disable in-ear monitoring, you cannot hear your voice on your earphones during streaming. This feature does not take effect if no earphones are detected. */
    [self.livePusher setBGMEarsBack:true];
    /* Configure audio mixing. You can adjust the volumes of the background music and human voice. */
    [self.livePusher setBGMVolume:50]; // Specify the volume of the background music.
    [self.livePusher setCaptureVolume:50]; // Specify the volume of the human voice.
    /* Configure muting. If you enable this feature, the background music and human voice are muted. To separately mute the background music or human voice, call the method that is used to configure audio mixing. */
    [self.livePusher setMute:isMute?true:false];
  8. Configure stream ingest snapshots.

    Push SDK for iOS allows you to capture snapshots of local video streams. Sample code:

    /* Configure the callback for snapshot capture. */
    [self.livePushersetSnapshotDelegate:self];
    /* Call the snapshot method. */
    [self.livePushersnapshot:1interval:1];
  9. Perform camera-related operations.

    You can perform camera-related operations only after you start preview in the streaming, paused, or reconnecting state. For example, you can switch between the front and rear cameras and configure the flash, focal length, zooming, and mirroring mode. If you do not start preview, the following methods are invalid. Sample code:

    /* Switch between the front and rear cameras. */
    [self.livePusher switchCamera];
    /* Enable or disable flash. You cannot enable flash for the front camera. */
    [self.livePusher setFlash:false]; 
    /* Adjust the focal length to zoom in or out. If you set the value to a positive number, the system increases the focal length. If you set the value to a negative number, the system decreases the focal length. */
    CGFloat max = [_livePusher getMaxZoom];
    [self.livePusher setZoom:MIN(1.0, max)]; 
    /* Configure manual focus. To configure manual focus, you must set the following parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus parameter specifies whether to enable autofocus. The autoFocus parameter takes effect only for this call. Whether autofocus is enabled otherwise depends on the setAutoFocus method. */
    [self.livePusher focusCameraAtAdjustedPoint:CGPointMake(50, 50) autoFocus:true];
    /* Configure autofocus. */
    [self.livePusher setAutoFocus:false];
    /* Configure the mirroring mode. The methods for mirroring are PushMirror and PreviewMirror. The PushMirror method is used to enable the mirroring mode for stream ingest. The PreviewMirror method is used to enable the mirroring mode for preview. PushMirror takes effect only for stream playback, and PreviewMirror takes effect only for preview. */
    [self.livePusher setPushMirror:false];
    [self.livePusher setPreviewMirror:false];
  10. Enable the live quiz feature.

    To enable the live quiz feature, you must insert supplemental enhancement information (SEI) into live streams and parse the SEI by using the player. Push SDK for iOS provides a method to insert SEI. You can call this method only during stream ingest. Sample code:

    /*
    sendMessage: Specify the SEI message to be inserted into the live stream. The SEI message is in the JSON format. ApsaraVideo Player SDK can receive and parse the SEI message. 
    repeatCount: Specify the number of frames into which the SEI message is inserted. To ensure that the SEI message is not dropped for a frame, you must specify the number of repetitions. For example, a value of 100 indicates that the SEI message is inserted into the subsequent 100 frames. ApsaraVideo Player SDK removes duplicate SEI messages. 
    delayTime: Specify the period of time to wait before the frames are sent. Unit: milliseconds. 
    KeyFrameOnly: Specify whether to send only keyframes. 
    */
    [self.livePusher sendMessage:@"Information about questions" repeatCount:100 delayTime:0 KeyFrameOnly:false];
  11. Configure external audio and video sources.

    Push SDK for iOS allows you to import external audio and video sources for stream ingest. For example, you can ingest an audio or video file.

    1. Configure the input of external audio and video sources in stream ingest settings.

      Sample code:

      config.externMainStream = true;// Enable the input of external streams.
      config.externVideoFormat = AlivcLivePushVideoFormatYUVNV21; // Specify the color format for video data. In this example, the color format is YUVNV21. You can also use other formats based on your business requirements. 
      config.externAudioFormat = AlivcLivePushAudioFormatS16;// Specify the bit depth format for audio data. In this example, the bit depth format is S16. You can also use other formats based on your business requirements.
    2. Import external video data.

      Sample code:

      /* The sendVideoData method supports only native YUV and RGB buffer data. You can use the sendVideoData method to transmit video data such as the buffer, length, width, height, timestamp, and rotation angle. */
      [self.livePusher sendVideoData:yuvData width:720 height:1280 size:dataSize pts:nowTime rotation:0];
      /* For CMSampleBufferRef video data, you can use the sendVideoSampleBuffer method. */
      [self.livePusher sendVideoSampleBuffer:sampleBuffer]
      /* You can also convert CMSampleBufferRef video data to native buffer before you use the sendVideoData method. The following sample code provides an example on how to convert video data. */
      // Query the length of sample buffer.
      - (int) getVideoSampleBufferSize:(CMSampleBufferRef)sampleBuffer {
      if(!sampleBuffer) {
       return 0;
      }
      int size = 0;
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
       int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
       for(int i=0; i<count; i++) {
       int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
       int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
       size += stride*height;
       }
      }else {
       int height = (int)CVPixelBufferGetHeight(pixelBuffer);
       int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
       size += stride*height;
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return size;
      }
      // Convert video sample buffer to native buffer.
      - (int) convertVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer toNativeBuffer:(void*)nativeBuffer
      {
      if(!sampleBuffer || !nativeBuffer) {
       return -1;
      }
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(pixelBuffer, 0);
      int size = 0;
      if(CVPixelBufferIsPlanar(pixelBuffer)) {
       int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
       for(int i=0; i<count; i++) {
       int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
       int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
       void *buffer = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, i);
       int8_t *dstPos = (int8_t*)nativeBuffer + size;
       memcpy(dstPos, buffer, stride*height);
       size += stride*height;
       }
      }else {
       int height = (int)CVPixelBufferGetHeight(pixelBuffer);
       int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
       void *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);
       size += stride*height;
       memcpy(nativeBuffer, buffer, size);
      }
      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
      return 0;
      }
    3. Import external audio data.

      Sample code:

      /* The sendPCMData method supports only native PCM buffer data. You can use this method to transmit audio data such as the buffer, length, and timestamp. */
      [self.livePusher sendPCMData:pcmData size:size pts:nowTime];
  12. Manage animated stickers.

    Push SDK for iOS allows you to add animated stickers to live streams. Animated stickers can be used as watermarks.

    1. To create an animated sticker, you can use and modify the materials provided in the demo. Create a sequence frame image for the animated sticker. Configure the following parameters in the config.json file:

      "du": 2.04,// Specify the duration for which each time the animated sticker is played.
      "n": "qizi",// Specify the name of the animated sticker. Make sure that the name of the folder in which the animated sticker is stored is the same as the name of the sticker, and the name of each frame ends with a sequence number. Example: qizi0.
      "c": 68.0,// Specify the number of frames, which is the number of images included in an animated sticker.
      "kerneframe": 51,// Specify an image as the keyframe. For example, specify the 51st frame as the keyframe in the demo. Make sure that the specified frame exists.
      "frameArry": [
          {"time":0,"pic":0},
          {"time":0.03,"pic":1},
          {"time":0.06,"pic":2},
          ],
      // Configure the parameters of the animated sticker. In the preceding settings, "time":0,"pic":0 indicates that the first frame qizi0 is displayed 0 seconds after the animated sticker is played. "time":0.03,"pic":1 indicates that the second frame qizi1 is displayed 0.03 seconds after the animated sticker is played. Configure all frames in the animated sticker in the same manner.
      Note

      For other parameters, you can retain the values in the config.json file provided in the demo.

    2. Add an animated sticker.

      Sample code:

      /**
      * Add an animated sticker.
      * @param path Specify the path of the animated sticker. The path must include the config.json file.
      * @param x Specify the starting position on the x-axis. Valid values: 0 to 1.0f.
      * @param y Specify the starting position on the y-axis. Valid values: 0 to 1.0f.
      * @param w Specify the width. Valid values: 0 to 1.0f.
      * @param h Specify the height. Valid values: 0 to 1.0f.
      * @return id Specify the ID of the sticker. You must specify the ID of a sticker if you want to remove the sticker.
      */
      [self.livePusher addDynamicWaterMarkImageDataWithPath: "Path of the sticker" x:0.2f y:0.2f w:0.2f h:0.2f];
    3. Remove an animated sticker.

      Sample code:

      [self.livePusher removeDynamicWaterMark:id];
  13. Configure the debugging tool.

    DebugView is a UI debugging tool that allows you to diagnose issues. DebugView provides a window that is draggable and always displayed at the top of the view for debugging. You can use DebugView to query the logs of stream ingest, monitor the metrics for stream ingest performance in real time, and generate line charts for main performance metrics.

    Note

    In released versions of your app, do not call the method that is used to invoke DebugView.

    Sample code:

    [AlivcLivePusher showDebugView]; // Open DebugView.
  14. Call other methods.

    /* In custom mode, you can change the minimum bitrate and target bitrate in real time. */
    [self.livePusher setTargetVideoBitrate:800];
    [self.livePusher setMinVideoBitrate:200]
    /* Query whether the stream is being ingested. */
    BOOL isPushing = [self.livePusher isPushing]; 
    /* Query the ingest URL. */
    NSString *pushURLString = [self.livePusher getPushURL];
    /* Query the stream ingest performance debugging information. For more information about the parameters of stream ingest performance, see the API references or comments in the code. */
    AlivcLivePushStatsInfo *info = [self.livePusher getLivePushStatusInfo];
    /* Query the SDK version number. */
    NSString *sdkVersion = [self.livePusher getSDKVersion];
    /* Specify the log level to filter debugging information. */
    [self.livePusher setLogLevel:(AlivcLivePushLogLevelDebug)];

Configure stream ingest for screen recordings (basic edition)

ReplayKit is a framework that is introduced by iOS 9, which allows you to record the screen. In iOS 10, ReplayKit allows you to use third-party app extensions to stream screen recordings. In iOS 10 or later, you can use Push SDK for iOS together with app extensions to stream screen recordings.

To ensure system stability, iOS provides fewer resources for the app extension that captures screen recordings. If the app extension occupies an excessive amount of memory, the app extension is terminated and exited. To address memory limits on app extensions, Push SDK for iOS designates the recording process to the extension app and the host app. The extension app is used to capture screen recordings and send the captured content to the host app by using inter-process communication. The host app is used to create an AlivcLivePusher object, which pushes the screen recordings to clients. The entire process of stream ingest is complete in the host app. You can also configure audio collection and the transmission of collected audio in the host app. The extension app is used only to capture screen recordings.

Important

In the demo of Push SDK for iOS, the inter-process communication between the extension app and the host app is enabled by using an app group. This part of logic is encapsulated in AlivcLibReplayKitExt.framework.

To stream screen recordings in iOS, the system creates the extension app to capture screen recordings. The following procedure shows how to stream screen recordings in iOS:

  1. Create an app group.

    Log on to the Apple Developer portal and perform the following operations:

    1. On the Certificates, Identifiers & Profiles page, register an app group. For more information, see Register an app group.

    2. On the Identifiers page, click App IDs and select your app ID. Click the app ID and enable the app group. You must perform the preceding operations on the app IDs of both the host app and the extension app. For more information, see Enable app capabilities.

    3. Download the regenerated provisioning profiles and reconfigure the provisioning profiles in Xcode.

    Then, the extension app can communicate with the host app.

    Note

    After you create an app group, you must save the value of the app group identifier. This value is used in subsequent steps.

  2. Create an extension app.

    In the demo, Push SDK for iOS provides the AlivcLiveBroadcast and AlivcLiveBroadcastSetupUI app extensions to stream screen recordings. To create an extension app in an app, perform the following steps:

    1. Choose New > Target in the current project. Then, click Broadcast Upload Extension, as shown in the following figure.录屏2

    2. Set the Product Name parameter, select Include UI Extension, and then click Finish, as shown in the following figure.录屏步骤1 After that, a live streaming extension and a live streaming UI are created.

    3. Configure the Info.plist extension. By default, Xcode creates a header file and a source file named SampleHandler in the target that you created, as shown in the following figure.录屏3

      Drag AlivcLibReplayKitExt.framework to the project. This way, the target extension depends on AlivcLibReplayKitExt.framework. 1Replace the code in SampleHandler.m with the following code. You must replace kAPPGroup in the code with the app group identifier that was created in the preceding step. Sample code:

      
      #import "SampleHandler.h"
      #import <AlivcLibReplayKitExt/AlivcLibReplayKitExt.h>
      
      @implementation SampleHandler
      
      - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *>
      *)setupInfo {
      
       //User has requested to start the broadcast. Setup info from the UI extension can
      be supplied but optional.
       [[AlivcReplayKitExt sharedInstance] setAppGroup:kAPPGROUP];
      }
      
      - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
       if (sampleBufferType != RPSampleBufferTypeAudioMic) {
       // The audio is collected and sent by the host app.
       [[AlivcReplayKitExt sharedInstance] sendSampleBuffer:sampleBuffer withType:sampleBufferType];
       }
      }
      
      - (void)broadcastFinished {
      
       [[AlivcReplayKitExt sharedInstance] finishBroadcast];
      }
      @end
      
                                          

    The target of the Broadcast Upload Extension is created in your project, and AlivcLibReplayKitExt.framework is customized for the screen recording extension.

  3. Integrate Push SDK for iOS into the host app.

    Create the AlivcLivePushConfig and AlivcLivePusher objects in the host app. Set the externMainStream parameter to true. Set the audioFromExternal parameter to false, which specifies that audio collection continues to be processed in the SDK. Call the startScreenCapture method to receive the screen recording data from the extension app. Then, start or stop stream ingest. Perform the following operations to integrate Push SDK for iOS into the host app:

    1. Add the AlivcLivePusher.framework, AlivcLibRtmp.framework, RtsSDK.framework, and AlivcLibReplayKitExt.framework dependencies to the host app.3

    2. Initialize Push SDK for iOS and configure an external video source for stream ingest.

      Set the externMainStream parameter to true, the externVideoFormat parameter to AlivcLivePushVideoFormatYUV420P, and the audioFromExternal parameter to false. Set other stream ingest parameters as needed. Sample code:

             self.pushConfig.externMainStream = true;
       self.pushConfig.externVideoFormat = AlivcLivePushVideoFormatYUV420P;
       self.pushConfig.audioSampleRate = 44100;
       self.pushConfig.audioChannel = 2;
       self.pushConfig.audioFromExternal = false;
       self.pushConfig.videoEncoderMode = AlivcLivePushVideoEncoderModeSoft;
       self.pushConfig.qualityMode = AlivcLivePushQualityModeCustom;
       self.pushConfig.targetVideoBitrate = 2500;
       self.pushConfig.minVideoBitrate = 2000;
       self.pushConfig.initialVideoBitrate = 2000;
       self.livePusher = [[AlivcLivePusher alloc] initWithConfig:self.pushConfig];
      
                                      
    3. Call the following methods of the AlivcLivePusher class to use live streaming features:

      • Receive screen recording data.

        Replace kAPPGroup in the code with the app group identifier that was created in the preceding step. Sample code:

        [self.livePusher startScreenCapture:kAPPGROUP];
      • Start stream ingest.

        Sample code:

        [self.livePusher startPushWithURL:self.pushUrl]
      • Stop stream ingest.

        Sample code:

        
        [self.livePusher stopPush];
        [self.livePusher destory];
        self.livePusher = nil;

Configure co-streaming settings (interactive edition)

The interactive edition of Push SDK V4.4.4 or later provides the RTC-based co-streaming feature. The interactive edition of Push SDK V4.4.5 or later provides the RTC-based battle feature. Streamers and viewers can interact with each other at an ultra-low latency of less than 300 ms by using the interactive edition of Push SDK. For more information about interactive streaming, see Developer guide to co-streaming and Developer guide to battles.

Usage notes

  • Package size: After the SDK is integrated, the size of the IPA package increases by about 3 MB.

  • Compatible mobile phones:

    iPhone 7 or later, with iOS 8.0 or a later version.