This topic introduces how to use Push SDK for iOS with examples, covering its core interfaces and basic workflow.
Features
Supports stream ingest over Real-Time Messaging Protocol (RTMP).
Supports RTS stream ingest and pulling based on Real-Time Communication (RTC).
Supports co-streaming and battles.
Adopts H.264 for video encoding and AAC for audio encoding.
Supports custom configurations for features such as bitrate control, resolution, and display mode.
Supports various camera operations.
Supports real-time retouching and custom retouching effects.
Allows you to add and remove animated stickers as watermarks.
Allows you to stream screen recordings.
Supports external audio and video inputs in different formats such as YUV and pulse-code modulation (PCM).
Supports mixing of multiple streams.
Supports ingest of audio-only and video-only streams and stream ingest in the background.
Supports background music.
Supports video snapshot capture.
Supports automatic reconnection and error handling.
Supports Automatic Gain Control (AGC), Automatic Noise Reduction (ANR), and Acoustic Echo Cancellation (AEC) algorithms.
Allows you to switch between the software and hardware encoding modes for video files. This improves the stability of the encoding module.
Limitations
Take note of the following limits before you use Push SDK for iOS:
You must configure screen orientation before stream ingest. You cannot rotate the screen during live streaming.
You must disable auto screen rotation for stream ingest in landscape mode.
In hardware encoding mode, the value of the output resolution must be a multiple of 16 to be compatible with the encoder. For example, if you set the resolution to 540p, the output resolution is 544 × 960. You must scale the screen size of the player based on the output resolution to prevent black bars.
API Reference
Procedure
Feature usage
Register the SDK
You must register the SDK before stream ingestion. Otherwise, you cannot use the SDK.
An SDK license is required for registration. To apply for and configure a license, see Integrate a Push SDK license.
Call the following methods before you use Push SDK for iOS:
[AlivcLiveBase registerSDK];AlivcLiveBase allows you to configure log levels, set local log paths, and retrieve the SDK version.
By implementing the onLicenceCheck method through AlivcLiveBase#setObserver, you can asynchronously verify license configuration.
Configure stream ingest parameters
In ViewController where you need the pusher, import the header file: #import <AlivcLivePusher/AlivcLivePusher.h>.
All basic parameters have default values. We recommend that you use the default values.
AlivcLivePushConfig *config = [[AlivcLivePushConfig alloc] init];// Initialize the class for stream ingest settings. You can also call initWithResolution to initialize.
config.resolution = AlivcLivePushResolution540P;// The default value is 540P. The maximum resolution supported is 720P.
config.fps = AlivcLivePushFPS20; // We recommend that you set the frame rate to 20 fps.
config.enableAutoBitrate = true; // Enable bitrate control. The default value is true.
config.videoEncodeGop = AlivcLivePushVideoEncodeGOP_2;// The default value is 2. The longer the interval between key frames, the higher the latency. We recommend that you set this parameter to a value from 1 to 2.
config.connectRetryInterval = 2000; // The reconnection interval. Unit: milliseconds. The default interval is 2 seconds. It must be at least 1 second. We recommend that you use the default value.
config.previewMirror = false; // The default value is false. We recommend that you use the default value.
config.orientation = AlivcLivePushOrientationPortrait; // The default value is portrait. In landscape mode, you can configure whether the home button is positioned on the left or right.Considering mobile device performance and network bandwidth requirements, we recommend that you set the resolution to 540P. Most mainstream live streaming apps use 540P.
If you disable adaptive bitrate, the bitrate is fixed at the initial bitrate and will not automatically adjust between the target and minimum bitrates. In poor network conditions, this may cause playback stuttering.
Ingest a camera stream
Initialize the SDK.
After you configure stream ingest parameters, call the initWithConfig method to initialize the SDK. Sample code:
self.livePusher = [[AlivcLivePusher alloc] initWithConfig:config];NoteAlivcLivePusher does not support multiple instances. Therefore, each init call must correspond to a destroy call.
Register stream ingest callbacks.
The following stream ingest callbacks are supported:
Info: the callbacks that are used for notifications and status detection.
Error: the callbacks that are returned when errors occur.
Network: the callbacks that are related to network.
Assign the corresponding delegates to receive these callbacks:
[self.livePusher setInfoDelegate:self]; [self.livePusher setErrorDelegate:self]; [self.livePusher setNetworkDelegate:self];Start preview.
To preview the camera feeds, specify the view (a subclass of UIView) for the camera. Sample code:
[self.livePusher startPreview:self.view];Start stream ingest.
You can start stream ingest only after the preview starts. Implement the AlivcLivePusherInfoDelegate method onPreviewStarted and, inside that callback, invoke startPushWithURL:
[self.livePusher startPushWithURL:@"Test ingest URL (rtmp://......)"];NoteRTMP and RTS (artc://) ingest URLs are supported. To generate ingest URLs, see Generate ingest and streaming URLs.
ApsaraVideo Live does not support ingesting multiple streams to the same URL simultaneously. The second ingest request will be rejected.
Camera-related methods
You can perform camera-related operations in the streaming, paused, or reconnecting state. For example, you can switch between the front and rear cameras and configure the flash, focal length, zooming, and mirroring mode. These methods take effect only when the preview starts.
/* Switch between the front and rear cameras*/
[self.livePusher switchCamera];
/* Enable or disable flash. You cannot enable flash for the front camera. */
[self.livePusher setFlash:false];
/* Adjust the focal length to zoom in or out. If you set the value to a positive number, the system increases the focal length. If you set the value to a negative number, the system decreases the focal length.*/
CGFloat max = [_livePusher getMaxZoom];
[self.livePusher setZoom:MIN(1.0, max)];
/* Configure manual focus. To enable manual focus, you must set the following parameters: point and autoFocus. The point parameter specifies the coordinates of the focus point. The autoFocus parameter specifies whether to enable autofocus. The autoFocus parameter takes effect only for this call.*/
[self.livePusher focusCameraAtAdjustedPoint:CGPointMake(50, 50) autoFocus:true];
/* Specify whether to enable autofocus.*/
[self.livePusher setAutoFocus:false];
/* Configure mirroring. The methods for mirroring are PushMirror and PreviewMirror. PushMirror takes effect only for stream playback, and PreviewMirror takes effect only for preview. */
[self.livePusher setPushMirror:false];
[self.livePusher setPreviewMirror:false];Ingest-related methods
Push SDK supports stream ingest control, such as start, stop, pause, and resume ingestion, stop preview, and dispose stream ingest objects. You can add buttons to perform these operations.
/* Switch from camera stream ingest to static image stream ingest. To do this, you must first set the pauseImage parameter. Audio stream ingest continues.*/
[self.livePusher pause];
/* Switch from static image stream ingest to camera stream ingest. Audio stream ingest continues.*/
[self.livePusher resume];
/* Stop a stream that is being ingested.*/
[self.livePusher stopPush];
/* Stop preview. This operation does not take effect for a stream that is being ingested. When the preview is stopped, the preview window is frozen at the last frame. */
[self.livePusher stopPreview];
/* Restart stream ingest when the stream is being ingested or when an error callback is received. All resources in ALivcLivePusher are reinitialized, including preview and ingestion. If an error occurs, you can call this method or the reconnectPushAsync method to restart stream ingest. You can also call the destroy method to destroy the stream ingest instance. */
[self.livePusher restartPush];
/* Reconnect and repush the RTMP stream during streaming or network error state (errors related to AlivcLivePusherNetworkDelegate). In the error state, you can also call destroy to dispose the instance.*/
[self.livePusher reconnectPushAsync];
/* Dispose the stream ingest instance. After you call this method, stream ingest and preview are stopped, and the preview window is removed. All resources related to AlivcLivePusher are released. */
[self.livePusher destory];
self.livePusher = nil;
/* Query the stream ingest status.*/
AlivcLivePushStatus status = [self.livePusher getLiveStatus];Ingest a screen sharing stream
The ReplayKit framework, supported by iOS 9 and later, allows iOS users to record video from the screen, and audio from the app and microphone. On iOS 10 and later, ReplayKit supports calling third-party app extensions to live stream screen content. You can use Push SDK for iOS together with app extensions to live stream screen content.
iOS allocates limited resources to app extensions to ensure smooth system operation. If an app extension uses too much memory, the system forcibly terminates the extension. To address memory limits on app extensions, Push SDK for iOS designates the recording process to Extension App and Host App. Extension App captures screen content and sends it to Host App through inter-process communication. Host App creates the AlivcLivePusher engine and pushes the screen data to the remote clients. The entire process of stream ingest is complete in Host App. You can also configure audio collection and transmission in Host App, then Extension App is used only to capture screen content.
The demo of Push SDK for iOS uses App Group to implement inter-process communication between Extension App and Host App. This logic is encapsulated in AlivcLibReplayKitExt.framework.
To ingest a screen sharing stream on iOS, Extension App is created by the system when screen recording is required, and is responsible for receiving screen content captured by the system.
Perform the following steps:
Create an App Group.
Log on to Apple Developer and perform the following operations:
On the Certificates, IDs & Profiles page, register an App Group. For more information, see Register an App Group.
Go back to the Identifier page, select App IDs, and then click your App ID. You must configure the App IDs of Host App and Extension App in the same way. For more information, see Enable App Group.
Download the corresponding Provisioning Profile again and configure it in XCode.
After you complete the preceding operations, Extension App can communicate with Host App.
NoteAfter you create an App Group, save the App Group Identifier. You will use this value in subsequent steps.
Create Extension App.
In the demo, Push SDK for iOS provides the AlivcLiveBroadcast and AlivcLiveBroadcastSetupUI extensions to live stream screen content. To create Extension App, perform the following steps:
In your project, choose and select Broadcast Upload Extension, as shown in the following figure:

Modify Product Name, select Include UI Extension, and click Finish to create a broadcast extension and a broadcast UI, as shown in the following figure:

Configure the Info.plist file of the broadcast extension. In the new target, Xcode creates a header file and a source file named SampleHandler by default, as shown in the following figure:

Drag the
AlivcLibReplayKitExt.frameworkframework to your project to make the extension target dependent on the framework.
Replace the code in SampleHandler.m with the following code. You must replace KAPP Group in the code with the App Group Identifier that you created in the first step. Sample code:#import "SampleHandler.h" #import <AlivcLibReplayKitExt/AlivcLibReplayKitExt.h> @implementation SampleHandler - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo { //User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional. [[AlivcReplayKitExt sharedInstance] setAppGroup:kAPPGROUP]; } - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType { if (sampleBufferType != RPSampleBufferTypeAudioMic) { // The audio is collected and sent by Host App. [[AlivcReplayKitExt sharedInstance] sendSampleBuffer:sampleBuffer withType:sampleBufferType]; } } - (void)broadcastFinished { [[AlivcReplayKitExt sharedInstance] finishBroadcast]; } @end
In your project, create a Broadcast Upload Extension target and integrate the
AlivcLibReplayKitExt.frameworkframework that is customized for the screen recording extension module into the extension target.Integrate the SDK into Host App.
Create AlivcLivePushConfig and AlivcLivePusher objects in Host App. Set ExternMainStream to True and AudioFromExternal to False. This configuration indicates that audio is still collected by the SDK. Call StartScreenCapture to start receiving screen data from Extension App. Then, you can start and stop stream ingest.
Add the AlivcLivePusher.framework, AlivcLibRtmp.framework, RtsSDK.framework, and AlivcLibReplayKitExt.framework dependencies to Host App.

Initialize Push SDK for iOS and configure the SDK to use an external video source.
Set ExternMainStream to True, ExternVideoFormat to AlivcLivePushVideoFormatYUV420P, and AudioFromExternal to False. Configure other stream ingest parameters. Sample code:
self.pushConfig.externMainStream = true; self.pushConfig.externVideoFormat = AlivcLivePushVideoFormatYUV420P; self.pushConfig.audioSampleRate = 44100; self.pushConfig.audioChannel = 2; self.pushConfig.audioFromExternal = false; self.pushConfig.videoEncoderMode = AlivcLivePushVideoEncoderModeSoft; self.pushConfig.qualityMode = AlivcLivePushQualityModeCustom; self.pushConfig.targetVideoBitrate = 2500; self.pushConfig.minVideoBitrate = 2000; self.pushConfig.initialVideoBitrate = 2000; self.livePusher = [[AlivcLivePusher alloc] initWithConfig:self.pushConfig];Use AlivcLivePusher to implement live streaming features. Call the following functions:
Start receiving screen data.
Replace
kAPPGroupin the code with theApp Group Identifierthat you created. Sample code:[self.livePusher startScreenCapture:kAPPGROUP];Start stream ingest.
Sample code:
[self.livePusher startPushWithURL:self.pushUrl]Stop stream ingest.
Sample code:
[self.livePusher stopPush]; [self.livePusher destory]; self.livePusher = nil;
Configure preview display mode
Push SDK for iOS supports the following preview modes. The preview mode does not affect stream ingest.
ALIVC_LIVE_PUSHER_PREVIEW_SCALE_FILL: The video fills the preview window. If the aspect ratios of the video and preview window are inconsistent, video deformation occurs.
ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT: The aspect ratio of the video is preserved. If aspect ratios differ, black bars appear on the preview window. This is the default mode.
ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FILL: The video is cropped to fit the preview window when aspect ratios differ.
Sample code:
mAlivcLivePushConfig.setPreviewDisplayMode(AlivcPreviewDisplayMode.ALIVC_LIVE_PUSHER_PREVIEW_ASPECT_FIT);You can specify a preview mode in the AlivcLivePushConfig class. You can also call the setpreviewDisplayMode method to specify a preview mode during preview and stream ingest.
The preview mode takes effect only for preview. The resolution of the ingested video stream is the same as the resolution that you specify in the AlivcLivePushConfig class. The resolution does not change when you change the preview mode. The preview mode is used to adapt to mobile phones of different sizes. You can select a preview mode based on your needs.
Ingest an image
Push SDK for iOS supports ingesting an image when the application is switched to the background or the bitrate is low.
When the application is switched to the background, video stream ingest is paused by default, and only the audio stream is ingested. Streamers can display an image, informing viewers that the streamer is away and will be back soon.
config.pauseImg = [UIImage imageNamed:@"image.png"];// Specify the image that is ingested when the app is switched to the background.You can also specify an image to be ingested in poor network conditions. When the bitrate is low, the image is displayed to prevent stuttering. Sample code:
config.networkPoorImg = [UIImage imageNamed:@"image.png"];// Set the image that is ingested in poor network conditions.Push external audio/video sources
Push SDK for iOS supports ingesting external audio/video sources, such as a video file.
Configure external audio and video inputs in the stream ingest configurations.
Insert external video data.
Insert audio data.
Sample code:
config.externMainStream = true;// Enable external stream input
config.externVideoFormat = AlivcLivePushVideoFormatYUVNV21;// Set the color format of video data. In this example, the format is set to YUVNV21. You can set the format to another value based on your needs.
config.externAudioFormat = AlivcLivePushAudioFormatS16;// Set the bit depth format of audio data. In this example, the format is set to S16. You can set the format to another value based on your needsSample code:
/* Only continuous buffer data in the YUV or RGB format can be sent by using the sendVideoData method. You can use this method to send the video buffer, length, width, height, timestamp, and rotation angle.*/
[self.livePusher sendVideoData:yuvData width:720 height:1280 size:dataSize pts:nowTime rotation:0];
/* If the external video data is in the CMSampleBufferRef format, you can use the sendVideoSampleBuffer method.*/
[self.livePusher sendVideoSampleBuffer:sampleBuffer]
/* You can also convert the CMSampleBufferRef format to a continuous buffer and then pass the buffer to the sendVideoData method. The following code provides an example on how to perform the conversion:*/
// Query the length of a sample buffer
- (int) getVideoSampleBufferSize:(CMSampleBufferRef)sampleBuffer {
if(!sampleBuffer) {
return 0;
}
int size = 0;
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
if(CVPixelBufferIsPlanar(pixelBuffer)) {
int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
for(int i=0; i<count; i++) {
int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
size += stride*height;
}
}else {
int height = (int)CVPixelBufferGetHeight(pixelBuffer);
int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
size += stride*height;
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return size;
}
// Convert a sample buffer to a continuous buffer
- (int) convertVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer toNativeBuffer:(void*)nativeBuffer
{
if(!sampleBuffer || !nativeBuffer) {
return -1;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
int size = 0;
if(CVPixelBufferIsPlanar(pixelBuffer)) {
int count = (int)CVPixelBufferGetPlaneCount(pixelBuffer);
for(int i=0; i<count; i++) {
int height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer,i);
int stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,i);
void *buffer = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, i);
int8_t *dstPos = (int8_t*)nativeBuffer + size;
memcpy(dstPos, buffer, stride*height);
size += stride*height;
}
}else {
int height = (int)CVPixelBufferGetHeight(pixelBuffer);
int stride = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
void *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);
size += stride*height;
memcpy(nativeBuffer, buffer, size);
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return 0;
}Sample code:
/* Only continuous buffer data in the PCM format is supported. You can use the sendPCMData method to send the audio buffer, length, and timestamp.*/
[self.livePusher sendPCMData:pcmData size:size pts:nowTime];Configure watermarks
Push SDK for iOS supports adding one or more watermarks in the PNG format.
Sample code:
NSString *watermarkBundlePath = [[NSBundle mainBundle] pathForResource:
[NSString stringWithFormat:@"watermark"] ofType:@"png"];// Set the path of the watermark image.
[config addWatermarkWithPath: watermarkBundlePath
watermarkCoordX:0.1
watermarkCoordY:0.1
watermarkWidth:0.3];// Add a watermark.Where:
coordX and coordY are relative values that determine the watermark position. watermarkCoordX=0.1 specifies that the left edge of the watermark is positioned at 10% of the stream width. When the resolution is 540 x 960, the x position is 540 x 0.1 = 54 pixels.
width specifies the watermark width relative to stream width. The height is proportionally scaled.
To add a text watermark, convert the text into a PNG image, then call this method to add the image as a watermark.
To ensure the clarity and edge smoothness of the watermark, we recommend that you use a source image that has the same size as your configuration. For example, if the resolution of the output video is 544 × 940 and the width of the watermark is set to 0.1f, the recommended source image width is 544 × 0.1f = 54.4 pixels.
Configure video quality
Push SDK for iOS supports the following video quality modes: Resolution Priority, Fluency Priority, and custom.
To set video quality, you must enable bitrate control: config.enableAutoBitrate = true;
Resolution Priority (default)
In this mode, the SDK automatically configures bitrate parameters to ensure the video quality.
config.qualityMode = AlivcLivePushQualityModeResolutionFirst;// Resolution Priority modeFluency Priority
In this mode, the SDK automatically configures bitrate parameters to ensure the smoothness of the ingested video stream.
config.qualityMode = AlivcLivePushQualityModeFluencyFirst;// Fluency Priority modeCustom
In custom mode, the SDK configures bitrate based on the values that you specify, including initial, minimum, and target bitrates.
initialVideoBitrate: the initial bitrate when a live stream starts.
minVideoBitrate: In poor network conditions, the bitrate is gradually reduced to the minimum bitrate to prevent stuttering.
targetVideoBitrate: In good network conditions, the bitrate is gradually increased to the target bitrate to improve the quality of a video stream.
config.qualityMode = AlivcLivePushQualityModeCustom// Custom mode
config.targetVideoBitrate = 1400; // The target bitrate is 1,400 kbit/s
config.minVideoBitrate = 600; // The minimum bitrate is 600 kbit/s
config.initialVideoBitrate = 1000; // The initial bitrate is 1,000 kbit/sWhen you configure bitrates, refer to the recommended settings provided by Alibaba Cloud:
Table 1. Recommended settings for Resolution Priority mode
Resolution | initialVideoBitrate | minVideoBitrate | targetVideoBitrate |
360p | 600 | 300 | 1000 |
480p | 800 | 300 | 1200 |
540p | 1000 | 600 | 1400 |
720p | 1500 | 600 | 2000 |
1080p | 1800 | 1200 | 2500 |
Table 1. Recommended settings for Resolution Priority mode
Resolution | initialVideoBitrate | minVideoBitrate | targetVideoBitrate |
360p | 400 | 200 | 600 |
480p | 600 | 300 | 800 |
540p | 800 | 300 | 1000 |
720p | 1000 | 300 | 1200 |
1080p | 1500 | 1200 | 2200 |
Configure adaptive resolution
The SDK supports dynamically adjusting the resolution of an ingested stream. When the feature is enabled, the resolution is automatically reduced to ensure the smoothness and quality in poor network conditions. Sample code:
config.enableAutoResolution = YES; // Enable adaptive resolution. The default value is NO.Adaptive resolution takes effect only when the video quality mode is set to Resolution Priority or Fluency Priority.
Some players may not support dynamic resolution. We recommend that you use ApsaraVideo Player.
Configure background music
Push SDK for iOS allows you to manage background music. You can control the background music playback and configure audio mixing, denoising, in-ear monitoring, and muting. Sample code:
/* Start the playback of background music. */
[self.livePusher startBGMWithMusicPathAsync:musicPath];
/* Stop the playback of background music. If you want to change the background music, call the startBGMWithMusicPathAsync method. You do not need to stop the current playback. */
[self.livePusher stopBGMAsync];
/* Pause the playback of background music. You can call this method only after the playback of background music starts. */
[self.livePusher pauseBGM];
/* Resume the playback of background music. You can call this method only after the playback of background music is paused. */
[self.livePusher resumeBGM];
/* Enable looping. */
[self.livePusher setBGMLoop:true];
/* Configure denoising. When enabled, the system filters out non-vocal parts from the collected audio. This feature may slightly reduce the volume of the human voice. We recommend that you allow your users to determine whether to enable this feature. By default, this feature is disabled. */
[self.livePusher setAudioDenoise:true];
/* Configure in-ear monitoring. In-ear monitoring is suitable for karaoke scenarios. When enabled, headphone users can hear their voice. When disabled, they cannot hear their voice on headphones. This parameter does not take effect if no headphones are detected. */
[self.livePusher setBGMEarsBack:true];
/* Configure audio mixing to adjust the volumes of the background music and human voice. */
[self.livePusher setBGMVolume:50];// Adjust the volume of the background music.
[self.livePusher setCaptureVolume:50];// Adjust the volume of the human voice.
/* Configure muting. When enabled, the background music and the human voice are both muted. To separately mute one of them, call the method that is used to configure audio mixing. */
[self.livePusher setMute:isMute?true:false];Capture snapshots
Push SDK for iOS supports capturing snapshots of local video streams. Sample code:
/* Set the snapshot callback.*/
[self.livePushersetSnapshotDelegate:self];
/* Call the API for snapshot capture.*/
[self.livePushersnapshot:1interval:1];Integrate the retouching feature
Push SDK for iOS provides basic and advanced retouching effects. Basic retouching supports overall bleaching, smoothing, and rosy cheeks. Advanced retouching supports bleaching, smoothing, rosy cheeks, eye enlargement, and face slimming based on facial recognition. This feature is provided by Queen SDK.
Sample code:
# pragma mark - "API operations for retouching types and parameters"/**
* @brief Enable or disable a retouching effect.
* @param type Specify a value for the QueenBeautyType parameter.
* @param isOpen YES: enables the retouching effect. NO: disables the retouching effect.
*
*/
- (void)setQueenBeautyType:(kQueenBeautyType)type enable:(BOOL)isOpen;
/**
* @brief Set retouching parameters.
* @param param Specify a retouching parameter. This parameter is a value of the QueenBeautyParams parameter.
* @param value Specify a value for the retouching parameter. Valid values: 0 to 1. If the value is smaller than 0, it is set to 0. If the value is greater than 1, it is set to 1.
*/
- (void)setQueenBeautyParams:(kQueenBeautyParams)param
value:(float)value;
# pragma mark - "API operations for filters"
/**
* @brief Specify a filter material. Before this operation, set the kQueenBeautyTypeLUT parameter.
* @param imagePath Specify the path of the filter material.
*/
- (void)setLutImagePath:(NSString *)imagePath;
# pragma mark - "API operations for face shaping"
/**
*@brief Specify a face shaping effect. Before you specify the face shaping effect, set the kQueenBeautyTypeFaceShape parameter.
*@param faceShapeType Specify the face shaping effect that you want to use. This parameter is similar to the QueenBeautyFaceShapeType parameter.
*@param value Specify a value for the faceShapeType parameter.
*/
- (void)setFaceShape:(kQueenBeautyFaceShapeType)faceShapeType
value:(float)value;
# pragma mark - "API operations for makeup"
/**
* @brief Specify a makeup type and the paths of makeup materials. Before you specify the makeup type, set the kQueenBeautyTypeMakeup parameter.
* @param makeupType Specify a makeup type.
* @param imagePaths Specify the paths of makeup materials.
* @param blend Specify mixed makeup.
*/
- (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
paths:(NSArray<NSString *> *)imagePaths
blendType:(kQueenBeautyBlend)blend;
/**
* @brief Specify a makeup type and the paths of makeup materials.
* @param makeupType Specify a makeup type.
* @param imagePaths Specify the paths of makeup materials.
* @param blend Specify mixed makeup.
* @param fps Specify the frame rate.
*/
- (void)setMakeupWithType:(kQueenBeautyMakeupType)makeupType
paths:(NSArray<NSString *> *)imagePaths
blendType:(kQueenBeautyBlend)blend fps:(int)fps;
/**
* @brief Configure the transparency for a makeup type. You can specify the gender.
* @param makeupType Specify a makeup type.
* @param isFeMale Specify whether the gender is female. YES: female. NO: male.
* @param alpha Specify the transparency for the makeup.
*/
- (void)setMakeupAlphaWithType:(kQueenBeautyMakeupType)makeupType
female:(BOOL)isFeMale alpha:(float)alpha;
/**
* @brief Specify the mixed makeup type for a makeup type.
* @param makeupType The mixed makeup type.
* @param blend Specify the transparency.
*/
- (void)setMakeupBlendWithType:(kQueenBeautyMakeupType)makeupType
blendType:(kQueenBeautyBlend)blend;
/**
* @brief Remove all makeup.
*/
- (void)resetAllMakeupType;Adjust retouching parameters in real time
Push SDK for iOS allows you to adjust retouching parameters in real time during stream ingest. Sample code:
[_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinBuffing enable:YES];
[_queenEngine setQueenBeautyType:kQueenBeautyTypeSkinWhiting enable:YES];
[_queenEngine setQueenBeautyParams:kQueenBeautyParamsWhitening value:0.8f];
[_queenEngine setQueenBeautyParams:kQueenBeautyParamsSharpen value:0.6f];
[_queenEngine setQueenBeautyParams:kQueenBeautyParamsSkinBuffing value:0.6];Configure the live quiz feature
To use the live quiz feature, insert supplemental enhancement information (SEI) into live streams and parse SEI by using the player. Push SDK for iOS provides a method to insert SEI. Sample code:
/*
sendMessage: Specify the SEI message to be inserted into the live stream. The SEI message is in the JSON format. ApsaraVideo Player SDK can receive and parse the SEI message.
repeatCount: Specify the number of frames into which the SEI message is inserted. To ensure that the SEI message is not dropped for a frame, you must specify the number of repetitions. For example, a value of 100 indicates that the SEI message is inserted into the subsequent 100 frames. The player removes duplicate SEI messages.
delayTime: Specify the period of time to wait before the frames are sent. Unit: milliseconds.
KeyFrameOnly: Specify whether to send only key frames.
*/
[self.livePusher sendMessage:@"Question information" repeatCount:100 delayTime:0 KeyFrameOnly:false];IPhone X adaptation
In most cases, previews can be properly displayed in full screen mode on mobile phones. However, the screen of an iPhone X has a special aspect ratio, leading to distorted previews in full screen mode. We recommend that you not use full screen mode on an iPhone X.
Change the view size during stream ingest
Check the values of UIView parameters when you call startPreview or startPreviewAsync. Change the value of the frame parameter for all subviews in the preview. For example:
[self.livePusher startPreviewAsync:self.previewView];
for (UIView *subView in [self.previewView subviews]) {
// ...
}Play external audio
To play external audio on the stream ingest page, we recommend that you use AVAudioPlayer because the SDK is incompatible with AudioServicesPlaySystemSound. Update the AVAudioSession and AVAudioPlayer settings after playback.
Sample code:
- (void)setupAudioPlayer {
NSString *filePath = [[NSBundle
mainBundle] pathForResource:@"sound" ofType:@"wav"];
NSURL *fileUrl = [NSURL URLWithString:filePath];
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:nil];
self.player.volume = 1.0;
[self.player prepareToPlay];
}
- (void)playAudio {
self.player.volume = 1.0;
[self.player play];
// Configure AVAudioSession.
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setMode:AVAudioSessionModeVideoChat error:nil];
[session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
[session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker|AVAudioSessionCategoryOptionAllowBluetooth
| AVAudioSessionCategoryOptionMixWithOthers error:nil];
[session setActive:YES error:nil];
}Background mode and phone calls
The SDK provides built-in configurations for the background mode. When you switch the app to the background, the video is paused at the last frame. The app continues ingesting the audio in the background. You need to enable the Background Mode option in the Capabilities section of your app and select Audio,AirPlay and Picture in Picture. This ensures that audio can be collected when the app is switched to the background. The following figure shows the configuration:
You can also dispose the stream ingest engine when you switch the app to the background and re-create the stream ingest engine when you switch the app back to the foreground. This way, you can stop audio collection in the background.
In this case, you must make configurations to listen to UIApplicationWillResignActiveNotification and UIApplicationDidBecomeActiveNotification when the app is switched to the background. Otherwise, an error may occur.
Callbacks
Push SDK for iOS provides the following types of callbacks:
Type | Class |
AlivcLivePusherInfoDelegate | |
AlivcLivePusherNetworkDelegate | |
AlivcLivePusherErrorDelegate | |
AlivcLivePusherBGMDelegate | |
AlivcLivePusherCustomFilterDelegate |
Stream ingest callbacks
Stream ingest callbacks are used to notify the app of the SDK status, including preview started, first video frame rendered, first frame of audio/video stream sent, ingest started, and ingest stopped.
onPushStarted: indicates that the server is connected.
onFirstFramePushed: indicates that the first frame of the audio or video stream is sent.
onPushStarted and onFirstFramePushed: indicate that stream is ingested.
Network callbacks
Network callbacks are used to notify the app of the network status and connection status. When a brief network drop or switch occurs, the SDK will attempt to reconnect automatically, as long as the interruption stays within the timeout period and retry limit configured in AlivcLivePushConfig. If the reconnection succeeds, stream ingest resumes.
onConnectFail: indicates that stream ingest fails. We recommend that you check whether the ingest URL is valid (for example, whether the URL contains invalid characters), whether there is an authentication issue, whether the upper limit on the number of concurrently ingested streams is exceeded, and whether the stream is in the blacklist. Make sure that the ingest URL is valid and available before you try to ingest the stream. The relevant error codes include 0x30020901 to 0x30020905 and 0x30010900 to 0x30010901.
onConnectionLost: indicates that the connection is lost. The SDK automatically reconnects to the network and returns onReconnectStart. If the connection is not recovered after the maximum number of reconnection attempts (config.connectRetryCount) is reached, onReconnectFail is returned.
onNetworkPoor: indicates that the network speed is slow. If you receive this callback, the current network may not be able to fully support your ingested stream, even though the stream is not interrupted. In this case, you can handle your own business logic, for example, notify the user of the poor network conditions.
onNetworkRecovery: indicates that the network is recovered.
onReconnectError: indicates that the network reconnection failed. Check the current network and re-ingest the stream when the network recovers.
onSendDataTimeout: indicates that a timeout occurred when the data is sent. Check the current network and re-ingest the stream when the network recovers.
onPushURLAuthenticationOverdue: indicates that the authentication of the ingest URL expires. You must provide a new URL to the SDK.
Error callbacks
onSystemError: indicates a system error occurred. You must dispose the engine and try again.
onSDKError: indicates an SDK error occurred. Perform operations based on the error code:
If the error code is 805438211, the device performance is poor and the frame rate for encoding and rendering is low. You need to prompt the streamer and stop time-consuming business logic, such as advanced retouching and animation, at the application layer.
You need to pay special attention to the callbacks that are related to microphone and camera permissions. The 268455940 error code indicates that the app requires the permissions on the microphone. The 268455939 error code indicates that the app requires the permissions on the camera.
For other error codes, no additional operations are required. All error codes are recorded in logs.
Background music callbacks
onOpenFailed: indicates that the background music fails to start playback. Check whether the music file is valid and whether its path is correctly specified. Call the
startBGMWithMusicPathAsyncmethod to try again.onDownloadTimeout: indicates a timeout during the playback of the background music. This usually occurs when the background music comes from a URL. In this case, check the network status and call the
startBGMWithMusicPathAsyncmethod to play the music again.
Callbacks for retouching and filter processing
You can use AlivcLivePusherCustomFilterDelegate to interconnect with the Queen SDK to implement basic and advanced retouching features. AlivcLivePusherCustomFilterDelegate allows you to trigger texture or CVPixelBuffer callbacks in Push SDK. The Queen SDK can process the callbacks and return the processed texture or CVPixelBuffer data to Push SDK. This way, retouching effects are implemented.
livePushMode in AlivcLivePushConfig is set to AlivcLivePushBasicMode. Push SDK uses AlivcLivePusherCustomFilterDelegate to obtain the texture ID instead of CVPixelBuffer data:
onCreate: indicates that the OpenGL context is created. This callback can be used to initialize the retouching engine.
onProcess: indicates that the OpenGL texture is updated. The ID of the raw texture in the SDK is obtained. In this callback, the retouching methods can be called to return the ID of the processed texture.
onDestory: indicates that the OpenGL context is destroyed. This callback can be used to destroy the retouching engine.
Common methods and interfaces
/* In custom mode, you can adjust the minimum and target bitrates in real time.*/
[self.livePusher setTargetVideoBitrate:800];
[self.livePusher setMinVideoBitrate:200]
/* Query whether the stream is being ingested.*/
BOOL isPushing = [self.livePusher isPushing];
/* Query the ingest URL.*/
NSString *pushURLString = [self.livePusher getPushURL];
/* Query the stream ingest performance information. For information about the parameters of stream ingest performance, see the API references or comments in the code.*/
AlivcLivePushStatsInfo *info = [self.livePusher getLivePushStatusInfo];
/* Obtain the SDK version number. */
NSString *sdkVersion = [self.livePusher getSDKVersion];
/* Specify the log level to filter debugging information. */
[self.livePusher setLogLevel:(AlivcLivePushLogLevelDebug)];Debugging tools
Push SDK for iOS provides an intuitive DebugView, a movable global floating window that is always displayed at the top of the view. DebugView provides debug features such as viewing stream ingest logs, real-time detection of stream ingest performance metrics, and line charts of metrics.
In the release version, do not call the method to open DebugView.
Sample code:
[AlivcLivePusher showDebugView];// Open DebugView.FAQ
How do I troubleshoot stream ingest failures?
You can use the troubleshooting tool to check whether the ingest URL is valid.
How do I obtain information about ingested streams?
Go to Streams and view ingested audio and video streams in Active Streams.
How do I play a stream?
After you start stream ingest, you can use a player (such as ApsaraVideo Player, FFplay, or VLC) to test stream pulling. To obtain playback URLs, see Generate ingest and streaming URLs.
The application failed App Store review
The RtsSDK binary is a fat library that contains both device and simulator slices. Apple rejects an IPA that carries simulator architectures. You can use lipo -remove to remove the x86_64 architecture.