All Products
Search
Document Center

ApsaraVideo Live:Handle errors, exceptions, and special cases

Last Updated:Mar 05, 2024

This topic describes the exceptions and special cases that you may encounter when you use Push SDK for iOS. This topic also describes how to handle such exceptions and special cases.

Handle errors and exceptions

Push SDK provides the following types of callbacks.

Callback type

Class

Stream ingest callbacks

AlivcLivePusherInfoDelegate

Network callbacks

AlivcLivePusherNetworkDelegate

Error callbacks

AlivcLivePusherErrorDelegate

Background music callbacks

AlivcLivePusherBGMDelegate

Callbacks related to retouching and filters

AlivcLivePusherCustomFilterDelegate

Stream ingest callbacks

Stream ingest callbacks are used to notify the app of the SDK status, including the callback for preview start, the callback for rendering the first frame, the callback for sending the first frame of an audio or video stream, the callback for stream ingest start, and the callback for stream ingest stop.

  • onPushStarted and onFirstFramePushed: indicates that stream ingest is successful.

  • onPushStarted: indicates that the server is connected.

  • onFirstFramePushed: indicates that the first frame of the audio or video stream is sent.

Network callbacks

Network callbacks are used to notify the app of the network status and link status regarding the SDK, some of which must be configured for the app.

Basic mode: basic live streaming scenarios

  • onConnectFail: indicates that stream ingest fails. We recommend that you check whether the ingest URL is valid (for example, whether the URL contains invalid characters), whether there is an authentication issue, whether the upper limit on the number of concurrently ingested streams is exceeded, and whether the stream is in the blacklist. Make sure that the ingest URL is valid and available before you try to ingest the stream. The relevant error codes include 0x30020901 to 0x30020905 and 0x30010900 to 0x30010901.

  • onConnectionLost: indicates that the network is disconnected. After the network is disconnected, the SDK automatically attempts to reconnect to the network and fires the onReconnectStart callback. If the maximum allowed number of reconnection attempts (config.connectRetryCount) is reached but the network is still not recovered, the SDK fires the onReconnectError callback.

  • onNetworkPoor: indicates that the network speed is slow. If you receive this callback, the current network may not be able to fully support your ingested stream, even though the stream is not interrupted and can keep running. In this case, you can handle your own business logic, for example, notify the user on the user interface (UI) of the app.

  • onNetworkRecovery: indicates that the network is recovered.

  • onReconnectError: indicates that the network reconnection fails. We recommend that you check the current network and re-ingest the stream when the network recovers.

  • onSendDataTimeout: indicates that a timeout occurs when the data is sent. We recommend that you check the current network and re-ingest the stream when the network recovers.

  • onPushURLAuthenticationOverdue: indicates that the signed ingest URL is about to expire. If URL signing is enabled, the ingest URL contains the auth_key field, which Alibaba Cloud regularly verifies. This callback is fired 1 minute before the ingest URL expires. After the callback is returned, you must specify a new ingest URL to ensure that stream ingest is not interrupted when the original URL expires.

Interactive mode: co-streaming scenarios

  • onConnectFail: indicates that stream ingest fails. We recommend that you check whether the token in the ingest URL for co-streaming is valid and whether the network is normal. Make sure that the ingest URL is valid and the network is normal before you try to ingest the stream.

  • onConnectionStatusChange: indicates that the connection status changes. In the callback, the connection status is returned. For example, the network is disconnected, the network connection is being established, the network is connected, and the network connection fails. If AliLiveConnectionStatusFailed is returned in the callback, the connection cannot be recovered. In this case, check the current network and re-ingest the stream after the network recovers. We recommend that you configure this callback in interactive mode to obtain the connection status.

  • onPushURLTokenWillExpire: indicates that the token in the ingest URL for co-streaming is about to expire. This callback is fired 30 seconds before the token expires. After receiving the callback, you must promptly request an ingest URL for co-streaming that contains a new token from the business server and call refreshPushURLToken to pass in the new token to the SDK.

  • onPushURLTokenExpired: indicates that the token in the ingest URL for co-streaming has expired. If this callback is fired, you need to use an ingest URL that contains a new token to re-ingest the stream.

  • onPusherNetworkQualityChanged: indicates that the upstream network quality changes. The callback can be used to rate the upstream network quality.

  • onConnectionLost: indicates that the network is disconnected. In interactive mode, if the onConnectionLost callback is fired, the connection cannot be recovered. We recommend that you check the current network and re-ingest the stream after the network recovers.

Error callbacks

  • onSystemError: indicates a system device exception. You must destroy the engine and try again.

  • onSDKError: indicates an SDK error. You need to perform operations based on the error code.

    • If the error code is 805438211, the device performance is poor and the frame rate for encoding and rendering is low. You need to prompt the streamer and stop time-consuming business logic, such as advanced retouching and animation, at the app layer.

    • You need to pay special attention to the callbacks that are related to microphone and camera permissions. The 268455940 error code indicates that the app requires the permissions on the microphone. The 268455939 error code indicates that the app requires the permissions on the camera.

    • For other error codes, no additional operations are required. All error codes are recorded in logs.

Background music callbacks

  • onOpenFailed: indicates that the background music fails to start. Check whether the music file is valid and whether the path of the music file that is specified in the relevant method is correct. You can call the startBGMAsync method to play the music again.

  • onDownloadTimeout: indicates a timeout during the playback of the background music. This usually occurs when the background music comes from a URL. In this case, check the network status and call the startBGMAsync method to play the music again.

Interconnection with a retouching SDK

You can use callbacks of AlivcLivePusherCustomFilterDelegate to interconnect with a third-party retouching SDK to implement basic and advanced retouching features. AlivcLivePusherCustomFilterDelegate allows you to trigger texture or CVPixelBuffer callbacks in Push SDK. The retouching SDK can process the callbacks and return the processed texture or CVPixelBuffer data to Push SDK. This way, retouching effects are implemented.

Basic mode: basic live streaming scenarios

In basic mode, livePushMode in AlivcLivePushConfig is set to AlivcLivePushBasicMode. Push SDK uses AlivcLivePusherCustomFilterDelegate to obtain the texture ID instead of CVPixelBuffer data. The following callbacks are included:

  • onCreate: indicates that the Open Graphics Library (OpenGL) context is created. This callback can be used to initialize the retouching engine.

  • onProcess: indicates that the OpenGL texture is updated. The ID of the raw texture in the SDK is obtained. In this callback, the retouching methods can be called to return the ID of the processed texture.

  • onDestory: indicates that the OpenGL context is destroyed. This callback can be used to destroy the retouching engine.

Interactive mode: co-streaming scenarios

In interactive mode, livePushMode in AlivcLivePushConfig is set to AlivcLivePushInteractiveMode. Push SDK uses AlivcLivePusherCustomFilterDelegate to obtain the CVPixelBuffer data or texture ID. By default, the CVPixelBuffer data is obtained.

  • CVPixelBuffer data

    In interactive mode, Push SDK calls the onProcessVideoSampleBuffer method to obtain the CVPixelBuffer data by default. You need only to send the CVPixelBuffer data to the retouching SDK from this callback, and then write the processed data back to Push SDK. At the same time, set the return value of this method to YES to implement retouching effects. If the return value is set to NO, the data is not written back to Push SDK and the retouching effects are not implemented. When the CVPixelBuffer data is obtained, methods other than onProcessVideoSampleBuffer in AlivcLivePusherCustomFilterDelegate are not called.

    Sample code

    - (BOOL)onProcessVideoSampleBuffer:(AlivcLivePusher *)pusher sampleBuffer:(AlivcLiveVideoDataSample *)sampleBuffer
    {
        BOOL result = NO;
        if (self.beautyOn)
        {
            result = [[AlivcBeautyController sharedInstance] processPixelBuffer:sampleBuffer.pixelBuffer withPushOrientation:self.pushConfig.orientation];
        }
        return result;
    }
  • Texture ID

    In interactive mode, Push SDK obtains the CVPixelBuffer data by default. If you set the enableLocalVideoTexture variable in AlivcLivePushConfig to YES, the texture ID is obtained instead of the CVPixelBuffer data. When the texture ID is obtained, methods other than onProcessVideoSampleBuffer are called.

    • onCreate: indicates that the OpenGL context is created. This callback can be used to initialize the retouching engine.

    • onProcess: indicates that the OpenGL texture is updated. The ID of the raw texture in the SDK is obtained. In this callback, the retouching methods can be called to return the ID of the processed texture.

    • onDestory: indicates that the OpenGL context is destroyed. This callback can be used to destroy the retouching engine.

    Callbacks of AlivcLivePusherCustomDetectorDelegate are required when the retouching SDK needs to use buffers to process facial recognition algorithms. Not all retouching operations require these callbacks. Only specific retouching SDKs require buffers to perform facial recognition. For these SDKs, AlivcLivePusherCustomFilterDelegate is used to obtain the texture ID to implement retouching effects, and AlivcLivePusherCustomDetectorDelegate is used to obtain buffers to perform facial recognition. If you pass the CVPixelBuffer data to the retouching SDK, the preceding handling steps are not required.

    For a retouching SDK to use buffers to process facial recognition algorithms, use the onDetectorProcess callback function. The callback function returns the buffer data. In interactive mode, if you obtain the texture ID and also require buffers, you need to enable enableLocalVideoRawBuffer in AlivcLivePushConfig to trigger the onDetectorProcess callback function.

Handle special cases

Handle network disconnection

  • Short-period network disconnection and network switchover: The SDK attempts to reconnect to the network when a network disconnection or a network switchover occurs. You can use the AlivcLivePushConfig class to set the reconnection timeout period and the maximum number of reconnection attempts allowed. After the SDK reconnects to the network, stream ingest is resumed. If you use ApsaraVideo Player, we recommend that you perform a reconnection operation 5 seconds after you receive an AliVcMediaPlayerPlaybackDidFinishNotification timeout notification.

  • Long-period network disconnection: The SDK fails to reconnect to the network if a reconnection request times out or the number of reconnection attempts exceeds the upper limit. In this case, the onReconnectError:error: callback is fired. After the network connection is recovered, call reconnectAsyn to reconnect to the network. You must also reconnect the SDK to ApsaraVideo Player.

    • We recommend that you externally monitor the network connection.

    • Use the server to handle communication failures between the streamer side and player. For example, when the streamer is disconnected from the server, the server receives a callback of stream ingest interruption from Alibaba Cloud CDN. The server pushes the callback message to the player. Then, the player handles the stream ingest interruption. The server uses the same procedure to resume stream ingest.

    • Stop the playback and then restart ApsaraVideo Player to reconnect the player to the server. To achieve this, call stop, prepareToPlay, and play in sequence.

      [self.mediaPlayer stop];
      AliVcMovieErrorCode err = [self.mediaPlayer prepareToPlay:[NSURL URLWithString:@"Streaming URL"]];
      if(err != ALIVC_SUCCESS) {
       NSLog(@"play failed,error code is %d",(int)err);
       return;
      }
      [self.mediaPlayer play];
      Note

      For more information about ApsaraVideo Player, see Advanced features.

Switch the app to the background or answer a phone call

The SDK provides built-in configurations for the background mode. When you switch the app to the background, the video is paused at the last frame. The app continues playing the audio in the background. Open your project in Xcode. On the Signing & Capabilities tab, select Audio, AirPlay, and Picture in Picture in the Background Mode section. This ensures that audio can be collected when the app is switched to the background. The following figure shows the configuration.退后台和接听电话

You can destroy the stream ingest engine when you switch the app to the background and re-create the stream ingest engine when you switch the app back to the foreground. This way, you can stop audio collection when the app is switched to the background.

Note

If you use this method, you must make configurations to listen to UIApplicationWillResignActiveNotification and UIApplicationDidBecomeActiveNotification when the app is switched to the background. Otherwise, an error may occur.

Play external audio

To play external audio on the stream ingest page, we recommend that you use AVAudioPlayer instead because the SDK is incompatible with AudioServicesPlaySystemSound. After you configure external audio playback, you need to update the AVAudioSession settings. Sample code:

- (void)setupAudioPlayer {
 NSString *filePath = [[NSBundle
mainBundle] pathForResource:@"sound" ofType:@"wav"];
 NSURL *fileUrl = [NSURL URLWithString:filePath];
 self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:nil];
 self.player.volume = 1.0;
 [self.player prepareToPlay];
}
 - (void)playAudio {
 self.player.volume = 1.0;
 [self.player play];
 // Configure AVAudioSession settings.
 AVAudioSession *session = [AVAudioSession sharedInstance];
 [session setMode:AVAudioSessionModeVideoChat error:nil];
 [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
 [session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker|AVAudioSessionCategoryOptionAllowBluetooth
| AVAudioSessionCategoryOptionMixWithOthers error:nil];
 [session setActive:YES error:nil];
}

Change the size of the view during stream ingest

Check the values of UIView parameters when you call startPreview or startPreviewAsync. Change the value of the frame parameter for all subviews in the preview. Sample code:

[self.livePusher startPreviewAsync:self.previewView];
for (UIView *subView in [self.previewView subviews]) {
 // ...
}

Adapt previews to an iPhone X

In most cases, all previews can be properly displayed in full screen mode on mobile phones. However, the screen of an iPhone X has a special aspect ratio. Therefore, previews are distorted when they are displayed in full screen mode on an iPhone X. We recommend that you not use preview in full screen mode on an iPhone X.

Set the bitrate

The SDK supports the dynamic bitrate feature. You can use the AlivcLivePushConfig class to change the default bitrate. Different services require different video quality. The resolution, smoothness, and quality of an output video vary based on the bitrate of the ingested stream.

  • Video quality: A higher bitrate of the ingested stream indicates higher video quality. We recommend that you set a higher bitrate to ensure higher video quality.

  • Video smoothness: A higher bitrate of the ingested stream indicates that more network bandwidth is required. A higher bitrate in poor network conditions may affect the smoothness of video playback.

Compilation error

When you receive a Building for iOS, but the linked and embedded framework XXX.framework' was built for iOS + iOS Simulator compilation error, perform the following operations:

  1. Click the Xcode menu.

  2. Choose File > Workspace Settings to enter dialog box settings.

  3. Select Legacy Build System for Build System.

Queen dependencies not found

In the case of manual integration, see the Queen_SDK_iOS documentation to add the corresponding dependencies.

Failed to submit the app integrated with the SDK to App Store for review

RtsSDK provides libraries for all platforms. If you want to submit your app to App Store, you must remove the simulator architecture. For example, you can run lipo -remove to remove the x86_64 architecture.