This topic describes how to use Supplemental Enhancement Information (SEI) messages to enable your client application to detect and smoothly transition between different video layouts, such as from a solo streamer to a co-streaming arrangement.
When the layout switches from a solo streamer to a multi-source composite (co-streaming), ApsaraVideo Live automatically injects an SEI message into the video stream. This SEI message contains the layout information for each participant. By enabling SEI listening on your player client, your application can detect these layout changes in real time. This ensures that your UI updates are perfectly synchronized with the video, even when streaming with a delay from a Content Delivery Network (CDN). For SEI configurations, see StartLiveMPUTask or UpdateLiveMPUTask.
The following table describes the structure of an SEI frame.
Data type | Parameter | Description |
Video stream information | stream | The information about the streamer. uid: the user ID of the streamer. |
The layout information of the area where the streamer is streaming. Take note of the following layout parameters:
|
If a single streamer is streaming, the SEI message that the viewer's client receives contains information about only one participant.
If co-streaming or battle mode is active, the SEI message that viewer's client receives contains information about multiple participants.
For example:
When the streamer A (UID: 111) is streaming, an SEI frame in the following format is sent to the viewer side:
{"stream":[{"uid":"111","paneid":-1,"zorder":0,"x":0,"y":0,"w":0,"h":0,"type":0,"ms":0,"vol":0,"vad":0}]}When the streamer A (UID: 111) is co-streaming with a co-streamer B (UID: 222), an SEI frame in the following format is sent to the viewer side:
{"stream":[{"uid":"111","paneid":0,"zorder":1,"x":0,"y":0.25,"w":0.5,"h":0.5,"type":0,"ms":0,"vol":1,"vad":119},{"uid":"222","paneid":1,"zorder":1,"x":0.5018382,"y":0.25,"w":0.5,"h":0.5,"type":0,"ms":0,"vol":60,"vad":123}]}
Your application can determine the current layout by checking the number of objects in the stream array. If the array contains a single object, the stream is in a solo-streamer layout. If it contains multiple objects, the stream is in a co-streaming or multi-source layout. The properties of each object specify the position and state of each participant in the composite stream.
The following sample code provides an example on how to use ApsaraVideo Player to parse an SEI frame:
Sample code for Android:
mAliPlayer = AliPlayerFactory.createAliPlayer(mContext);
PlayerConfig playerConfig = mAliPlayer.getConfig();
// For audio-only or video-only FLV streams, you can adjust the buffer settings to improve startup time.
// The startup cache. A larger value results in more stable playback but may increase startup time. Unit: milliseconds.
playerConfig.mStartBufferDuration = 1000;
// The cache used for stutter recovery. You can increase this value for users with poor network conditions. A value of 500 ms is recommended for audio-only streams, while the default of 3000 ms is advised for video streams. Unit: milliseconds.
playerConfig.mHighBufferDuration = 500;
// Enable SEI listening.
playerConfig.mEnableSEI = true;
mAliPlayer.setConfig(playerConfig);
mAliPlayer.setAutoPlay(true);
mAliPlayer.setOnErrorListener(errorInfo -> {
mAliPlayer.prepare();
});
mAliPlayer.setOnSeiDataListener(new IPlayer.OnSeiDataListener() {
@Override
public void onSeiData(int i, byte[] bytes) {
}
});Sample code for iOS:
self.cdnPlayer = [[AliPlayer alloc] init];
self.cdnPlayer.delegate = self;
AVPConfig *config = [self.cdnPlayer getConfig];
config.enableSEI = YES;
[self.cdnPlayer setConfig:config];
// Listen to SEI-related callbacks.
- (void)onSEIData:(AliPlayer*)player type:(int)type data:(NSData *)data {
if (data.bytes){
NSString *str = [NSString stringWithUTF8String:data.bytes];
// Process the SEI message.
}
}