This topic describes how to implement custom video rendering.
Feature description
Alibaba Real-Time Communication (ARTC) includes a market-proven video rendering module. The built-in module provides a stable and efficient video playback experience.
If you have a mature, self-developed rendering module or specific requirements for color accuracy and frame rate control, the ARTC software development kit (SDK) provides flexible interfaces. These interfaces allow you to integrate a custom video rendering module to meet your business needs.
Prerequisites
Before you configure video settings, make sure that the following prerequisites are met:
You have created an ApsaraVideo Real-time Communication application with a valid Alibaba Cloud account. For more information, see Create an application. You must also obtain an App ID and an App Key from the Application Management console.
You have integrated the ARTC SDK into your project and implemented basic audio and video communication features. For more information about SDK integration, see Download and integrate the SDK. To learn how to implement audio and video calls, see Implement an audio and video call.
Technical principles
Remote video rendering
Buffer-based local video rendering
Texture ID-based local video rendering
Implementation
iOS implementation
1. Enable external rendering using the extra field
You can enable external video playback using the extra field. After this feature is enabled, ARTC uses a virtual renderer.
NSString *extras = @"{\"user_specified_use_external_video_render\":\"TRUE\"}";
/* If callbacks for cvPixelBuffer are needed, add this option. */
NSString * extras = @"{\"user_specified_use_external_video_render\":\"TRUE\",\"user_specified_native_buffer_observer\":"TRUE"}";
_engine = [AliRtcEngine sharedInstance:nil extras:extras];
2. Set callbacks
cvPixelBuffer/I420:
[_engine registerVideoSampleObserver];
textureID:
[_engine registerLocalVideoTexture];3. Handle callbacks
I420/cvPixelBuffer
/*
* Triggered after AliRtcEngine::registerVideoSampleObserver is called.
*/
- (AliRtcVideoFormat)onGetVideoFormatPreference {
return AliRtcVideoFormat_I420;
}
/**
* @brief Callback for the subscribed video data captured locally.
* @param videoSource Video stream type.
* @param videoSample Raw video data.
* @return
* - YES: The sample needs to be written back to the SDK. This is valid only for the I420 and CVPixelBuffer formats on iOS and macOS.
* - NO: The sample does not need to be written back to the SDK.
*/
- (BOOL)onCaptureVideoSample:(AliRtcVideoSource)videoSource videoSample:(AliRtcVideoDataSample *_Nonnull)videoSample {
/*
* do....
*/
return FALSE ;
}
textureID
Callback after the OpenGL context is created
/** * @brief Callback for OpenGL context creation. * @param context OpenGL context. * @note This callback is triggered when the OpenGL context is created within the SDK. */ - (void)onTextureCreate:(void *_Nullable)context { [[renderEngine_ shared] create:context]; }Callback for OpenGL texture update
/** * @brief Callback for OpenGL texture update. * @param textureId OpenGL texture ID. * @param width OpenGL texture width. * @param height OpenGL texture height. * @param videoSample Video frame data. For more information, see {@link AliRtcVideoDataSample}. * @return OpenGL texture ID. * @note * - This callback is triggered after each video frame is uploaded to the OpenGL texture. If an external observer for OpenGL texture data is registered, you can process the texture in this callback and return the processed texture ID. * - The return value must be a valid texture ID. If no processing is performed, you must return the original textureId parameter. * The format of the returned TextureID is AliRtcVideoFormat_Texture2D. */ - (int)onTextureUpdate:(int)textureId width:(int)width height:(int)height videoSample:(AliRtcVideoDataSample *_Nonnull)videoSample { if ( [[renderEngine_ shared] enabled] == NO) { return textureId; } int texId = [[renderEngine_ shared] processTextureToTexture:textureId Width:width Height:height]; if(texId<0) { texId = textureId; } return texId; }Callback for OpenGL context deletion
- (void)onTextureDestory { if (self.settingModel.chnnelType == ChannelTypePrimary) { [[renderEngine_ shared] destroy]; } }
4. Exit
/* I420/cvPixelBuffer */
unregisterVideoSampleObserver
/* textureID */
unregisterLocalVideoTextureAndroid implementation
1. Enable external rendering using the extra field
You can enable external video playback using the extra field. After this feature is enabled, ARTC uses a virtual renderer.
String extras = "{\"user_specified_use_external_video_render\":\"TRUE\"}";
/*
If callbacks for textureID are needed, add this option.
We recommend that you also enable hardware encoding for textures.
*/
String extras = "{\"user_specified_use_external_video_render\":\"TRUE\",\"user_specified_camera_texture_capture\":"TRUE",\"user_specified_texture_encode\":"TRUE" }";
_engine = AliRtcEngine.getInstance(getApplicationContext(), extras);
2. Set callbacks
cvPixelBuffer/I420:
_engine.registerVideoSampleObserver();
textureID:
_engine.registerLocalVideoTextureObserver();3. Handle callbacks
I420/cvPixelBuffer
/**
* @brief Callback for the subscribed video data captured locally.
* @param videoSource Video stream type.
* @param videoSample Raw video data.
* @return
* - true: The sample needs to be written back to the SDK. This is valid only for the I420 and CVPixelBuffer formats on iOS and macOS.
* - false: The sample does not need to be written back to the SDK.
*/
public int onGetVideoAlignment(){
return AliRtcVideoObserAlignment.AliRtcAlignmentDefault.getValue();
}
@Override
public boolean onLocalVideoSample(AliRtcEngine.AliRtcVideoSourceType sourceType, AliRtcEngine.AliRtcVideoSample videoSample) {
boolean ret = false;
/*
* Process the locally captured video data for rendering.
*/
local_renderEngine.draw(videoSample);
return ret;
}
@Override
public boolean onRemoteVideoSample(String userId, AliRtcEngine.AliRtcVideoSourceType sourceType, AliRtcEngine.AliRtcVideoSample videoSample) {
/*
* Process the remote data for rendering.
*/
remote_renderEngine.draw(userId, videoSample);
return false;
}
textureID
Callback after the OpenGL context is created
@Override public void onTextureCreate(long context) { context_ = context ; render.bind(context); Log.d(TAG, "texture context: "+context_+" create!") ; }Callback for OpenGL texture update
/** @Override public int onTextureUpdate(int textureId, int width, int height, AliRtcEngine.AliRtcVideoSample videoSample) { /* * Process the texture ID. */ render.drawTexture(textureId); return textureId; }Callback for OpenGL context deletion
@Override public void onTextureDestroy() { render.free(context_); Log.d(TAG, "texture context: "+context_+" destory!") ; }
4. Exit
/* I420/cvPixelBuffer */
_engine.unregisterVideoSampleObserver();
/* textureID */
_engine.unRegisterLocalVideoTextureObserver();