All Products
Search
Document Center

Apsara Video SDK:FAQ about Queen SDK

Last Updated:Mar 21, 2024

This topic provides answers to frequently asked questions (FAQ) about Queen SDK.

Table of contents

General FAQ

Technical FAQ

What is Queen SDK?

Queen SDK is developed by Alibaba Cloud to enable various real-time effects such as retouching, shaping, filters, stickers, makeup, gesture recognition, and auto chroma key in various video shooting scenarios.

Can Queen SDK be integrated with third-party SDKs?

Yes, Queen SDK is a standalone SDK that can be used in combination with many popular commercial SDKs, such as ApsaraVideo Live SDK, Tencent Cloud SDK, Qiniu Cloud Live Video Cloud SDK, Agora SDK, and Librestreaming SDK.

The combination is easy to implement. You can integrate a third-party SDK with Queen SDK if the third-party SDK can return texture or image data buffers to the application side for processing. To facilitate quick and convenient use in various business scenarios, Queen SDK integrates with common SDKs of Alibaba Cloud, such as the short video SDK, Push SDK, interactive streaming SDK, and Real-Time Communication (RTC) SDK. For more information, see Queen_SDK_Android.

How do I select an appropriate edition of Queen SDK?

To meet the fine-grained business requirements of customers, Alibaba Cloud provides the following five editions of Queen SDK tailored for common business scenarios:

  • Basic Edition

  • Pro Edition

  • Ultimate Edition

  • Auto Chroma Key Edition

  • Gesture Recognition Edition

You can select any single edition suited to your requirements, or combine multiple editions for a more comprehensive solution. For more information about the features of the first three editions, see Download SDKs.

Auto Chroma Key Edition supports green/blue screen keying and keying in real-time scenes. Specifically, it provides a solution for solid color background keying and supports a variety of common solid color backdrops, including green, blue, teal, and purple screens.

Gesture Recognition Edition is used to recognize common gestures and body postures.

Auto Chroma Key Edition and Gesture Recognition Edition of Queen SDK are not included in the demo package due to their large package sizes and low use frequency. If you want to use the two editions, submit a request on the Create Application and Bind License page or join the DingTalk group (ID 34197869) to obtain a trial.

How do I apply for different editions of Queen SDK?

You can select an edition based on your business requirements. Then, submit the required information on the Create Application and Bind License page. After your request is received, Alibaba Cloud handles the request within two business days. For more information, see Obtain a license of Queen SDK.

How do I obtain Auto Chroma Key Edition and Gesture Recognition Edition of Queen SDK?

Auto Chroma Key Edition and Gesture Recognition Edition of Queen SDK provide specialized features for unique scenarios and require customization to fit individual customer needs. The two editions are not available for direct download. You are welcome to download and try out the demo to check whether your expectations are met. For more information, see Use Queen SDK. If you want to know more about how to integrate and use the two editions, submit a request on the Create Application and Bind License page or join the DingTalk group (ID 34197869).

How do I download necessary resources to use Queen SDK?

If you do not use the low-code integration solution that provides a built-in user interface (UI) when you use ApsaraVideo MediaBox SDK, you must manually download resource files related to algorithm models. Otherwise, some retouching features that depend on the algorithm models do not take effect.

After you download necessary resources, you can use the features of Queen SDK. For more information about how to download resources, see the following documentation:

What are the requirements for calling the API operations of Queen SDK?

Make sure that the following requirements are met when you call the API operations of Queen SDK, including the operations for creation, configuration, usage, and destruction:

  • All API operations must be called by using the same thread.

  • The thread must have a Graphics Library (GL) context. If no GL context exists, you can call an API operation of Queen SDK to create one.

How do I troubleshoot and debug errors?

Android:

After an engine is created, enable debugging.

engine.enableDebugLog();

You can enter the keyword Queen to search for the output logs that record the operational status of Queen SDK. The following figures show some log examples.1.png2.png

You can obtain detailed information from DEBUG-level logs and more concise information from INFO-level logs.

How do I specify the input and output parameters?

Queen SDK provides a built-in algorithm that automatically calculates the optimal input parameters for the current image. To view the expected input parameter values, perform the following steps:

  1. When you create an engine, set the input mode of the algorithm to auto. config.algInputMode = AlgInputMode.kModeAutomatic;3.png

  2. Enable debugging after the engine is created.

    engine.enableDebugLog();

    You can filter logs to view the Logcat output, as shown in the following figure.

    4.png

    In the log examples, the values of input_angle, out_angle, and out_flip are algorithm-generated values, which are the expected input parameter values for the current image.

Important

Take note that using the intelligent algorithm incurs additional performance costs. After the test is complete, revert the changes made in the preceding two steps in a timely manner.

How does Queen SDK perform?

After years of iteration, Queen SDK strives to lead the industry in terms of performance, quality of retouching effects, and efficiency. Performance optimization is a top priority for Queen SDK. Alibaba Cloud undertakes targeted optimizations and iterations for Queen SDK, especially for low-end devices and earlier versions of operating systems prevalent in Southeast Asia. Thanks to multiple rounds of optimization, Queen SDK delivers industry-leading performance. Queen SDK offers various versions of each feature to cater to different business requirements. For example, the retouching module supports advanced, power-saving, and smart adjustment modes. During regular iterations, each version has a dedicated multidimensional performance testing report and is verified on a large scale by a large number of internal and external users.

How do I resolve the black screen issue?

Causes: In most cases, a black screen appears when an invalid texture ID is used or an error occurs in the texture generation process. Occasionally, a black screen issue may also occur if you improperly use the returned texture ID or use an invalid texture ID to render your business layer.

Solution: The rendering layer of Queen SDK is built based on Open Graphics Library (OpenGL). To resolve the black screen issue, make sure that the texture ID passed to Queen SDK is valid. This way, Queen SDK ensures that the output texture is normally displayed. Queen SDK can return the original texture even if no effect is applied, invalid parameters exist, or you do not obtain a license to use Queen SDK.

After my application is integrated with Queen SDK, why does my screen flicker with a solid color or change color when I move the mobile phone, without displaying any image?

Causes: The issue may occur if the construction parameter toScreen of the engine is set to true, the input texture type is oes, and the render() method is used without a parameter.

Solution: Replace render() with renderTexture(matrix). If the input texture type is oes, you must use renderTexture. The value of the matrix parameter is obtained from surfaceTexture.

What do I do if the application exits unexpectedly after startup?

Make sure that textureId in engine.setInputTexture is correct, and then restart the application.

How do I resolve the issue where the sticker or makeup feature cannot recognize faces when the image is rotated by 90 degrees and displayed in landscape mode?

Causes: The issue may occur if the construction parameter toScreen of the engine is set to true, the input texture type is oes, the render() method is used without a parameter, and the algorithm uses the frame fetch method instead of the bytebuffer method.

Solution: Switch to using the bytebuffer method. However, switching methods may result in the face image being incorrectly scaled up or down due to changes in width and height.

Why are the advanced retouching, makeup, and sticker features ineffective while the basic retouching feature is effective?

If you can use the basic retouching feature, the processes of initialization, parameter settings, and rendering of QueenEngine are valid. The ineffectiveness of the advanced retouching feature is usually caused by invalid parameter settings. If your advanced retouching settings are invalid, faces cannot be recognized and all effects that require facial keypoints become invalid. The following part describes the common invalid parameter settings:

  • You specify invalid values for the width and height parameters of the input texture for the setInputTexture(int texture, int width, int height, boolean isOES) operation. The values of the width and height parameters must be the width and height of the current texture that is specified by the texture parameter. The value of the isOES parameter specifies whether the texture is an oes texture, which is specific to Android. Whether the texture is oes determines whether QueenEngine needs to perform texture display conversion. If QueenEngine needs to perform texture display conversion, the correct matrix of the current camera needs to be passed by using the render parameter. The width and height that you specify determine the aspect ratio of the displayed rendering view. This may affect the scaling ratio of the rendering view to which the effects of advanced features are applied. A common result is that the effects of advanced features are applied, but the image is enlarged and deformed.

  • You specify invalid values for the input width and height parameters for the updateInputDataAndRunAlg(byte[] imageData, int format, int width, int height, int stride, int inputAngle, int outAngle, int flipAxis, boolean reuseData) operation. The values of the width and height parameters specify the actual width and height of the current input data ImageData. The width and height can be the same as the width and height specified in the preceding scenario, or can be reversed. For example, the value of buffer directly obtained from the camera on Android devices is the buffer when the image is rotated by 90 degrees, and the width and height are reversed. The width and height that you specify determine whether faces can be recognized and whether the features that are related to facial recognition are available.

  • You specify invalid values for input parameters such as inputAngle, outAngle, and flipAxis for the updateInputDataAndRunAlg(byte[] imageData, int format, int width, int height, int stride, int inputAngle, int outAngle, int flipAxis, boolean reuseData) operation, or you specify invalid values for input parameters for the updateInputTextureBufferAndRunAlg(int inputAngle, int outAngle, int flipAxis, boolean usePreviousFrame) operation. The inputAngle, outAngle, and flipAxis parameters are required for the facial recognition feature. The inputAngle parameter determines how the algorithm uses the input data or texture, whether images need to be rotated, and how many degrees images are rotated. The outAngle parameter determines how the algorithm renders and displays recognized results, whether images need to be rotated, and how many degrees images are rotated. The flipAxis parameter is an enumeration value, which is defined in QueenEngine and determines whether final rendered images are flipped symmetrically, and whether the flip is along the X-axis or the Y-axis. The preceding parameters are critical for the algorithm that is used to implement facial recognition. The parameters are closely related to the current camera angle and whether the front or rear camera is used. The inputAngle, outAngle, and flipAxis parameters are encapsulated into the tool class QueenCameraHelper.java. You can directly use or adjust the values of these parameters for different applications based on your business requirements.