This topic describes how to integrate and use Queen SDK for Android to implement face retouching effects. This topic also provides sample code and reference links of Queen SDK for Android.

Prerequisites

Required environments are prepared. The following table describes the environments that are required for development.
Environment Requirement
Android Android 4.3 or later.
Java Java 1.7 or later.
API level Android API level 18 or later.
Android Studio Android Studio 2.3 or later. To download Android Studio, visit Android Studio.

References

Item References
SDK references
Sample project Sample project

In the sample project, the assets directory contains all the image resources of the demo.

Demo project on GitHub Demo project

The demo project provides integration examples of the following SDKs: Push SDK, Qiniu Cloud Live Video Cloud SDK, and Tencent Real-Time Communication SDK.

Integrate the SDK by using Maven dependencies

  1. Add the Alibaba Cloud Maven repository to the project-level build.gradle file.
    allprojects {
        repositories {
            google()
            jcenter()
            maven { url "https://maven.aliyun.com/repository/releases" }
        }
    }
  2. Add one of the following Queen SDK dependencies to the build.gradle file for your Queen SDK for Android.
    • Pro Edition:
      implementation "com.aliyun.maliang.android:queen:2.3.0-official-pro"
    • Ultimate Edition:
      implementation "com.aliyun.maliang.android:queen:2.3.0-official-ultimate"
    • Full Edition:
      implementation "com.aliyun.maliang.android:queen:2.3.0-official-full"
Note

The libraries of face retouching effects and the libraries of SDKs such as Alibaba Cloud Real-Time Communication (RTC) SDK may contain duplicate files. As a result, your application may fail to be packaged. In this case, add the packagingOptions parameter to the build.gradle file to avoid file conflicts, as shown in the following code. If no duplicate files exist, you do not need to add this parameter.

android {    
  ...
  compileSdkVersion ... 
  ...
  packagingOptions {
        pickFirst '**/libMNN.so'    // If duplicate libMNN.so files exist, add this line of code.
        pickFirst '**/libMNN_CL.so'    // If duplicate libMNN_CL.so files exist, add this line of code.
        pickFirst '**/libc++_shared.so'    // If duplicate libc++_shared.so files exist, add this line of code.
    }
  ...
}

Integrate the SDK by importing .aar files

  1. Download the SDK package and decompress the package.
  2. Copy the .aar files in the decompressed SDK package to the libs directory of your project.

Configure the license

Obtain a license in advance. For information about how to obtain a license, see Obtain a license of Queen SDK. After you obtain a license, perform the following steps to configure the LicenseKey and LicenseFile in the project.

Note
  • If you integrate Queen SDK of ApsaraVideo Live and the short video SDK of ApsaraVideo VOD at the same time, the two SDKs have the same LicenseKey and LicenseFile. You only need to configure the LicenseKey and LicenseFile once. Take note that the latest license file must be used.

  • If the SDK that you purchase is updated or needs to be renewed, you must also update the license file. Perform the following steps to update the license file:
    1. Send an email to obtain the latest license file. For more information, see Request a license.
    2. After you obtain the latest license file, perform the steps to configure the license.
Import the license file to the assets directory of the project, and then add two meta-data entries under the application node in the AndroidManifest.xml file. Sample code:
com.aliyun.alivc_license.licensekey is the name of the metadata entry, and the value is fixed.
<application
   android:icon="@drawable/icon"
   android:label="@string/app_name" >
    <meta-data
         android:name="com.aliyun.alivc_license.licensekey"     
         android:value="Your LicenseKey"/>   // A specified parameter for the metadata entry. Enter the LicenseKey that you obtain from the email.
    <meta-data
       android:name="com.aliyun.alivc_license.licensefile"
       android:value="Your LicenseFile Path"/>   // A specified parameter for the metadata entry. Enter the path of the license file that is relative to the assets directory. Example: alivc_license/AliVideoCert.crt.
  ……
</application>

Sample code

Note
If you upgrade Queen SDK from V1.X.X to V2.0.0 or later, take note of the following requirements:
  • The package name of the Java API is changed from com.taobao.android.libqueen to com.aliyun.android.libqueen.
  • In V2.0.0 and later versions, the authentication method for Queen SDK is changed. You must send an email to videosdk@service.aliyun.com again to request a LicenseKey and LicenseFile, and specify Request License File in the title of the email. After your email is received, Alibaba Cloud sends you the LicenseKey and LicenseFile within 48 hours on business days. After you receive the LicenseKey and LicenseFile, refer to Configure the license to complete the configuration.
  • Create a QueenEngine instance. Configure the texture and view parameters for initialization.
    QueenEngine engine;
    try {
        com.aliyun.android.libqueen.QueenConfig config = new com.aliyun.android.libqueen.QueenConfig();
        // The value true indicates that the output is returned to the display area of the OpenGL interface. Default value: false.
        config.toScreen = false;
        // The value true indicates that log debugging is enabled. Default value: false. We recommend that you enable log debugging only in the Debug package to prevent performance impact.
        config.enableDebugLog = false;
        // The value true indicates that the GL context must be created for the QueenEngine instance. Default value: false.
        config.withContext = false;
        // The value true indicates that a separate thread must be created. Default value: false.
        config.withNewGlThread = false;
        if (withContext || withNewGlThread) {
            // If you want to create the GL context in a separate thread for the QueenEngine instance and share the GL context of the current thread, configure the current GL context.
            if (Build.VERSION.SDK_INT >= 21) {
              config.shareGlContext = EGL14.eglGetCurrentContext().getNativeHandle();
            } else {
              config.shareGlContext = EGL14.eglGetCurrentContext().getHandle();
            }
        }
        // Initialize QueenEngine by passing the Android.content.Context object.
        // The second parameter is used to configure the created instance.
        engine = new QueenEngine(mContext, config);
    } catch (InitializationException e) {
        e.printStackTrace();
    }
    
    // Set the input texture that is used for rendering during retouching.
    // The fourth parameter specifies whether the type of the input texture is OES.
    engine.setInputTexture(textureId, textureWidth, textureHeight, true);
    
    // (Optional) Obtain the output texture of retouching, which can be used for other services. If you want to return the output texture in the same direction as the input texture, set the keepInputDirection parameter to true for engine.autoGenOutTexture and engine.updateOutTexture.
    Texture2D outTexture = engine.autoGenOutTexture(true);
    
    // Set the size of the view.
    engine.setScreenViewport(0, 0, viewWidth, viewHeight);
    // Enable the log printing and debugging mode. We recommend that you enable log debugging only in the Debug package to prevent performance impact.
    engine.enableDebugLog();
  • Set the parameters for skin whitening and basic retouching.
    • Skin whitening
      // Specify whether to enable skin whitening.
      engine.enableBeautyType(BeautyFilterType.kSkinWhiting, true);
      // Set the level of skin whitening. Valid values: [0,1].
      engine.setBeautyParam(
          BeautyParams.kBPSkinWhitening, 
          0.3f
      );
    • Basic retouching
      /**
      * Specify whether to enable skin smoothing and image sharpening. 
      * The third parameter specifies the basic retouching mode. If you set this parameter to kBMSkinBuffing_Natural, the retouching effect is more natural and more details are retained. If you set this parameter to kBMSkinBuffing_Strong, the effect is more exaggerated and more details are removed. 
       */
      engine.enableBeautyType(BeautyFilterType.kSkinBuffing, true, BeautyFilterMode.kBMSkinBuffing_Natural);
      // Set the level of skin smoothing. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPSkinBuffing, 0.6f);
      // Set the level of image sharpening. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPSkinSharpen, 0.2f);
  • Set the parameters for advanced retouching.
    If you use advanced retouching effects, such as advanced face retouching, face shaping, body shaping, makeup, filters, and stickers, you must call updateInputTextureBufferAndRunAlg.
    if (mUseTextureBuffer) { // Use the execution algorithm for texture data.
        engine.updateInputTextureBufferAndRunAlg(
                mCamera.inputAngle, mCamera.outAngle,
                mCamera.flipAxis, false);
    } else {
    // Set the parameters for the input frame image stream. 
        engine.updateInputDataAndRunAlg(
    imageData, // The information about the frame image stream.
    ImageFormat.NV21, // The format of the frame image stream.
    imageWidth, // The width of the frame images.
    imageHeight, // The height of the frame images.
    0, // This parameter is used to detect the image stride, which indicates the number of bytes in each row. Default value: 0. Unit: pixels.
    mCamera.inputAngle, // The rotation angle of the input frame images. For information about the formula, see the sample project that is provided in the "References" section of this topic.
    mCamera.outAngle, // The rotation angle of the output frame images. For information about the formula, see the sample project that is provided in the "References" section of this topic.
    mCamera.flipAxis // Specify how to rotate the input frame images. The value 0 indicates that the input frame images are not rotated. The value 1 indicates that the input frame images are rotated along the x-axis. The value 2 indicates that the input frame images are rotated along the y-axis.
        );
    }
    • Advanced retouching
      // Specify whether to enable advanced retouching.
      engine.enableBeautyType(BeautyFilterType.kFaceBuffing, true);
      // Set the level of nasolabial fold removal. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPNasolabialFolds, 0.3f); 
      // Set the level of eye-bag removal. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPPouch, 0.3f); 
      // Set the level of teeth whitening. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPWhiteTeeth, 0.2f); 
      // Set the level of lipstick effects. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPLipstick, 0.2f); 
      // Set the level of blush effects. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPBlush, 0.2f);
      // Set the level of eye brightening. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPBrightenEye, 1.0f); 
      // Set the level of rosy cheeks. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPBlush, 1.0f);
      // Set the color of lipstick. Valid values: [-0.5,0.5]. You must set the color, saturation, and brightness together. You can set this parameter to -0.125 for ochre, -0.1 for pink, 0.0 for vintage red and orange red, -0.2 for magenta, -0.08 for true red, -0.42 for purple, 0.125 for orange, and 0.25 for yellow.
      engine.setBeautyParam(BeautyParams.kBPLipstickColorParam, 0.0f);
      // Set the saturation of lipstick. Valid values: [0,1]. You must set the color, saturation, and brightness together. You can set this parameter to 0.25 for ochre and orange, 0.125 for pink, 1.0 for vintage red and true red, 0.35 for magenta, orange red, and purple, and 0.45 for yellow.
      engine.setBeautyParam(BeautyParams.kBPLipstickGlossParam, 0.0f);
      // Set the brightness of lipstick. Valid values: [0,1]. You must set the color, saturation, and brightness together. You can set this parameter to 0.4 for ochre, 0.0 for pink, magenta, true red, orange red, purple, orange, and yellow, and 0.2 for vintage red.
      engine.setBeautyParam(BeautyParams.kBPLipstickBrightnessParam, 1.0f);
      // Set the level of wrinkle removal. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPWrinkles, 0.2f);
      // Set the level of skin brightening. Valid values: [0,1].
      engine.setBeautyParam(BeautyParams.kBPBrightenFace, 0.2f);
      // Enable the HSV color model.
      engine.enableBeautyType(BeautyFilterType.kHSV, true);
      // Set the level of saturation. Valid values: [-1,1].
      engine.setBeautyParam(BeautyParams.kBPHSV_SATURATION, 0.2f);
      // Set the level of contrast. Valid values: [-1,1].
      engine.setBeautyParam(BeautyParams.kBPHSV_CONTRAST, 0.2f);
    • Face play
      // Configure the pixelation effect.
      engine.enableBeautyType(BeautyFilterType.kBTEffectMosaicing, true);
      engine.setBeautyParam(BeautyParams.kBPEffects_Mosaicing, 0.45f);
    • Face shaping
      /**
       * Specify whether to enable face shaping. The second parameter specifies whether to enable face shaping. The third parameter specifies whether to enable debugging.
       * The fourth parameter specifies the face shaping mode. Valid values: kBMFaceShape_Baseline, kBMFaceShape_Main, kBMFaceShape_High, and kBMFaceShape_Max, with the deformation level increased in sequence.
       */
      engine.enableBeautyType(BeautyFilterType.kFaceShape, true, false, BeautyFilterMode.kBMFaceShape_Main);
      /**
       * Face shaping: cheekbones<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeCutCheek, 0.0f);
      /**
       * Face shaping: cheekbone narrowing<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeCutFace, 0.0f);
      /**
       * Face shaping: face slimming<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeThinFace, 0.0f);
      /**
       * Face shaping: face length<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeLongFace, 0.0f);
      /**
       * Face shaping: chin shortening<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeLowerJaw, 0.0f);
      /**
       * Face shaping: chin lengthening<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeHigherJaw, 0.0f);
      /**
       * Face shaping: chin slimming<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeThinJaw, 0.0f);
      /**
       * Face shaping: jaw slimming<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeThinMandible, 0.0f);
      /**
       * Face shaping: big eyes<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeBigEye, 0.0f);
      /**
       * Face shaping: canthus shaping 1<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeEyeAngle1, 0.0f);
      /**
       * Face shaping: eye distance<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeCanthus, 0.0f);
      /**
       * Face shaping: eye distance increase<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeCanthus1, 0.0f);
      /**
       * Face shaping: canthus shaping 2<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeEyeAngle2, 0.0f);
      /**
       * Face shaping: eye height<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeEyeTDAngle, 0.0f);
      /**
       * Face shaping: nose slimming<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeThinNose, 0.0f);
      /**
       * Face shaping: nasal alar slimming<br />
       * Valid values: [0,1].
       */
      engine.updateFaceShape(FaceShapeType.typeNosewing, 0.0f);
      /**
       * Face shaping: nose length<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeNasalHeight, 0.0f);
      /**
       * Face shaping: length of the nose tip<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeNoseTipHeight, 0.0f);
      /**
       * Face shaping: lip width<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeMouthWidth, 0.0f);
      /**
       * Face shaping: lip size<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeMouthSize, 0.0f);
      /**
       * Face shaping: lip height<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeMouthHigh, 0.0f);
      /**
       * Face shaping: philtrum<br />
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typePhiltrum, 0.0f);
      /**
       * Face shaping: hairline
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeHairLine, 0.0f);
      /**
       * Face shaping: smiling
       * Valid values: [-1,1].
       */
      engine.updateFaceShape(FaceShapeType.typeSmile, 0.0f);
    • Body shaping
      /**
      * Specify whether to enable body shaping. The second parameter specifies whether to enable body shaping. The third parameter specifies whether to enable debugging. */
      queenEngine.enableBeautyType(BeautyFilterType.kBodyShape, true, false);
      /**
      * Body shaping: whole body
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kFullBody, 1.0f);
      /**
      * Body shaping: small head
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kSmallHead, 1.0f);
      /**
      * Body shaping: thin legs
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kThinLag, 1.0f);
      /**
      * Body shaping: long legs
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kLongLag, 1.0f);
      /**
      * Body shaping: neck
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kLongNeck, 1.0f);
      /**
      * Body shaping: slim waist
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kThinWaist, 1.0f);
      /**
      * Body shaping: breast enlarging
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kEnhanceBreast, 1.0f);
      /**
      * Body shaping: arms
      * Valid values: [-1,1].
       */
      engine.updateBodyShape(BodyShapeType.kThinArm, 1.0f);
    • Makeup

      Enable makeup

      // The second parameter specifies whether to enable makeup. The third parameter specifies whether to enable debugging.
      // The fourth parameter specifies the makeup mode. This parameter takes effect only on eyebrows. If you set this parameter to BeautyFilterMode.kBMFaceMakeup_High, the deformation level of eyebrows is high. If you set this parameter to BeautyFilterMode.kBMFaceMakeup_Baseline, the deformation level of eyebrows is low.
      engine.enableBeautyType(BeautyFilterType.kMakeup, true, false, BeautyFilterMode.kBMFaceMakeup_Baseline);
      
      // Set the parameters for makeup materials.
      // The first parameter specifies the makeup type.
      // The second parameter specifies the storage path of the makeup material. The value of this parameter can be a path relative to the assets directory, such as /makeup/peach makeup.png, or an absolute path on which you have access permissions.
      // The third parameter specifies how the makeup material is applied to faces. The fourth parameter is reserved.
      engine.setMakeupImage(MakeupType.kMakeupBlush,
                            new String[]{""},     
                            BlendType.kBlendCurve, 15);
      /** 
      * The full face makeup type is kMakeupWhole. You can set a single makeup material to apply makeup to the entire face but cannot adjust the details of each part of the face. 
      * If your SDK version is later than V1.4.0, we recommend that you use mixed makeup. Mixed makeup allows you to adjust the details of each part of the face. The following types of mixed makeup are provided:
      *  1. Tipsy makeup:
      *     For eye shadow, the path is makeup/eyeshadow/naichazong.2.31.png, and the transparency is 0.7. 
      *     For eyelash, the path is makeup/eyelash/yesheng.2.31.png, and the transparency is 0.5. 
      *     For blush, the path is makeup/blush/weixun.2.31.png, and the transparency is 0.8. 
      *     For eyeliner, the path is makeup/eyeliner_292929/wenrou.2.31.png, and the transparency is 0.5. 
      *     For lipstick, the path is makeup/mouth_wumian/standout.2.31.png, and the transparency is 0.5. 
      *     For highlight, the path is makeup/highlight/highlight.2.12.png, and the transparency is 0.4.
      *  2. Freckle makeup:
      *     For eye shadow, the path is makeup/eyeshadow/taohuafen.2.31.png, and the transparency is 0.7. 
      *     For eyelash, the path is makeup/eyelash/yesheng.2.31.png, and the transparency is 0.5. 
      *     For blush, the path is makeup/blush/cool.2.31.png, and the transparency is 0.8. 
      *     For eyeliner, the path is makeup/eyeliner_292929/guima.2.31.png, and the transparency is 0.5. 
      *     For lipstick, the path is makeup/mouth_yaochun/nanguase.2.31.png, and the transparency is 0.5. 
      *     For highlight, the path is makeup/highlight/highlight.2.12.png, and the transparency is 0.4.
      *  3. Lively makeup:
      *     For eye shadow, the path is makeup/eyeshadow/tianchengse.2.31.png, and the transparency is 0.7. 
      *     For eyelash, the path is makeup/eyelash/lingdong.2.31.png, and the transparency is 0.5. 
      *     For blush, the path is makeup/blush/luori.2.31.png, and the transparency is 0.8. 
      *     For eyeliner, the path is makeup/eyeliner_292929/qizhi.2.31.png, and the transparency is 0.5. 
      *     For lipstick, the path is makeup/mouth_yaochun/nanguase.2.31.png, and the transparency is 0.5. 
      *     For highlight, the path is makeup/highlight/highlight.2.12.png, and the transparency is 0.4.
      *  4. Nightclub makeup:
      *     For eye shadow, the path is makeup/eyeshadow/yeqiangwei.2.31.png, and the transparency is 0.7. 
      *     For eyelash, the path is makeup/eyelash/zhixing.2.31.png, and the transparency is 0.5. 
      *     For blush, the path is makeup/blush/shaonv.2.31.png, and the transparency is 0.8. 
      *     For eyeliner, the path is makeup/eyeliner_292929/wenrou.2.31.png, and the transparency is 0.5. 
      *     For lipstick, the path is makeup/mouth_zirun/zhenggongse.2.31.png, and the transparency is 0.5. 
      *     For highlight, the path is makeup/highlight/highlight.2.12.png, and the transparency is 0.4.
       **/
      
      // Set the transparency for the makeup material.
      // The second parameter specifies the transparency of the material. The third parameter is reserved.
      engine.setMakeupAlpha(MakeupType.kMakeupBlush, 0.6f, 0.3f);

      Disable makeup

      engine.setMakeupImage(MakeupType.kMakeupBlush, new String[], BlendType.kBlendNormal, 15);

      Enable the aegyo-sal effect

       // The second parameter specifies whether to enable the aegyo-sal effect. The third parameter specifies whether to enable debugging.
      engine.enableBeautyType(BeautyFilterType.kMakeup, true, false);
      
      engine.setMakeupImage(MakeupType.kMakeupWocan,
                            new String[]{""},   // Only built-in materials for the aegyo-sal effect are used.  
                            BlendType.kBlendCurve, 15);
      
      engine.setMakeupAlpha(MakeupType.kMakeupWocan, 0.6f, 0.3f);
      Disable the aegyo-sal effect
      engine.setMakeupImage(MakeupType.kMakeupWocan, new String[], BlendType.kBlendCurve, 15);
    • Change the hair color
      engine.enableBeautyType(BeautyFilterType.kHairColor, true);
      /**
       * Hair colors are represented by RGB values (floating point numbers).
       * Sample colors:
       *   Blue: [0.3137f, 0.3137f, 0.6275f]
       *   Purple: [0.6078f, 0.3529f, 0.6275f]
       *   Sky blue: [0.3333f, 0.5492f, 0.5491f]
       *   Yellow: [0.6471f, 0.5294f, 0.3529f]
       *   Green: [0.3725f, 0.5882f, 0.3137f]
       *   Brown: [0.3922f, 0.3333f, 0.3137f]
       *   Red: [0.5098f, 0.2745f, 0.2745f]
       */
      engine.setHairColor(
        getQueenParam().hairRecord.colorRed,
        getQueenParam().hairRecord.colorGreen,
        getQueenParam().hairRecord.colorBlue);
    • Filters
      // Specify whether to enable filters.
      engine.enableBeautyType(BeautyFilterType.kLUT, true);
      // Specify the filter that you want to use. The path of the filter can be a path relative to the assets directory, such as /lookups/lookup_1.png, or an absolute path on which you have access permissions.
      engine.setFilter(lutResPath); 
      // Specify the level of the filter.
      engine.setBeautyParam(BeautyParams.kBPLUT, 1.0f); 
    • Stickers
      // Remove a sticker in the specified path.
      engine.removeMaterial(oldStickerResPath);
      // Add a sticker in the specified path. After a sticker is added, you cannot add it again.
      // The path of the sticker can be a path relative to the assets directory, such as /sticker/baiyang, or an absolute path on which you have access permissions.
      engine.addMaterial(stickerResPath);
    • Chroma key
      /**
       * Set the parameters for the green screen.
       * @param backgroundPath: the path of the background image that you want to replace. If the value is "", chroma key is disabled.
       * @param blueScreenEnabled: specifies whether to use the blue screen. The value true indicates that the blue screen is used. The value false indicates that the green screen is used.
       * @param threshold: Valid values: 20 to 60. Default value: 30. We recommend that you use the default value.    
       */
      engine.setGreenScreen(String backgroundPath, boolean blueScreenEnabled, float threshold);
    • Background replacement
      // Before you enable background replacement, you can specify the performance mode based on your business requirements. The supported performance modes include Auto Mode, Best Quality Mode, Balance Mode, and Best Performance Mode. If you do not configure this parameter, Auto Mode is used by default.
      engine.setSegmentPerformanceMode(SegmentPerformanceMode.Auto);
      // Remove a background image in the specified path.
      engine.removeMaterial(oldBackgroundResPath);
      // Add a background image in the specified path. After a background image is added, you cannot add it again.
      // The path of the background image can be a path relative to the assets directory, such as /static/xiaomanyao, or an absolute path on which you have access permissions.
      engine.addMaterial(backgroundResPath);
      // In addition to background replacement, you can also enable background bokeh.
      engine.enableBeautyType(BeautyFilterType.kBTBackgroundProcess, true);
      // After you enable background bokeh, the background is blurred by default. You can use the following API operation to set a transparent background. This is suitable for scenarios in which the output is used as a foreground and the background is synthesized.
      engine.setSegmentBackgroundProcessType(BackgroundProcessType.kBackgroundTransparent);
    • AR writing
      /**
       * The first parameter specifies whether to enable AR writing.
       * The second parameter specifies the mode. The value 1 indicates the writing mode, and the value 2 indicates the drawing mode.
       */
      engine.setArWriting(true, getQueenParam().sArWritingRecord.mode);
    • Intelligent dynamic optimization
      // Enable intelligent dynamic optimization.
      engine.enableBeautyType(BeautyFilterType.kBTAutoFilter, true);
      // Disable intelligent dynamic optimization.
      engine.enableBeautyType(BeautyFilterType.kBTAutoFilter, false);
  • Set the parameters for gesture recognition.
    Register and deregister the algorithm callback.
    /** Register the algorithm callback.**/
    // Use the handle of QueenEngine to create an algorithm instance. After you register the algorithm callback, the algorithm is executed.
    // The third parameter specifies the algorithm type. For more information, see the details about the AlgType class.
    // The following example shows a gesture recognition algorithm.
    Algorithm algorithm = new Algorithm(engine.getEngineHandler(), "", com.taobao.android.libqueen.models.AlgType.kAAiGestureDetect);
    algorithm.registerAlgCallBack(new Algorithm.OnAlgDetectListener() {
        @Override
        public int onAlgDetectFinish(int algId, Object algResult) {
            if (algResult instanceof com.taobao.android.libqueen.algorithm.GestureData) {
    // GestureData contains data about static gestures and dynamic gestures.
                com.taobao.android.libqueen.algorithm.GestureData gestureData = (com.taobao.android.libqueen.algorithm.GestureData) algResult;
            }
            return 0;
        }
    });
    
    /** Deregister the algorithm callback.**/
    // If QueenEngine associated with the algorithm is not destroyed, you must manually deregister the algorithm callback. If QueenEngine is destroyed, the algorithm instance automatically becomes invalid. 
    algorithm.unRegisterAlgCallBack();
  • Set the parameters for movement detection.
    Register and deregister the algorithm callback.
    /** Register the algorithm callback.**/
    // Use the handle of QueenEngine to create an algorithm instance. After you register the algorithm callback, the algorithm is executed.
    // The third parameter specifies the algorithm type. For more information, see the details about the AlgType class.
    // The following example shows a movement detection algorithm.
    Algorithm algorithm = new Algorithm(engine.getEngineHandler(), "", com.taobao.android.libqueen.models.AlgType.kQueenAIBodySportDetect);
    algorithm.registerAlgCallBack(new Algorithm.OnAlgDetectListener() {
        @Override
        public int onAlgDetectFinish(int algId, Object algResult) {
            if (algResult instanceof BodyDetectData) {
              BodyDetectData resultData = (BodyDetectData)algResult;
              int sportType = resultData.getBodySportType();
              if (sportType <= 0) {
                // Posture recognition
                int poseType = resultData.getBodyPoseType();
                sb.append("[Posture recognition]:").append(getBodyPoseName(poseType));
              } else {
                // Count the number of specific movements.
                int sportCount = resultData.getBodySportCount();
                sb.append("[Movement]:").append(getBodySportPoseName(sportType)).append("\r\n")
                  .append("[Count]:").append(sportCount);
              }
            }
            return 0;
        }
    });
    
    /** Deregister the algorithm callback.**/
    // If QueenEngine associated with the algorithm is not destroyed, you must manually deregister the algorithm callback. If QueenEngine is destroyed, the algorithm instance automatically becomes invalid. 
    algorithm.unRegisterAlgCallBack();
  • Perform rendering.
    // The transformation matrix of the OES texture. This value can be obtained by using SurfaceTexture.
    float[] transformMatrix = new float[16];
    // Use SurfaceTexture to update the transformation matrix.
    mSurfaceTexture.updateTexImage();
    mSurfaceTexture.getTransformMatrix(transformMatrix);
    
    // Render the texture to the current view. If certificate authentication fails or all effects are disabled, Queen SDK does not render the texture.
    int retCode = engine.renderTexture(transformMatrix);
    
    // For more information, see SDK references.
    // QUEEN_INVALID_LICENSE(-9) indicates that certificate authentication fails.
    // QUEEN_NO_EFFECT(-10) indicates that all effects are disabled.
    // In this case, you must manually render the texture. For more information, see the sample project that is provided in the "References" section of this topic.
    if (retCode == -9 || retCode == -10) {
        mFrameGlDrawer.draw(transformMatrix, mOESTextureId, true);
    }
  • Release the engine.
    // Release the engine.
    engine.release();
  • Configure the resource download feature.
    // Initialize the resource download feature.
    QueenMaterial.getInstance().init(context);
    
    // Configure the resource download listener.
    QueenMaterial.getInstance().setCallback(new QueenMaterial.OnMaterialCallback() {
    
        @Override
        public void onReady(QueenMaterial.MaterialType materialType) {
            // The callback for the completion of resource download in a non-UI thread is returned. In most cases, the loading box is closed and the business logic is triggered.
    
            // Stickers
            String stickerName = "1"; // The relative path of the sticker.
            String stickerPath = QueenMaterial.getInstance().getMaterialPath(QueenMaterial.MaterialType.STICKER) + File.separator + stickerName;
            // Set the stickerPath parameter to QueenEngine.
            engine.addMaterial(stickerResPath);
        }
    
        @Override
        public void onProgress(QueenMaterial.MaterialType materialType, int currentSize, int totalSize, float progress) {
            // The callback for resource download progress in a non-UI thread is returned. In most cases, the download progress is updated.
        }
    
        @Override
        public void onError(QueenMaterial.MaterialType materialType) {
            // The callback for a resource download error in a non-UI thread is returned. In most cases, the loading box is closed and a message that prompts you to check the network and disk space is displayed.
        }
    
    });
    
    // Trigger resource download. You can download algorithm models, stickers, makeup resources, or filters based on your business requirements.
    // Note 1: You need to download an algorithm model only if the SDK version does not contain one. Download the algorithm model before you initialize QueenEngine, or retouching effects are invalid.
    // Note 2: You can package and publish materials such as stickers, makeup resources, and filters as an APK file, or you can download these materials and import them.
    if (QueenMaterial.getInstance().requestMaterial(QueenMaterial.MaterialType.MODEL)) {
        showProgressDialog();
    }

Example of Push SDK integration

Push SDK is used in this example to describe how to integrate Queen SDK for Android.

  1. Make sure that live streaming can be configured. Enable the preprocessing mode.
    mAliLiveConfig.customPreProcessMode = CUSTOM_MODE_VIDEO_PREPROCESS;
  2. Initialize QueenEngine.
    engine = new QueenEngine(this, false);
  3. Register a callback agent for video processing to obtain the updated texture.
    public int onTextureInput(int inputTexture, int textureWidth, int textureHeight) {
        glThreadId = Thread.currentThread().getId();
    
        if (mMediaChainEngine == null || !isBeautyEnable) {
        return inputTexture;
            }
        updateSettings();
    
        int[] oldFboId = new int[1];
        GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, 
    IntBuffer.wrap(oldFboId));
    
        mMediaChainEngine.setInputTexture(inputTexture, textureWidth, 
    textureHeight, false);
    
       // If the screen rotates, you must recreate the texture and set the texture size.
       if (lastTextureWidth != textureWidth || lastTextureHeight != textureHeight) 
    {
       if (mOutTexture2D != null) {
           mOutTexture2D.release();
           mOutTexture2D = null;
                }
       lastTextureWidth = textureWidth;
       lastTextureHeight = textureHeight;
       mMediaChainEngine.setScreenViewport(0, 0, textureWidth,
    textureHeight);
            }
    
       if (mOutTexture2D == null) {
           mOutTexture2D = mMediaChainEngine.autoGenOutTexture();
            }
    
       if (mOutTexture2D == null) {
           return inputTexture;
           }
    
       long now = SystemClock.uptimeMillis();
    
       boolean hasRunAlg = false;
       if (USE_FRAME_SYNCHRONIZED) {
           mMediaChainEngine.setInputFlip(Flip.kNone);
           if (outAngle == 90 || outAngle == 270) {// right out = 90 / left out = 270
               mMediaChainEngine.setRenderAndFaceFlip(Flip.kFlipY, Flip.kNone);
               mMediaChainEngine.updateInputTextureBufferAndRunAlg(360-outAngle, outAngle, Flip.kFlipY, false);
          } else { // right-side up out = 180 / upside down out = 0               
               mMediaChainEngine.setRenderAndFaceFlip(Flip.kNone, Flip.kFlipY);               
               mMediaChainEngine.updateInputTextureBufferAndRunAlg(180-outAngle, 180-outAngle, Flip.kNone, false); 
          }
          hasRunAlg = true;
          } else if (mAlgNativeBufferPtr != 0) {
              mMediaChainEngine.updateInputNativeBufferAndRunAlg(mAlgNativeBufferPtr, 
              mAlgDataFormat, mAlgDataWidth, mAlgDataHeight, nAlgDataStride, 
              inputAngle, outAngle, flipAxis);
          hasRunAlg = true;
          }
    
              int retCode = mMediaChainEngine.render();
              isAlgDataRendered = true;
    
              Log.i(TAG, Thread.currentThread().getId() + " - " +"render : " + 
          (SystemClock.uptimeMillis()-now) + "ms, hasRunAlg: " + hasRunAlg + ", 
          textureW: " + textureWidth + ", textureH: " + textureHeight + ", outAngle: " +
          outAngle);
             if (retCode == -9 || retCode == -10) {
                 Log.d(TAG, "queen error code:" + retCode + ",please ensure license valid");
                 return inputTexture;
           }
    
           GLES20.glBindFramebuffer(GL_FRAMEBUFFER, oldFboId[0]);
    
           return mOutTexture2D.getTextureId();
       }