This topic provides answers to some commonly asked questions about the short video SDK for Android.

SDK integration

What Android instruction sets does the short video SDK support?

If you need to access only armeabi for your project, copy the so file in the armeabi_v7a folder of the SDK package to the armeabi folder of the project.

What do I do if the debug version of an application integrated with the short video SDK runs properly, but the release version crashes during startup?

Symptom: The release version of an application integrated with the short video SDK crashes during startup, while the debug version runs properly.

Possible cause: obfuscation configuration error.

Solution:
  1. Check whether the log that records the crash reports a message indicating that JNI cannot find the correct Java class.
    • If such a message is reported, the issue is usually caused by obfuscation. Because JNI uses reflection to call Java classes, if the classes related to JNI inside the SDK are obfuscated, JNI cannot find the correct Java classes when loading. As a result, the loading fails.
    • If no such message is reported, submit a ticket to contact technical support.
  2. Copy the obfuscation configuration in the demo of the short video SDK to your project.

How do I add a blacklist for hardware encoding and a whitelist for hardware decoding?

  • Run the following code to add a blacklist for hardware encoding:
    /**
    * Add a blacklist for hardware encoding. The models and versions correspond to each other in the order that they are added in the list.
    * If a blacklist for hardware encoding is enabled, models in the blacklist use software encoding, whereas other models use hardware encoding.
    * @param models List of models {@link Build#MODEL}
    * @param versions List of system versions {@link Build.VERSION#SDK_INT} // If you do not need to specify versions, set the value to 0.
    */NativeAdaptiveUtil.encoderAdaptiveList(String[] models,int[] versions);
  • Run the following code to add a whitelist for hardware decoding:
    /**
    * Add a whitelist for hardware decoding. The models and versions correspond to each other in the order that they are added in the list.
    * If a whitelist for hardware decoding is enabled, models in the whitelist use hardware decoding, whereas other models use software decoding.
    * @param models List of models {@link Build#MODEL}
    * @param versions List of system versions {@link Build.VERSION#SDK_INT} // If you do not need to specify versions, set the value to 0.
    * @see #setHWDecoderEnable(boolean)
    */
    NativeAdaptiveUtil.decoderAdaptiveList(String[] models, int[] versions);

What do I do if the following error is returned when I use the short video SDK Basic Edition for Android: java.lang.NoSuchFieldError: No field height of type I in class Lcom/aliyun/snap/snap_core/R$id; or its superclasses (declaration of 'com.aliyun.snap.snap_core.R$id' appears in /data/app/com.rablive.jwrablive-2/base.apk:classes2.dex)?

Symptom: The following error is returned when the short video SDK Basic Edition for Android is used: java.lang.NoSuchFieldError: No field height of type I in class Lcom/aliyun/snap/snap_core/R$id; or its superclasses (declaration of 'com.aliyun.snap.snap_core.R$id' appears in /data/app/com.rablive.jwrablive-2/base.apk:classes2.dex).

Possible cause: This error occurs because the name of an XML file in your project is the same as the name of an XML file in the AAR package of the SDK.

Solution: Add a prefix to the XML file in the project or AAR package. The following XML files may cause duplicate name conflicts: activity_setting.xml and activity_video_play.xml.

What do I do if the following error is returned when I use the short video SDK Basic Edition for Android: java.lang.NoSuchFieldError: No static field notification_template_lines of type I in class Lcom/aliyun/snap/snap_core/R$layout; or its superclasses (declaration of 'com.aliyun.snap.snap_core.R$layout' appears/data/app/com.Aliyun.AliyunVideoSDK.VodSaaSDemo_android-1/base.apk)?

This error occurs because the user interface (UI) of the short video SDK Basic Edition is not open source. The short video SDK uses the support library package of Android. However, the support library package is not included in the compiled AAR package. As a result, a version mismatch occurs in the support library package. To resolve this issue, import the support library package of the same version, as shown in the following code:
// Note: If the third-party library package that you import to your project uses the support library package, make sure that the version of the support library package is correct. We recommend that you import the source code of the third-party library package to your project.
compile 'com.android.support:appcompat-v7:24.2.1'
compile 'com.android.support:design:24.2.1'
In some cases, the version of the support library package used in the third-party library package is difficult to change, and the third-party library package is imported without the use of source code. In this case, we recommend that you change the version of the support library package in the .gradle file of your application, as shown in the following code:
configurations.all {
    resolutionStrategy {
        force 'com.android.support:appcompat-v7:24.2.1'
        force 'com.android.support:design:24.2.1'
    }
}

What do I do if the "Please invoke the FileDownloader#init in Application#onCreate first" message appears when I use the demo?

Call the DownloaderManager.getInstance().init(Context context); operation in the OnCreate() method of the application.

Does the short video SDK provide an API operation to obtain the thumbnail of a video?

The short video SDK Professional Edition for Android provides AliyunIThumbnailFetcher, which allows you to obtain non-keyframe thumbnails. If you use other editions of the short video SDK for Android, we recommend that you use a system function to obtain the thumbnail of a video.

Video recording

How do I add a common animated sticker when I record a video?

You can use the AliyunIRecorder#addPaster(EffectPaster effectPaster,float sx,float sy,float sw,float sh,float rotation,boolean flip) operation and configure the EffectPaster parameter to add a common animated sticker. Take note that you must set the isTrack parameter to false. Otherwise, the sticker is added as a face sticker, which can be applied only to human faces. You cannot use the sticker if no human face is detected. Call this operation after the RecordCallback#OnInitReady() callback is returned. Otherwise, the added sticker is unavailable.

How do I set the angle for recording?

You can call the setRotation or setRecordRotation operation to set the angle for recording. The setRotation operation is an adaptive operation. After the angle detected by the angle sensor of a mobile phone is passed to this operation, the recording angle and face angle are returned. You can also call the setRecordRotation operation to set a recording angle.

How do I record a video in landscape mode?

  • To set the default recording orientation to landscape, set the application view to portrait and rotate the UI elements to landscape. This guides users to rotate their devices during recording.
    android:screenOrientation="portrait"
  • The application view is used as the benchmark to determine the rotation angle of a recorded video. If a recorded video contains multiple clips, the rotation angle of the first clip in the recorded video is used for all clips.
  • If you use the short video SDK Professional Edition, your application uses the video production operation to produce recorded videos. The videos are rotated to portrait mode to adapt to the application view. Assume that the width of a recorded video is 640 pixels, the height is 360 pixels, and the rotation angle is 270°. The width of the produced video is 360 pixels, the height is 640 pixels, and the rotation angle is 0°. If you use the short video SDK Basic Edition or Standard Edition, your application produces recorded videos without rotating them to portrait mode. If a recorded video contains multiple clips, the rotation angle of the first clip in the recorded video is used for all clips. The recording process of the short video SDK Professional Edition is similar to the recording process of the short video SDK Basic Edition and Standard Edition.
Key operation: setRotation.
/**
* Set the rotation angle of the video.
* @param rotation
*/
void setRotation(int rotation);

Condition: Call the setRotation operation only after a recording instance is initialized and before the first video clip is recorded.

Procedure: Set your application view to portrait mode and set the rotation angle for recording.
  1. Set the application view to portrait mode and rotate UI elements to guide users to rotate their devices during recording.
  2. Initialize a recording instance.
  3. Call the setRotation operation to set the rotation angle. You can call the OrientationDetector operation to obtain the rotation angle. For more information, see the demo.
    mRecorder.setRotation(int rotation);
  4. Proceed with recording. Take note that you must set the rotation angle for recording before each time you call the startRecording operation.

Why does the recorded video not contain the background music added during recording after I call the finishRecordForEditor operation?

After you add background music, you must call the finishRecording operation to include the music in the recorded video. Then, you can edit the recorded video with background music in the editing UI. The finishRecording operation and the finishRecordForEditor operation have the following differences:
  • The finishRecording operation merges multiple recorded clips to an MP4 file. After you add one or more pieces of background music, you can also call this operation to include the background music in the MP4 file.
  • In comparison, the finishRecordForEditor operation only adds a recorded video clip in the specified format to the project.json file after the startRecording and stopRecording operations are called to record the video clip. The URI that you specify when you create the editor instance is the URI of the project.json file. The finishRecordForEditor operation does not merge multiple clips or add background music to the recorded video.
To edit multiple recorded video clips with background music, perform the following operations:

Call the finishRecording operation to merge the video clips to an MP4 file and call the AliyunIImport operation to import the MP4 file to the editing UI.

What do I do if the recording UI displays only a part of a common animated sticker?

Call the following operation to obtain the width and height of the material of the common animated sticker:
AliyunIRecorder#addPaster(EffectPaster effectPaster,float sx,float sy,float sw,float sh,float rotation,boolean flip)
Set the sw parameter to the ratio of the sticker width to the screen width. Set the sh parameter to the ratio of the sticker height to the screen height.

Assume that the sticker width is 200 pixels, the sticker height is 300 pixels, the screen width is 540 pixels, and the screen height is 720 pixels. In this case, you must write the following code: sw = (float)200/540, sh = (float)300/720. The sx and sy parameters are normalized values of the width and height of the sticker. The center of the sticker is used as the anchor.

Video editing

I added effects to a video during editing. However, the video that is produced by calling the AliyunICompose.compose operation does not contain the effects. Why does this happen?

In the short video SDK for Android V3.5.0 or earlier, you must call the AliyunIEditor.onPause operation to persist the added effects to an on-premises configuration file. If you do not call this operation, the effects are not persisted to the on-premises configuration file. As a result, the Project field generated by deserializing the response of the AliyunICompose operation does not contain the effects. Therefore, the produced video does not contain the effects.

When I edit a video, I call the applyMusic operation to add a music stream to the video. Then, I set the startTime parameter to specify the point in time when the music starts to play. However, the music always starts from the beginning. Why does this happen?

In the EffectBean class, the startTime parameter specifies a point in time in the main stream instead of the effect material stream. In the short video SDK V3.6.0 and later, the streamStartTime parameter that allows you to specify a point in time in the effect material stream is added. After you specify this parameter, the specified effect starts to take effect at the specified point in time. If you use the short video SDK V3.6.0 or later, you can specify the streamStartTime parameter to meet your business requirements. If you use an earlier version, you must crop the effect material stream to add background music to a video.

when the main stream reaches the 10-second mark, the background music starts to play from the 3-second mark of the effect material stream. When the effect material stream reaches the 10-second mark, the background music starts to play again from the 3-second mark. When the effect material stream reaches the 6-second mark for a second time, the main stream reaches the 20-second mark. At this point in time, the background music stops playing.

How do I add GIF files to a main stream?

The short video SDK of a version earlier than V3.7.0 allows you to add GIF files to a main stream only as images. In comparison, the short video SDK V3.7.0 or later allows you to add GIF files to a main stream as videos or images. If a GIF file is added as a video, all frames of the GIF file are played. However, if a GIF file is added to a main stream as an image, only the first frame of the GIF file is played.

What do I do if a video is stuck after I configure a transition effect or call the applySourceChange operation?

After you configure a transition effect or call the applySourceChange operation, you must call the mAliyunIEditor.play() operation.