This topic answers frequently asked questions about using the short video SDK for Android.
SDK integration
What Android instruction sets are compatible?
If your project uses only the armeabi instruction set, copy the .so file from the armeabi_v7a folder in the SDK package to your project's armeabi folder.
What do I do if the debug version runs correctly, but the release version crashes on startup?
Symptom: After you integrate the SDK, the debug version runs correctly, but the release version crashes on startup.
Possible cause: Incorrect obfuscation configuration.
Solutions:
Check the crash log for a message indicating that the Java Native Interface (JNI) cannot find the corresponding Java class.
If this message exists, the crash is typically caused by obfuscation. JNI uses reflection to call Java classes. If obfuscation renames the SDK's internal classes related to JNI, the call fails because the corresponding Java classes cannot be found.
If not, submit a ticket for assistance from Alibaba Cloud technical support.
Copy the obfuscation configuration from the demo to the configuration file of your project.
How do I add a hardware encoding blacklist and a hardware decoding whitelist?
Adding a hard-coded blacklist
/** * Add a blacklist for hardware encoding. The models and versions must correspond to each other. * Models in the blacklist use software encoding. Models outside the blacklist use hardware encoding. * @param models A list of device models. {@link Build#MODEL} * @param versions A list of system version numbers. {@link Build.VERSION#SDK_INT}. If you do not need to specify a version, enter 0. */NativeAdaptiveUtil.encoderAdaptiveList(String[] models,int[] versions);Add a hardware decoding whitelist
/** * Add a whitelist for hardware decoding. The models and versions must correspond to each other. * If hardware decoding is enabled, models in the whitelist use hardware decoding. Models outside the whitelist use software decoding. * @param models A list of device models. {@link Build#MODEL} * @param versions A list of system version numbers. {@link Build.VERSION#SDK_INT}. If you do not need to specify a version, enter 0. * @see #setHWDecoderEnable(boolean) */ NativeAdaptiveUtil.decoderAdaptiveList(String[] models, int[] versions);
What do I do if the Basic Edition of the short video SDK for Android returns the following error: java.lang.NoSuchFieldError: No field height of type I in class Lcom/aliyun/snap/snap_core/R$id; or its superclasses (declaration of ‘com.aliyun.snap.snap_core.R$id’ appears in /data/app/com.rablive.jwrablive-2/base.apk:classes2.dex)?
Symptom: The Basic Edition of the short video SDK for Android returns the following error: java.lang.NoSuchFieldError: No field height of type I in class Lcom/aliyun/snap/snap_core/R$id; or its superclasses (declaration of ‘com.aliyun.snap.snap_core.R$id’ appears in /data/app/com.rablive.jwrablive-2/base.apk:classes2.dex).
Possible cause: This error occurs because a name conflict exists between an XML file in your project and an XML file in the Android Archive (AAR) package of the SDK.
Solution: Find the conflicting XML file and add a prefix to its name. XML files that often cause conflicts include activity_setting.xml and activity_video_play.xml.
What do I do if the Basic Edition of the short video SDK for Android returns the following error: java.lang.NoSuchFieldError: No static field notification_template_lines of type I in class Lcom/aliyun/snap/snap_core/R$layout; or its superclasses (declaration of ‘com.aliyun.snap.snap_core.R$layout’ appears/data/app/com.Aliyun.AliyunVideoSDK.VodSaaSDemo_android-1/base.apk)?
This error occurs because the user interface (UI) of the short video SDK Basic Edition is not open source. The SDK references the Android support package, but this package is not included when the SDK is compiled into an AAR file. This leads to an ID mismatch. To resolve this issue, ensure the support package versions match, as shown in the following code:
// Important: If a third-party library that you import into your project also imports a support package, you must ensure that the package versions match. We recommend that you import the third-party library from the source code.
compile 'com.android.support:appcompat-v7:24.2.1'
compile 'com.android.support:design:24.2.1'Sometimes, modifying a third-party library that includes a support package is difficult, especially if the library was not imported from source code. In this case, configure the Gradle file in your application as follows:
configurations.all {
resolutionStrategy {
force 'com.android.support:appcompat-v7:24.2.1'
force 'com.android.support:design:24.2.1'
}
}What do I do if the "Please invoke the FileDownloader#init in Application#onCreate first" message appears when I use the demo?
Call DownloaderManager.getInstance().init(Context context); in the onCreate method of your Application class.
Does the SDK provide an API to retrieve a video thumbnail?
The Professional Edition of the short video SDK for Android provides the AliyunIThumbnailFetcher interface, which you can use to retrieve thumbnails from non-keyframes. For other editions, use a system function to retrieve video frames.
Video recording
How do I add a regular animated sticker?
To add a regular animated sticker, use the AliyunIRecorder#addPaster(EffectPaster effectPaster,float sx,float sy,float sw,float sh,float rotation,boolean flip) interface and configure the EffectPaster object. Set the isTrack parameter in EffectPaster to false. Otherwise, the sticker is treated as a face sticker that tracks a human face. The sticker does not appear if no face is detected. Also, call this interface after the RecordCallback#OnInitReady() callback. Otherwise, the sticker is not displayed.
How do I set the recording angle?
You can use the setRotation and setRecordRotation interfaces to set the recording angle. The setRotation interface is adaptive. Pass the angle value from the phone's angle sensor to this interface to obtain the correct recording and face angles. The setRecordRotation interface lets you set a custom video angle.
How do I record in landscape mode?
To set the default recording orientation to landscape, keep the screen orientation in portrait mode and rotate only the UI elements to guide users to record in landscape mode.
android:screenOrientation="portrait"A video recorded in landscape mode has a rotation angle. The angle is determined by the first recorded segment.
With the Professional Edition, the composed video does not have a rotation angle after editing. For example, an original 360 × 640 video with a 270-degree angle becomes a 640 × 360 video. The Basic and Standard editions only support recording. If you use these editions to record in landscape mode, the output video retains a rotation angle based on the first segment. The recording behavior is the same across all editions, but only the Professional Edition removes the rotation angle during composition, swapping the video's width and height.
Key interface function:
/**
* Set the rotation angle of the video.
* @param rotation
*/
void setRotation(int rotation);Prerequisites: Set the rotation angle after initialization is complete and before you record the first segment.
To call this operation, lock the interface to portrait orientation, and then set the rotation angle.
Set the screen to portrait mode and rotate the UI elements to guide users to record in landscape mode.
Initialize the recording as you would for a normal recording.
Call this operation before you start recording. Note that you must obtain the rotation angle yourself. Refer to the demo and use OrientationDetector to obtain the orientation.
mRecorder.setRotation(int rotation);Continue the recording steps. Before each call to startRecording, set the rotation angle to determine the angle for each video segment.
Why is the background music not included in the final video after I call finishRecordForEditor?
After adding background music, you must call the finishRecording interface to merge the music into the video. Otherwise, the video does not have music when you open it in the editor. The differences between finishRecording and finishRecordForEditor are as follows:
finishRecording: This interface has two functions. First, when you record multiple segments, it merges them into a single MP4 file, which is the specified output file. Second, after you add background music, it also merges the music into the output MP4 file for both single and multiple segments.
finishRecordForEditor: This interface does not merge video segments or background music. It only adds the recorded segment (created between startRecording and stopRecording) to the project.json configuration file. The URI passed when you create the Editor is the URI of this file.
How do I edit multiple recorded segments that have background music?
First, call finishRecording to merge the video segments into an MP4 file at the output path. Then, use the AliyunIImport interface to import the MP4 file into the editor.
What do I do if an animated sticker is not fully displayed on the recording screen?
You need to retrieve the width and height of the animated image asset.
AliyunIRecorder#addPaster(EffectPaster effectPaster,float sx,float sy,float sw,float sh,float rotation,boolean flip)The sw and sh parameters must match the ratio of the asset's dimensions to the screen's dimensions.
For example, if the asset is 200 pixels wide and 300 pixels high, and the screen is 540 pixels wide and 720 pixels high, then set sw = (float)200/540, sh = (float)300/720. The sx and sy parameters are normalized screen coordinates, with the anchor point at the center of the resource.
Video editing
Why are the effects I added during editing missing from the composed video?
In versions 3.5.0 and earlier, you must call the AliyunIEditor.onPause interface to save the effects from the editor preview to the local configuration file. If you do not call this interface, the effect settings are not saved. As a result, the project does not include the effects when it is deserialized by the AliyunICompose interface, and the composed video will not contain them.
When editing, I call applyMusic to add music and set startTime. Why does the music stream always play from the beginning instead of from startTime?
In the EffectBean class, the startTime parameter specifies the time on the main stream when the effect starts, not the start time within the material stream. Version 3.6.0 and later include the streamStartTime parameter, which specifies the start time within the material stream. If you use version 3.6.0 or later, you can use this parameter. In earlier versions, you must first trim the music stream and then use the trimmed music as the background music.
How do I add a GIF file as a main stream?
In versions earlier than 3.7.0, GIF files are imported as images. In versions 3.7.0 and later, if a GIF file is imported as a video, all its frames are played. If it is imported as an image, only the first frame is played.
Why does the video freeze after I set a transition or call applySourceChange?
After you perform these operations, you must call the mAliyunIEditor.play() interface.