The WUYING Android SDK is an open API that allows Android clients to connect to WUYING Workspaces, cloud applications, and Cloud Phones. You can integrate the Android SDK to quickly customize and build Android clients.
1. Getting started
1.1 Get the SDK and demo
How to obtain
You can download AndroidSDK_aar.
You can download AndroidSDK_demo.
All documents, SDKs, and client programs on this platform are for personal or enterprise use only. Do not forward them to any third party without the consent of Alibaba Cloud.
Integration requirements
Requires Android 5.1 or later.
Integrate the SDK using an AAR package
Copy the downloaded aspengine-third-release.aar, aspengine-player-release.aar, and wytrace-release.aar files to the app/libs directory.
Add the following dependencies to the build.gradle file of the application module:
dependencies {
implementation fileTree(include the following: ['*.jar', '*.aar'], dir: 'libs')
// The INI configuration parsing library that the aspengine SDK depends on.
implementation 'org.ini4j:ini4j:0.5.4'
// The libraries that wytrace depends on.
implementation 'com.squareup.okhttp3:okhttp:5.0.0-alpha.8'
implementation 'com.google.code.gson:gson:2.10.1'
implementation 'io.github.aliyun-sls:aliyun-log-android-sdk:2.7.0@aar'
}Declare the required permissions in the AndroidManifest.xml file:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.VIBRATE"/>1.2 Integration process
1.3 Best practices
For more information, see Cloud Phone quick integration best practices. The following figure describes the cloud phone integration process.
Multiple logon methods are available to obtain the ticket required by the SDK to connect to a Cloud Phone. The following flowchart shows the process.
For specific integration code, see the sample code for the lifecycle APIs.
2. Lifecycle APIs
2.1 Initialize and create a StreamView
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_stream_view_demo);
mStreamView = findViewById(R.id.stream_view);
mStreamView.enableDesktopMode(false);
mStreamView.scaleStreamVideo(pref.getBoolean("fit_video_content", true) ?
mStreamView.ScaleType.FIT_STREAM_CONTENT : StreamView.ScaleType.FILL_STREAM_VIEW);
mStreamView.getASPEngineDelegate().setAlignStreamResolutionWithSurfaceSize(false);
}<?xml version="1.0" encoding="utf-8"?>
<android.widget.RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".StreamViewDemoActivity">
<com.aliyun.wuying.aspsdk.aspengine.ui.StreamView
android:id="@+id/stream_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:focusableInTouchMode="true"
android:focusable="true"
android:focusedByDefault="true" />
</android.widget.RelativeLayout>2.2 Establish a connection
For more information about the values of mConfigs, see 4.1 Config.
mStreamView.start(mConfigs);2.3 Disconnect
mStreamView.stop();2.4 Destroy the StreamView instance
mStreamView.dispose();
mStreamView = null;2.5 Multi-StreamView mode
This mode allows a stream to be seamlessly switched between multiple views. Perform the following steps:
Define and initialize the StreamView. For more information, see section 2.1.
Use IAspEngine to establish a connection. The parameters are the same as those defined in 4.1 Config.
IASPEngine engine = mBuilder.enableRTC(true).build(context); // Enable data statistics. engine.enableStatistics(true, true); ConnectionConfig cc = new ConnectionConfig(); cc.id = CONFIG_DESKTOP_ID; cc.connectionTicket = CONFIG_CONNECTION_TICKET; cc.useVPC = CONFIG_USE_VPC; cc.type = OS_TYPE; cc.user = CONFIG_USER; cc.uuid = CONFIG_UUID; engine.start(cc);Bind the StreamView to the IAspEngine. This API renders the stream in the current view.
mStreamView.bindASPEngine(engine);Resume the binding between the StreamView and the IAspEngine. This API allows the view to resume rendering the stream.
mStreamView.resumeASPEngine();
For more details, see the implementation in the demo.
2.6 Callback description
Connection callback listener: IASPEngineListener
API description:
API | Description |
onConnectionSuccess(int connectionId) | Callback for a successful connection to the Cloud Phone. The connection ID is returned. |
onConnectionFailure(int errorCode, String errorMsg) | Callback for a failed connection to the Cloud Phone. The error code and error message are returned. |
onEngineError(int errorCode, String errorMsg) | Callback for an internal SDK exception. The error code and error message are returned. |
onDisconnected(int reason) | The connection to the Cloud Phone is disconnected. The reason for the disconnection is returned. |
onReconnect(int errorCode) | The connection to the Cloud Phone is being re-established. The error code that caused the reconnection is returned. |
onFirstFrameRendered(long timeCostMS) | Callback for when the first frame is displayed on the Cloud Phone. The time consumed is returned. |
onPolicyUpdate(String policy) | Callback for a Cloud Phone policy. The policy configuration is returned. |
onSessionSuccess() | Callback for the successful creation of a Cloud Phone connection session. |
Performance data callback listener: IStatisticsListener
API description:
API | Description |
onStatisticsInfoUpdate(StatisticsInfo statisticsInfo) | Callback for Cloud Phone performance data. A performance data object is returned. |
System permission request callback listener: IRequestSystemPermissionListener
API description:
API | Description |
bool OnRequestSystemPermission(SystemPermission permission) | Callback for a system permission request. The type of system permission requested is returned. |
Sample code for registering and unregistering callbacks:
// Listen for connection callbacks.
mStreamView.getASPEngineDelegate().registerASPEngineListener(IASPEngineListener var1);
mStreamView.getASPEngineDelegate().unregisterASPEngineListener(IASPEngineListener var1);
// Listen for performance data callbacks.
mStreamView.getASPEngineDelegate().registerStatisticsListener(IStatisticsListener var1);
mStreamView.getASPEngineDelegate().unregisterStatisticsListener(IStatisticsListener var1);
// Listen for system permission requests.
mStreamView.registerSystemPermissionListener(IRequestSystemPermissionListener listener);
mStreamView.unregisterSystemPermissionListener(IRequestSystemPermissionListener listener);3. Service APIs
API | Description |
enableVDAgentCheck(boolean enabled) | Specifies whether to check the availability of VDAgent when establishing a connection. This check is enabled by default. If this parameter is set to true and VDAgent is unavailable during connection establishment, an error is reported and the connection is disconnected. We recommend that you do not set this parameter to false. This setting is intended for internal testing only. |
enableRTC(boolean enabled) | Specifies whether to use Real-Time Communication (RTC) to transmit streaming data. This feature is enabled by default. |
enableDesktopMode(boolean enabled) | Specifies whether to enable desktop mode. When enabled, all touch messages are converted to mouse events and sent to the server. We recommend that you set this parameter to false for Cloud Phones. |
scaleStreamVideo(ScaleType scaleType) | Specifies the scaling method for the video stream. For more information about the values of ScaleType, see section 5.1 ScaleType. |
setVideoProfile(int width, int height, int fps, IRemoteResult result) | Sets the resolution and frame rate of the video stream. The frame rate setting is not currently supported. |
boolean sendKeyEvent(KeyEvent event) and sendKeyboardEvent(KeyEvent event, IRemoteResult result) | Sends keyboard events to the cloud. |
boolean simulateMouseClick(boolean leftButton) | Simulates a mouse click event. Set the parameter to true for a left-click or false for a right-click. |
boolean enableMouseMode(boolean enabled) | Enables or disables mouse mode. |
boolean sendMouseEvent(MotionEvent motionEvent) and sendMouseEvent(MotionEvent motionEvent, IRemoteResult result) | Sends mouse events to the cloud. |
reconnect(String connectionToken) | Call this method to reconnect after an unexpected disconnection. This action is typically performed when the disconnection reason code is 2200. The application must call the OpenAPI to obtain a new connection token for the Cloud Phone and pass it to this method. |
boolean setMediaStreamPlayer(MediaStreamPlayer player) | Replaces the default media engine in the SDK with a custom one provided by the application. This method can be called only before the stream starts or after the stream stops. |
void setAlignStreamResolutionWithSurfaceSize(boolean aligned) | Specifies whether to automatically align the stream resolution with the size of the client-side SurfaceView used for rendering when the stream starts. This feature is enabled by default. We recommend that you disable this feature for Cloud Phones. |
void mute(boolean muted) | Specifies whether to mute the audio stream. |
void enableStatistics(boolean enabled) | Specifies whether to enable performance statistics. |
mStreamView.getASPEngineDelegate().requestIFrame() | Requests a keyframe. |
mStreamView.getASPEngineDelegate() under registerFileTransferListener(IFileTransferListener var1) and unregisterFileTransferListener(IFileTransferListener var1) and mStreamView.getASPEngineDelegate().uploadFiles(pathList, "/sdcard/Download/"); | Provides methods for file upload and download. For a sample implementation, see the demo. |
in mStreamView.getASPEngineDelegate() addDataChannel(DataChannel var1) and removeDataChannel(DataChannel var1) | Sends and receives data through a custom data channel alongside the video stream. For a sample implementation, see the demo. |
In mStreamView.getASPEngineDelegate() addLyncChannel(LyncChannel var1) and removeLyncChannel(LyncChannel var1) | Provides a channel for sending adb commands. For the implementation, see AspAdbUtil in the demo. |
void setToQualityFirst() | Sets the stream to quality-first mode. In this mode, the maximum frame rate is 30 FPS and the picture quality is set to excellent. This mode is not currently supported for Cloud Phones. |
void setToFpsFirst() | Sets the stream to FPS-first mode. In this mode, the maximum frame rate is 60 FPS and the picture quality is set to good. This mode is not currently supported for Cloud Phones. |
void setToCustomPicture(int fps, int quality); | Sets a custom mode where you can define the frame rate and picture quality. The fps parameter specifies the frame rate and accepts values from 0 to 60. A higher value results in a smoother stream. The quality parameter specifies the picture quality and accepts values from 0 to 4. The values represent the following quality levels: 0 for lossless, 1 for excellent, 2 for good, 3 for fair, and 4 for auto. This mode is not currently supported for Cloud Phones. |
mStreamView.getASPEngineDelegate() registerIMEListener, unregisterIMEListener, setImeType, enableRelativeMouse, etc. | Not currently supported for Cloud Phones. |
4. Parameter details
4.1 Config
The following parameters are used to establish a connection.
Configuration key | Value type | Description |
StreamView.CONFIG_DESKTOP_ID | string | The instance ID. This is the ResourceId returned by the DescribeUserResources API. |
StreamView.CONFIG_CONNECTION_TICKET | string | The connection authentication ticket. Get this by calling the GetConnectionTicket API. |
StreamView.CONFIG_PREFER_RTC_TRANSPORT | boolean | Use the RTC channel. We recommend setting this to true for Cloud Phones. |
StreamView.CONFIG_ENABLE_VDAGENT_CHECK | boolean | Specifies whether to check the availability of VDAgent during connection establishment. We recommend setting this to true for Cloud Phones. |
StreamView.CONFIG_ENABLE_STATISTICS | boolean | Specifies whether to enable performance statistics. If enabled, performance data is displayed over the video stream. We recommend setting this to true for Cloud Phones. |
OSType | string | For Cloud Phones, the value is android. |
4.2 StatisticsInfo
Contains performance data.
API | Type | Description |
mReceiveFps | int | Received frame rate |
mRenderFps | int | Rendering frame rate |
mDownstreamBandwithMBPerSecond | double | Downstream bandwidth |
mUpstreamBandwithMBPerSecond | double | Upstream bandwidth |
mP2pFullLinkageLatencyMS | long | End-to-end full-link latency. Deprecated. |
mNetworkLatencyMS | long | Network RTT latency |
mPingGatewayRttMS | long | Ping RTT latency |
mLostRate | double | Packet loss rate |
mServerRenderLatencyMS | long | Server-side rendering latency |
mServerEncoderLatencyMS | long | Server-side encoding latency |
mServerTotalLatencyMS | long | Total server-side latency |
mTotalDownstreamBandwidth | long | Total bandwidth |
mGuestCpuUsage | long | Image CPU utilization |
mStreamType | String | Stream protocol type |
5. Enumeration types
5.1 ScaleType
Specifies the scaling type for streamed image content.
Name | Meaning |
FILL_STREAM_VIEW | Always stretches the streamed image to the same size as the StreamView. If the aspect ratio of the StreamView is different from that of the streamed image, this policy may cause noticeable image distortion. |
FIT_STREAM_CONTENT | Adjusts the rendering area of the StreamView so that it always renders the streamed image content with the same aspect ratio. With this policy, the streamed image may not fill the entire StreamView. |
5.1 System permission
System Permission Types
Name | Meaning |
RECORDAUDIO | Requests permission to record audio. |
6. Custom MediaStreamPlayer
You can skip this section if you do not need to customize multimedia. The SDK provides a default multimedia implementation.
6.1 Use a custom media engine to process media data
By implementing `com.aliyun.wuying.aspsdk.aspengine.MediaStreamPlayer`, your application can use a custom media engine to process streaming media data. This data includes the following:
Video stream data: Raw video streams that primarily consist of H.264 or H.265 compressed frames.
Adaptive image stream data: Image streams that primarily consist of bitmaps.
Downstream audio data: Downstream audio data streams that primarily consist of Opus or PCM data.
Cursor data: When virtual mouse mode is enabled, the application receives cursor image and position data. The application can use this data to render a virtual cursor.
You can provide a custom media engine implementation to the WUYING SDK by calling the `IASPEngine.setMediaStreamPlayer` API.
6.2 MediaStreamPlayer
`MediaStreamPlayer` is an abstract class. It requires your application to implement global initialization and destruction methods and provide custom implementations for processing different types of media data.

Where:
The `IVideoStreamHandler` interface defines methods for processing video stream data.
The `IAdaptiveGraphicStreamHandler` interface defines methods for processing adaptive image stream data.
The `IAudioPlaybackStreamHandler` interface defines methods for processing downstream audio data.
The `ICursorBitmap` interface defines methods for processing cursor data.
You can choose to implement one or more of the preceding interfaces. The WUYING SDK sets the type of stream sent from the cloud based on the interface implementations that you provide. The rules are as follows:
If you provide implementations for both `IVideoStreamHandler` and `IAdaptiveGraphicStreamHandler`, the stream is set to mixed mode. WUYING automatically switches between the adaptive image stream and the video stream based on the current scenario.
If you provide only an implementation for `IVideoStreamHandler`, the stream is set to video stream only. In this case, the server provides only video stream data.
If the application provides only an implementation of IAudioPlaybackStreamHandler, the stream is set to an image stream, and the server-side sends only image data.
You can provide custom implementations for different media data to the SDK by implementing the `onCreateXXXHandler` methods of `MediaStreamPlayer`:
@Override
protected IVideoStreamHandler onCreateVideoStreamHandler() {
return new VideoStreamHandler();
}
@Override
protected IAdaptiveGraphicStreamHandler onCreateAdaptiveGraphicStreamHandler() {
return null;
}
@Override
protected IAudioPlaybackStreamHandler onCreateAudioPlaybackStreamHandler() {
return new AudioPlaybackStreamHandler();
}
@Override
protected ICursorBitmapHandler onCreateCursorBitmapHandler() {
return null;
}In the preceding example, the custom media engine provides implementations for `IVideoStreamHandler` and `IAudioPlaybackStreamHandler`. The `onCreateXXXHandler` method is executed only once during a streaming process.
The main method call flow is as follows:
6.2.1 initialize
You can implement this method to perform global initialization actions for the custom media engine.
This method is executed once during each streaming process.
public ErrorCode initialize()Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates that the initialization is successful. Otherwise, the initialization failed. |
6.2.2 release
You can implement this method to perform global release actions for the custom media engine.
This method is executed once during each streaming process.
public ErrorCode release()Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates that the release is successful. Otherwise, the release failed. |
6.2.3 enableStatistics
You can implement this method to enable or disable performance statistics collection.
public void enableStatistics(boolean enabled)Parameters:
Parameter | Type | Description |
enabled | boolean | True indicates that performance data collection is enabled. False indicates that performance data collection is disabled. |
6.2.4 onCreateVideoStreamHandler
You can implement this method to provide the SDK with the media engine implementation for processing video stream data.
This method is executed once during each streaming process.
public IVideoStreamHandler onCreateVideoStreamHandler()Return value:
Type | Description |
IVideoStreamHandler | The media engine implementation provided by the application for processing video stream data. If the application does not provide a video stream processing implementation, this method returns null. In this case, the video stream data is not processed. |
6.2.5 onCreateAdaptiveGraphicStreamHandler
You can implement this method to provide the SDK with the media engine implementation for processing adaptive image stream data.
This method is executed once during each streaming process.
public IAdaptiveGraphicStreamHandler onCreatAdaptiveGraphicStreamHandler()Return value:
Type | Description |
IAdaptiveGraphicStreamHandler | The media engine implementation provided by the application for processing adaptive image stream data. If the application does not provide an adaptive image stream processing implementation, this method returns null. In this case, the image stream data is not processed. |
6.2.6 onCreateAudioPlaybackStreamHandler
You can implement this method to provide the SDK with the media engine implementation for processing downstream audio data.
This method is executed once per streaming process.
public IAudioPlaybackStreamHandler onCreatAudioPlaybackStreamHandler()Return value:
Type | Description |
IAudioPlaybackStreamHandler | The media engine implementation provided by the application for processing downstream audio data. If the application does not provide a downstream audio data processing implementation, this method returns null. In this case, the downstream audio data is not processed. |
6.2.7 onCreateCursorBitmapHandler
You can implement this method to provide the SDK with the media engine implementation for processing cursor data.
This method is executed once during each streaming process.
The interface implementation provided by this method is used only when virtual mouse mode is enabled.
public ICursorBitmapHandler onCreatCursorBitmapHandler()Return value:
Type | Description |
ICursorBitmapHandler | The media engine implementation provided by the application for processing cursor data. If the application does not provide a cursor data processing implementation, this method returns null. In this case, even if virtual mouse mode is enabled, the cursor position data is not processed. |
6.3 IVideoStreamHandler
This interface defines the main methods for processing video stream data. The main workflow is as follows:
When the application switches between the foreground and the background, the `Surface` used for rendering is destroyed or rebuilt. In this case, `IVideoStreamHandler.setVideoSurface` is called multiple times. When the `Surface` is destroyed, the surface object passed in through `setVideoSurface` is null. Your application must handle the fault tolerance of the decoder and renderer.
By implementing the `IVideoStreamHandler.setEventHandler` method, your application can obtain the event handling interface provided by the WUYING SDK. Through this interface, your application can notify the WUYING SDK of video processing events that occur within the custom media engine. This is currently used primarily for performance data statistics:
@Override
public void setEventHandler(EventHandler handler) {
Log.i(TAG, "setEventHandler handler " + handler);
VideoStreamEventHandler.getInstance().reset(handler);
}
...
public synchronized void onVideoFrameRendered() {
VFrame frame = mVideoFrame.remove();
if (mEnabled && mHandler != null) {
Event event = new Event();
event.type = EventType.RENDER_PERF_INFO;
event.decodePerfInfo = new VDecodePerfInfo();
event.renderPerfInfo = new VRenderPerfInfo();
event.renderPerfInfo.frameId = frame.frameId;
event.renderPerfInfo.sessionId = frame.sessionId;
// Notify the WUYING SDK that a video frame has been rendered. The SDK calculates the client-side end-to-end latency based on the frameId.
mHandler.callback(event);
}
}6.3.1 setEventHandler
You can implement this method. When the custom media engine is loaded by the WUYING SDK, the SDK provides an `EventHandler` to your application through this method. You can use this handler to send video stream processing events.
public void setEventHandler(EventHandler handler)Parameters:
Parameter | Type | Description |
handler | EventHandler | The handler object provided by the WUYING SDK. The application uses this object to send video stream processing events to the SDK. |
6.3.2 addVideoTrack
You can implement this method. When a video stream is created, your application is notified through this method.
Currently, only one video stream can exist in a single streaming process.
ErrorCode addVideoTrack(int trackId, VProfile profile);Parameters:
Parameter | Type | Description |
trackId | int | The video stream ID. |
profile | VProfile | The video stream information. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.3.3 setVideoSurface
You can implement this method. When the status of the `Surface` used for video rendering changes, your application is notified through this method.
ErrorCode setVideoSurface(int trackId, Surface surface);Parameters:
Parameter | Type | Description |
trackId | int | The video stream ID. |
surface | android.view.Surface | The Surface object used for video rendering. This object may be null when the application is switched to the background or the screen is locked. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.3.4 playVideo
You can implement this method. When the video stream is ready, your application is notified through this method.
ErrorCode playVideo(int trackId);Parameters:
Parameter | Type | Description |
trackId | int | The video stream ID. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.3.5 removeVideoTrack
You can implement this method. When the video stream is destroyed, your application is notified through this method.
ErrorCode removeVideoTrack(int trackId);Parameters:
Parameter | Type | Description |
trackId | int | The video stream ID. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.3.6 pushVideoFrame
You can implement this method. When a new video frame is received, your application is notified through this method.
ErrorCode setVideoSurface(int trackId, Surface surface);Parameters:
Parameter | Type | Description |
trackId | int | The video stream ID. |
frame | VFrame | Information about the newly received video frame. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.3.7 getVideoTracks
You can implement this method. The SDK calls this method to retrieve information about all video streams that are currently being processed from your application.
HashMap<Integer, VProfile> getVideoTracks();Return value:
Type | Description |
HashMap<Integer, VProfile> | Information about all video streams that are currently being processed by the application. |
6.3.8 release
You can implement this method. When all video streams are destroyed, your application is notified through this method to perform cleanup actions.
ErrorCode release();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4 IAudioPlaybackStreamHandler
This interface defines the main methods for processing audio stream data. The main workflow is as follows:
6.4.1 initAudio
You can implement this method. When an audio channel is created in the SDK, your application is notified through this method.
ErrorCode initAudio();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.2 deInitAudio
You can implement this method. When the audio channel in the SDK is destroyed, your application is notified through this method.
ErrorCode deInitAudio();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.3 startAudioPlayback
You can implement this method. When the cloud computer is about to send an audio stream, your application is notified through this method.
ErrorCode startAudioPlayback();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.4 stopAudioPlayback
You can implement this method. When the cloud computer stops sending the audio stream, your application is notified through this method.
ErrorCode stopAudioPlayback();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.5 pushAudioPlaybackFrame
You can implement this method. When a new downstream audio frame is received, your application is notified through this method.
ErrorCode pushAudioPlaybackFrame(AFrame pbData);Parameters:
Parameter | Type | Description |
pbData | AFrame | Information about the newly received audio frame. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.6 updateAudioPlaybackVol
You can implement this method. When the system volume on the cloud computer changes, your application is notified through this method.
ErrorCode updateAudioPlaybackVol(int volume);Parameters:
Parameter | Type | Description |
volume | int | The system volume value in the Cloud Phone. The maximum value is USHRT_MAX. A value of 0 indicates mute. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.7 updateAudioPlaybackMute
You can implement this method. When the cloud computer is muted or unmuted, your application is notified through this method.
ErrorCode updateAudioPlaybackMute(int mute);Parameters:
Parameter | Type | Description |
mute | int | A value of 1 indicates that the Cloud Phone is muted. A value of 0 indicates that it is unmuted. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.4.8 release
You can implement this method. When the audio channel is destroyed, your application is notified through this method to perform cleanup actions.
ErrorCode release();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.5 IAdaptiveGraphicStreamHandler
This interface defines the main methods for processing image stream data. The main workflow is as follows:
Currently, the image frame format that the application obtains is bitmap ARGB8888.
Only one image stream can exist in a single streaming process.
6.5.1 setAdaptiveGraphicSurface
You can implement this method. When the status of the `Surface` used for image rendering changes, your application is notified through this method.
ErrorCode setAdaptiveGraphicSurface(Surface surface);Parameters:
Parameter | Type | Description |
surface | android.view.Surface | The Surface object used for image rendering. This object may be null when the application is switched to the background or the screen is locked. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.5.2 invalidateAdaptiveGraphicSurface
You can implement this method. When new image frame data is received, your application is notified through this method.
ErrorCode invalidateAdaptiveGraphicSurface(Region region, byte[] buffer, BitmapFormat format);Parameters:
Parameter | Type | Description |
region | Region | Information about the drawing area of the image frame. |
buffer | byte[] | The image frame data. |
format | BitmapFormat | The image frame format information. The default is ARGB8888. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.5.3 release
You can implement this method. When the image stream is destroyed, your application is notified through this method to perform cleanup actions.
ErrorCode release();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.6 ICursorBitmapHandler
This interface defines the main methods for processing cursor data. When virtual mouse mode is enabled, you can provide an implementation of this interface to the SDK to render the cursor. The main workflow is as follows:
6.6.1 setCursorBitmap
You can implement this method. When the cursor shape on the cloud computer changes, your application is notified through this method.
ErrorCode setCursorBitmap(CursorBitmap bitmap);Parameters:
Parameter | Type | Description |
bitmap | CursorBitmap | The Cloud Phone cursor shape data. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.6.2 unsetCursorBitmap
You can implement this method. When the cursor on the cloud computer is hidden, your application is notified through this method.
ErrorCode unsetCursorBitmap();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.6.3 setCursorPosition
You can implement this method. When the cursor position on the cloud computer changes, your application is notified through this method.
ErrorCode setCursorPosition(float x, float y);Parameters:
Parameter | Type | Description |
x | float | The X coordinate of the cursor. |
y | float | The Y coordinate of the cursor. |
Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
6.6.4 release
You can implement this method. When the connection is disconnected and the cursor display is canceled, your application is notified through this method to perform cleanup actions.
ErrorCode release();Return value:
Type | Description |
ErrorCode | ErrorCode.OK indicates success. Otherwise, the operation failed. |
7. Error codes
Error code | Error message ( | Defining module | Cause |
2-26 are mainly network-related issues. | |||
2 | Failed to connect to %s. | ASP SDK | Invalid MAGIC. |
3 | Failed to connect to %s. | ASP SDK | Incorrect data. |
4 | The client and server versions do not match. | ASP SDK | Version mismatch. |
5 | The connection requires TLS. | ASP SDK | TLS is required. |
6 | The connection does not require TLS. | ASP SDK | TLS was used when it was not required. |
7 | You do not have permission to connect to the current %s. | ASP SDK | Permission issue. |
8 | ASP SDK | Invalid client ID during migration. | |
9 | Failed to connect to %s. | ASP SDK | The channel does not exist. |
20 | Failed to connect to the ASP server. | ASP SDK | Channel connection error. |
21 | TLS authentication error. | ASP SDK | TLS authentication error. |
22 | Failed to connect to %s. | ASP SDK | Channel link error. |
23 | Failed to connect to %s. | ASP SDK | Connection authentication error. |
24 | Failed to connect to %s. | ASP SDK | Connection I/O error. |
25 | Failed to connect to %s. | ASP SDK | Ticket verification failed. This error also occurs if you try to establish a connection again with the same ticket after the user has been disconnected. |
26 | ASP SDK | xquic handshake failed. | |
Disconnections or errors | |||
2000 | Timed out while getting data from %s. Disconnected from the server. | ASP SDK | Normal disconnection. |
2001 | %s has been disconnected from the server. This may be because the %s process was forcibly terminated. | ASP SDK | Generally, the client-side application process was terminated, for example, when an Android user presses the Home button. |
2002 | Another user has connected to the current %s from a different terminal. Please try again later. | ASP SDK | Another user has preempted the connection. |
2003 | %s is shutting down or restarting. This is usually an administrator operation. Please try again later. | ASP SDK | The Cloud Phone was shut down or restarted, usually by an administrator. |
2004 | The current user connection was disconnected. | ASP SDK | The client initiated a disconnection, or the server initiated a kick-out or disconnection. |
2005 | %s has timed out and disconnected because the usage time limit set by the administrator has been reached. | ASP SDK | Disconnected due to the usage time limit set by the administrator. |
2010 | Failed to connect to %s. | ASP SDK | VDAgent connection failed. |
2011 | Incorrect connection parameters were passed. | ASP SDK | Incorrect parameters were passed for the server connection. |
2027 | The stream pulling mode has been switched. | ASP SDK | The stream pulling mode was switched from preemption to collaboration, or from collaboration to preemption. |
2100 | Clipboard permission denied from %s to local. | ASP SDK | Clipboard permission denied from VM to local. |
2101 | Clipboard permission denied from local to %s. | ASP SDK | Clipboard permission denied from local to VM. |
2200 | %s is trying to reconnect... | ASP SDK | The connection was disconnected due to network issues. The ASP SDK is reconnecting. |
2201 | A network exception occurred on your device, and %s has been disconnected. | ASP SDK | The connection was disconnected due to network issues. The ASP SDK does not support reconnection due to an image issue. The application side starts reconnecting. |
2202 | %s reconnection timed out. Check your device's network and try again. | ASP SDK | ASP SDK reconnection timed out. |
Client-side logic errors | |||
5100 | Connection to ASP Server for %s timed out. Please try again later. | Application side | The client did not receive a connected event within a certain period. |
5102 | Timed out while getting data from %s. Please try again later. | Application side | The client received a connected event but did not receive a display event within a certain period. |
5004 | An error occurred on the client. Please reopen it. | Application side | Incorrect startup parameters were passed to the client. This generally occurs during the development phase. |
5200 | Client reconnection timed out. Please try again later. | Application side | |
8. FAQ
How do I restart a Cloud Phone?
Call the RebootInstances management API to restart the instance. This disconnects the client connection to the Cloud Phone. After the instance restarts, you must reconnect to the Cloud Phone from the client.
External Maven repository
External Maven repositories are not currently supported, but you can upload the SDK's AAR library to your own Maven repository.
Common ADB commands
Function | Command |
Back key | input keyevent KEYCODE_BACK |
Home key | input keyevent KEYCODE_HOME |
Switch key | input keyevent KEYCODE_APP_SWITCH |
Mute | input keyevent 164 |
Volume up | input keyevent KEYCODE_VOLUME_UP |
Volume down | input keyevent KEYCODE_VOLUME_DOWN |
Hide navigation bar | setprop persist.wy.hasnavibar false; killall com.android.systemui |
Show navigation bar | setprop persist.wy.hasnavibar true; killall com.android.systemui |
Take a screenshot | screencap -p /sdcard/Download/abc.png |