Alibaba Cloud Workspace SDK for Android provides open interfaces for connecting to cloud computers, cloud apps, and cloud phones from Android clients. By integrating the SDK, you can efficiently customize and deploy an Android client tailored to your business needs.
1 Quick start
1.1 Obtain the SDK and demo
Methods
Download AndroidSDK_aar.
Download AndroidSDK_demo.
All platform resources, including documents, SDKs, and client programs, are strictly for your or your organization's internal use. Unauthorized sharing with third parties or other organizations is prohibited without prior written consent from Alibaba Cloud.
Environment requirements
Android version: Android 5.1 or later
Integrate the SDK by using an AAR package
Copy the downloaded aspengine-third-release.aar, aspengine-player-release.aar, and wytrace-release.aar to the app/libs directory.
Add a library to the build.gradle file of the application module.
dependencies {
implementation fileTree(include: ['*.jar', '*.aar'], dir: 'libs')
// INI configuration parsing library for aspengine-sdk
implementation 'org.ini4j:ini4j:0.5.4'
// Dependency for wytrace
implementation 'com.squareup.okhttp3:okhttp:5.0.0-alpha.8'
implementation 'com.google.code.gson:gson:2.10.1'
implementation 'io.github.aliyun-sls:aliyun-log-android-sdk:2.7.0@aar'
}Declare required permissions in the AndroidManifest file.
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.VIBRATE"/>1.2 Integration process
1.3 Best practices
Refer to Best practices for integrating Cloud Phone for implementation details. The following figure outlines the cloud phone integration process.
Multiple logon methods are supported to retrieve the ticket credential for cloud phone access, which is required for SDK integration. Refer to the following flowcharts:
See the "Lifecycle interfaces" section for sample integration code.
2. Lifecycle interfaces
2.1 Initialize an StreamView instance
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_stream_view_demo);
mStreamView = findViewById(R.id.stream_view);
mStreamView.enableDesktopMode(false);
mStreamView.scaleStreamVideo(pref.getBoolean("fit_video_content", true) ?
mStreamView.ScaleType.FIT_STREAM_CONTENT : StreamView.ScaleType.FILL_STREAM_VIEW);
mStreamView.getASPEngineDelegate().setAlignStreamResolutionWithSurfaceSize(false);
}<?xml version="1.0" encoding="utf-8"?>
<android.widget.RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".StreamViewDemoActivity">
<com.aliyun.wuying.aspsdk.aspengine.ui.StreamView
android:id="@+id/stream_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:focusableInTouchMode="true"
android:focusable="true"
android:focusedByDefault="true" />
</android.widget.RelativeLayout>2.2 Establish a session
For information about valid values for mConfigs, see 4.1 Config.
mStreamView.start(mConfigs);2.3 Disconnect from a session
mStreamView.stop();2.4 Destroy a StreamView instance
mStreamView.dispose();
mStreamView = null;2.5 Multi-StreamView mode
This mode supports seamlessly switching the display of a single stream between multiple View instances. To implement this feature, follow these steps:
Define and initialize the
StreamViewobjects as described in 2.1.Use
IAspEngineto establish a connection, using parameters consistent with theConfigdefinition in 4.1.IASPEngine engine = mBuilder.enableRTC(true).build(context); //Enable statistics collection engine.enableStatistics(true, true); ConnectionConfig cc = new ConnectionConfig(); cc.id = CONFIG_DESKTOP_ID; cc.connectionTicket = CONFIG_CONNECTION_TICKET; cc.useVPC = CONFIG_USE_VPC; cc.type = OS_TYPE; cc.user = CONFIG_USER; cc.uuid = CONFIG_UUID; engine.start(cc);Bind the
IAspEngineto aStreamViewinstance. This call will begin rendering the stream on the specifiedView.mStreamView.bindASPEngine(engine);To switch the display, bind the
IAspEngineto a differentStreamViewinstance. This allows the newViewto take over rendering the stream.mStreamView.resumeASPEngine();
For a complete implementation, please refer to the provided demo application.
2.6 Callback instructions
Callback proxy for connections: IASPEngineListener
Interface description:
Interface | Description |
onConnectionSuccess(int connectionId) | Callback invoked when connection to the cloud phone succeeds, with a connection identifier returned. |
onConnectionFailure(int errorCode, String errorMsg) | Callback invoked when connection to the cloud phone fails, with an error code and error message returned. |
onEngineError(int errorCode, String errorMsg) | Callback invoked when an SDK internal exception occurs, with an error code and error message returned. |
onDisconnected(int reason) | Callback invoked when cloud phone connection is terminated, with a disconnect reason returned. |
onReconnect(int errorCode) | Callback invoked when the cloud phone reconnects automatically, with an error code that triggered reconnection returned. |
onFirstFrameRendered(long timeCostMS) | Callback invoked when the first frame from the cloud phone is rendered, with the consumed rendering time returned. |
onPolicyUpdate(String policy) | Callback invoked when the cloud phone policy is updated, with the policy configurations returned. |
onSessionSuccess() | Callback invoked when a session is successfully established with the cloud phone |
Callback proxy for performance data: IStatisticsListener
Interface description:
Interface | Description |
onStatisticsInfoUpdate(StatisticsInfo statisticsInfo) | Callback invoked for phone performance data, with performance data objects returned. |
Callback proxy for requesting system permissions: IRequestSystemPermissionListener
Interface description:
Interface | Description |
bool OnRequestSystemPermission(SystemPermission permission) | Callback invoked when you initiate a system permission request, with the requested permission type returned. |
Sample code for callback registration and deregistration:
// Listen for connection callbacks.
mStreamView.getASPEngineDelegate().registerASPEngineListener(IASPEngineListener var1);
mStreamView.getASPEngineDelegate().unregisterASPEngineListener(IASPEngineListener var1);
// Listen for performance data callbacks.
mStreamView.getASPEngineDelegate().registerStatisticsListener(IStatisticsListener var1);
mStreamView.getASPEngineDelegate().unregisterStatisticsListener(IStatisticsListener var1);
// Listen for system permission requests.
mStreamView.registerSystemPermissionListener(IRequestSystemPermissionListener listener);
mStreamView.unregisterSystemPermissionListener(IRequestSystemPermissionListener listener);3. Business interfaces
Interface | Description |
enableVDAgentCheck(boolean enabled) | Specifies whether to enforce VDAgent availability check during connection establishment. Default value: true. When the value is set to true, if VDAgent is unavailable during connection, an error will be reported and the connection will be terminated. Do not set the value to false unless for internal debugging purposes. |
enableRTC(boolean enabled) | Specifies whether to use real-time communication (RTC) for streaming data transmission. By default, RTC is used. |
enableDesktopMode(boolean enabled) | Specifies whether to run in desktop mode. When the desktop mode is enabled, all Touch events are converted to Mouse events for server transmission. We recommended that you set the value to false for cloud phones. |
scaleStreamVideo(ScaleType scaleType) | Specifies scaling behavior for streamed video content. Refer to "5.1 Enumeration types" for valid values. |
setVideoProfile(int width, int height, int fps, IRemoteResult result) | Specifies the video stream resolution and frame rate (frame rate is currently unsupported). |
boolean sendKeyEvent(KeyEvent event) and sendKeyboardEvent(KeyEvent event, IRemoteResult result) | Sends keyboard input events to the cloud. |
boolean simulateMouseClick(boolean leftButton) | Simulates mouse click events on the cloud (true for left-click and false for right-click). |
boolean enableMouseMode(boolean enabled) | Activates or deactivates mouse mode. |
boolean sendMouseEvent(MotionEvent motionEvent) and sendMouseEvent(MotionEvent motionEvent, IRemoteResult result) | Sends mouse input events to the cloud. |
reconnect(String connectionToken) | Reconnects after an abnormal disconnection. In most cases, you can call this method to handle the "disconnect reason=2200" error. Applications must call API operations to obtain the connection token to reconnect to the cloud phone. |
boolean setMediaStreamPlayer(MediaStreamPlayer player) | Replaces the SDK's default media engine with a custom media engine. You can call this interface only before streaming starts or after streaming ends. |
void setAlignStreamResolutionWithSurfaceSize(boolean aligned) | Specifies whether to auto-sync stream resolution with the client-side SurfaceView size upon streaming start. By default, this feature is enabled. We recommend that you disable this feature for cloud phones. |
void mute(boolean muted) | Enables or disables mute mode. |
void enableStatistics(boolean enabled) | Enables or disables performance statistics collection. |
mStreamView.getASPEngineDelegate().requestIFrame() | Requests a keyframe. |
Under mStreamView.getASPEngineDelegate(): registerFileTransferListener(IFileTransferListener var1) and unregisterFileTransferListener(IFileTransferListener var1) and mStreamView.getASPEngineDelegate().uploadFiles(pathList, "/sdcard/Download/"); | Uploads or downloads files, which are also implemented in the demo. |
Under mStreamView.getASPEngineDelegate(): addDataChannel(DataChannel var1) and removeDataChannel(DataChannel var1) | Specifies the channel for images to receive custom data, which is also implemented in the demo. |
Under mStreamView.getASPEngineDelegate(): addLyncChannel(LyncChannel var1) and removeLyncChannel(LyncChannel var1) | Specifies the channel for sending ADB commands. Refer to "AspAdbUtil" in the demo for implementation. |
void setToQualityFirst() | Configures the quality-first mode, which can provide high image quality and a frame rate of up to 30 fps. This feature is currently not supported by cloud phones. |
void setToFpsFirst() | Configures the smoothness-first mode, which can provide medium image quality and a frame rate of up to 60 fps. This feature is currently not supported by cloud phones. |
void setToCustomPicture(int fps, int quality); | Configures the custom mode, which allows custom frame rates and image quality. This feature is currently not supported by cloud phones. Valid values of quality: 0 to 4. 0: lossless. 1: high. 2: medium. 3: normal. 4: auto. This feature is currently not supported by cloud phones. |
Under mStreamView.getASPEngineDelegate(): registerIMEListener, unregisterIMEListener, setImeType, enableRelativeMouse, and so on | This feature is currently not supported by cloud phones. |
4. Parameters
4.1 Config
The following table describes the parameters required for establishing a connection.
Parameter | Type | Description |
StreamView.CONFIG_DESKTOP_ID | string | The instance ID, which is the value of ResourceId returned by DescribeUserResources. |
StreamView.CONFIG_CONNECTION_TICKET | string | The connection authentication ticket, which is obtained through GetConnectionTicket. |
StreamView.CONFIG_PREFER_RTC_TRANSPORT | boolean | Enables the RTC channel. We recommend that you set the value to true for cloud phones. |
StreamView.CONFIG_ENABLE_VDAGENT_CHECK | boolean | Specifies whether to check VDAgent availability when you connect to a cloud phone. We recommend that you set the value to true for cloud phones. |
StreamView.CONFIG_ENABLE_STATISTICS | boolean | Specifies whether to report performance statistics. If performance statistics are reported, additional performance data is provided in video streams. We recommend that you set the value to true for cloud phones. |
OSType | string | Specifies the OS type. You must set the value to android for cloud phones. |
4.2 StatisticsInfo
The following table describes the parameters for collecting performance data.
Interface | Type | Description |
mReceiveFps | int | The received frame rate. |
mRenderFps | int | The rendered frame rate. |
mDownstreamBandwithMBPerSecond | double | The downstream bandwidth. |
mUpstreamBandwithMBPerSecond | double | The upstream bandwidth. |
mP2pFullLinkageLatencyMS | long | (Deprecated) The end-to-end full-link latency. |
mNetworkLatencyMS | long | The network RTT latency. |
mPingGatewayRttMS | long | The ping RTT latency. |
mLostRate | double | The packet loss rate. |
mServerRenderLatencyMS | long | The cloud-side rendering latency. |
mServerEncoderLatencyMS | long | The cloud-side encoding latency. |
mServerTotalLatencyMS | long | The cloud-side total latency. |
mTotalDownstreamBandwidth | long | The total bandwidth. |
mGuestCpuUsage | long | The guest CPU usage. |
mStreamType | String | The stream protocol type. |
5. Enumeration types
5.1 ScaleType
The following table describes the scaling processing types for streamed image content.
Parameter | Description |
FILL_STREAM_VIEW | Always stretches the streamed image to match the size of the StreamView instance. If the aspect ratio of the StreamView instance does not match that of the streamed image, this strategy may result in noticeable distortion. |
FIT_STREAM_CONTENT | Adjusts the rendering area of the StreamView instance to ensure that the streamed image content is always rendered with the same aspect ratio. If you use this strategy, the streamed image may not fill the entire display area of the StreamView instance. |
5.1 SystemPermission
The following table describes the system permission type.
Parameter | Description |
RECORDAUDIO | Requests the audio recording permissions. |
6. Customize MediaStreamPlayer
If there are no custom multimedia requirements, you can ignore this, as the SDK includes a default multimedia implementation.
6.1 Process media data with a custom media engine
By implementing com.aliyun.wuying.aspsdk.aspengine.MediaStreamPlayer, apps can use a custom media engine to process streaming media data, which mainly includes:
Video stream data: raw video streams primarily composed of H.264/H.265 compressed frames.
Adaptive graphic stream data: graphic streams mainly consisting of bitmaps.
Audio downstream data: audio downstream streams primarily in Opus/PCM format.
Cursor data: When virtual mouse mode is enabled, the app can receive cursor images and position data, allowing it to render a virtual cursor independently.
Apps can implement a custom media engine in the Alibaba Cloud Workspace SDK by using the IASPEngine.setMediaStreamPlayer interface.
6.2 MediaStreamPlayer
MediaStreamPlayer is an abstract class that requires apps to implement global initialization and cleanup methods. It also provides custom implementations for handling various types of media data.

Among them:
IVideoStreamHandler provides methods for handling video stream data.
IAdaptiveGraphicStreamHandler provides methods for handling adaptive graphic stream data.
IAudioPlaybackStreamHandler provides methods for handling audio playback data.
ICursorBitmap provides methods for handling cursor data.
Your app can choose to implement one or more of the interfaces mentioned above. Based on the interfaces it implements, the Alibaba Cloud Workspace SDK will decide the type of cloud streaming to use, according to the following rules:
If the app implements both IVideoStreamHandler and IAdaptiveGraphicStreamHandler, the streaming mode will be set to Mixed. In this case, Alibaba Cloud Workspace will automatically switch between adaptive graphic streaming and video streaming based on the current scenario.
If the app only implements IVideoStreamHandler, the streaming mode will be set to Video stream only, meaning the server will only send video stream data.
If the app only implements IAudioPlaybackStreamHandler, the streaming mode will be set to Image stream only, meaning the server will only send image stream data.
Your app can provide custom implementations for different types of media data to the SDK by using the onCreateXXXHandler methods of MediaStreamPlayer.
@Override
protected IVideoStreamHandler onCreateVideoStreamHandler() {
return new VideoStreamHandler();
}
@Override
protected IAdaptiveGraphicStreamHandler onCreateAdaptiveGraphicStreamHandler() {
return null;
}
@Override
protected IAudioPlaybackStreamHandler onCreateAudioPlaybackStreamHandler() {
return new AudioPlaybackStreamHandler();
}
@Override
protected ICursorBitmapHandler onCreateCursorBitmapHandler() {
return null;
}In the example above, the custom media engine implements both IVideoStreamHandler and IAudioPlaybackStreamHandler. The onCreateXXXHandler methods are executed only once during a streaming session.
Process of calling main methods:
6.2.1 initialize
This method is implemented by apps and can be used to perform global initialization tasks related to the custom media engine.
This method is executed once in each streaming process.
public ErrorCode initialize()Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if initialization is successful; otherwise, the initialization failed. |
6.2.2 release
This method is implemented by apps and can be used to perform global release tasks related to the custom media engine.
This method is executed once in each streaming process.
public ErrorCode release()Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if release is successful; otherwise, the release failed. |
6.2.3 enableStatistics
This method is implemented by apps and can be used to enable or disable performance statistics collection.
public void enableStatistics(boolean enabled)Parameters:
Parameter | Type | Description |
enabled | boolean | True: enables performance data collection. False: disables performance data collection. |
6.2.4 onCreateVideoStreamHandler
This method is implemented by apps and used to provide the SDK with a media engine implementation for handling video stream data.
This method is executed once in each streaming process.
public IVideoStreamHandler onCreateVideoStreamHandler()Return value:
Type | Description |
IVideoStreamHandler | Implemented by the media engine provided by the app for video stream data processing. If the app does not provide any implementation for video stream processing, null is returned. In this case, video stream data is not processed. |
6.2.5 onCreateAdaptiveGraphicStreamHandler
This method is implemented by apps and used to provide the SDK with a media engine implementation for handling adaptive graphic stream data.
This method is executed once in each streaming process.
public IAdaptiveGraphicStreamHandler onCreatAdaptiveGraphicStreamHandler()Return value:
Type | Description |
IAdaptiveGraphicStreamHandler | Implemented by the media engine provided by the app for adaptive graphic stream processing. If the app does not provide any implementation for adaptive graphic stream processing, null is returned. In this case, graphic stream data is not processed. |
6.2.6 onCreateAudioPlaybackStreamHandler
This method is implemented by apps and used to provide the SDK with a media engine implementation for handling audio downstream data.
This method is executed once in each streaming process.
public IAudioPlaybackStreamHandler onCreatAudioPlaybackStreamHandler()Return value:
Type | Description |
IAudioPlaybackStreamHandler | Implemented by the media engine provided by the app for audio downstream data processing. If the app does not provide any implementation for audio downstream data processing, null is returned. In this case, audio downstream data is not processed. |
6.2.7 onCreateCursorBitmapHandler
This method is implemented by apps and used to provide the SDK with a media engine implementation for handling cursor data.
This method is executed once in each streaming process.
The interface provided by this method is used only when the virtual mouse mode is activated.
public ICursorBitmapHandler onCreatCursorBitmapHandler()Return value:
Type | Description |
ICursorBitmapHandler | Implemented by the media engine provided by the app for cursor data processing. If the app does not provide any implementation for cursor data processing, null is returned. In this case, cursor data is not processed. |
6.3 IVideoStreamHandler
This interface defines the primary methods for handling video stream data, and its main workflow is as follows:
When the app switches between the foreground and background, the Surface instance used for rendering will be destroyed or recreated. In this case, the IVideoStreamHandler.setVideoSurface method will be invoked multiple times. When the Surface instance is destroyed, the surface object passed to setVideoSurface will be null. The app needs to handle fault tolerance for the decoder and rendering process appropriately.
The app can obtain the event handling interface provided by the Alibaba Cloud Workspace SDK by implementing the IVideoStreamHandler.setEventHandler method. Through this interface, the app can notify the SDK of certain video processing events from its custom media engine. This is mainly used for performance data tracking.
@Override
public void setEventHandler(EventHandler handler) {
Log.i(TAG, "setEventHandler handler " + handler);
VideoStreamEventHandler.getInstance().reset(handler);
}
...
public synchronized void onVideoFrameRendered() {
VFrame frame = mVideoFrame.remove();
if (mEnabled && mHandler != null) {
Event event = new Event();
event.type = EventType.RENDER_PERF_INFO;
event.decodePerfInfo = new VDecodePerfInfo();
event.renderPerfInfo = new VRenderPerfInfo();
event.renderPerfInfo.frameId = frame.frameId;
event.renderPerfInfo.sessionId = frame.sessionId;
// Notify the SDK that a frame of video image has been rendered. The SDK will then calculate the end-to-end latency on the client side based on the frame ID.
mHandler.callback(event);
}
}6.3.1 setEventHandler
This method is implemented by apps. When the custom media engine is loaded by the Alibaba Cloud Workspace SDK, the SDK uses this method to provide an event handler to your app. The app can use this handler to send video stream processing events.
public void setEventHandler(EventHandler handler)Parameters:
Parameter | Type | Description |
handler | EventHandler | The handler provided by the SDK. The app can use this handler to send video stream processing events to the SDK. |
6.3.2 addVideoTrack
This method is implemented by apps. It notifies your app when a video stream is created.
Currently, only one video stream can exist at a time during a single streaming session.
ErrorCode addVideoTrack(int trackId, VProfile profile);Parameters:
Parameter | Type | Description |
trackId | int | The ID of the video stream. |
profile | VProfile | The information about the video stream. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.3.3 setVideoSurface
This method is implemented by apps. It notifies your app when the state of the Surface instance used for video rendering changes.
ErrorCode setVideoSurface(int trackId, Surface surface);Parameters:
Parameter | Type | Description |
trackId | int | The ID of the video stream. |
surface | android.view.Surface | The Surface object used for video rendering. The object may be null when the app is switched to the background or when the screen is locked. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.3.4 playVideo
This method is implemented by apps. It notifies your app when the video stream is ready.
ErrorCode playVideo(int trackId);Parameters:
Parameter | Type | Description |
trackId | int | The ID of the video stream. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.3.5 removeVideoTrack
This method is implemented by apps. It notifies your app when a video stream is destroyed.
ErrorCode removeVideoTrack(int trackId);Parameters:
Parameter | Type | Description |
trackId | int | The ID of the video stream. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.3.6 pushVideoFrame
This method is implemented by apps. It notifies your app when a new video frame is received.
ErrorCode setVideoSurface(int trackId, Surface surface);Parameters:
Parameter | Type | Description |
trackId | int | The ID of the video stream. |
frame | VFrame | The information about the newly received video frame. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.3.7 getVideoTracks
This method is implemented by apps. The SDK uses it to retrieve information about video streams currently being processed by your app.
HashMap<Integer, VProfile> getVideoTracks();Return value:
Type | Description |
HashMap<Integer, VProfile> | The information about all video streams being processed obtained from your app. |
6.3.8 release
This method is implemented by apps. The SDK uses it to notify your app to perform cleanup actions when all video streams are destroyed.
ErrorCode release();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4 IAudioPlaybackStreamHandler
This interface defines the primary methods for handling audio stream data, and its main workflow is as follows:
6.4.1 initAudio
This method is implemented by apps. When an audio channel is created within the SDK, your app is notified through this method.
ErrorCode initAudio();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.2 deInitAudio
This method is implemented by apps. When an audio channel is destroyed within the SDK, your app is notified through this method.
ErrorCode deInitAudio();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.3 startAudioPlayback
This method is implemented by apps. When the cloud phone is about to start streaming audio, your app is notified through this method.
ErrorCode startAudioPlayback();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.4 stopAudioPlayback
This method is implemented by apps. When the cloud phone stops streaming audio, your app is notified through this method.
ErrorCode stopAudioPlayback();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.5 pushAudioPlaybackFrame
This method is implemented by apps. When a new downstream audio frame is received, your app is notified through this method.
ErrorCode pushAudioPlaybackFrame(AFrame pbData);Parameters:
Parameter | Type | Description |
pbData | AFrame | The information about the newly received audio stream. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.6 updateAudioPlaybackVol
This method is implemented by apps. When the system volume in the cloud phone changes, your app is notified through this method.
ErrorCode updateAudioPlaybackVol(int volume);Parameters:
Parameter | Type | Description |
volume | int | The system volume of the cloud phone. Valid values: 0 to USHRT_MAX. 0 indicates the cloud phone is muted. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.7 updateAudioPlaybackMute
This method is implemented by apps. When the cloud phone is muted or unmuted, your app is notified through this method.
ErrorCode updateAudioPlaybackMute(int mute);Parameters:
Parameter | Type | Description |
mute | int | Specifies whether to mute the cloud phone. Valid values: 1 and 0. 1 indicates the cloud phone is muted, and 0 indicates that the cloud phone is unmuted. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.4.8 release
This method is implemented by apps. The SDK uses it to notify your app to perform cleanup actions when an audio channel is destroyed.
ErrorCode release();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.5 IAdaptiveGraphicStreamHandler
This interface defines the primary methods for handling image stream data, and its main workflow is as follows:
The current format of image frames obtained by apps is Bitmap ARGB8888.
Only one image stream exists at a time during a streaming process.
6.5.1 setAdaptiveGraphicSurface
This method is implemented by apps. It notifies your app when the state of the Surface instance used for image rendering changes.
ErrorCode setAdaptiveGraphicSurface(Surface surface);Parameters:
Parameter | Type | Description |
surface | android.view.Surface | The Surface object used for image rendering. The object may be null when the app is switched to the background or when the screen is locked. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.5.2 invalidateAdaptiveGraphicSurface
This method is implemented by apps. It notifies your app when a new adaptive graphic frame is received.
ErrorCode invalidateAdaptiveGraphicSurface(Region region, byte[] buffer, BitmapFormat format);Parameters:
Parameter | Type | Description |
region | Region | The drawing area information of the adaptive graphic frame. |
buffer | byte[] | The adaptive graphic frame. |
format | BitmapFormat | The format information of the adaptive graphic frame, with ARGB8888 as the default. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.5.3 release
This method is implemented by apps. The SDK uses it to notify your app to perform cleanup actions when all adaptive graphic streams are destroyed.
ErrorCode release();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.6 ICursorBitmapHandler
This interface defines the primary methods for handling cursor data. When virtual mouse mode is activated, the app provides an implementation of this interface to the SDK for cursor image rendering. The main workflow is as follows:
6.6.1 setCursorBitmap
This method is implemented by apps. When the cursor graphic in the cloud phone changes, your app is notified through this method.
ErrorCode setCursorBitmap(CursorBitmap bitmap);Parameters:
Parameter | Type | Description |
bitmap | CursorBitmap | The cursor graphic data of the cloud phone. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.6.2 unsetCursorBitmap
This method is implemented by apps. When the cursor in the cloud phone is hidden, your app is notified through this method.
ErrorCode unsetCursorBitmap();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.6.3 setCursorPosition
This method is implemented by apps. When the cursor position in the cloud phone changes, your app is notified through this method.
ErrorCode setCursorPosition(float x, float y);Parameters:
Parameter | Type | Description |
x | float | The X-coordinate of the cursor. |
y | float | The Y-coordinate of the cursor. |
Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
6.6.4 release
This method is implemented by apps. The SDK uses it to notify your app to perform cleanup actions when the connection is discontinued and the cursor display is canceled.
ErrorCode release();Return value:
Type | Description |
ErrorCode | Returns ErrorCode.OK if execution is successful; otherwise, the execution failed. |
7. Error codes
Error code | Error message ( | Defined module | Cause |
Codes 2 to 26 are primarily related to network issues. | |||
2 | Failed to connect to %s. | ASP SDK | Invalid MAGIC. |
3 | Failed to connect to %s. | ASP SDK | Incorrect data. |
4 | The client version does not match the server version. | ASP SDK | Mismatched version. |
5 | Transport Layer Security (TLS) is required for connection. | ASP SDK | TLS is required. |
6 | TLS is not required for connection. | ASP SDK | TLS is applied. |
7 | You do not have permissions to connect to %s. | ASP SDK | Permission issue. |
8 | ASP SDK | Invalid client ID during migration. | |
9 | Failed to connect to %s. | ASP SDK | No channel is found. |
20 | Failed to connect to the ASP server. | ASP SDK | Channel connection error. |
21 | TLS authentication error. | ASP SDK | TLS authentication failed. |
22 | Failed to connect to %s. | ASP SDK | Channel link error. |
23 | Failed to connect to %s. | ASP SDK | Connection authentication error. |
24 | Failed to connect to %s. | ASP SDK | Connection I/O error. |
25 | Failed to connect to %s. | ASP SDK | Ticket validation failed. This error will also be triggered if the same ticket is used to request a new connection after the previous session was disconnected. |
26 | ASP SDK | Xquic handshake failure. | |
Cases where the connection is interrupted or encounters errors | |||
2000 | Failed to connect to the server because the data retrieval of %s times out. | ASP SDK | Normal disconnection. |
2001 | Failed to connect %s to the server because the %s process is forcibly suspended. | ASP SDK | This issue typically occurs when the client app is terminated (for example, an Android app closed by the user via the Home button). |
2002 | Another user has connected to the current %s from a different terminal. Please try again later. | ASP SDK | The instance was taken over by another user. |
2003 | %s is shutting down or restarting (usually initiated by an administrator). Please try again later. | ASP SDK | The cloud phone is shut down or restarted typically by an administrator. |
2004 | User connection terminated. | ASP SDK | A client-initiated disconnect, or a server-initiated action (standard disconnect or forced termination/kick). |
2005 | %s is disconnected due to timeout, as the usage duration limit set by the administrator has been reached. | ASP SDK | %s is disconnected due to administrator-set usage duration limit. |
2010 | Failed to connect to %s. | ASP SDK | Vdagent connection failure. |
2011 | Incorrect connection parameter passed. | ASP SDK | An error occurs when passing connection parameters. |
2027 | Stream pull mode switched. | ASP SDK | The mode was changed from preemptive to collaborative, or vice versa. |
2100 | Clipboard permission denied (from %s to local). | ASP SDK | Clipboard permission restricted (from VM to local). |
2101 | Clipboard permission denied (from local to %s). | ASP SDK | Clipboard permission restricted (from local to VM). |
2200 | %s is attempting to reconnect... | ASP SDK | Disconnected due to network issues, the ASP SDK is reconnecting. |
2201 | Your device encountered a network anomaly, causing %s to disconnect. | ASP SDK | Disconnected due to network issues, the ASP SDK does not support reconnection due to image constraints; the app initiates reconnection. |
2202 | %s reconnection timed out. Please check your device's network and try again. | ASP SDK | ASP SDK reconnection timeout. |
Client-side logic errors | |||
5100 | Connection from %s to ASP Server timed out. Please try again later. | App-side | The client did not receive a connected event within the specified timeframe. |
5102 | Failed to obtain the data of %s due to a timeout error. Please try again later. | App-side | The client received a connected event but did not receive a display event within the specified timeframe. |
5004 | The client encountered an error. Please reopen the client. | App-side | Incorrect startup parameters passed to the client (usually occurs during development). |
5200 | Client reconnection timed out. Please try again later. | App-side | |
8. FAQ
How do I restart a cloud phone?
To restart a cloud phone, call the RebootAndroidInstancesInGroup API operation. Note that the active client connection to the cloud phone will be temporarily disconnected during the restart. Once the restart is complete, simply reconnect from the client side.
Maven support
Public Maven hosting is not currently available. However, customers can manually integrate the SDK by uploading the AAR library to their own Maven repository.
Common ADB commands
Feature | Command |
Back button | input keyevent KEYCODE_BACK |
Home button | input keyevent KEYCODE_HOME |
App switch | input keyevent KEYCODE_APP_SWITCH |
Mute | input keyevent 164 |
Volume up | input keyevent KEYCODE_VOLUME_UP |
Volume down | input keyevent KEYCODE_VOLUME_DOWN |
Hide navigation bar | setprop persist.wy.hasnavibar false; killall com.android.systemui |
Show navigation bar | setprop persist.wy.hasnavibar true; killall com.android.systemui |
Take screenshot | screencap -p /sdcard/Download/abc.png |