This topic explains how to implement screen sharing on Android.
Feature introduction
Screen sharing enables a user to present their screen content to other users in a channel during a video call or live stream. It facilitates instant information sharing and enhances visual communication.
Sample code
Alibaba Real-Time Communication (ARTC) provides open-source sample code: Implement screen sharing on Android.
Prerequisites
Before you implement screen sharing, make sure you meet the following requirements:
Create an ARTC application and obtain the AppID and AppKey from the ApsaraVideo Live console.
Integrate the ARTC SDK into your project and implement basic audio and video call features.
Due to system limitations, screen capturing is only supported on Android API level 21 (Android 5.0) or later. If you call
startScreenShareon an unsupported version, the call fails.Due to system limitations, system audio capturing is only supported on Android API level 29 (Android 10.0) or later.
Limitations
On certain devices from manufacturers like Xiaomi, sharing system audio may not be supported. Some devices that can otherwise record audio normally may produce noise when capturing system audio from specific apps like QQ Music.
On certain devices from manufacturers like Huawei, changing the resolution during screen sharing can cause the application to crash.
Test the screen sharing feature on a wide range of devices to ensure stability.
Implementation
The following diagram shows the call sequence for implementing screen sharing:
1. Set permissions
To implement stable and compatible screen sharing on Android, you must add the following permissions and service declaration to your AndroidManifest.xml file.
FOREGROUND_SERVICE: Allows the app to run a foreground service, which prevents the screen sharing service from being terminated by the system when the app is in the background.
FOREGROUND_SERVICE_MEDIA_PROJECTION: A granular permission introduced in Android 10 (API level 29) that identifies the foreground service as being used for screen capture (MediaProjection). This permission is mandatory for apps targeting Android 14 (API level 34) and later.
POST_NOTIFICATIONS: Starting from Android 13 (API level 33), an app must declare this permission to post notifications, such as those for a foreground service. Without user authorization, the notification cannot be shown.
Service declaration: To implement a foreground service, you must declare the service component in
AndroidManifest.xmlfor the system to recognize and launch it. Failure to declare the service will result in an error or prevent it from starting.
The system requests permission for screen sharing from the user through a standard authorization dialog. User approval is required to proceed. This permission is granted only for the current session, and your application must request permission each time screen sharing begins.
<manifest ...>
<!-- Declare foreground service permissions -->
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />
<!-- Required for Android 13 (API level 33) and later -->
<uses-permission android:name="android.permission.POST_NOTIFICATIONS"/>
<application ...>
<!-- Declare the foreground service for screen sharing -->
<service android:name="com.alivc.rtc.share.ForegroundService"
android:enabled="true" />
</application>
</manifest>2. Configure camera and screen sharing streams
ARTC supports ingesting camera and screen sharing streams. Configure them based on your use case.
2.1. Push only the screen sharing stream
To ingest only the screen sharing stream, you must explicitly disable the camera stream, as it is enabled by default.
Before joining a channel, call
publishLocalVideoStream(false)to disable camera stream ingest.After joining a channel, call
startScreenShareto start screen capture and ingest the screen sharing stream.
mAliRtcEngine.publishLocalVideoStream(false);2.2. Push both the camera and screen sharing streams
Camera stream ingest is enabled by default. If disabled, call publishLocalVideoStream(true) to enable it. After joining a channel, call startScreenShare to start screen capture and ingest the screen sharing stream.
mAliRtcEngine.publishLocalVideoStream(true);3. Configure the screen sharing stream encoder
To customize the encoding properties of the screen sharing stream, call the setScreenShareEncoderConfiguration method. You can configure properties such as resolution, frame rate, bitrate, keyframe interval (GOP), and video orientation.
You can call this method both before and after joining a channel. To set the encoding properties only once per session, call this method before joining.
To update the configuration, you can call this method multiple times.
The following table describes the configuration parameters:
Parameter | Description | Value |
dimensions | The video resolution. | Default value: 0×0, which indicates that the ingested stream's resolution matches the screen capture resolution. Maximum value: 3840×2160. |
frameRate | The video frame rate. | Default value: 5. Maximum value: 30. |
bitrate | The video encoding bitrate in Kbps. Note The bitrate must be set within a reasonable range corresponding to the resolution and frame rate. Otherwise, the SDK automatically adjusts the bitrate to a valid value. | Default value: 512. |
keyFrameInterval | The keyframe interval, or GOP. Unit: milliseconds. | Default value: 0, which indicates that the SDK internally controls the keyframe interval. |
forceStrictKeyFrameInterval | Specifies whether to force the encoder to generate keyframes strictly at the set interval. | Default value: false. Valid values:
|
rotationMode | The orientation of the ingested stream. | Default value: AliRtcRotationMode_0. You can choose 0, 90, 180, or 270 degrees. |
Sample code:
AliRtcEngine.AliRtcScreenShareEncoderConfiguration screenShareEncoderConfiguration = new AliRtcEngine.AliRtcScreenShareEncoderConfiguration();
screenShareEncoderConfiguration.bitrate = bitrate;
screenShareEncoderConfiguration.dimensions = new AliRtcEngine.AliRtcVideoDimensions(width, height);
screenShareEncoderConfiguration.frameRate = fps;
screenShareEncoderConfiguration.keyFrameInterval = gop;
screenShareEncoderConfiguration.forceStrictKeyFrameInterval= mForceKeyFrameSwitch.isChecked();
mAliRtcEngine.setScreenShareEncoderConfiguration(screenShareEncoderConfiguration);4. Start screen capture
Call the startScreenShare method to start the screen capture process, which captures the device's screen and ingests the stream into the channel. Configure the parameters based on your use case:
intent: The Activity used to start screen sharing. If you do not create a custom Activity, passing
nullis recommended.screenShareMode: The screen sharing mode. You can choose to share only audio, only video, or both.
When your app calls this method, the system displays a dialog prompting the user to grant permission for the app to capture the screen. This dialog is managed by the system. User approval is required to proceed.
Button mStartScreenShareBtn = findViewById(R.id.start_screen_share_btn);
mStartScreenShareBtn.setOnClickListener(v -> {
if(mAliRtcEngine != null) {
// Configure the screen sharing stream encoder.
getScreenShareEncoderConfiguration();
mAliRtcEngine.setScreenShareEncoderConfiguration(screenShareEncoderConfiguration);
// Start screen capture.
mAliRtcEngine.startScreenShare(null, screenShareMode);
}
});5. Stop screen capture
Call the stopScreenShare method to stop screen capture. This terminates the ingestion of the screen sharing stream and releases the associated resources.
Button mStopScreenShareBtn = findViewById(R.id.stop_screen_share_btn);
mStopScreenShareBtn.setOnClickListener(v -> {
if(mAliRtcEngine != null && mAliRtcEngine.isScreenSharePublished()) {
mAliRtcEngine.stopScreenShare();
}
});6. View the shared screen
When a user starts screen sharing, other users in the channel receive the onRemoteTrackAvailableNotify callback. This event notifies the client when a remote user publishes or unpublishes their camera or screen sharing stream. Your application can use the parameters in this callback to dynamically create or destroy the render view associated with the stream.
The AliRtcVideoTrack enum in the onRemoteTrackAvailableNotify callback identifies the state of the remote video stream:
AliRtcVideoTrackNo (0): The remote user is not publishing any video stream.AliRtcVideoTrackCamera (1): Only the camera stream is pushed.AliRtcVideoTrackScreen (2): Only the screen sharing stream is pushed.AliRtcVideoTrackBoth (3): Both the camera and screen sharing streams are pushed.
Based on this value, your application can adjust its UI logic accordingly, such as rendering camera and screen sharing streams in separate views or removing views for streams that are no longer available.
private AliRtcEngineNotify mRtcEngineNotify = new AliRtcEngineNotify() {
// Set the render view for the remote video stream in the onRemoteTrackAvailableNotify callback.
@Override
public void onRemoteTrackAvailableNotify(String uid, AliRtcEngine.AliRtcAudioTrack audioTrack, AliRtcEngine.AliRtcVideoTrack videoTrack){
handler.post(new Runnable() {
@Override
public void run() {
if(videoTrack == AliRtcVideoTrackCamera || videoTrack == AliRtcVideoTrackScreen) {
viewRemoteVideo(uid, videoTrack);
} else if (videoTrack == AliRtcVideoTrackBoth) {
viewRemoteVideo(uid, AliRtcVideoTrackCamera);
viewRemoteVideo(uid, AliRtcVideoTrackScreen);
} else if(videoTrack == AliRtcVideoTrackNo) {
removeAllRemoteVideo(uid);
}
}
});
}
}
/**
* Display the remote video stream (camera or screen sharing).
* @param uid The remote user ID.
* @param videoTrack The video stream type.
*/
private void viewRemoteVideo(String uid, AliRtcEngine.AliRtcVideoTrack videoTrack) {
String streamKey = getStreamKey(uid, videoTrack);
FrameLayout view;
if (remoteViews.containsKey(streamKey)) {
view = remoteViews.get(streamKey);
if (view != null) {
view.removeAllViews();
} else {
view = createVideoView(streamKey);
gridVideoContainer.addView(view);
remoteViews.put(streamKey, view);
}
} else {
view = createVideoView(streamKey);
gridVideoContainer.addView(view);
remoteViews.put(streamKey, view);
}
// Create a SurfaceView and set it for rendering.
SurfaceView surfaceView = mAliRtcEngine.createRenderSurfaceView(this);
surfaceView.setZOrderMediaOverlay(true);
view.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
// Configure the canvas.
AliRtcEngine.AliRtcVideoCanvas videoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
videoCanvas.view = surfaceView;
mAliRtcEngine.setRemoteViewConfig(videoCanvas, uid, videoTrack);
}
/**
* Removes the view for a specified user's video stream.
* @param uid The remote user ID.
* @param videoTrack The video stream type.
*/
private void removeRemoteVideo(String uid, AliRtcEngine.AliRtcVideoTrack videoTrack) {
String streamKey = getStreamKey(uid, videoTrack);
// Find the corresponding FrameLayout container and remove the view.
FrameLayout frameLayout = remoteViews.remove(streamKey);
if(frameLayout != null) {
frameLayout.removeAllViews();
gridVideoContainer.removeView(frameLayout);
Log.d("RemoveRemoteVideo", "Removed video stream for: " + streamKey);
}
}
/**
* Removes all video views for a specified user.
* @param uid The remote user ID.
*/
private void removeAllRemoteVideo(String uid) {
removeRemoteVideo(uid, AliRtcVideoTrackCamera);
removeRemoteVideo(uid, AliRtcVideoTrackScreen);
}7. (Optional) Configure the screen sharing audio volume
To share system audio, call the setAudioShareVolume method to control the volume of the screen sharing audio stream.
mAliRtcEngine.setAudioShareVolume(60);