All Products
Search
Document Center

ApsaraVideo Live:Implement audio/video communications on Android

Last Updated:Dec 17, 2025

This guide shows how to integrate the ARTC SDK into your Android project to build a real-time audio and video application, suitable for use cases such as interactive live streaming and video calls.

Feature description

Before you begin, understand the following key concepts:

  • ARTC SDK: An SDK provided by Alibaba Cloud that helps developers quickly implement real-time audio and video interaction.

  • Global Realtime Transport Network (GRTN): A globally distributed network engineered for real-time media, ensuring ultra-low latency, high-quality, and secure communication.

  • Channel: A virtual room that users join to communicate with each other. All users in the same channel can interact in real time.

  • Host: A user who can publish audio and video streams in a channel and subscribe to streams published by other hosts.

  • Viewer: A user who can subscribe to audio and video streams in a channel but cannot publish their own.

Basic process for implementing real-time audio and video interaction:

image
  1. Call setChannelProfile to set the scenario, and call joinChannel to join a channel:

    • Video call scenario: All users are hosts and can both publish and subscribe to streams.

    • Interactive streaming scenario: Roles must be set using setClientRole before joining a channel. For users who will publish streams, set the role to host. If a user only needs to subscribe to streams, set the role to viewer.

  2. After joining the channel, users have different publishing and subscribing behaviors based on their roles:

    • All users can receive audio and video streams within that channel.

    • A host can publish audio and video streams in the channel.

    • If a viewer wants to publish streams, call the setClientRole method to switch the role to host.

Sample project

ARTC SDK provides an open-source sample project for real-time audio and video apps.

Environment requirements

Before you run the sample project, make sure your development environment meets the following requirements:

  • Development tool: Android Studio 2020.3.1 or later.

  • Test device: A test device running Android 5.0 (SDK API Level 21) or later.

    Note

    We recommend using a physical device for testing, as some emulators may lack the required functionality.

  • Network: A stable internet connection.

  • Application: Obtain the AppID and AppKey for your ARTC application. For details, see Create an ARTC application.

Create a project (optional)

This section shows how to create a project and add the necessary permissions for audio and video interaction. You can skip this section if you already have a project.

  1. Open Android Studio and select New Project.

  2. Select Phone and Tablet and choose a starter template. This example uses Empty Views Activity.

image.png

  1. Configure your project information, including the project name, package name, save location, language (Java in this example), and build configuration language (Groovy DSL in this example).

image.png

  1. Click Finish and wait for the project to sync.

Configure the project

Step 1: Import SDK

Automatic integration through Maven (recommended)

  1. Open the settings.gradle file in the root directory of your project and add the Maven repository for the ARTC SDK to the dependencyResolutionManagement/repositories field, as shown in the following example:

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
    repositories {
        google()
        mavenCentral()
        // Add the Maven repository for the ARTC SDK
        maven { url 'https://maven.aliyun.com/repository/google' }
        maven { url 'https://maven.aliyun.com/repository/public' }
    }
}

Note: If you are using Android Gradle Plugin version lower than 7.1.0, you may not find the corresponding field in the settings.gradle file. For more information, see Android Gradle Plugin 7.1. In this case, use the following alternative solution:

Alternative solution

Open the build.gradle file in the root directory of your project and add the Maven repository to allprojects/repositories, as shown in the following example:

allprojects {
    repositories {
        ...
        // Add the Maven repository for the ARTC SDK
        maven { url 'https://maven.aliyun.com/repository/google' }
        maven { url 'https://maven.aliyun.com/repository/public' }
    }
}
  1. Open the app/build.gradle file and add the dependencies for the ARTC SDK to dependencies. You can obtain the latest version in Download SDK and replace ${latest_version} with the specific version number. The latest available version is 7.9.1.

dependencies {
    // Add the dependencies for the ARTC SDK
    // Replace ${latest_version} with the specific version number.
    implementation 'com.aliyun.aio:AliVCSDK_ARTC:${latest_version}'
    // For versions 7.4.0 and below, add the keep dependency
    // implementation 'com.aliyun.aio.keep:keep:1.0.1'
}   

If you are using Android Gradle Plugin version 8.1 or later, Android Studio recommends migrating dependency library information to the version catalog. For more information, see Migrate dependencies.

Manual integration

  1. Download the ARTC SDK AAR file for the version you need (AliVCSDK_ARTC-x.y.z.aar). The latest available version is 7.9.1.

  2. Copy the downloaded AAR file to your project directory, such as app/libs. Create this folder if it is not available.

  3. Open the settings.gradle file in the root directory of your project and add the folder containing the AAR file to dependencyResolutionManagement/repositories, as shown in the following example:

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
    repositories {
        google()
        mavenCentral()
        // Add the relative directory where the ARTC SDK is located
        flatDir {
            dir 'app/libs'
        }
    }
}

Note: If you are using Android Gradle Plugin version lower than 7.1.0, you may not find the corresponding field in the settings.gradle file. For more information, see Android Gradle Plugin 7.1. In this case, use the following alternative solution:

Open the build.gradle file in the root directory of your project and add the following field to allprojects/repositories:

allprojects {
    repositories {
        ...
        // Add the relative directory where the ARTC SDK is located
        flatDir {
            dir 'app/libs'
        }
    }
}
  1. Open the app/build.gradle file and add the dependencies for the AAR file to dependencies. Sample code:

// Replace x.y.z with the specific version number.
implementation(name:'AliVCSDK_ARTC', version: 'x.y.z', ext:'aar')
  1. The corresponding dependency is generated under External Libraries.

    image

Step 2: Configure supported CPU architectures

Open the app/build.gradle file and specify the CPU architectures supported by your project in defaultConfig, as shown in the following example. Available architectures include armeabi-v7a, arm64-v8a, x86, and x86_64.

android {
    defaultConfig {
        // ...other default configurations
        // Support armeabi-v7a and arm64-v8a architectures
        ndk {
             abiFilters "armeabi-v7a", "arm64-v8a"
        }
    }
}	

Step 3: Configure permissions

Configure the permissions required by your application:

Go to the app/src/main directory, open the AndroidManifest.xml file, and add the required permissions.

<uses-feature android:name="android.hardware.camera" android:required="false" /> 
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<!-- Request legacy Bluetooth permissions on older devices. -->
<uses-permission
  android:name="android.permission.BLUETOOTH"
  android:maxSdkVersion="30" />
<uses-permission
  android:name="android.permission.BLUETOOTH_ADMIN"
  android:maxSdkVersion="30" />

<!-- Needed only if your app communicates with already-paired Bluetooth devices. -->
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<uses-permission android:name="android.permission.WRITE_SETTINGS"
  tools:ignore="ProtectedPermissions" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />

Note: For Android 6.0 (API level 23) and later, runtime permissions need to be requested dynamically. In addition to statically declaring them in the AndroidManifest.xml file, you also need to request permissions at runtime.

Runtime permissions include:

  • Manifest.permission.CAMERA

  • Manifest.permission.WRITE_EXTERNAL_STORAGE

  • Manifest.permission.RECORD_AUDIO

  • Manifest.permission.READ_EXTERNAL_STORAGE

  • Manifest.permission.READ_PHONE_STATE

For Android 12 (API level 31) and later, the following permission also needs to be requested dynamically:

  • Manifest.permission.BLUETOOTH_CONNECT

Permission descriptions:

Permission name

Description

Purpose

Required

Dynamic request

CAMERA

Camera permission.

Access the device camera to capture video streams.

Yes

Android 6 and later

RECORD_AUDIO

Microphone permission.

Access the device microphone to capture audio streams.

Yes

Android 6 and later

INTERNET

Network permission.

Transmit audio and video data over network (e.g., WebRTC protocol).

Yes

No

ACCESS_NETWORK_STATE

Allows the application to obtain network status.

Monitor network connection status to optimize audio and video transmission quality, such as reconnecting when disconnected.

No

No

ACCESS_WIFI_STATE

Allows the application to obtain WiFi status.

Obtain current WiFi connection information to optimize network performance.

No

No

MODIFY_AUDIO_SETTINGS

Allows the application to modify audio configurations.

Adjust system volume, switch audio output devices, etc.

No

No

BLUETOOTH

Bluetooth permission (basic functionality)

Connect to Bluetooth devices (such as Bluetooth headsets).

No

No

BLUETOOTH_CONNECT

Bluetooth connection permission

Communicate with paired Bluetooth devices, such as transmitting audio streams.

No

Android 12 and later

READ_PHONE_STATE

Allows the application to access information related to the device's phone state

Start or stop audio based on the phone state.

No

Android 6 and later

READ_EXTERNAL_STORAGE

Allows the application to read files from external storage.

Play local music, etc.

No

Android 6 and later

WRITE_EXTERNAL_STORAGE

Allows the application to write to external storage.

Save audio and video files, logs, etc.

No

Android 6 and later

Step 4: Prevent code obfuscation (optional)

In the app/proguard-rules.pro file, configure rules for the SDK to prevent the interfaces provided by the SDK from being obfuscated, which leads to improper function calls.

-keep class com.aliyun.allinone.** {
*;
}

-keep class com.aliyun.rts.network.AliHttpTool {
*;
}

-keep class com.aliyun.common.AlivcBase {
*;
}

-keep class com.huawei.multimedia.alivc.** {
*;
}

-keep class com.alivc.rtc.** {
*;
}

-keep class com.alivc.component.** {
*;
}

-keep class org.webrtc.** {
*;
}

Step 5: Create a user interface

Create a user interface that suits your scenario. The following sample code for a video call scenario creates two views to display the local and remote video streams. You can use it as a reference during development.

User interface code example

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:id="@+id/video_chat_main"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".VideoCall.VideoCallActivity"
    >
    <LinearLayout
        android:id="@+id/ll_channel_layout"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        app:layout_constraintTop_toTopOf="parent"
        app:layout_constraintLeft_toLeftOf="parent"
        app:layout_constraintRight_toRightOf="parent"
        app:layout_constraintBottom_toTopOf="@id/ll_video_layout"
        android:orientation="vertical">

        <LinearLayout
            android:id="@+id/ll_channel_desc"
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            android:orientation="horizontal"
            android:layout_marginTop="12dp"
            android:layout_marginLeft="8dp"
            android:layout_marginRight="12dp"
        >
            <TextView
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:text="@string/video_chat_channel_desc"
                />

        </LinearLayout>
        <LinearLayout
            android:id="@+id/ll_channel_id"
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            android:orientation="horizontal"
            android:layout_marginTop="12dp"
            android:layout_marginLeft="8dp"
            android:layout_marginRight="12dp"
            app:layout_constraintTop_toTopOf="parent"
            app:layout_constraintLeft_toLeftOf="parent"
            app:layout_constraintRight_toRightOf="parent"
            android:visibility="visible">
            <TextView
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_weight="0"
                android:text="ChannelID:"
                android:layout_marginTop="5dp"
                />
            <EditText
                android:id="@+id/channel_id_input"
                android:layout_width="0dp"
                android:layout_height="wrap_content"
                android:layout_weight="1"
                android:text=""
                android:padding="5dp"
                android:textSize="15sp"
                android:layout_marginLeft="10dp"
                android:layout_marginTop="5dp"
                android:layout_marginRight="10dp"
                android:background="@drawable/edittext_border"
                />
        </LinearLayout>
        <LinearLayout
            android:id="@+id/ll_bottom_bar"
            android:layout_width="match_parent"
            android:layout_height="48dp"
            android:layout_marginTop="20dp"
            android:orientation="horizontal"
            android:gravity="center_vertical"
            app:layout_constraintLeft_toLeftOf="parent"
            app:layout_constraintRight_toRightOf="parent"
            app:layout_constraintTop_toBottomOf="@id/ll_channel_desc"
            app:layout_constraintBottom_toBottomOf="parent">
            <TextView
                android:id="@+id/join_room_btn"
                android:layout_width="0dp"
                android:layout_height="wrap_content"
                android:layout_weight="1"
                android:text="@string/video_chat_join_room"
                android:layout_marginStart="20dp"
                android:layout_marginEnd="20dp"
                android:gravity="center"
                android:padding="10dp"
                android:background="@color/layout_base_blue"
                />

        </LinearLayout>
    </LinearLayout>
    <LinearLayout
        android:id="@+id/ll_video_layout"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:orientation="vertical"
        app:layout_constraintTop_toBottomOf="@id/ll_channel_layout"
        app:layout_constraintLeft_toLeftOf="parent"
        app:layout_constraintRight_toRightOf="parent"
        android:layout_marginTop="10dp"
        android:layout_marginBottom="10dp"
        android:layout_marginLeft="10dp"
        android:layout_marginRight="10dp"
        >

        <LinearLayout
            android:id="@+id/video_layout_1"
            android:layout_width="match_parent"
            android:layout_height="0dp"
            android:layout_weight="0.5"
            android:orientation="horizontal">

            <FrameLayout
                android:id="@+id/fl_local"
                android:layout_width="108dp"
                android:layout_weight="0.5"
                android:layout_height="192dp"
                />
            <FrameLayout
                android:id="@+id/fl_remote"
                android:layout_marginLeft="5dp"
                android:layout_width="108dp"
                android:layout_weight="0.5"
                android:layout_height="192dp"
                />

        </LinearLayout>

        <LinearLayout
            android:id="@+id/video_layout_2"
            android:layout_width="match_parent"
            android:layout_height="0dp"
            android:layout_weight="0.5"
            android:layout_marginTop="10dp"
            android:orientation="horizontal">

            <FrameLayout
                android:id="@+id/fl_remote2"
                android:layout_width="108dp"
                android:layout_weight="0.5"
                android:layout_height="192dp"
                />
            <FrameLayout
                android:id="@+id/fl_remote3"
                android:layout_marginLeft="5dp"
                android:layout_width="108dp"
                android:layout_weight="0.5"
                android:layout_height="192dp"
                />

        </LinearLayout>
    </LinearLayout>
</androidx.constraintlayout.widget.ConstraintLayout>

Implementation

This section explains how to use the ARTC SDK to build a basic real-time audio and video application. You can copy the complete code sample into your project to test the functionality. The steps below explain the core API calls.

The following diagram shows the basic workflow for implementing a video call:

image

Code example for a video call scenario

Code example

/**
 * API call example for a video call scenario.
 */
public class VideoCallActivity extends AppCompatActivity {

    private Handler handler;
    private EditText mChannelEditText;
    private TextView mJoinChannelTextView;
    private boolean hasJoined = false;
    private FrameLayout fl_local, fl_remote, fl_remote_2, fl_remote_3;

    private AliRtcEngine mAliRtcEngine = null;
    private AliRtcEngine.AliRtcVideoCanvas mLocalVideoCanvas = null;
    private Map<String, ViewGroup> remoteViews = new ConcurrentHashMap<String, ViewGroup>();

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        handler = new Handler(Looper.getMainLooper());
        EdgeToEdge.enable(this);
        setContentView(R.layout.activity_video_chat);
        ViewCompat.setOnApplyWindowInsetsListener(findViewById(R.id.video_chat_main), (v, insets) -> {
            Insets systemBars = insets.getInsets(WindowInsetsCompat.Type.systemBars());
            v.setPadding(systemBars.left, systemBars.top, systemBars.right, systemBars.bottom);
            return insets;
        });
        setTitle(getString(R.string.video_chat));
        getSupportActionBar().setDisplayHomeAsUpEnabled(true);

        fl_local = findViewById(R.id.fl_local);
        fl_remote = findViewById(R.id.fl_remote);
        fl_remote_2 = findViewById(R.id.fl_remote2);
        fl_remote_3 = findViewById(R.id.fl_remote3);

        mChannelEditText = findViewById(R.id.channel_id_input);
        mChannelEditText.setText(GlobalConfig.getInstance().gerRandomChannelId());
        mJoinChannelTextView = findViewById(R.id.join_room_btn);
        mJoinChannelTextView.setOnClickListener(v -> {
            if(hasJoined) {
                destroyRtcEngine();
                mJoinChannelTextView.setText(R.string.video_chat_join_room);
            } else {
                startRTCCall();
            }
        });
    }

    public static void startActionActivity(Activity activity) {
        Intent intent = new Intent(activity, VideoCallActivity.class);
        activity.startActivity(intent);
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        if (item.getItemId() == android.R.id.home) {
            // Action to take when the back button is clicked
            destroyRtcEngine();
            finish();
            return true;
        }
        return super.onOptionsItemSelected(item);
    }

    private FrameLayout getAvailableView() {
        if (fl_remote.getChildCount() == 0) {
            return fl_remote;
        } else if (fl_remote_2.getChildCount() == 0) {
            return fl_remote_2;
        } else if (fl_remote_3.getChildCount() == 0) {
            return fl_remote_3;
        } else {
            return null;
        }
    }

    private void handleJoinResult(int result, String channel, String userId) {
        handler.post(() -> {
            String  str = null;
            if(result == 0) {
                str = "User " + userId + " Join " + channel + " Success";
            } else {
                str = "User " + userId + " Join " + channel + " Failed!, error: " + result;
            }
            ToastHelper.showToast(this, str, Toast.LENGTH_SHORT);
            ((TextView)findViewById(R.id.join_room_btn)).setText(R.string.leave_channel);
        });
    }

    private void startRTCCall() {
        if(hasJoined) {
            return;
        }
        initAndSetupRtcEngine();
        startPreview();
        joinChannel();
    }

    private void initAndSetupRtcEngine() {

        // Create and initialize the engine
        if(mAliRtcEngine == null) {
            mAliRtcEngine = AliRtcEngine.getInstance(this);
        }
        mAliRtcEngine.setRtcEngineEventListener(mRtcEngineEventListener);
        mAliRtcEngine.setRtcEngineNotify(mRtcEngineNotify);


        // Set the channel profile to Interactive Mode. For RTC, always use AliRTCSdkInteractiveLive.
        mAliRtcEngine.setChannelProfile(AliRtcEngine.AliRTCSdkChannelProfile.AliRTCSdkInteractiveLive);
        // Set the user role. To both publish and subscribe, use AliRTCSdkInteractive. To only subscribe, use AliRTCSdkLive.
        mAliRtcEngine.setClientRole(AliRtcEngine.AliRTCSdkClientRole.AliRTCSdkInteractive);
        // et the audio profile. The default is high-quality mode (AliRtcEngineHighQualityMode) and music scenario (AliRtcSceneMusicMode).
        mAliRtcEngine.setAudioProfile(AliRtcEngine.AliRtcAudioProfile.AliRtcEngineHighQualityMode, AliRtcEngine.AliRtcAudioScenario.AliRtcSceneMusicMode);
        mAliRtcEngine.setCapturePipelineScaleMode(AliRtcEngine.AliRtcCapturePipelineScaleMode.AliRtcCapturePipelineScaleModePost);

        // Set video encoding parameters
        AliRtcEngine.AliRtcVideoEncoderConfiguration aliRtcVideoEncoderConfiguration = new AliRtcEngine.AliRtcVideoEncoderConfiguration();
        aliRtcVideoEncoderConfiguration.dimensions = new AliRtcEngine.AliRtcVideoDimensions(
                720, 1280);
        aliRtcVideoEncoderConfiguration.frameRate = 20;
        aliRtcVideoEncoderConfiguration.bitrate = 1200;
        aliRtcVideoEncoderConfiguration.keyFrameInterval = 2000;
        aliRtcVideoEncoderConfiguration.orientationMode = AliRtcVideoEncoderOrientationModeAdaptive;
        mAliRtcEngine.setVideoEncoderConfiguration(aliRtcVideoEncoderConfiguration);

        // The SDK publishes audio by default, so you don't need to call publishLocalAudioStream.
        mAliRtcEngine.publishLocalAudioStream(true);
        // For video calls, you don't need to call publishLocalVideoStream(true) as the SDK publishes video by default.
        // For audio-only calls, you need to call publishLocalVideoStream(false) to disable video publishing.
        mAliRtcEngine.publishLocalVideoStream(true);

        // Set default subscription to remote audio and video streams.
        mAliRtcEngine.setDefaultSubscribeAllRemoteAudioStreams(true);
        mAliRtcEngine.subscribeAllRemoteAudioStreams(true);
        mAliRtcEngine.setDefaultSubscribeAllRemoteVideoStreams(true);
        mAliRtcEngine.subscribeAllRemoteVideoStreams(true);

    }

    private void startPreview(){
        if (mAliRtcEngine != null) {

            if (fl_local.getChildCount() > 0) {
                fl_local.removeAllViews();
            }

            findViewById(R.id.ll_video_layout).setVisibility(VISIBLE);
            ViewGroup.LayoutParams layoutParams = new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT);
            if(mLocalVideoCanvas == null) {
                mLocalVideoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
                SurfaceView localSurfaceView = mAliRtcEngine.createRenderSurfaceView(VideoCallActivity.this);
                localSurfaceView.setZOrderOnTop(true);
                localSurfaceView.setZOrderMediaOverlay(true);
                fl_local.addView(localSurfaceView, layoutParams);
                mLocalVideoCanvas.view = localSurfaceView;
                mAliRtcEngine.setLocalViewConfig(mLocalVideoCanvas, AliRtcVideoTrackCamera);
                mAliRtcEngine.startPreview();
            }
        }
    }

    private void joinChannel() {
        String channelId = mChannelEditText.getText().toString();
        if(!TextUtils.isEmpty(channelId)) {
            String userId = GlobalConfig.getInstance().getUserId();
            String appId = ARTCTokenHelper.AppId;
            String appKey = ARTCTokenHelper.AppKey;
            long timestamp = ARTCTokenHelper.getTimesTamp();
            String token = ARTCTokenHelper.generateSingleParameterToken(appId, appKey, channelId, userId, timestamp);
            mAliRtcEngine.joinChannel(token, null, null, null);
            hasJoined = true;
        } else {
            Log.e("VideoCallActivity", "channelId is empty");
        }
    }


    private AliRtcEngineEventListener mRtcEngineEventListener = new AliRtcEngineEventListener() {
        @Override
        public void onJoinChannelResult(int result, String channel, String userId, int elapsed) {
            super.onJoinChannelResult(result, channel, userId, elapsed);
            handleJoinResult(result, channel, userId);
        }

        @Override
        public void onLeaveChannelResult(int result, AliRtcEngine.AliRtcStats stats){
            super.onLeaveChannelResult(result, stats);
        }

        @Override
        public void onConnectionStatusChange(AliRtcEngine.AliRtcConnectionStatus status, AliRtcEngine.AliRtcConnectionStatusChangeReason reason){
            super.onConnectionStatusChange(status, reason);

            handler.post(new Runnable() {
                @Override
                public void run() {
                    if(status == AliRtcEngine.AliRtcConnectionStatus.AliRtcConnectionStatusFailed) {
                        /* TODO: Must handle. We recommend notifying the user. This is reported only after the SDK's internal recovery strategies have failed. */
                        ToastHelper.showToast(VideoCallActivity.this, R.string.video_chat_connection_failed, Toast.LENGTH_SHORT);
                    } else {
                        /* TODO: Optional. Add business logic here, typically for data analytics or UI changes. */
                    }
                }
            });
        }
        @Override
        public void OnLocalDeviceException(AliRtcEngine.AliRtcEngineLocalDeviceType deviceType, AliRtcEngine.AliRtcEngineLocalDeviceExceptionType exceptionType, String msg){
            super.OnLocalDeviceException(deviceType, exceptionType, msg);
            /* TODO: Must handle. We recommend notifying the user of the device error. This is reported only after the SDK's internal recovery strategies have failed. */
            handler.post(new Runnable() {
                @Override
                public void run() {
                    String str = "OnLocalDeviceException deviceType: " + deviceType + " exceptionType: " + exceptionType + " msg: " + msg;
                    ToastHelper.showToast(VideoCallActivity.this, str, Toast.LENGTH_SHORT);
                }
            });
        }

    };

    private AliRtcEngineNotify mRtcEngineNotify = new AliRtcEngineNotify() {
        @Override
        public void onAuthInfoWillExpire() {
            super.onAuthInfoWillExpire();
            /* TODO: Must handle. When this callback is triggered, retrieve a new token for the current user and channel, then call refreshAuthInfo to update it. */
        }

        @Override
        public void onRemoteUserOnLineNotify(String uid, int elapsed){
            super.onRemoteUserOnLineNotify(uid, elapsed);
        }

        // Unset the remote video stream renderer in the onRemoteUserOffLineNotify callback.
        @Override
        public void onRemoteUserOffLineNotify(String uid, AliRtcEngine.AliRtcUserOfflineReason reason){
            super.onRemoteUserOffLineNotify(uid, reason);
        }

        // Set the remote video stream renderer in the onRemoteTrackAvailableNotify callback.
        @Override
        public void onRemoteTrackAvailableNotify(String uid, AliRtcEngine.AliRtcAudioTrack audioTrack, AliRtcEngine.AliRtcVideoTrack videoTrack){
            handler.post(new Runnable() {
                @Override
                public void run() {
                    if(videoTrack == AliRtcVideoTrackCamera) {
                        SurfaceView surfaceView = mAliRtcEngine.createRenderSurfaceView(VideoCallActivity.this);
                        surfaceView.setZOrderMediaOverlay(true);
                        FrameLayout view = getAvailableView();
                        if (view == null) {
                            return;
                        }
                        remoteViews.put(uid, view);
                        view.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
                        AliRtcEngine.AliRtcVideoCanvas remoteVideoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
                        remoteVideoCanvas.view = surfaceView;
                        mAliRtcEngine.setRemoteViewConfig(remoteVideoCanvas, uid, AliRtcVideoTrackCamera);
                    } else if(videoTrack == AliRtcVideoTrackNo) {
                        if(remoteViews.containsKey(uid)) {
                            ViewGroup view = remoteViews.get(uid);
                            if(view != null) {
                                view.removeAllViews();
                                remoteViews.remove(uid);
                                mAliRtcEngine.setRemoteViewConfig(null, uid, AliRtcVideoTrackCamera);
                            }
                        }
                    }
                }
            });
        }

        /*  Your app must also handle cases where multiple devices attempt to join with the same UserID. */
        @Override
        public void onBye(int code){
            handler.post(new Runnable() {
                @Override
                public void run() {
                    String msg = "onBye code:" + code;
                    ToastHelper.showToast(VideoCallActivity.this, msg, Toast.LENGTH_SHORT);
                }
            });
        }

    };

    private void destroyRtcEngine() {
        if( mAliRtcEngine != null) {
            mAliRtcEngine.stopPreview();
            mAliRtcEngine.setLocalViewConfig(null, AliRtcVideoTrackCamera);
            mAliRtcEngine.leaveChannel();
            mAliRtcEngine.destroy();
            mAliRtcEngine = null;

            handler.post(() -> {
                ToastHelper.showToast(this, "Leave Channel", Toast.LENGTH_SHORT);
            });
        }
        hasJoined = false;
        for (ViewGroup value : remoteViews.values()) {
            value.removeAllViews();
        }
        remoteViews.clear();
        findViewById(R.id.ll_video_layout).setVisibility(View.GONE);
        fl_local.removeAllViews();
        mLocalVideoCanvas = null;
    }
}

For details on the complete sample code, see Run ARTC demo project for Android.

1. Request permissions

When starting a video call, check if the required permissions have been granted in the app:

private static final int REQUEST_PERMISSION_CODE = 101;

private static final String[] PERMISSION_MANIFEST = {
    Manifest.permission.RECORD_AUDIO,
    Manifest.permission.READ_PHONE_STATE,
    Manifest.permission.WRITE_EXTERNAL_STORAGE,
    Manifest.permission.READ_EXTERNAL_STORAGE,
    Manifest.permission.CAMERA
};

private static final String[] PERMISSION_MANIFEST33 = {
    Manifest.permission.RECORD_AUDIO,
    Manifest.permission.READ_PHONE_STATE,
    Manifest.permission.CAMERA
};

private static String[] getPermissions() {
    if (Build.VERSION.SDK_INT < Build.VERSION_CODES.TIRAMISU) {
        return PERMISSION_MANIFEST;
    }
    return PERMISSION_MANIFEST33;
}

public boolean checkOrRequestPermission() {
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
        if (ContextCompat.checkSelfPermission(this, "android.permission.CAMERA") != PackageManager.PERMISSION_GRANTED
                || ContextCompat.checkSelfPermission(this, "android.permission.RECORD_AUDIO") != PackageManager.PERMISSION_GRANTED) {
            requestPermissions(getPermissions(), REQUEST_PERMISSION_CODE);
            return false;
        }
    }
    return true;
}

2. Get an authentication token

Joining an ARTC channel requires an authentication token to verify the user's identity. For details on how the token is generated, see Implement token-based authentication. A token can be generated using a single-parameter method or a multi-parameter method. The method you use determines which joinChannel API you need to call.

For production environments:

Because generating a token requires your AppKey, hardcoding it on the client side poses a security risk. In a production environment, we strongly recommend generating the token on your server and sending it to the client.

For development and debugging:

During development, if your business server does not yet have the logic to generate tokens, you can temporarily use the token generation logic from the APIExample to create a temporary token. The reference code is as follows:

public final class ARTCTokenHelper {
    /**
     * RTC AppId
     */
    public static String AppId = "";

    /**
     * RTC AppKey
     */
    public static String AppKey = "";

    /**
     * Generate a single-parameter token for joining a meeting based on channelId, userId, timestamp, and nonce.
     */
    public static String generateSingleParameterToken(String appId, String appKey, String channelId, String userId, long timestamp,  String nonce) {

        StringBuilder stringBuilder = new StringBuilder()
                .append(appId)
                .append(appKey)
                .append(channelId)
                .append(userId)
                .append(timestamp);
        String token =  getSHA256(stringBuilder.toString());
        try{
            JSONObject tokenJson = new JSONObject();
            tokenJson.put("appid", AppId);
            tokenJson.put("channelid", channelId);
            tokenJson.put("userid", userId);
            tokenJson.put("nonce", nonce);
            tokenJson.put("timestamp", timestamp);
            tokenJson.put("token", token);
            String base64Token = Base64.encodeToString(tokenJson.toString().getBytes(StandardCharsets.UTF_8), Base64.NO_WRAP);
            return base64Token;
        }catch (Exception e) {
            e.printStackTrace();
        }
        return null;
    }

    /**
     * Generate a single-parameter token for joining a meeting based on channelId, userId, and timestamp.
     */
    public static String generateSingleParameterToken(String appId, String appKey, String channelId, String userId, long timestamp) {
        return generateSingleParameterToken(appId, appKey, channelId, userId, timestamp, "");
    }

    public static String getSHA256(String str) {
        try {
            MessageDigest messageDigest = MessageDigest.getInstance("SHA-256");
            byte[] hash = messageDigest.digest(str.getBytes(StandardCharsets.UTF_8));
            return byte2Hex(hash);
        } catch (NoSuchAlgorithmException e) {
            // Consider logging the exception and/or re-throwing as a RuntimeException
            e.printStackTrace();
        }
        return "";
    }

    private static String byte2Hex(byte[] bytes) {
        StringBuilder stringBuilder = new StringBuilder();
        for (byte b : bytes) {
            String hex = Integer.toHexString(0xff & b);
            if (hex.length() == 1) {
                // Use single quote for char
                stringBuilder.append('0');
            }
            stringBuilder.append(hex);
        }
        return stringBuilder.toString();
    }

    public static long getTimesTamp() {
        return System.currentTimeMillis() / 1000 + 60 * 60 * 24;
    }
}

3. Import ARTC SDK classes

Import the relevant classes and interfaces from the ARTC SDK:

// Import ARTC classes
import com.alivc.rtc.AliRtcEngine;
import com.alivc.rtc.AliRtcEngineEventListener;
import com.alivc.rtc.AliRtcEngineNotify;

4. Create and initialize the engine

  • Create the RTC engine

    Call the getInstance[1/2] method to create an AliRtcEngine instance.

    private AliRtcEngine mAliRtcEngine = null;
    if(mAliRtcEngine == null) {
        mAliRtcEngine = AliRtcEngine.getInstance(this);
    }
  • Initialize the engine

    • Call setChannelProfile to set the channel to AliRTCSdkInteractiveLive (interactive mode).

      Depending on your business needs, you can choose interactive mode, which is suitable for interactive entertainment scenarios, or communication mode, which is suitable for one-to-one or one-to-many broadcasting. Choosing the right mode ensures a smooth user experience and efficient use of network resources.

      Mode

      Publishing

      Subscribing

      Description

      Interactive mode

      • Limited by role. Only users with the host role can publish streams.

      • Participants can flexibly switch roles throughout the session.

      No role restrictions. All participants have permission to subscribe to streams.

      • In interactive mode, events such as a host joining or leaving the channel, or starting to publish a stream, are notified to viewers in real time. Conversely, the viewers' activities are not notified to the hosts, ensuring uninterrupted streaming.

      • In this mode, hosts are responsible for interaction, while viewers only consumes content. If your business needs might change, consider using interactive mode by default. Its flexibility allows you to adapt to different interaction requirements by adjusting user roles.

      Communication mode

      No role restrictions. All participants have permission to publish streams.

      No role restrictions. All participants have permission to subscribe to streams.

      • In communication mode, participants are aware of each other's presence.

      • Although this mode does not differentiate user roles, it is functionally equivalent to the host role in interactive mode. The goal is to simplify operations, allowing users to achieve the desired functionality with fewer API calls.

    • Call setClientRole to set the user role to AliRTCSdkInteractive (host) or AliRTCSdkLive (viewer). Note: The host role publishes and subscribes by default. The viewer role only subscribes by default, with preview and publishing disabled.

      Note

      When a user switches roles in a channel, the system adjusts the audio and video publishing status accordingly:

      • Switching from host to viewer: The system stops publishing the local audio and video streams. Subscribed remote streams are not affected, and the user can continue to watch others.

      • Switching from viewer to host: The system starts publishing the local audio and video streams. Subscribed remote streams remain unchanged, and the user can continue to watch other participants.

      // Set the channel mode to interactive mode, use AliRTCSdkInteractiveLive for RTC
      mAliRtcEngine.setChannelProfile(AliRtcEngine.AliRTCSdkChannelProfile.AliRTCSdkInteractiveLive);
      // Set the user role, use AliRTCSdkInteractive to both ingest and pull streams, use AliRTCSdkLive if you only pull streams without ingesting
      mAliRtcEngine.setClientRole(AliRtcEngine.AliRTCSdkClientRole.AliRTCSdkInteractive);
  • Set common callbacks

    If the SDK encounters an issue during operation, it will first attempt to recover automatically using its internal retry mechanisms. For errors that it cannot resolve on its own, the SDK will notify your application through predefined callback interfaces.

    The following are key callbacks for issues the SDK cannot handle, which your application must listen for and respond to:

    Cause of exception

    Callback and parameters

    Solution

    Description

    Authentication failed

    result in the onJoinChannelResult callback returns AliRtcErrJoinBadToken.

    The app should check if the token is correct.

    When a user calls an API, if authentication fails, the API's callback will return an authentication failure error.

    Token about to expire

    onAuthInfoWillExpire

    Retrieve a new token and call refreshAuthInfo to update the information.

    A token expiration error can occur either when an API is called or during runtime. The error is reported through API callbacks or a separate error callback.

    Token expired

    onAuthInfoExpired

    The app must rejoin the channel.

    A token expiration error can occur either when an API is called or during runtime. The error is reported through API callbacks or a separate error callback.

    Network connection issue

    onConnectionStatusChange callback returns AliRtcConnectionStatusFailed.

    The app must rejoin the channel.

    The SDK can automatically recover from brief network disconnections. If the disconnection time exceeds a threshold, it will time out. The app must check the network status and guide the user to rejoin.

    Kicked from channel

    onBye

    • AliRtcOnByeUserReplaced: Check if another user has joined with the same userId.

    • AliRtcOnByeBeKickedOut: The user was kicked out of the channel and needs to rejoin.

    • AliRtcOnByeChannelTerminated: The channel was terminated, and the user needs to rejoin.

    The RTC service allows an administrator to remove participants.

    Local device exception

    onLocalDeviceException

    Check app permissions and whether the hardware is working correctly.

    When a local device exception occurs that the SDK cannot resolve, it notifies the app via a callback. The app should then intervene to check the device status.

    private AliRtcEngineEventListener mRtcEngineEventListener = new AliRtcEngineEventListener() {
        @Override
        public void onJoinChannelResult(int result, String channel, String userId, int elapsed) {
            super.onJoinChannelResult(result, channel, userId, elapsed);
            handleJoinResult(result, channel, userId);
        }
    
        @Override
        public void onLeaveChannelResult(int result, AliRtcEngine.AliRtcStats stats){
            super.onLeaveChannelResult(result, stats);
        }
    
        @Override
        public void onConnectionStatusChange(AliRtcEngine.AliRtcConnectionStatus status, AliRtcEngine.AliRtcConnectionStatusChangeReason reason){
            super.onConnectionStatusChange(status, reason);
    
            handler.post(new Runnable() {
                @Override
                public void run() {
                    if(status == AliRtcEngine.AliRtcConnectionStatus.AliRtcConnectionStatusFailed) {
                        /* TODO: Must handle. We recommend notifying the user. This is reported only after the SDK's internal recovery strategies have failed. */
                        ToastHelper.showToast(VideoChatActivity.this, R.string.video_chat_connection_failed, Toast.LENGTH_SHORT);
                    } else {
                        /* TODO: Optional. Add business logic here, typically for data analytics or UI changes. */
                    }
                }
            });
        }
        @Override
        public void OnLocalDeviceException(AliRtcEngine.AliRtcEngineLocalDeviceType deviceType, AliRtcEngine.AliRtcEngineLocalDeviceExceptionType exceptionType, String msg){
            super.OnLocalDeviceException(deviceType, exceptionType, msg);
            /* TODO: Must handle. We recommend notifying the user of the device error. This is reported only after the SDK's internal recovery strategies have failed. */
            handler.post(new Runnable() {
                @Override
                public void run() {
                    String str = "OnLocalDeviceException deviceType: " + deviceType + " exceptionType: " + exceptionType + " msg: " + msg;
                    ToastHelper.showToast(VideoChatActivity.this, str, Toast.LENGTH_SHORT);
                }
            });
        }
    
    };
    
    private AliRtcEngineNotify mRtcEngineNotify = new AliRtcEngineNotify() {
        @Override
        public void onAuthInfoWillExpire() {
            super.onAuthInfoWillExpire();
            /* TODO: Must handle. When this callback is triggered, retrieve a new token for the current user and channel, then call refreshAuthInfo to update it. */
        }
    
        @Override
        public void onRemoteUserOnLineNotify(String uid, int elapsed){
            super.onRemoteUserOnLineNotify(uid, elapsed);
        }
    
        // Unset the remote video stream renderer in the onRemoteUserOffLineNotify callback.
        @Override
        public void onRemoteUserOffLineNotify(String uid, AliRtcEngine.AliRtcUserOfflineReason reason){
            super.onRemoteUserOffLineNotify(uid, reason);
        }
    
        // Set the remote video stream renderer in the onRemoteTrackAvailableNotify callback.
        @Override
        public void onRemoteTrackAvailableNotify(String uid, AliRtcEngine.AliRtcAudioTrack audioTrack, AliRtcEngine.AliRtcVideoTrack videoTrack){
            handler.post(new Runnable() {
                @Override
                public void run() {
                    if(videoTrack == AliRtcVideoTrackCamera) {
                        SurfaceView surfaceView = mAliRtcEngine.createRenderSurfaceView(VideoChatActivity.this);
                        surfaceView.setZOrderMediaOverlay(true);
                        FrameLayout view = getAvailableView();
                        if (view == null) {
                            return;
                        }
                        remoteViews.put(uid, view);
                        view.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
                        AliRtcEngine.AliRtcVideoCanvas remoteVideoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
                        remoteVideoCanvas.view = surfaceView;
                        mAliRtcEngine.setRemoteViewConfig(remoteVideoCanvas, uid, AliRtcVideoTrackCamera);
                    } else if(videoTrack == AliRtcVideoTrackNo) {
                        if(remoteViews.containsKey(uid)) {
                            ViewGroup view = remoteViews.get(uid);
                            if(view != null) {
                                view.removeAllViews();
                                remoteViews.remove(uid);
                                mAliRtcEngine.setRemoteViewConfig(null, uid, AliRtcVideoTrackCamera);
                            }
                        }
                    }
                }
            });
        }
    
        /* Your app must also handle cases where multiple devices attempt to join with the same UserID. */
        @Override
        public void onBye(int code){
            handler.post(new Runnable() {
                @Override
                public void run() {
                    String msg = "onBye code:" + code;
                    ToastHelper.showToast(VideoChatActivity.this, msg, Toast.LENGTH_SHORT);
                }
            });
        }
    };
    
    mAliRtcEngine.setRtcEngineEventListener(mRtcEngineEventListener);
    mAliRtcEngine.setRtcEngineNotify(mRtcEngineNotify);

5. Set audio and video properties

  • Set audio properties

    Call setAudioProfile to set the audio encoding mode and scenario.

    mAliRtcEngine.setAudioProfile(AliRtcEngine.AliRtcAudioProfile.AliRtcEngineHighQualityMode, AliRtcEngine.AliRtcAudioScenario.AliRtcSceneMusicMode);
  • Set video properties

    Set properties for the published video stream, such as resolution, bitrate, and frame rate.

    // Set video encoding parameters.
    AliRtcEngine.AliRtcVideoEncoderConfiguration aliRtcVideoEncoderConfiguration = new AliRtcEngine.AliRtcVideoEncoderConfiguration();
    aliRtcVideoEncoderConfiguration.dimensions = new AliRtcEngine.AliRtcVideoDimensions(
                    720, 1280);
    aliRtcVideoEncoderConfiguration.frameRate = 20;
    aliRtcVideoEncoderConfiguration.bitrate = 1200;
    aliRtcVideoEncoderConfiguration.keyFrameInterval = 2000;
    aliRtcVideoEncoderConfiguration.orientationMode = AliRtcVideoEncoderOrientationModeAdaptive;
    mAliRtcEngine.setVideoEncoderConfiguration(aliRtcVideoEncoderConfiguration);

6. Set publishing and subscribing properties

Configure the publishing of audio/video streams and set the default to subscribe to all users' streams:

  • Call publishLocalAudioStream to publish a audio stream.

  • Call publishLocalVideoStream to publish a video stream. For an audio-only call, you can set this to false.

// The SDK publishes audio by default, so you don't need to call publishLocalAudioStream.
mAliRtcEngine.publishLocalAudioStream(true);
// For video calls, you don't need to call publishLocalVideoStream(true) as the SDK publishes video by default.
// For audio-only calls, you need to call publishLocalVideoStream(false) to disable video publishing.
mAliRtcEngine.publishLocalVideoStream(true);

// Set default subscription to all remote audio and video streams.
mAliRtcEngine.setDefaultSubscribeAllRemoteAudioStreams(true);
mAliRtcEngine.subscribeAllRemoteAudioStreams(true);
mAliRtcEngine.setDefaultSubscribeAllRemoteVideoStreams(true);
mAliRtcEngine.subscribeAllRemoteVideoStreams(true);
Note

By default, the SDK automatically publishes local audio and video streams and subscribes to the audio and video streams of all other users in the channel. You can call the methods above to override this default behavior.

7. Enable local preview

  • Call setLocalViewConfig to set up the local render view and configure local video display properties.

  • Call startPreview to start the local video preview.

mLocalVideoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
SurfaceView localSurfaceView = mAliRtcEngine.createRenderSurfaceView(VideoChatActivity.this);
localSurfaceView.setZOrderOnTop(true);
localSurfaceView.setZOrderMediaOverlay(true);
FrameLayout fl_local = findViewById(R.id.fl_local);
fl_local.addView(localSurfaceView, layoutParams);
mLocalVideoCanvas.view = localSurfaceView;
mAliRtcEngine.setLocalViewConfig(mLocalVideoCanvas, AliRtcVideoTrackCamera);
mAliRtcEngine.startPreview();

8. Join a channel

Call joinChannel to join the channel. If the token was generated using the single-parameter method, call the joinChannel[1/3] operation. If it was generated using the multi-parameter method, call the joinChannel[2/3] method. The result is returned in the onJoinChannelResult callback. A result of 0 indicates a successful join. A non-zero result may indicate an invalid token.

 mAliRtcEngine.joinChannel(token, null, null, null);
Note
  • After joining the channel, the SDK will publish and subscribe to streams according to the parameters set before joining.

  • The SDK automatically publishes and subscribes by default to reduce the number of API calls the client needs to make.

9. Set the remote view

When you initialize the engine, set the corresponding callback with mAliRtcEngine.setRtcEngineNotify. In the onRemoteTrackAvailableNotify callback, you can set up the remote view for the remote user. Example code:

@Override
public void onRemoteTrackAvailableNotify(String uid, AliRtcEngine.AliRtcAudioTrack audioTrack, AliRtcEngine.AliRtcVideoTrack videoTrack){
    handler.post(new Runnable() {
        @Override
        public void run() {
            if(videoTrack == AliRtcVideoTrackCamera) {
                SurfaceView surfaceView = mAliRtcEngine.createRenderSurfaceView(VideoChatActivity.this);
                surfaceView.setZOrderMediaOverlay(true);
                FrameLayout fl_remote = findViewById(R.id.fl_remote);
                if (fl_remote == null) {
                    return;
                }
                fl_remote.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
                AliRtcEngine.AliRtcVideoCanvas remoteVideoCanvas = new AliRtcEngine.AliRtcVideoCanvas();
                remoteVideoCanvas.view = surfaceView;
                mAliRtcEngine.setRemoteViewConfig(remoteVideoCanvas, uid, AliRtcVideoTrackCamera);
            } else if(videoTrack == AliRtcVideoTrackNo) {
                FrameLayout fl_remote = findViewById(R.id.fl_remote);
                fl_remote.removeAllViews();
                mAliRtcEngine.setRemoteViewConfig(null, uid, AliRtcVideoTrackCamera);
            }
        }
    });
}

10. Leave the channel and destroy the engine

When the session is over, leave the channel and destroy the engine:

  1. Call stopPreview to stop the video preview.

  2. Call leaveChannel to leave the channel.

  3. Call destroy to destroy the engine and release its resources.

private void destroyRtcEngine() {
    mAliRtcEngine.stopPreview();
    mAliRtcEngine.setLocalViewConfig(null, AliRtcVideoTrackCamera);
    mAliRtcEngine.leaveChannel();
    mAliRtcEngine.destroy();
    mAliRtcEngine = null;
}

11. Effect demonstration

image

References

Data structures

AliRtcEngine class