This topic describes the methods of the Web RTS SDK.
Overview
API | Description |
Instantiates an object. | |
Checks the stream pulling environment. | |
Checks the stream ingest environment. | |
Starts pulling an RTS stream. | |
Stops RTS playback. | |
Mutes the stream. | |
Gets a local camera stream, a local screen-sharing stream, or a custom stream. | |
Starts stream ingest. | |
Stops stream ingest. | |
Sets up a callback listener. | |
Cancels a callback listener. | |
Listens for an event once. |
Details
createClient: Creates a client object.
var aliRts = AliRTS.createClient();isSupport: Checks whether the current environment supports stream pulling.
/** * isSupport checks if the environment is available. * @param {Object} supportInfo Check information. * @param {boolean} supportInfo.isReceiveVideo Specifies whether to pull the video stream. * @return {Promise} */ aliRts.isSupport({isReceiveVideo: true}).then(re=> { // Available }).catch(err=> { // Not available.console.log(`not support errorCode: ${err.errorCode}`); console.log(`not support message: ${err.message}`); })checkPublishSupport: Checks whether the current environment supports stream ingest.
/** * checkPublishSupport checks if the stream ingest environment is available. * @return {Promise} */ aliRts.checkPublishSupport().then(re => { console.log('support info',re); // re.isAudioMixSupported: boolean; Specifies whether local audio stream mixing is supported. // re.isH264EncodeSupported: boolean; Specifies whether H.264 encoding is supported. // re.isMediaDevicesSupported: boolean; Specifies whether cameras, microphones, and speakers can be queried. // re.isScreenCaptureSupported: boolean; Specifies whether screen sharing is supported. // re.isWebRTCSupported: boolean; Specifies whether Web Real-Time Communication (WebRTC) is supported. // re.cameraList: MediaDeviceInfo[]; List of video input devices. // re.micList: MediaDeviceInfo[]; List of audio input devices. // re.speakerList: MediaDeviceInfo[]; List of audio output devices. }).catch(err=> { console.log(err); })subscribe: Subscribes to an RTS stream.
/** * Starts pulling the RTS stream. * @param {string} pullStreamUrl The stream pulling URL. Add @subaudio=no or @subvideo=no to the end of the URL to not subscribe to the audio or video stream. * @param {Object} [config] (Optional) Custom configuration. * @param {string} [config.signalUrl] (Optional) The signaling URL. * @param {number} [config.retryTimes] (Optional) The maximum number of reconnection attempts. Default value: 5. * @param {number} [config.retryInterval] (Optional) The reconnection interval in milliseconds. Default value: 2000. * @return {Promise} */ aliRts.subscribe(pullStreamUrl).then((remoteStream) => { // mediaElement is the audio or video media tag. remoteStream.play(mediaElement); // Calling remoteStream.play binds the media stream to the media tag and attempts to autoplay. // If you do not want to autoplay, pass {autoplay:false} as the second parameter. This is supported from version 2.2.4. // remoteStream.play(mediaElement, {autoplay:false}); }).catch((err) => { // Subscription failed. })ImportantDuring audio and video decoding, RTS (RTS) does not support videos that contain B-frames or audio in the AAC format. If a video contains B-frames, frame skipping may occur. If audio is in the AAC format, noise may occur. You can transcode the stream to remove B-frames from the video and convert the audio from AAC to another format. For more information, see RTS Transcoding.
If you import the Web RTS SDK into a uni-app project, pass an actual HTMLVideoElement to the
remoteStream.play()method. The uni-app encapsulates the <video> tag. To obtain the actual HTMLVideoElement, refer to the method in the Demo. For example, in pages/index/index.vue, useremoteStream.play(this.$refs.myVideo.$refs.video).If you use the subscribe method to pull a stream, the returned remoteStream object contains the original audio and video data. You can retrieve this data using the WebRTC MediaStream API.
unsubscribe: Stops RTS playback.
aliRts.unsubscribe();muted: Mutes the stream.
remoteStream.muted = true;createStream
Creates a stream from the local camera.
/** * Get the local stream localStream. * @param {Object} config Configuration. * @param {boolean} config.audio Specifies whether to use an audio device. * @param {boolean} config.video Specifies whether to use a video device. * @param {boolean} config.skipProfile Specifies whether to skip the profile. Set this to true if the camera shows a black screen. * @returns {Promise} */ AliRTS.createStream({ audio: true, video: true, }).then((localStream) => { // Preview the ingest stream content. mediaElement is the audio or video media tag. localStream.play(mediaElement); }).catch((err) => { // Failed to create the local stream. })Creates a stream from a local screen share.
/** * Share only the screen video. */ AliRTS.createStream({ screen: true }); /** * Share the screen video and capture the screen audio. Chrome on macOS supports capturing tab audio. Chrome on Windows supports capturing tab and system audio. */ AliRTS.createStream({ screen: { audio: true } }); /** * Share the screen video, capture the screen audio, and capture the microphone audio. */ AliRTS.createStream({ screen: { audio: true }, audio: true }); /** * Custom capture parameters. * - Disable audio echo cancellation. * - Prefer the current tab in Chrome. * The preceding code is an example. You can pass any parameters that conform to getDisplayMedia. The actual effect depends on browser support. */ AliRTS.createStream({ screen: { audio: { echoCancellation: false }, preferCurrentTab: true } });Obtain a custom stream
/** * Get the local stream localStream. * @param {Object} config Configuration. * @param {boolean} config.custom Specifies whether to pass in a custom stream. * @param {boolean} config.mediaStream A valid custom stream. * @returns {Promise} */ AliRTS.createStream({ // Custom stream custom: true, mediaStream: myStream //Pass a valid MediaStream (https://developer.mozilla.org/en-US/docs/Web/API/MediaStream). }).then((localStream) => { // Preview the ingest stream content. mediaElement is the audio or video media tag. localStream.play(mediaElement); }).catch((err) => { // Failed to create the local stream. })
publish: Starts stream ingest.
/** * Start stream ingest. * @param {string} pushUrl The ingest URL. * @param {Object} localStream The local stream created by createStream. * @param {Object} [config] (Optional) Custom configuration. * @param {string} [config.signalUrl] (Optional) The signaling URL. * @param {number} [config.retryTimes] (Optional) The maximum number of reconnection attempts. Default value: 5. * @param {number} [config.retryInterval] (Optional) The reconnection interval in milliseconds. Default value: 2000. * @return {Promise} */ aliRts.publish(pushUrl, localStream).then(() => { // Stream ingest successful. }).catch((err) => { // Stream ingest failed. })unpublish: Stops stream ingest.
aliRts.unpublish();on: Registers a callback listener for an event.
/* * If error code 10201 is returned in onError, the audio on the webpage is muted. * A user must manually trigger an event on the webpage. This cannot be controlled by code. * Call remoteStream.muted = false to unmute. */ aliRts.on("onError", (err)=> { console.log(`errorCode: ${err.errorCode}`); console.log(`message: ${err.message}`); }) aliRts.on('reconnect', function(evt) { console.log('reconnect', evt); // Listen for the reconnect event. evt is the reason for the reconnection. }) const PLAY_EVENT = { CANPLAY: "canplay", // Playback ready. WAITING: "waiting", // Stuttering. PLAYING: "playing", // Stuttering recovered. MEDIA: "media", // Report real-time media status every second. } aliRts.on('onPlayEvent', function(evt) { /* evt data structure: { event: string, // PLAY_EVENT data: any, // Data } */ if (evt.event === PLAY_EVENT.CANPLAY) { console.log("Playback is ready."); } else if (evt.event === PLAY_EVENT.WAITING) { console.log("Stuttering occurs."); } else if (evt.event === PLAY_EVENT.PLAYING) { console.log("Playback resumes after stuttering."); } else if (evt.event === PLAY_EVENT.MEDIA) { console.log("Real-time media data per second: ", evt.data); /* evt.data data structure: { url: string, // Playback URL. aMsid: stirng, // Audio ID (default: 'rts audio'). audio: { // (Not supported by some browsers) bytesReceivedPerSecond: number, // Audio bitrate. lossRate: number, // Audio packet loss rate. rtt: number, // RTT (shared by audio and video). }, vMsid: string, // Video ID (default: 'rts video'). video: { // (Not supported by some browsers) bytesReceivedPerSecond: number, // Video bitrate. framesDecodedPerSecond: number, // Decoded frame rate. fps: number, // Rendering frame rate. height: number, // Resolution height. width: number, // Resolution width. lossRate: number, // Video packet loss rate. rtt: number, // RTT (shared by audio and video). }, networkQuality: number, // Network quality score. } // networkQuality score values: // 0: Unknown, 1: Excellent, 2: Good, 3: Fair, 4: Poor, 5: Very poor, 6: No network */ } });off: Removes a callback listener.
function handle() {}; aliRts.on('onPlayEvent', handle); // Cancel the listener. aliRts.off('onPlayEvent', handle);once: Registers a one-time listener for an event.
aliRts.once('onPlayEvent', handle);