All Products
Search
Document Center

ApsaraVideo Live:Methods

Last Updated:Dec 18, 2023

This topic describes the methods provided by Web RTS SDK.

Overview

Method

Description

createClient

Creates an instance on which Web RTS SDK is used.

isSupport

Checks the stream pulling environment.

checkPublishSupport

Checks the stream ingest environment.

subscribe

Starts to pull the stream over Real-Time Streaming (RTS).

unsubscribe

Stops playing the stream over RTS.

muted

Mutes the stream.

createStream

Obtain a local camera stream, a local screen-sharing stream, or a custom stream.

publish

Starts to ingest the stream.

unpublish

Stops ingesting the stream.

on

Invokes a callback.

Sample code

  • createClient: creates an instance on which Web RTS SDK is used.

    var aliRts = AliRTS.createClient();
  • isSupport: checks the stream pulling environment.

    /**
     * isSupport Check whether the stream pulling environment is available.
     * @param {Object} supportInfo The check result.
     * @param {Boolean} supportInfo.isReceiveVideo Specify whether to pull the video stream.
     * @return {Promise}
     */
    aliRts.isSupport({isReceiveVideo: true}).then(re=> {
      // The stream pulling environment is available.
    }).catch(err=> {
      // The stream pulling environment is not available. console.log(`not support errorCode: ${err.errorCode}`);
      console.log(`not support message: ${err.message}`);
    })
  • checkPublishSupport: checks the stream ingest environment.

    /**
      * checkPublishSupport Check whether the stream ingest environment is available.
      * @return {Promise}
      */
    aliRts.checkPublishSupport().then(re => {
      console.log('support info',re);
       // re.isAudioMixSupported: boolean; Indicates whether local audio stream mixing is supported.
       // re.isH264EncodeSupported: boolean; Indicates whether H.264 is supported.
       // re.isMediaDevicesSupported: boolean; Indicates whether cameras, microphones, and speakers can be queried.
       // re.isScreenCaptureSupported: boolean; Indicates whether screen sharing is supported.
       // re.isWebRTCSupported: boolean; Indicates whether WebRTC is supported.
       // re.cameraList: MediaDeviceInfo[]; The list of video input devices.
       // re.micList: MediaDeviceInfo[]; The list of audio input devices.
       // re.speakerList: MediaDeviceInfo[]; The list of audio output devices.
    }).catch(err=> {
      console.log(err);
    })
  • subscribe: starts to pull the stream over RTS.

    /**
     * rts Start to pull the stream.
     * @param {string} pullStreamUrl The source URL. Add @subaudio=no or @subvideo=no after the URL to specify that the audio stream or video stream is not subscribed to.
     * @param {Object} [config] Perform custom configuration. This step is optional.
     * @param {string} [config.signalUrl] Specify the signaling URL. This step is optional.
     * @param {number} [config.retryTimes] Specify the maximum allowed number of reconnection attempts. Default value: 5.
     * @param {number} [config.retryInterval] Specify the interval between reconnection attempts. Unit: milliseconds. Default value: 2000.
     * @return {Promise}
     */
     aliRts.subscribe(pullStreamUrl).then((remoteStream) => {
      // mediaElement indicates whether the media type of the stream is audio or video.
      remoteStream.play(mediaElement);
      // Call remoteStream.play to bind a media tag to the media stream. The stream automatically plays. 
      // If you do not want the stream to automatically play, specify the second parameter {autoplay:false}. This parameter is supported beginning from Web RTS SDK V2.2.4.
      // remoteStream.play(mediaElement, {autoplay:false});
    }).catch((err) => {
      // The subscription failed.
    })
    Important
    • During audio and video decoding, RTS does not support videos that contain B-frames and audio that is in the AAC format. If the video contains B-frames, frame skips may occur. If the audio is in the AAC format, noise may occur. You can transcode the stream to remove B-frames from the video and convert the audio from AAC to another format. For more information, see Configure RTS transcoding.

    • If you introduce Web RTS SDK by using a uni-app project, you need to pass in an actual HTMLVideoElement in the remoteStream.play() method. Because the video is encapsulated by the uni-app, you can refer to the demo to learn how to use the method to obtain the actual HTMLVideoElement. In this case, specify remoteStream.play(this.$refs.myVideo.$refs.video) in pages/index/index.vue.

  • unsubscribe: stops playing the stream over RTS.

    aliRts.unsubscribe();
  • muted: mutes the stream.

    remoteStream.muted = true;
  • createStream

    • Obtain a local camera stream.

      /**
       * Obtain a local stream.
       * @param {Object} config Perform configuration.
       * @param {boolean} config.audio Specify whether to use the audio device.
       * @param {boolean} config.video Specify whether to use the video device.
       * @param {boolean} config.skipProfile Specify whether to skip the profile. We recommend that you set the value to true if the camera shows a black screen.
       * @returns {Promise}
       */
      AliRTS.createStream({
        audio: true,
        video: true,
      }).then((localStream) => {
        // Preview the content of the ingested stream. The mediaElement parameter indicates whether the media type of the stream is audio or video.
        localStream.play(mediaElement);
      }).catch((err) => {
        // The local stream failed to be created.
      })
    • Obtain a local screen-sharing stream.

      /**
       * Only share the collected video from the screen.
       */
      AliRTS.createStream({ screen: true });
      
      /**
       * Share the collected video from the screen and capture the audio from the screen at the same time. Google Chrome on macOS supports audio capture from tab pages, while Google Chrome on Windows supports audio capture from tab pages and the system.
       */
      AliRTS.createStream({ screen: { audio: true } });
      
      /**
       * Share the collected video from the screen, capture the audio from the screen, and capture the audio from the microphone at the same time.
       */
      AliRTS.createStream({ screen: { audio: true }, audio: true });
      
      /**
       * Configure custom collection parameters.
       * Disable echo cancellation for the audio.
       * Preferentially select the current tab page in Google Chrome.
       * The preceding code is for reference only. You can pass in any parameters that match with getDisplayMedia. The effect of these parameters is determined by the browser.
       */
      AliRTS.createStream({ screen: { audio: { echoCancellation: false }, preferCurrentTab: true } });
    • Obtain a custom stream.

      /**
       * Obtain a local stream.
       * @param {Object} config Perform configuration.
       * @param {boolean} config.custom Specify whether to pass in a custom stream.
       * @param {boolean} config.mediaStream Specify a valid custom stream.
       * @returns {Promise}
       */
      AliRTS.createStream({
        // Create the custom stream.
        custom: true,
        mediaStream: myStream // Specify a valid MediaStream. For more information, visit https://developer.mozilla.org/en-US/docs/Web/API/MediaStream.
      }).then((localStream) => {
        // Preview the content of the ingested stream. The mediaElement parameter indicates whether the media type of the stream is audio or video.
        localStream.play(mediaElement);
      }).catch((err) => {
        // The local stream failed to be created.
      })
  • publish: starts to ingest the stream.

    /**
     * Start to ingest the stream.
     * @param {string} pushUrl The ingest URL.
     * @param {Object} localStream The local stream that you create by calling createStream.
     * @param {Object} [config] Perform custom configuration. This step is optional.
     * @param {string} [config.signalUrl] Specify the signaling URL. This step is optional.
     * @param {number} [config.retryTimes] Specify the maximum allowed number of reconnection attempts. Default value: 5.
     * @param {number} [config.retryInterval] Specify the interval between reconnection attempts. Unit: milliseconds. Default value: 2000.
     * @return {Promise}
     */
    aliRts.publish(pushUrl, localStream).then(() => {
      // The stream is ingested.
    }).catch((err) => {
      // The stream failed to be ingested.
    })
  • unpublish: stops ingesting the stream.

    aliRts.unpublish();
  • on: invokes a callback.

    /*
     * If the error code 10201 is returned, the video is muted. 
     * You must manually trigger the event on the web page. You cannot use code to automatically trigger the event.
     * Call the remoteStream.muted = false method to unmute the stream.
     */
    aliRts.on("onError", (err)=> {
      console.log(`errorCode: ${err.errorCode}`);
      console.log(`message: ${err.message}`);
    })
    
    aliRts.on('reconnect', function(evt) {
      console.log('reconnect', evt); // Listen to the reconnection event. evt indicates the reason for reconnection.
    })
    
    const PLAY_EVENT = {
      CANPLAY: "canplay", // Ready for playback.
      WAITING: "waiting", // Stuttering occurs.
      PLAYING: "playing", // Stuttering is recovered.
      MEDIA: "media",     // Report the real-time status of the media every second.
    }
    aliRts.on('onPlayEvent', function(evt) {
      /* evt Data structure: {
            event: string, // PLAY_EVENT
            data: any, // The data. 
          } 
        */
      if (evt.event === PLAY_EVENT.CANPLAY) {
        console.log("Ready for playback.");
      } else if (evt.event === PLAY_EVENT.WAITING) {
        console.log("Stuttering occurs.");
      } else if (evt.event === PLAY_EVENT.PLAYING) {
        console.log("Stuttering is recovered, and the playback continues.");
      } else if (evt.event === PLAY_EVENT.MEDIA) {
        console.log("Real-time media data per second: ", evt.data);
        /* evt.data Data structure: {
              url: string, // The streaming URL.
              aMsid: stirng, // The ID of the audio. Default value: rts audio.
              audio: {                          // (Not supported by some browsers)
                bytesReceivedPerSecond: number, // The bitrate of the audio.
                lossRate: number, // The packet loss rate of the audio.
                rtt: number, // The round trip time (RTT) of the audio and video.
              },
              vMsid: string, // The ID of the video. Default value: rts video.
              video: {                          // (Not supported by some browsers)
                bytesReceivedPerSecond: number, // The bitrate of the video.
                framesDecodedPerSecond: number, // The decoded frame rate.
                fps: number, // The frame rate for rendering.
                height: number, // The height of the resolution.
                width: number, // The width of the resolution.
                lossRate: number, // The packet loss rate of the video.
                rtt: number, // The RTT of the audio and video.
              },
              networkQuality: number, // The score on the network quality.
            }
            // networkQuality Valid values: 
            // 0 (unknown), 1 (excellent), 2 (good), 3 (moderate), 4 (poor), 5 (very poor), and 6 (no network).
          */
      }
    });