Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
359 views
in Technique[技术] by (71.8m points)

javascript - Changing a MediaStream of RTCPeerConnection

I want to change from a audio/video stream to a "screensharing" stream:

peerConnection.removeStream(streamA) // __o_j_sep... in Screenshots below
peerConnection.addStream(streamB)  // SSTREAM in Screenshots below
  • streamA is a video/audio stream coming from my camera and microphone.
  • streamB is the screencapture I get from my extension.
  • They are both MediaStream objects that look like this:

streamA

streamB

* 1 Remark

But if I remove streamA from the peerConnection and addStream(streamB) like above nothing seems to happen.

The following works as expected (the stream on both ends is removed and re-added)

peerConnection.removeStream(streamA) // __o_j_sep...
peerConnection.addStream(streamA) // __o_j_sep...

More Details

I have found this example which does "the reverse" (Switch from screen capture to audio/video with camera) but can't spot a significant difference.

The peerConnection RTCPeerConnection object is actually created by this SIPML library source code available here. And I access it like this:

var peerConnection = stack.o_stack.o_layer_dialog.ao_dialogs[1].o_msession_mgr.ao_sessions[0].o_pc

(Yes, this does not look right, but there is no official way to get access to the Peer Connection see discussion here) and here.

Originally I tried to just (ex)change the videoTracks of streamA with the videoTrack of streamB. See question here. It was suggested to me that I should try to renegotiate the Peer Connection (by removing/adding Streams to it), because the addTrack does not trigger a re-negotitation.

I've also asked for help here but the maintainer seems very busy and didn't have a chance to respond yet.


* 1 Remark: Why does streamB not have a videoTracks property? The stream plays in an HTML <video> element and seems to "work". Here is how I get it:

navigator.webkitGetUserMedia({
  audio: false,
  video: {
    mandatory: {
      chromeMediaSource:  'desktop',
      chromeMediaSourceId: streamId,
      maxWidth: window.screen.width,
      maxHeight: window.screen.height
      //,   maxFrameRate: 3
    }
  }
  // success callback
}, function(localMediaStream) {
  SSTREAM = localMediaStream; //streamB

  // fail callback
}, function(error) {
  console.log(error);
});

it also seems to have a videoTrack:

videoTrack of streamB

I'm running:

  • OS X 10.9.3
  • Chrome Version 35.0.1916.153
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

To answer your first question, when modifying the MediaStream in an active peerconnection, the peerconnection object will fire an onnegotiationneeded event. You need to handle that event and re-exchange your SDPs. The main reason behind this is so that both parties know what streams are being sent between them. When the SDPs are exchanged, the mediaStream ID is included, and if there is a new stream with a new ID(event with all other things being equal), a re-negotiation must take place.

For you second question(about SSTREAM). It does indeed contain video tracks but there is no videotrack attribute for webkitMediaStreams. You can grab tracks via their ID, however.

Since there is the possibility of having numerous tracks for each media type, there is no single attribute for a videotrack or audiotrack but instead an array of such. The .getVideoTracks() call returns an array of the current videoTracks. So, you COULD grab a particular video track through indicating its index .getVideoTracks()[0].


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

57.0k users

...