admin管理员组文章数量:1290927
Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. She selects the media file in her browser and the media file starts playing instantly in Bob's browser.
I'm not even sure if this is possible to build using WebRTC API right now. All examples I can find use streams obtained via getUserMedia() but this is what I have:
var context = new AudioContext();
var pc = new RTCPeerConnection(pc_config);
function handleFileSelect(event) {
var file = event.target.files[0];
if (file) {
if (file.type.match('audio*')) {
console.log(file.name);
var reader = new FileReader();
reader.onload = (function(readEvent) {
context.decodeAudioData(readEvent.target.result, function(buffer) {
var source = context.createBufferSource();
var destination = context.createMediaStreamDestination();
source.buffer = buffer;
source.start(0);
source.connect(destination);
pc.addStream(destination.stream);
pc.createOffer(setLocalAndSendMessage);
});
});
reader.readAsArrayBuffer(file);
}
}
}
On the receiving side I have the following:
function gotRemoteStream(event) {
var mediaStreamSource = context.createMediaStreamSource(event.stream);
mediaStreamSource.connect(context.destination);
}
This code does not make the media (music) play on the receiving side. I do however receive an ended event right after the WebRTC handshake is done and the gotRemoteStream function was called. The gotRemoteStream function gets called the media does not start playing.
On Alice's side the magic is suppose to happen in the line that says source.connect(destination). When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers.
On Bob's side a media stream source is created based upon Alice's stream. However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers.
Off course I could always send the media file through a DataChannel but where is the fun in that...
Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated!
I'm using the latest and greatest Chrome Canary.
Thanks.
Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. She selects the media file in her browser and the media file starts playing instantly in Bob's browser.
I'm not even sure if this is possible to build using WebRTC API right now. All examples I can find use streams obtained via getUserMedia() but this is what I have:
var context = new AudioContext();
var pc = new RTCPeerConnection(pc_config);
function handleFileSelect(event) {
var file = event.target.files[0];
if (file) {
if (file.type.match('audio*')) {
console.log(file.name);
var reader = new FileReader();
reader.onload = (function(readEvent) {
context.decodeAudioData(readEvent.target.result, function(buffer) {
var source = context.createBufferSource();
var destination = context.createMediaStreamDestination();
source.buffer = buffer;
source.start(0);
source.connect(destination);
pc.addStream(destination.stream);
pc.createOffer(setLocalAndSendMessage);
});
});
reader.readAsArrayBuffer(file);
}
}
}
On the receiving side I have the following:
function gotRemoteStream(event) {
var mediaStreamSource = context.createMediaStreamSource(event.stream);
mediaStreamSource.connect(context.destination);
}
This code does not make the media (music) play on the receiving side. I do however receive an ended event right after the WebRTC handshake is done and the gotRemoteStream function was called. The gotRemoteStream function gets called the media does not start playing.
On Alice's side the magic is suppose to happen in the line that says source.connect(destination). When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers.
On Bob's side a media stream source is created based upon Alice's stream. However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers.
Off course I could always send the media file through a DataChannel but where is the fun in that...
Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated!
I'm using the latest and greatest Chrome Canary.
Thanks.
Share Improve this question edited Jul 10, 2013 at 11:19 Eelco asked Jul 4, 2013 at 11:40 EelcoEelco 5473 silver badges19 bronze badges 2- Due to an error in my code the received stream on Bob's side was ended because the SDP answer on Alice's side was not handeld correctly. After fixing the issue the media still does not play but the example behaves differently. I updated the question accordingly. – Eelco Commented Jul 10, 2013 at 11:21
- it might be unrelated (I have no experience with webRTC) but could github./wearefractal/holla help you? – rickyduck Commented Jul 16, 2013 at 11:05
3 Answers
Reset to default 3It is possible to play the audio using the Audio element like this:
function gotRemoteStream(event) {
var player = new Audio();
attachMediaStream(player, event.stream);
player.play();
}
Playing back the audio via the WebAudio API it not working (yet) for me.
Note sure about Chrome; sounds like a bug.
try it on Firefox (nightly I suggest); we have WebAudio support there though I don't know all the details about what's supported currently.
Also, on Firefox at least we have stream = media_element.captureStreamUntilEnded(); we use it in some of our tests in dom/media/tests/mochitests I believe. This lets you take any audio or video element and capture the output as a mediastream.
Edit: see below; both Chrome and Firefox have misses in bining WebAudio with WebRTC PeerConnections, but in different places. Mozilla hopes to fix the last bug there very soon.
Check out the page MediaStream Integration. It illustrates WebRTC integration with the Web Audio API. In particular this example is relevant for your question:
- Capture microphone input, visualize it, mix in another audio track and stream the result to a peer
<canvas id="c"></canvas>
<audio src="back.webm" id="back"></audio>
<script>
navigator.getUserMedia('audio', gotAudio);
var streamRecorder;
function gotAudio(stream) {
var microphone = context.createMediaStreamSource(stream);
var backgroundMusic = context.createMediaElementSource(document.getElementById("back"));
var analyser = context.createAnalyser();
var mixedOutput = context.createMediaStreamDestination();
microphone.connect(analyser);
analyser.connect(mixedOutput);
backgroundMusic.connect(mixedOutput);
requestAnimationFrame(drawAnimation);
peerConnection.addStream(mixedOutput.stream);
}
</script>
I fear, however, that this is only a proposal currently.
本文标签: javascriptStream media file using WebRTCStack Overflow
版权声明:本文标题:javascript - Stream media file using WebRTC - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741517906a2383004.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论