admin管理员组

文章数量:1302384

I have a web application of my own, which is based on the peerjs library (It is a video conference).

I'm trying to make a recording with 'MediaRecorder', but I'm facing a very unpleasant case.

The code for capturing my desktop stream is the following:

let chooseScreen = document.querySelector('.chooseScreenBtn')
chooseScreen.onclick = async () => {
    let desktopStream = await navigator.mediaDevices.getDisplayMedia({ video:true, audio: true });
}

I then successfully apply my received desktopStream to videoElement in DOM:

const videoElement = doc.querySelector('.videoElement')
  videoElement.srcObject = desktopStream 
  videoElement.muted = false;
  videoElement.onloadedmetadata = ()=>{videoElement.play();}

For example, I get desktopStream on the page with an active conference where everyone hears and sees each other.

To check the video and audio in desktopStream I play some video on the video player on the desktop. I can hear any audio from my desktop but audio from any participant cannot be heard. Of course, when I put the desktopStream in MediaRecorder I get a video file with no sound from anyone except my desktop. Any ideas on how to solve it?

I have a web application of my own, which is based on the peerjs library (It is a video conference).

I'm trying to make a recording with 'MediaRecorder', but I'm facing a very unpleasant case.

The code for capturing my desktop stream is the following:

let chooseScreen = document.querySelector('.chooseScreenBtn')
chooseScreen.onclick = async () => {
    let desktopStream = await navigator.mediaDevices.getDisplayMedia({ video:true, audio: true });
}

I then successfully apply my received desktopStream to videoElement in DOM:

const videoElement = doc.querySelector('.videoElement')
  videoElement.srcObject = desktopStream 
  videoElement.muted = false;
  videoElement.onloadedmetadata = ()=>{videoElement.play();}

For example, I get desktopStream on the page with an active conference where everyone hears and sees each other.

To check the video and audio in desktopStream I play some video on the video player on the desktop. I can hear any audio from my desktop but audio from any participant cannot be heard. Of course, when I put the desktopStream in MediaRecorder I get a video file with no sound from anyone except my desktop. Any ideas on how to solve it?

Share edited Feb 21, 2021 at 13:14 Andrew Medvedev asked Feb 21, 2021 at 10:49 Andrew MedvedevAndrew Medvedev 651 silver badge8 bronze badges 5
  • Looks like restrictOwnAudio defaults to true. Try setting it to false on your constraints object as an audio property. Reference: w3c.github.io/mediacapture-screen-share/#dfn-restrictownaudio. – lnogueir Commented Feb 21, 2021 at 23:14
  • I did it as you advice desktopStream = await navigator.mediaDevices.getDisplayMedia({video:true,audio: {restrictOwnAudio: false,}, but nothing changed unfortunately. – Andrew Medvedev Commented Feb 22, 2021 at 8:30
  • It happens when I get desktopStream on the same page with conference, When I take out this functionality to another page, it works fine. – Andrew Medvedev Commented Feb 22, 2021 at 8:38
  • @AndrewMedvedev were you able to resolve it? i am facing similar issue – Hemant Kumar Commented Jun 2, 2021 at 8:48
  • Briefly, I decided to merge all audio tracks from remote users with AudioContext including the own audio track. Also, I take the audio track from every users who connect and bind it with MediaStream too. As a result I always have the actual audio from all users. This topic is about how it to do stackoverflow./questions/66317945/… there is a function to merge some audio tracks jsfiddle/onadmin/qvtd5cwo/21 – Andrew Medvedev Commented Jun 3, 2021 at 11:22
Add a ment  | 

3 Answers 3

Reset to default 5

Chrome's MediaRecorder API can only output one track. The createMediaStreamSource can take streams from desktop audio and microphone, by connecting both together into one object created by createMediaStreamDestination it gives you the ability to pipe this one stream into the MediaRecorder API.

const mergeAudioStreams = (desktopStream, voiceStream) => {
    const context = new AudioContext();

    // Create a couple of sources
    const source1 = context.createMediaStreamSource(desktopStream);
    const source2 = context.createMediaStreamSource(voiceStream);
    const destination = context.createMediaStreamDestination();

    const desktopGain = context.createGain();
    const voiceGain = context.createGain();

    desktopGain.gain.value = 0.7;
    voiceGain.gain.value = 0.7;

    source1.connect(desktopGain).connect(destination);
    // Connect source2
    source2.connect(voiceGain).connect(destination);

    return destination.stream.getAudioTracks();
}; 

It is also possible to use two or more audio inputs + video input.

window.onload = () => {
    const warningEl = document.getElementById('warning');
    const videoElement = document.getElementById('videoElement');
    const captureBtn = document.getElementById('captureBtn');
    const startBtn = document.getElementById('startBtn');
    const stopBtn = document.getElementById('stopBtn');
    const download = document.getElementById('download');
    const audioToggle = document.getElementById('audioToggle');
    const micAudioToggle = document.getElementById('micAudioToggle');
    
    if('getDisplayMedia' in navigator.mediaDevices) warningEl.style.display = 'none';

    let blobs;
    let blob;
    let rec;
    let stream;
    let voiceStream;
    let desktopStream;
    
    const mergeAudioStreams = (desktopStream, voiceStream) => {
        const context = new AudioContext();
        const destination = context.createMediaStreamDestination();
        let hasDesktop = false;
        let hasVoice = false;
        if (desktopStream && desktopStream.getAudioTracks().length > 0) {
        // If you don't want to share Audio from the desktop it should still work with just the voice.
        const source1 = context.createMediaStreamSource(desktopStream);
        const desktopGain = context.createGain();
        desktopGain.gain.value = 0.7;
        source1.connect(desktopGain).connect(destination);
        hasDesktop = true;
        }
        
        if (voiceStream && voiceStream.getAudioTracks().length > 0) {
        const source2 = context.createMediaStreamSource(voiceStream);
        const voiceGain = context.createGain();
        voiceGain.gain.value = 0.7;
        source2.connect(voiceGain).connect(destination);
        hasVoice = true;
        }
        
        return (hasDesktop || hasVoice) ? destination.stream.getAudioTracks() : [];
    };

    captureBtn.onclick = async () => {
        download.style.display = 'none';
        const audio = audioToggle.checked || false;
        const mic = micAudioToggle.checked || false;
        
        desktopStream = await navigator.mediaDevices.getDisplayMedia({ video:true, audio: audio });
        
        if (mic === true) {
        voiceStream = await navigator.mediaDevices.getUserMedia({ video: false, audio: mic });
        }
    
        const tracks = [
        ...desktopStream.getVideoTracks(), 
        ...mergeAudioStreams(desktopStream, voiceStream)
        ];
        
        console.log('Tracks to add to stream', tracks);
        stream = new MediaStream(tracks);
        console.log('Stream', stream)
        videoElement.srcObject = stream;
        videoElement.muted = true;
        
        blobs = [];
    
        rec = new MediaRecorder(stream, {mimeType: 'video/webm; codecs=vp8,opus'});
        rec.ondataavailable = (e) => blobs.push(e.data);
        rec.onstop = async () => {
        
        blob = new Blob(blobs, {type: 'video/webm'});
        let url = window.URL.createObjectURL(blob);
        download.href = url;
        download.download = 'test.webm';
        download.style.display = 'block';
        };
        startBtn.disabled = false;
        captureBtn.disabled = true;
        audioToggle.disabled = true;
        micAudioToggle.disabled = true;
    };

    startBtn.onclick = () => {
        startBtn.disabled = true;
        stopBtn.disabled = false;
        rec.start();
    };

    stopBtn.onclick = () => {
        captureBtn.disabled = false;
        audioToggle.disabled = false;
        micAudioToggle.disabled = false;
        startBtn.disabled = true;
        stopBtn.disabled = true;
        
        rec.stop();
        
        stream.getTracks().forEach(s=>s.stop())
        videoElement.srcObject = null
        stream = null;
    };
};

Audio capture with getDisplayMedia is only fully supported with Chrome for Windows. Other platforms have a number of limitations:

  • there is no support for audio capture at all under Firefox or Safari;
  • on Chrome/Chromium for Linux and Mac OS, only the audio of a Chrome/Chromium tab can be captured, not the audio of a non-browser application window.

const displayMediaOptions = {audio: { echoCancellation: false }};

This will capture your conference tab too.

本文标签: javascriptwebrtcgetDisplayMedia() does not capture sound from the remote streamStack Overflow