admin管理员组

文章数量:1290975

In Chrome. I'm using MediaRecorder and canvas.captureStream() to create a webm file of the canvas.

let recorder = new MediaRecorder(document.querySelector('canvas').captureStream(), {mimeType: 'video/webm'});
let chunks = [];
let blob;

recorder.ondataavailable = function(event) {
  if (event.data.size > 0) {
    chunks.push(event.data);
  }
};
recorder.onstop = function() {
  blob = new Blob(chunks, {type: 'video/webm'});
  let url = URL.createObjectURL(blob);
  let a = document.createElement('a');
  document.body.appendChild(a);
  a.href = url;
  a.download = Date.now()+'.webm';
  a.click();
  window.URL.revokeObjectURL(url);
  a.parentNode.removeChild(a);
}
recorder.onstart = function() {
  chunks = [];
}

This is the basic recording and download code along with calling recorder.start() to begin recording and recorder.stop() to end.

The output webm file is fine, the issue I'm having is that because of a shitty puter/overhead, I can't always draw the canvas fast enough to make it a full 60 fps. On the canvas itself, I don't mind a lower framerate, but the lag on drawing to the canvas gets translated into the webm and I'm left with a x0.9 speed video.

I've tried remedying this by using canvas.captureStream(0) to capture only a single frame at a time and matching that up with each canvas render. But this fails because I can't specify the duration each frame should last and the file size bees enormous since every single frame has all the header information.

I can see in my blob array that the first 131 blobs are constant, and blob 132 has a very large amount of data. After that, there are typically ~7 spacer blobs of 1-byte each and then a single blob containing some larger amount of data. I know the first 132 blobs are header information + my first frame. And I imagine the blobs with larger amounts of data are each frame. I'm also assuming then that the 1-byte spacer blobs are something to do with frame duration or pause for a set amount of time.

What I would like to do is to be able to modify those spacer blobs to specify the exact duration of frame. I've tried to manually do this by copying the 7 spacer blobs between 2 frames where I knew the framerate was ideal, and then removing all other spacers and pasting in these ideal spacer blobs between every frame, but the output file did not play.

Am I misunderstanding the blob data? Is there any way to manually specify the duration a frame lasts by modifying the blob data or am I stuck with whatever framerate I can draw to the canvas?

In Chrome. I'm using MediaRecorder and canvas.captureStream() to create a webm file of the canvas.

let recorder = new MediaRecorder(document.querySelector('canvas').captureStream(), {mimeType: 'video/webm'});
let chunks = [];
let blob;

recorder.ondataavailable = function(event) {
  if (event.data.size > 0) {
    chunks.push(event.data);
  }
};
recorder.onstop = function() {
  blob = new Blob(chunks, {type: 'video/webm'});
  let url = URL.createObjectURL(blob);
  let a = document.createElement('a');
  document.body.appendChild(a);
  a.href = url;
  a.download = Date.now()+'.webm';
  a.click();
  window.URL.revokeObjectURL(url);
  a.parentNode.removeChild(a);
}
recorder.onstart = function() {
  chunks = [];
}

This is the basic recording and download code along with calling recorder.start() to begin recording and recorder.stop() to end.

The output webm file is fine, the issue I'm having is that because of a shitty puter/overhead, I can't always draw the canvas fast enough to make it a full 60 fps. On the canvas itself, I don't mind a lower framerate, but the lag on drawing to the canvas gets translated into the webm and I'm left with a x0.9 speed video.

I've tried remedying this by using canvas.captureStream(0) to capture only a single frame at a time and matching that up with each canvas render. But this fails because I can't specify the duration each frame should last and the file size bees enormous since every single frame has all the header information.

I can see in my blob array that the first 131 blobs are constant, and blob 132 has a very large amount of data. After that, there are typically ~7 spacer blobs of 1-byte each and then a single blob containing some larger amount of data. I know the first 132 blobs are header information + my first frame. And I imagine the blobs with larger amounts of data are each frame. I'm also assuming then that the 1-byte spacer blobs are something to do with frame duration or pause for a set amount of time.

What I would like to do is to be able to modify those spacer blobs to specify the exact duration of frame. I've tried to manually do this by copying the 7 spacer blobs between 2 frames where I knew the framerate was ideal, and then removing all other spacers and pasting in these ideal spacer blobs between every frame, but the output file did not play.

Am I misunderstanding the blob data? Is there any way to manually specify the duration a frame lasts by modifying the blob data or am I stuck with whatever framerate I can draw to the canvas?

Share Improve this question asked Dec 12, 2016 at 2:26 BobbyBobby 1632 silver badges6 bronze badges 6
  • Not sure if what I'm about to say now is 100% true, but I think you can't really have control over the recorded video's frameRate. The fps parameter is only for captureStream, but it's not a setting for the MediaRecorder. The only option you've got is videoBitsPerSecond in the MediaRecorder Constructor. You could try to play with it. – Kaiido Commented Dec 12, 2016 at 7:56
  • @Kaiido Unfortunately that won't do the trick. MediaRecorder actually does have a timeslice I can place in MediaRecorder.start(timeslice) which, if set to 1000/fps, will lead to frame captures at the fps I want. I can also manually .requestData() after drawing a frame. The issue is that the bottleneck is the video source (ie canvas refresh). Imagine if I drew to my canvas once per second, so my animation was 1 fps. I want to be able to record that canvas, then generate a .webm where each frame of that canvas animation lasts a time of my choosing, independent of the 1 fps canvas. – Bobby Commented Dec 12, 2016 at 22:18
  • As I said I wasn't sure about previous ment. I didn't had a lot of time yesterday to do tests, and my VLC never returns an FPS metadata from browser recorded videos... But my FF seems to have the same number of `mozDecodedFrames than the fps set in captureStream, when set to <30fps. (10 frames for a 10 sec video at 1fps, and 297 frames for a 10sec video at 60 fps). Not sure if it is reliable either though. – Kaiido Commented Dec 12, 2016 at 23:18
  • 1 But, the slice argument for play is just to tell at which frequency the dataavailable will fire. On FF if you don't set it, chunks will be like very large and contain all frames of a 10sec video. So I don't think that what you are trying to do is possible either. For the independent frame rate, in this exact example (1fps drawing to higher fps recording, you can probably achieve it with requestFrame. For the other way, that might be trickier. May be of some interest : stackoverflow./questions/40687010/… – Kaiido Commented Dec 12, 2016 at 23:20
  • I'll check out that link, thanks. I might be able to sort of cheat it with requestData. If I set my animation to a purposely higher framerate than I want. Maybe I pause and resume the MediaRecorder along with request data at the framerate I actually want. Idk, maybe that's the wrong idea. I'll have to keep messing around with it. – Bobby Commented Dec 13, 2016 at 1:16
 |  Show 1 more ment

2 Answers 2

Reset to default 8

I was able to define a framerate separate from the canvas refresh rate by pausing and resuming the recorder on timeout, and requesting a frame before pausing again:

let recorder = new MediaRecorder(canvas.captureStream(), {mimeType: 'video/webm'});
recorder.start();
recorder.pause();

function draw() {
  context.drawImage(...);
  recorder.resume();
  setTimeout(function() {
    recorder.requestData();
    recorder.pause();

    //update progress bars or any laggy overhead stuff at this point

    requestAnimationFrame(draw);
  }, 1000/fps);
}
requestAnimationFrame(draw);

This way, any lag in the actual canvas drawing or in updating progress bars etc. will not affect the frame collection of the recorder. recorder.requestData() doesn't seem to be necessary, but also doesn't seem to have any downsides. It's included here for clarity.

I haven't checked in detail but there may be a double frame at the beginning depending on whether or not recorder.start() collects an initial frame and your canvas isn't blank.

I have struggled a lot trying to make videos frame by frame using canvas.captureStream() and MediaRecorder, including using your pause/resume solution. It simply doesn't seem this is intended usage. I have, however, found a library exactly for this purpose: CCapture.js. It worked for me. It works by sampling image dumps of the canvas and joining them afterwards. In the process it overrides some internal functions, so may not be safe for everything(?). But it makes the job very easy. Also, it allows multiple output formats, and in my limited experience, the webm output has much better quality than that achieved with MediaRecorder.

本文标签: javascriptHow to edit webm Blobs captured by Chrome MediaRecorderStack Overflow