admin管理员组文章数量:1423216
I am attempting to play a play an audio blob within safari and it plays for a fraction of a second and I never hear any audio. The media element fires a "paused" event a tiny fraction of a second into playback (example : 0.038s ).
The blob is recorded in Chrome. Playback works just fine in Chrome and Firefox.
Also the reported duration of the media in safari is much shorter than what it should be. For example a given recording is 7.739 seconds and chrome recognizes the correct duration but safari shows a duration of 1.584. Or another had a duration of 9.96 but safari reported 6.552.
I have tried making sure this is not an issue with Safari preventing playback when not initiated by the user. So playback starts on a tap. I have also tried different mime types. mpeg. webm with h264 and vp8 codecs.
I have made sure that the download blob is the same size in safari as it is on chrome.
I have looked through a number of similar posts including the one with the answer by @lastmjs Loading audio via a Blob URL fails in Safari where there is a demo provided. The demo does work and I am doing more or less what is shown. I suspect the problem is on the record side.
Recorder:
self.mediaRecorder = new MediaRecorder(stream,{'audio' : {'sampleRate' : 22000}});
...assemble the chunks...
self.audioBlob = new Blob(self.audioChunks, {type: 'audio/webm; codecs=vp8'});
...upload the blob to cloud (S3)...
Player:
...in the success handler that downloads blob...
self.audioBlob = new Blob([data],{type: 'audio/webm'});
...I later prepare the element for playback...
let audioUrl = window.URL.createObjectURL(self.audioBlob);
let audioElement = document.createElement('audio');
let sourceElement = document.createElement('source');
audioElement.muted = true;
audioElement.appendChild(sourceElement);
sourceElement.src = audioUrl;
sourceElement.type = 'audio/webm';
document.body.appendChild(audioElement);
audioElement.load()
... when the user taps on a button...
self.audioElement.muted = false;
let playPromise = self.audioElement.play();
playPromise.then(()=>{
console.log("playing should have started: " + self.audioElement.muted + " - " + self.audioElement.paused);
});
...shortly after this - the paused event handler gets fired.
There are no error messages. I am trying this on Safari on Mac and on iOS. No errors. I also listen for the error event on the media element and nothing fires. It just doesnt play for very long. I am clearly missing something. Again capture and playback works great in Chrome. And playback works in Firefox. But playback in Safari just won't work. What should I try?
I am attempting to play a play an audio blob within safari and it plays for a fraction of a second and I never hear any audio. The media element fires a "paused" event a tiny fraction of a second into playback (example : 0.038s ).
The blob is recorded in Chrome. Playback works just fine in Chrome and Firefox.
Also the reported duration of the media in safari is much shorter than what it should be. For example a given recording is 7.739 seconds and chrome recognizes the correct duration but safari shows a duration of 1.584. Or another had a duration of 9.96 but safari reported 6.552.
I have tried making sure this is not an issue with Safari preventing playback when not initiated by the user. So playback starts on a tap. I have also tried different mime types. mpeg. webm with h264 and vp8 codecs.
I have made sure that the download blob is the same size in safari as it is on chrome.
I have looked through a number of similar posts including the one with the answer by @lastmjs Loading audio via a Blob URL fails in Safari where there is a demo provided. The demo does work and I am doing more or less what is shown. I suspect the problem is on the record side.
Recorder:
self.mediaRecorder = new MediaRecorder(stream,{'audio' : {'sampleRate' : 22000}});
...assemble the chunks...
self.audioBlob = new Blob(self.audioChunks, {type: 'audio/webm; codecs=vp8'});
...upload the blob to cloud (S3)...
Player:
...in the success handler that downloads blob...
self.audioBlob = new Blob([data],{type: 'audio/webm'});
...I later prepare the element for playback...
let audioUrl = window.URL.createObjectURL(self.audioBlob);
let audioElement = document.createElement('audio');
let sourceElement = document.createElement('source');
audioElement.muted = true;
audioElement.appendChild(sourceElement);
sourceElement.src = audioUrl;
sourceElement.type = 'audio/webm';
document.body.appendChild(audioElement);
audioElement.load()
... when the user taps on a button...
self.audioElement.muted = false;
let playPromise = self.audioElement.play();
playPromise.then(()=>{
console.log("playing should have started: " + self.audioElement.muted + " - " + self.audioElement.paused);
});
...shortly after this - the paused event handler gets fired.
There are no error messages. I am trying this on Safari on Mac and on iOS. No errors. I also listen for the error event on the media element and nothing fires. It just doesnt play for very long. I am clearly missing something. Again capture and playback works great in Chrome. And playback works in Firefox. But playback in Safari just won't work. What should I try?
Share Improve this question asked Jul 28, 2019 at 3:13 Pineapple JoePineapple Joe 3192 silver badges12 bronze badges 1- webm is not supported in safari. You can try running below mand in browser console to test supported MimeType in your browser. MediaRecorder.isTypeSupported('audio/webm;codecs=opus') I am struggling with similar problem, Let me know if you find any solution. I will update this thread in case I am able to fix it before that. – Shubham Gautam Commented Apr 14, 2021 at 10:09
5 Answers
Reset to default 2you can try changing the blob format to WAV so that it is patible with safari. either using a library or the audioContext API can work. here is how i converted the blob to WAV in my angular application:
add a function convertBlobToWav which takes the recorded blob from mediarecorder api and converts to WAV format.
async convertBlobToWav(blob: Blob): Promise<Blob> { const arrayBuffer = await new Promise<ArrayBuffer>((resolve, reject) => { const fileReader = new FileReader(); fileReader.onload = () => resolve(fileReader.result as ArrayBuffer); fileReader.onerror = reject; fileReader.readAsArrayBuffer(blob); }); const audioContext = new AudioContext(); const audioBuffer = await audioContext.decodeAudioData(arrayBuffer); const wavBuffer = this.audioBufferToWav(audioBuffer); return new Blob([wavBuffer], { type: 'audio/wav' }); }
add the audioBufferToWav function for the audioBuffer from decoded blob audio data:
audioBufferToWav(buffer: AudioBuffer): ArrayBuffer { const numChannels = buffer.numberOfChannels; const sampleRate = buffer.sampleRate; const numFrames = buffer.length; const bytesPerSample = 2; const blockAlign = numChannels * bytesPerSample; const byteRate = sampleRate * blockAlign; const wavBuffer = new ArrayBuffer(44 + numFrames * blockAlign); const dataView = new DataView(wavBuffer); dataView.setUint32(0, 0x52494646); dataView.setUint32(4, 36 + numFrames * blockAlign, true); dataView.setUint32(8, 0x57415645); dataView.setUint32(12, 0x666d7420); dataView.setUint32(16, 16, true); dataView.setUint16(20, 1, true); dataView.setUint16(22, numChannels, true); dataView.setUint32(24, sampleRate, true); dataView.setUint32(28, byteRate, true); dataView.setUint16(32, blockAlign, true); dataView.setUint16(34, 8 * bytesPerSample, true); dataView.setUint32(36, 0x64617461); dataView.setUint32(40, numFrames * blockAlign, true); const channelData = new Array(numChannels); for (let i = 0; i < numChannels; i++) { channelData[i] = buffer.getChannelData(i); } let offset = 44; for (let i = 0; i < numFrames; i++) { for (let j = 0; j < numChannels; j++) { const sample = Math.max(-1, Math.min(1, channelData[j][i])); const int16Value = sample < 0 ? sample * 0x8000 : sample * 0x7fff; dataView.setInt16(offset, int16Value, true); offset += bytesPerSample; } } return wavBuffer; }
finally call the async function into a new variable:
this.newBlob= await convertBlobToWav(this.recordedBlob)
Changing blob to new Blob([data], {type: 'audio/mpeg'})
worked for me.
'audio/wav' and 'audio/webm' didn't work. I don't know why, but for Safari 15.4 only 'audio/mpeg' works. Without this setting you can play/pause but there is no sound.
for everyone having the same problem, try changing 'audio/webm' to 'audio/wav'
I had the same problem and I solved it with this library https://github./muaz-khan/RecordRTC.
It is very similar to using the MediaRecorder API.
As of 2023/06/17 this works on Safari/iOS (iPadOS 16.3.1).
The key part is
let audio = document.createElement("audio");
let blob = new Blob(data, { type: "audio/mpeg" });
audio.src = window.URL.createObjectURL(blob);
res.appendChild(audio);
A full example is here:
https://dk-minimal-mediarecorder.glitch.me/
本文标签: javascriptCannot get audio blob recorded in chrome to work in safariStack Overflow
版权声明:本文标题:javascript - Cannot get audio blob recorded in chrome to work in safari - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745385011a2656336.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论