admin管理员组

文章数量:1410731

I know it's doable with mediaSource but media source doesn't support all video formats (like fragmented mp4 for example). Which is a problem because my application doesn't have a server that can fix the file. It's a client side application only.

const blob = await ipfs.getBlobFromStream(hash)

const url = URL.createObjectURL(blob)

this.setState({...this.state, videoSrc: url})

const getBlobFromStream = async (hash) => {

  return new Promise(async resolve => {

    let entireBuffer

    const s = await stream(hash)
    s.on('data', buffer => {

      console.log(buffer)

      if (!entireBuffer) {
        entireBuffer = buffer
      }
      else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer)
      }

    })

    s.on('end', () => {

      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer)
      const blob = new Blob(arrayBuffer)
      resolve(blob)
    })

  })

}

this is the code i'm using right now, which basically waits for the entire file and puts it in a single array and then into a blob and then into URL.createObjectURL

I know it's doable with mediaSource but media source doesn't support all video formats (like fragmented mp4 for example). Which is a problem because my application doesn't have a server that can fix the file. It's a client side application only.

const blob = await ipfs.getBlobFromStream(hash)

const url = URL.createObjectURL(blob)

this.setState({...this.state, videoSrc: url})

const getBlobFromStream = async (hash) => {

  return new Promise(async resolve => {

    let entireBuffer

    const s = await stream(hash)
    s.on('data', buffer => {

      console.log(buffer)

      if (!entireBuffer) {
        entireBuffer = buffer
      }
      else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer)
      }

    })

    s.on('end', () => {

      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer)
      const blob = new Blob(arrayBuffer)
      resolve(blob)
    })

  })

}

this is the code i'm using right now, which basically waits for the entire file and puts it in a single array and then into a blob and then into URL.createObjectURL

Share edited Sep 1, 2018 at 2:37 Trump Voters Deserve Damnation 130k88 gold badges358 silver badges381 bronze badges asked Sep 1, 2018 at 2:35 cooldude101cooldude101 1,4353 gold badges17 silver badges33 bronze badges 3
  • 1 What do you want to achieve? To start the video when first buffer(s) arrived? Also to create on each ining buffer blob url and pass that url into a video element? – Serkan Sipahi Commented Sep 12, 2018 at 11:46
  • 3 What contains the hash variable? Is the stream function a library? Could your please give more details, thanks. – Serkan Sipahi Commented Sep 12, 2018 at 11:53
  • @Bitcollage I want the video to start playing before the entire buffer is downloaded. I want it to "buffer" or "stream" like on youtube. The hash is the IPFS infoHash. It's a wrapper function I made around this method github./ipfs/interface-ipfs-core/blob/master/SPEC/… – cooldude101 Commented Sep 12, 2018 at 16:41
Add a ment  | 

2 Answers 2

Reset to default 2 +25

You can do it in which you restructure your code:

await ipfs.startBlobStreaming(hash);
this.setState({...this.state, videoComplete: true});

const startBlobStreaming = async (hash) => {
  return new Promise(async (resolve) => {

    let entireBuffer;
    const s = await stream(hash);
    s.on('data', buffer => {
      if (!entireBuffer) {
        entireBuffer = buffer;
      } else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer);
      }
      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer);
      const blob = new Blob(arrayBuffer);
      const url = URL.createObjectURL(blob);
      this.setState({...this.state, videoSrc: url});

    });
    s.on('end', _ => resolve())
  });
}

I dont know how intensive the buffers are e into s.on but you could also collect a amount of buffer in a certain time(e.g. 1000ms) and then create the blob url.

Unfortunately it is not currently possible to create a generally readable blob url with content that will be determined asynchronously.

If the goal is specifically for media playback, then there is the MediaSource API which you mention you know about. You imply that it requires server-side processing, but it is not always true - you can generate fragmented mp4 from a normal mp4 file with client-side code, for example with something like mux.js (last time I used it, it generated wrong/buggy fmp4 header so I needed some custom code to fix their stuff) or emsciptened ffmpeg or something else.

I agree with you that MediaStream API has many drawbacks/differences from a generic stream concept:

  • the data can not be arbitrary formats or arbitrarily split into chunks but must be in one of a few specific formats, i.e. fragmented mp4 or webm, and its fragmentation must follow the format's specific requirements
  • can not be read by generic url reading methods like xhr or fetch, it is only usable by audio/video elements;
  • can only be assigned to a single media element, and only once;
  • can be read non-sequentially by the corresponding media element;
  • can not control the data flow with stream-like mechanisms like backpressure or pull events, instead you need to manually monitor the media element's current position in seconds and figure out the corresponding data segments;
  • buffers a copy of the data added to it, doubling memory usage in some use-cases (you can manually remove data from its buffer to try and mitigate this);

Unfortunately for now that is the only option.

本文标签: