admin管理员组

文章数量:1391989

Im Accessing a file from my device storage using HTML file input then I read that file as a stream.

  • Is there any built-in way to limit the length of each read to specific number of bytes.
const stream = myFile.stream() 
const reader = stream.getReader();

//actually I use this each time I want to read more data from the stream
reader.read().then(function({ done, value }) { 
  
    const bufferSizeInBytes = value.length //  I want to limit this to 1000 bytes as max value
   
    })

Another question that's confusing me, why do we get different size of the buffer on each read, does it depend on the available memory or cpu or how does it actually work? if it depends on the memory we would be able to read the whole stream in one single read since the file is around 100 mb and my available memory is around 6GB , but actually it took many reads what made me think that the memory is not the only factor behind this operation.

any help would be much appreciated.

Im Accessing a file from my device storage using HTML file input then I read that file as a stream.

  • Is there any built-in way to limit the length of each read to specific number of bytes.
const stream = myFile.stream() 
const reader = stream.getReader();

//actually I use this each time I want to read more data from the stream
reader.read().then(function({ done, value }) { 
  
    const bufferSizeInBytes = value.length //  I want to limit this to 1000 bytes as max value
   
    })

Another question that's confusing me, why do we get different size of the buffer on each read, does it depend on the available memory or cpu or how does it actually work? if it depends on the memory we would be able to read the whole stream in one single read since the file is around 100 mb and my available memory is around 6GB , but actually it took many reads what made me think that the memory is not the only factor behind this operation.

any help would be much appreciated.

Share Improve this question edited Dec 8, 2021 at 6:56 OULAHTAK asked Dec 8, 2021 at 6:49 OULAHTAKOULAHTAK 5016 silver badges13 bronze badges
Add a ment  | 

2 Answers 2

Reset to default 3

One way is to create an intermediate ReadableStream that will have a buffer, once the buffer exceeds the desired chunkSize then enqueue (or if we are at the last part and then there is only a part left that is < chunkSize).

Like this:

  const reader = readable.getReader();
  const chunkSize = 1 * 1024 * 1024 // 1MB

  let buffer: Uint8Array;

  const readableWithDefinedChunks = new ReadableStream({
    async pull(controller) {
      let fulfilledChunkQuota = false;

      while (!fulfilledChunkQuota) {
        const status = await reader.read();

        if (!status.done) {
          const chunk = status.value;
          buffer = new Uint8Array([...(buffer || []), ...chunk]);

          while (buffer.byteLength >= chunkSize) {
            const chunkToSend = buffer.slice(0, chunkSize);
            controller.enqueue(chunkToSend);
            buffer = new Uint8Array([...buffer.slice(chunkSize)]);
            fulfilledChunkQuota = true;
          }
        }
        if (status.done) {
          fulfilledChunkQuota = true;
          if (buffer.byteLength > 0) {
            controller.enqueue(buffer);
          }
          controller.close();
        }
      }
    },
  });

No, you can't control the reader chunk size of default file stream currently, you can try to transform it to a ByteStream, then use stream.getReader({ mode: 'byob' }) to get a BYOB reader to control read size limit.

Further information: https://web.dev/streams/

本文标签: javascriptHow to limit the size of each read from a ReadableStreamStack Overflow