admin管理员组文章数量:1391989
Im Accessing a file from my device storage using HTML file input then I read that file as a stream.
- Is there any built-in way to limit the length of each read to specific number of bytes.
const stream = myFile.stream()
const reader = stream.getReader();
//actually I use this each time I want to read more data from the stream
reader.read().then(function({ done, value }) {
const bufferSizeInBytes = value.length // I want to limit this to 1000 bytes as max value
})
Another question that's confusing me, why do we get different size of the buffer on each read, does it depend on the available memory or cpu or how does it actually work? if it depends on the memory we would be able to read the whole stream in one single read since the file is around 100 mb
and my available memory is around 6GB , but actually it took many reads what made me think that the memory is not the only factor behind this operation.
any help would be much appreciated.
Im Accessing a file from my device storage using HTML file input then I read that file as a stream.
- Is there any built-in way to limit the length of each read to specific number of bytes.
const stream = myFile.stream()
const reader = stream.getReader();
//actually I use this each time I want to read more data from the stream
reader.read().then(function({ done, value }) {
const bufferSizeInBytes = value.length // I want to limit this to 1000 bytes as max value
})
Another question that's confusing me, why do we get different size of the buffer on each read, does it depend on the available memory or cpu or how does it actually work? if it depends on the memory we would be able to read the whole stream in one single read since the file is around 100 mb
and my available memory is around 6GB , but actually it took many reads what made me think that the memory is not the only factor behind this operation.
any help would be much appreciated.
Share Improve this question edited Dec 8, 2021 at 6:56 OULAHTAK asked Dec 8, 2021 at 6:49 OULAHTAKOULAHTAK 5016 silver badges13 bronze badges2 Answers
Reset to default 3One way is to create an intermediate ReadableStream that will have a buffer, once the buffer exceeds the desired chunkSize
then enqueue (or if we are at the last part and then there is only a part left that is < chunkSize).
Like this:
const reader = readable.getReader();
const chunkSize = 1 * 1024 * 1024 // 1MB
let buffer: Uint8Array;
const readableWithDefinedChunks = new ReadableStream({
async pull(controller) {
let fulfilledChunkQuota = false;
while (!fulfilledChunkQuota) {
const status = await reader.read();
if (!status.done) {
const chunk = status.value;
buffer = new Uint8Array([...(buffer || []), ...chunk]);
while (buffer.byteLength >= chunkSize) {
const chunkToSend = buffer.slice(0, chunkSize);
controller.enqueue(chunkToSend);
buffer = new Uint8Array([...buffer.slice(chunkSize)]);
fulfilledChunkQuota = true;
}
}
if (status.done) {
fulfilledChunkQuota = true;
if (buffer.byteLength > 0) {
controller.enqueue(buffer);
}
controller.close();
}
}
},
});
No, you can't control the reader chunk size of default file stream currently, you can try to transform it to a ByteStream, then use stream.getReader({ mode: 'byob' })
to get a BYOB reader to control read size limit.
Further information: https://web.dev/streams/
本文标签: javascriptHow to limit the size of each read from a ReadableStreamStack Overflow
版权声明:本文标题:javascript - How to limit the size of each read from a ReadableStream - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744672594a2618915.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论