admin管理员组文章数量:1323714
A JSON file is 6 GB. When reading it with the following code,
var fs = require('fs');
var contents = fs.readFileSync('large_file.txt').toString();
It had the following error:
buffer.js:182
throw err;
^
RangeError: "size" argument must not be larger than 2147483647
at Function.Buffer.allocUnsafe (buffer.js:209:3)
at tryCreateBuffer (fs.js:530:21)
at Object.fs.readFileSync (fs.js:569:14)
at Object.<anonymous> (/home/readHugeFile.js:4:19)
at Module._pile (module.js:569:30)
at Object.Module._extensions..js (module.js:580:10)
at Module.load (module.js:503:32)
at tryModuleLoad (module.js:466:12)
at Function.Module._load (module.js:458:3)
at Function.Module.runMain (module.js:605:10)
Could somebody help, please?
A JSON file is 6 GB. When reading it with the following code,
var fs = require('fs');
var contents = fs.readFileSync('large_file.txt').toString();
It had the following error:
buffer.js:182
throw err;
^
RangeError: "size" argument must not be larger than 2147483647
at Function.Buffer.allocUnsafe (buffer.js:209:3)
at tryCreateBuffer (fs.js:530:21)
at Object.fs.readFileSync (fs.js:569:14)
at Object.<anonymous> (/home/readHugeFile.js:4:19)
at Module._pile (module.js:569:30)
at Object.Module._extensions..js (module.js:580:10)
at Module.load (module.js:503:32)
at tryModuleLoad (module.js:466:12)
at Function.Module._load (module.js:458:3)
at Function.Module.runMain (module.js:605:10)
Could somebody help, please?
Share Improve this question edited Jul 9, 2017 at 12:41 Alessio Cantarella 5,2113 gold badges29 silver badges36 bronze badges asked Jul 9, 2017 at 9:08 superuserDoHaveStupidQsuperuserDoHaveStupidQ 1213 silver badges8 bronze badges 2- Possible duplicate of Node.js read big file with fs.readFileSync – Kukic Vladimir Commented Jul 9, 2017 at 9:20
- Possible duplicate of What's the maximum size of a Node.js Buffer – Evan Carroll Commented Feb 24, 2019 at 23:27
2 Answers
Reset to default 8The maximum size for a Buffer
, which is what readFileSync()
uses internally to hold the file data, is about 2GB (source: https://nodejs/api/buffer.html#buffer_buffer_kmaxlength).
You probably need a streaming JSON parser, like JSONStream
, to process your file:
const JSONStream = require('JSONStream');
const fs = require('fs');
fs.createReadStream('large_file.json')
.pipe(JSONStream.parse('*'))
.on('data', entry => {
console.log('entry', entry);
});
u can read the file using line reader node js package and at every 50000 lines you can make small files squenceally then process those file and clear them out for your purpose if you have some task to read data from each line for a bigger file.line reader can do the job as it use stream in backend. the line reader dont wait for you if you directly read and process data like update in mongodb etc. i did it and it worked even for a 10gb file .
本文标签: Javascript Read large files failedStack Overflow
版权声明:本文标题:Javascript Read large files failed - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1742126789a2421978.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论