admin管理员组文章数量:1207727
Assume we have such a program:
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
for (let i = 1; i < 1000; i++) {
fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}
My question is, will string1 to string1000 be gurantted to append to the same file in order?
Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread
(like a callstack
?) and once the previous call is done the next call can be executed.
I'm not really sure if my understanding is accurate.
Edit 1
As in comments and answers, I see fs.write
is not safe for multiple write to same file without waiting for callback
. But what about writestream?
If I use the following code, would it guarantee the order of writing?
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
var fileStream = fs.createWriteFileStream("./same/path/file.txt", { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();
Any comments or corrections will be helpful! Thanks!
Assume we have such a program:
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
for (let i = 1; i < 1000; i++) {
fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}
My question is, will string1 to string1000 be gurantted to append to the same file in order?
Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread
(like a callstack
?) and once the previous call is done the next call can be executed.
I'm not really sure if my understanding is accurate.
Edit 1
As in comments and answers, I see fs.write
is not safe for multiple write to same file without waiting for callback
. But what about writestream?
If I use the following code, would it guarantee the order of writing?
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
var fileStream = fs.createWriteFileStream("./same/path/file.txt", { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();
Any comments or corrections will be helpful! Thanks!
Share Improve this question edited Oct 27, 2016 at 19:53 Lubor asked Oct 27, 2016 at 19:33 LuborLubor 9994 gold badges13 silver badges33 bronze badges 7 | Show 2 more comments4 Answers
Reset to default 16The docs say that
Note that it is unsafe to use
fs.write
multiple times on the same file without waiting for the callback. For this scenario, fs.createWriteStream is strongly recommended.
Using a stream works because streams inherently guarantee that the order of strings being written to them is the same order that is read out of them.
var stream = fs.createWriteStream("./same/path/file.txt");
stream.on('error', console.error);
arr.forEach((str) => {
stream.write(str + '\n');
});
stream.end();
Another way to still use fs.write
but also make sure things happen in order is to use promises to maintain the sequential logic.
function writeToFilePromise(str) {
return new Promise((resolve, reject) => {
fs.write("./same/path/file.txt", str, {flag: "a"}}, (err) => {
if (err) return reject(err);
resolve();
});
});
}
// for every string,
// write it to the file,
// then write the next one once that one is finished and so on
arr.reduce((chain, str) => {
return chain
.then(() => writeToFilePromise(str));
}, Promise.resolve());
You can synchronize the access to the file using the read/write locking for node, please see the following example an you could read the documentation
var ReadWriteLock = require('rwlock');
var lock = new ReadWriteLock();
lock.writeLock(function (release) {
fs.appendFile(fileName, addToFile, function(err, data) {
if(err)
console.log("write error"); //logging error message
else
console.log("write ok");
release(); // unlock
});
});
I had the same problem and wrote an NPM package to solve it for my project. It works by buffering the data in an array, and waiting until the event loop turns over, to concatenate and write the data in a single call to fs.appendFile
:
const SeqAppend = require('seqappend');
const writeLog = SeqAppend('log1.txt');
writeLog('Several...');
writeLog('...logged...');
writeLog('.......events');
you can use json-stream package to achieve it
版权声明:本文标题:javascript - Can Multiple fs.write to append to the same file guarantee the order of execution? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1738745086a2110072.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
fs.writeFile()
, then the documentation states that "...it is unsafe to usefs.writeFile
multiple times on the same file without waiting for the callback". – robertklep Commented Oct 27, 2016 at 19:41