admin管理员组

文章数量:1207727

Assume we have such a program:

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
for (let i = 1; i < 1000; i++) {
  fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}

My question is, will string1 to string1000 be gurantted to append to the same file in order?

Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread (like a callstack?) and once the previous call is done the next call can be executed.

I'm not really sure if my understanding is accurate.

Edit 1

As in comments and answers, I see fs.write is not safe for multiple write to same file without waiting for callback. But what about writestream?

If I use the following code, would it guarantee the order of writing?

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
var fileStream = fs.createWriteFileStream("./same/path/file.txt",  { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
  fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();

Any comments or corrections will be helpful! Thanks!

Assume we have such a program:

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
for (let i = 1; i < 1000; i++) {
  fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}

My question is, will string1 to string1000 be gurantted to append to the same file in order?

Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread (like a callstack?) and once the previous call is done the next call can be executed.

I'm not really sure if my understanding is accurate.

Edit 1

As in comments and answers, I see fs.write is not safe for multiple write to same file without waiting for callback. But what about writestream?

If I use the following code, would it guarantee the order of writing?

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
var fileStream = fs.createWriteFileStream("./same/path/file.txt",  { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
  fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();

Any comments or corrections will be helpful! Thanks!

Share Improve this question edited Oct 27, 2016 at 19:53 Lubor asked Oct 27, 2016 at 19:33 LuborLubor 9994 gold badges13 silver badges33 bronze badges 7
  • 1 If you mean fs.writeFile(), then the documentation states that "...it is unsafe to use fs.writeFile multiple times on the same file without waiting for the callback". – robertklep Commented Oct 27, 2016 at 19:41
  • Note that due to how buffering works in node.js, not waiting for the callback does not affect order or writes. What's unsafe is the possibility of overflowing the buffer. So the risk is missing writes, not unordered writes. – slebetman Commented Oct 27, 2016 at 19:55
  • @slebetman, Thanks for replying. But what does buffer here really mean? – Lubor Commented Oct 27, 2016 at 19:57
  • 2 @Lubor: The core node.js manages file/disk I/O by spawning one thread per open file. When you write, you don't actually write to the file. What you do instead is send a message to this I/O thread. So this I/O thread needs to store this message somewhere in RAM. This is the I/O buffer. I believe it's size is fixed at compile time. The thread then does a proper async I/O loop writing data from this buffer to disk whenever the file is writable (when the OS write buffer for the file is empty, your OS again does not write to disk but buffer in a similar way) – slebetman Commented Oct 27, 2016 at 20:02
  • 1 @Lubor: From what I've read of the code when answering a similar question I can see that the ordering of strings entering this I/O buffer and being taken from this buffer to be written to disk is guaranteed. The buffer can't be rearranged thus the order you write is the order that will be written to disk. But if the buffer is full your writes will be ignored. – slebetman Commented Oct 27, 2016 at 20:05
 |  Show 2 more comments

4 Answers 4

Reset to default 16

The docs say that

Note that it is unsafe to use fs.write multiple times on the same file without waiting for the callback. For this scenario, fs.createWriteStream is strongly recommended.

Using a stream works because streams inherently guarantee that the order of strings being written to them is the same order that is read out of them.

var stream = fs.createWriteStream("./same/path/file.txt");
stream.on('error', console.error);
arr.forEach((str) => { 
  stream.write(str + '\n'); 
});
stream.end();

Another way to still use fs.write but also make sure things happen in order is to use promises to maintain the sequential logic.

function writeToFilePromise(str) {
  return new Promise((resolve, reject) => {
    fs.write("./same/path/file.txt", str, {flag: "a"}}, (err) => {
      if (err) return reject(err);
      resolve();
    });
  });
}

// for every string, 
// write it to the file, 
// then write the next one once that one is finished and so on
arr.reduce((chain, str) => {
  return chain
   .then(() => writeToFilePromise(str));
}, Promise.resolve());

You can synchronize the access to the file using the read/write locking for node, please see the following example an you could read the documentation

var ReadWriteLock = require('rwlock');

var lock = new ReadWriteLock();

lock.writeLock(function (release) {
  fs.appendFile(fileName, addToFile, function(err, data) {
    if(err) 
      console.log("write error"); //logging error message
    else    
      console.log("write ok");

    release(); // unlock
   });    
});

I had the same problem and wrote an NPM package to solve it for my project. It works by buffering the data in an array, and waiting until the event loop turns over, to concatenate and write the data in a single call to fs.appendFile:

const SeqAppend = require('seqappend');

const writeLog = SeqAppend('log1.txt');

writeLog('Several...');
writeLog('...logged...');
writeLog('.......events');

you can use json-stream package to achieve it

本文标签: javascriptCan Multiple fswrite to append to the same file guarantee the order of executionStack Overflow