admin管理员组

文章数量:1332395

If I have a web server that runs Node.js, then will I be able to serve multiple requests at the same time? From my preliminary tests I can see than Node being primarily single threaded can process only one HTTP request at the moment. But if one request takes too long to plete (for example, uploading of a large data) then all the other request have to wait.

Is there an workaround for this situation? Can we write the code such that it can server multiple HTTP requests at the same time?

If I have a web server that runs Node.js, then will I be able to serve multiple requests at the same time? From my preliminary tests I can see than Node being primarily single threaded can process only one HTTP request at the moment. But if one request takes too long to plete (for example, uploading of a large data) then all the other request have to wait.

Is there an workaround for this situation? Can we write the code such that it can server multiple HTTP requests at the same time?

Share Improve this question asked Mar 3, 2013 at 21:41 Sankha Narayan GuriaSankha Narayan Guria 9452 gold badges9 silver badges26 bronze badges 1
  • 2 If your app can only service one HTTP request at a time, then you've done something wrong. Specifically, make sure you aren't using any of the *Sync methods. – josh3736 Commented Mar 3, 2013 at 23:34
Add a ment  | 

3 Answers 3

Reset to default 7

The fact that Node is single threaded does not necessarily mean it can only process 1 request at a time.

A lot of things in Node are purposely asynchronous; such as many file system operations, DB queries etc. This is the mindset of:

I know you're going to take some time to do this, so call me when you're done and in the meantime I'm going to put my single-thread of operation to some better use, rather than waiting for you to plete.

It is at that point that other work (which can be other requests) are processed. Of course, if they then have to perform an asynchronous operation, the flow of operation might return to where we suspended ourselves earlier, because the other operation has pleted.

In your situation where a large upload is taking place, the ining stream is processed asynchronously and is managed through the data event. If you want to write this stream to disk, again, the writing can be performed asynchronously; i.e. there are points during the file upload process where other requests could be processed.

Having multiple threads of operation is only beneficial if you have multiple CPU cores; then each thread of operation can run on a different core. To do this, Node has the cluster module, which you should look at.

As we can see in node.js we have a non-blocking I/O server platform. So we can serve mutltiple request at the same time. For Eg:-

Consider file handling as a case :-

var fs=require("fs");
console.log("starting");
fs.readFile("path-to-file", function(error, data){
    console.log("Content:"+data);
});
console.log("Carry on executing");

OUTPUT:-

starting
carry on executing
Content: data

So we can see that while we are waiting for the contents of the file our code will carry on executing

When a request takes a long time, that's not because the processor is taking a long time to process the request. It's because it spends a long time waiting for each chunk of data to e in. It's in this waiting time that node.js can process other requests, which makes it very scalable, and far more efficient than the threaded model that most other platforms use. For a detailed discussion, see the C10k problem by Dan Kegel.

JavaScript events work by having an event queue. This queue gets added to every time an event (such as a file read operation or a data chunk from a server request) fires. As long as the resulting event handling code isn't too processor intensive, the queue doesn't typically get very long, and code is executed almost immediately. That is why almost everything in node.js is async.

本文标签: javascriptParallel requests in Nodejs web serverStack Overflow