admin管理员组

文章数量:1410697

... is there only a very technical reason behind this, maybe simply because the V8 wasn't developed with server side Javascript in mind? Or is there a reason more related to the architecture and nature of Node.js? Is there a reason why I should consider to use Node.js for streaming and piping rather than for loading and processing bigger amounts of data, besides "because then you have to increase the memory limit"?

... is there only a very technical reason behind this, maybe simply because the V8 wasn't developed with server side Javascript in mind? Or is there a reason more related to the architecture and nature of Node.js? Is there a reason why I should consider to use Node.js for streaming and piping rather than for loading and processing bigger amounts of data, besides "because then you have to increase the memory limit"?

Share Improve this question asked Nov 19, 2017 at 19:48 vuzavuza 2,6342 gold badges13 silver badges12 bronze badges 1
  • 2 If the memory limit is to low, gc would trigger to often, if the memory is to large gc would stop the world to long. But for shure you can increase the memory limit easily. This is only a default – Jonas Wilms Commented Nov 19, 2017 at 19:54
Add a ment  | 

1 Answer 1

Reset to default 5

The primary reason why there is a limit at all is to protect the rest of the system from out-of-control memory consumers (i.e. when you have several processes running, and one of them consumes all the memory it can get, you want that one to run into its own limit rather than affecting everything else in the system). This is particularly true for the browser scenario (you don't want one tab to consume all your memory at the expense of all other tabs), but applies in many server-side scenarios as well.

The default value of the limit is more or less arbitrary; it seems to be a reasonable promise for most purposes. If you need more, feel free to raise it. 64-bit builds of modern V8 versions ("modern" = from the last few years) support extremely large heaps if your app needs them.

Regarding your last question, "streaming and piping" are generally useful techniques for working with large amounts of data, regardless of whether you're using Node.js or any other technology. Code that reads all of its input data in one go tends to be easier to write, but is always prone to running into limits: for example, physical memory size of the machine it's running on. You can "fix" that one by buying more memory, but if your code uses streaming and reasonably sized buffers, it'll be able to process much bigger input data with much lower hardware requirements, and it might even run faster.

本文标签: javascriptWhy does Nodejs have an 176 Gb memory limitStack Overflow