admin管理员组文章数量:1387327
I'm looking at sing pooled connections from NodeJs to Postgresql. I'll be using the Pool
class in the pg
library, along with async
/ await
.
I've read that Postgresql by default has a limit of 100 concurrent connections and the Pool
has a default of 10 pooled connections.
My app will scale up new instances as it es under heavy load, so I could theoretically end up with more than 10 instances, which would then exceed the 100 Postgresql max connections.
What I'd like to know is, what will happen when I execute await pool.Query(....)
under the following circumstances.
- All 10 pooled connections are currently in use - will it await one to bee available or throw an exception?
- All 100 connections to the DB server are in use and NodeJS tries to create a new connection from a pool.
Also, how can I write some NodeJS code that will attempt to make 101 pooled connections in order to prove this behaviour?
I'm looking at sing pooled connections from NodeJs to Postgresql. I'll be using the Pool
class in the pg
library, along with async
/ await
.
I've read that Postgresql by default has a limit of 100 concurrent connections and the Pool
has a default of 10 pooled connections.
My app will scale up new instances as it es under heavy load, so I could theoretically end up with more than 10 instances, which would then exceed the 100 Postgresql max connections.
What I'd like to know is, what will happen when I execute await pool.Query(....)
under the following circumstances.
- All 10 pooled connections are currently in use - will it await one to bee available or throw an exception?
- All 100 connections to the DB server are in use and NodeJS tries to create a new connection from a pool.
Also, how can I write some NodeJS code that will attempt to make 101 pooled connections in order to prove this behaviour?
Share Improve this question edited Jan 19, 2022 at 0:07 Bergi 666k161 gold badges1k silver badges1.5k bronze badges asked Jan 18, 2022 at 23:29 Peter MorrisPeter Morris 23.3k12 gold badges97 silver badges166 bronze badges 1-
for (let i=0; i<11; i++) { const p = new Pool(); for (let j=0; j<11; j++) { p.query('SELECT true') } }
? – Bergi Commented Jan 19, 2022 at 0:02
1 Answer
Reset to default 5When all connections in a pool are reached, a new requestor will just block until someone else finishes, unless connectionTimeoutMillis is set then it will get a synthetic error after the specified timeout. This is documented
When all PostgreSQL max_connection are exhausted (by multiple pools for example), then attempts to get a connection will fail and will fail back through to the requestor. This does not seem to be documented, and one could imagine the Pool being more clever, by for example intercepting the error and making the client wait as if max for that pool had been reached (But in that case, what if it were the first connection that that pool tried to make? What would it be waiting for?), or retrying the connection periodically for one to bee available.
So you would be well advised not to allow this to happen, by limiting how far the app server can scale or increasing max_connections in Postgres or lowering max in each pool.
Which of these makes sense depends on the circumstances. There is no point scaling the app server at all if the bottleneck is entirely in the database.
本文标签: javascriptWhat happens when Postgresql Connection Pool is exhaustedStack Overflow
版权声明:本文标题:javascript - What happens when Postgresql Connection Pool is exhausted? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744495267a2608985.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论