admin管理员组文章数量:1406926
I spun up a cloud server and put my flask web app code plus a frontend in there. When a user runs an operation on the frontend this gets sent to the backend endpoint as a JSON, then processed and another JSON gets sent back to the frontend. I want to say that I am new to this and seeking advice to best handle situations like 100+ simultaneous users.
I can imagine one user making a call to the flask endpoint, that gets processed and sent back to the frontend - but what if 100+ users are hitting this same endpoint? Would it be a first come first serve situation, like a queue? But what about the possibility of two people hitting the endpoint at the exact same time? I am using Gunicorn instead of the default flask server (via flask run).
I am new to these concepts, but I am wondering how these might apply (if at all ) to my situation and if so how I'd go about integrating them into my code/infrastructure:
- Queuing
- Caching
- Asychronous
- Multi-processing
- Multi-threading
- Concurrency
- Parallelism
- Distributed systems
If anyone could please add clarity it would be greatly appreciated. If there are any monitoring tools I could attach to this application that would also be great.
Thank you
本文标签: pythonHow to make flask asynchronous to handle multiple users at the same timeStack Overflow
版权声明:本文标题:python - How to make flask asynchronous to handle multiple users at the same time? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744280607a2598632.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论