admin管理员组

文章数量:1406926

I spun up a cloud server and put my flask web app code plus a frontend in there. When a user runs an operation on the frontend this gets sent to the backend endpoint as a JSON, then processed and another JSON gets sent back to the frontend. I want to say that I am new to this and seeking advice to best handle situations like 100+ simultaneous users.

I can imagine one user making a call to the flask endpoint, that gets processed and sent back to the frontend - but what if 100+ users are hitting this same endpoint? Would it be a first come first serve situation, like a queue? But what about the possibility of two people hitting the endpoint at the exact same time? I am using Gunicorn instead of the default flask server (via flask run).

I am new to these concepts, but I am wondering how these might apply (if at all ) to my situation and if so how I'd go about integrating them into my code/infrastructure:

  1. Queuing
  2. Caching
  3. Asychronous
  4. Multi-processing
  5. Multi-threading
  6. Concurrency
  7. Parallelism
  8. Distributed systems

If anyone could please add clarity it would be greatly appreciated. If there are any monitoring tools I could attach to this application that would also be great.

Thank you

本文标签: pythonHow to make flask asynchronous to handle multiple users at the same timeStack Overflow