admin管理员组

文章数量:1332889

I’m building a dropshipping e-commerce site and using Python with Flask as the backend framework. The site uses APIs to fetch product details from third-party providers and display them on the frontend.

While testing, I noticed that the API response time is significantly slowing down the page load. Here’s an outline of my current setup:

I use requests to fetch product data from the provider’s API. Responses are processed in the same Flask route that serves the frontend page. The backend is deployed on a shared hosting server.

Adding caching using Flask-Caching with a timeout of 10 minutes. Optimizing database queries using indexing. Enabling gzip compression for API responses. Despite these efforts, the overall page speed is still suboptimal, especially during peak hours.

I’m building a dropshipping e-commerce site and using Python with Flask as the backend framework. The site uses APIs to fetch product details from third-party providers and display them on the frontend.

While testing, I noticed that the API response time is significantly slowing down the page load. Here’s an outline of my current setup:

I use requests to fetch product data from the provider’s API. Responses are processed in the same Flask route that serves the frontend page. The backend is deployed on a shared hosting server.

Adding caching using Flask-Caching with a timeout of 10 minutes. Optimizing database queries using indexing. Enabling gzip compression for API responses. Despite these efforts, the overall page speed is still suboptimal, especially during peak hours.

Share Improve this question edited Nov 21, 2024 at 8:55 VLAZ 29.1k9 gold badges63 silver badges84 bronze badges asked Nov 21, 2024 at 5:37 Deepak RaoDeepak Rao 1 1
  • With these information we can't do much more than guessing. You will have to find the bottleneck first. A tracing tool like ZipKin might come handy. If you find the bottleneck to be your code, it might qualify as a Stack Overflow or Code Review question when presented with the relevant code. – Klaus D. Commented Nov 21, 2024 at 5:49
Add a comment  | 

1 Answer 1

Reset to default 0

Based on what you've described, I have 2 options for you to try:

  1. Make sure that your database queries do not take place within any loops, as this will greatly impact performance. Rather do a few single, bigger queries, then create a dictionary for each query containing the results of each query, and then a final dictionary to store the results of the combination of the queries. This is how it should look:
def some_route_function():
    query1 = Table.query.filter(some_filter).all()
    query2 = OtherTable.query.filter(some_other_filter).all()
    query1_dict = {}
    query2_dict = {}

    for row in query1:
        row_obj = {
            'some_key': row.some_col,
            'some_f_key': row.some_f_col
        }
        query1_dict[row.some_ref_col] = row_obj

    for row in query2:
        row_obj = {
            'some_key': row.some_col,
            'some_f_key': row.some_f_col
        }
        query2_dict[row.some_ref_col] = row_obj

    # now both queries are accessible using your specified key or data point
    # so you can now create a result dictionary
    result_dict = {}

    for row in query1:
        ref_data_point = row.some_ref_col
        if ref_data_point in query2_dict.keys():
            result_dict[ref_data_point] = {}
            result_dict[ref_data_point]['some_key'] = some_common_value
            result_dict[ref_data_point]['some_other_key'] = some_other_common_value

    # now the result dictionary can be used as needed

Please note that this is really not an optimized example, it's to give you an idea of how you can potentially speed up your result building process by avoiding queries in loops. You still need to optimize for your use case, like in my example, based on how I'm getting the results built, I can remove the loop that builds the query1 dictionary since I don't reference it in the result building loop. But the concept will help to speed up the response time if you are using loops to query data (there's not enough info for me to know if you are or not since you only mentioned optimizing the DB with indexing, which won't help if you query inside a loop)

  1. You mentioned fetching data that comes from a 3rd-party provider, so those could either be thrown into threads to simultaneously fetch the data and your route can wait for all of them to finish. That should already speed up your response time if you have multiple requests in one route.

Otherwise you can see what processing can be offloaded to the client side by trying to perform the product requests on the client side using some JavaScript (I know you mention Python and Flask, but you likely have some amount of JS on your frontend pages anyways), and making the templates of your flask app smarter to accommodate data coming in on the fly. But this is dependent on how your data is supposed to be displayed on your front end.

BONUS:

  • Look at using something like nginx in combo with gunicorn to serve your flask app. gunicorn will take care of handling requests by offloading the requests to worker processes that are much faster than just running your flask app directly. Nginx will act as a proxy and load balancer.

Please do let me know if you've tried all of these and I'll see if I can help with some additional suggestions.

本文标签: