admin管理员组文章数量:1414887
This is my first time developing a multi-tenant SaaS application in Django. In this SaaS each company has its own PostgreSQL database, and these databases are created dynamically when a company registers. I cannot predefine all databases in settings.DATABASES
, as companies can register at any time without requiring a server restart.
My current solution uses a Middleware to detect the company from the subdomain or a JWT token and then modify connections.databases
at runtime to configure the connection to the company's database:
import redis
from django.db import connections
from django.core.exceptions import ImproperlyConfigured
from django.utils.connection import ConnectionDoesNotExist
from rest_framework_simplejwt.authentication import JWTAuthentication
from myapp.models import Company # Company model stored in the global database
class CompanyDBMiddleware:
def __init__(self, get_response):
self.get_response = get_response
self.jwt_authenticator = JWTAuthentication()
self.cache = redis.Redis(host='localhost', port=6379, db=0)
def __call__(self, request):
company_db = self.get_database_for_company(request)
if not company_db:
raise ImproperlyConfigured("Could not determine the company's database.")
# Register connection only if it does not exist in `connections.databases`
if company_db not in connections.databases:
connections.databases[company_db] = {
'ENGINE': 'django.db.backends.postgresql',
'NAME': company_db,
'USER': 'postgres',
'PASSWORD': 'your_password',
'HOST': 'localhost',
'PORT': '5432',
'CONN_MAX_AGE': 60, # To avoid opening and closing connections on each request
}
requestpany_db = company_db
response = self.get_response(request)
# Close connection after the response
try:
connections[company_db].close()
except ConnectionDoesNotExist:
pass
return response
def get_database_for_company(self, request):
subdomain = request.get_host().split('.')[0]
company_db = None
cache_key = f"company_db_{subdomain}"
company_db = self.cache.get(cache_key)
if company_db:
return company_db.decode("utf-8")
try:
company = Company.objects.using('default').get(subdomain=subdomain, active=True)
company_db = company.db_name
self.cache.setex(cache_key, 300, company_db) # Cache the database name for 5 minutes
return company_db
except Company.DoesNotExist:
return None
My questions are:
- Is it correct to modify connections.databases dynamically on each request to handle multiple databases?
- Is there a better way to do this in Django without restarting the application when registering new databases?
- How does this practice affect performance in environments with load balancing and multiple Django instances?
- Would it be better to deploy a separate API per client in its own Django container?
- I am considering giving each client their own portal on a separate domain and only deploying their frontend in a container while keeping a centralized API. Is this approach more efficient?
I appreciate any recommendations on best practices or potential issues with this approach.
This is my first time developing a multi-tenant SaaS application in Django. In this SaaS each company has its own PostgreSQL database, and these databases are created dynamically when a company registers. I cannot predefine all databases in settings.DATABASES
, as companies can register at any time without requiring a server restart.
My current solution uses a Middleware to detect the company from the subdomain or a JWT token and then modify connections.databases
at runtime to configure the connection to the company's database:
import redis
from django.db import connections
from django.core.exceptions import ImproperlyConfigured
from django.utils.connection import ConnectionDoesNotExist
from rest_framework_simplejwt.authentication import JWTAuthentication
from myapp.models import Company # Company model stored in the global database
class CompanyDBMiddleware:
def __init__(self, get_response):
self.get_response = get_response
self.jwt_authenticator = JWTAuthentication()
self.cache = redis.Redis(host='localhost', port=6379, db=0)
def __call__(self, request):
company_db = self.get_database_for_company(request)
if not company_db:
raise ImproperlyConfigured("Could not determine the company's database.")
# Register connection only if it does not exist in `connections.databases`
if company_db not in connections.databases:
connections.databases[company_db] = {
'ENGINE': 'django.db.backends.postgresql',
'NAME': company_db,
'USER': 'postgres',
'PASSWORD': 'your_password',
'HOST': 'localhost',
'PORT': '5432',
'CONN_MAX_AGE': 60, # To avoid opening and closing connections on each request
}
requestpany_db = company_db
response = self.get_response(request)
# Close connection after the response
try:
connections[company_db].close()
except ConnectionDoesNotExist:
pass
return response
def get_database_for_company(self, request):
subdomain = request.get_host().split('.')[0]
company_db = None
cache_key = f"company_db_{subdomain}"
company_db = self.cache.get(cache_key)
if company_db:
return company_db.decode("utf-8")
try:
company = Company.objects.using('default').get(subdomain=subdomain, active=True)
company_db = company.db_name
self.cache.setex(cache_key, 300, company_db) # Cache the database name for 5 minutes
return company_db
except Company.DoesNotExist:
return None
My questions are:
- Is it correct to modify connections.databases dynamically on each request to handle multiple databases?
- Is there a better way to do this in Django without restarting the application when registering new databases?
- How does this practice affect performance in environments with load balancing and multiple Django instances?
- Would it be better to deploy a separate API per client in its own Django container?
- I am considering giving each client their own portal on a separate domain and only deploying their frontend in a container while keeping a centralized API. Is this approach more efficient?
I appreciate any recommendations on best practices or potential issues with this approach.
Share Improve this question edited Feb 21 at 10:11 VLAZ 29.2k9 gold badges63 silver badges84 bronze badges asked Feb 21 at 0:45 Br0k3nS0u1Br0k3nS0u1 857 bronze badges 1- i would only consider different database and django container in case if clients can be ver different in case of load. For example one tenant can be very small and other is very high load or even with rapid spikes of user activity to tens of thousands. In this case it may happen that your tenant can negatively affect availability of other tenants. I don't know details of your business, but 90% or more multitenant projects does not require separating infrastructure and can be done by adding a way to separate tenants in one db and project, you can even have multiple domain for one project – Alexandr Zayets Commented Feb 21 at 9:52
2 Answers
Reset to default 1You can use django-tenants library. Its designed for this purpose.
https://django-tenants.readthedocs.io/en/latest/
Django documentation explicitly states that you shouldn't be doing this:
You shouldn’t alter settings in your applications at runtime. The only place you should assign to settings is in a settings file.
Consider using a router, or writing your own database backend by extending django.db.backends.postgresql
.
版权声明:本文标题:python - Is it correct to modify `django.db.connections.databases` dynamically to multiple databases? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745171931a2646027.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论