admin管理员组文章数量:1122832
as the title says, I have rabbitmq on a server and 6 machines connected to it with celery on them
machine 1: [2024-11-21 19:15:12,181: INFO/MainProcess] Task tasks.task_name[ef00cc1f-1be5-44ba-8911-90c0746196ba] received
machine 2: [2024-11-21 19:04:29,949: INFO/MainProcess] Task tasks.task_name[ef00cc1f-1be5-44ba-8911-90c0746196ba] received
celery config:
import os
from celery import Celery
import settings
import sys
sys.path.insert(0, f"{settings.BASE_DIR.parent}/")
broker_url = "amqp://username:password@machine_ip:rabbitmq_port/"
app = Celery(broker=broker_url)
app.conf.update(
task_acks_late=True,
broker_transport_options={'visibility_timeout': 3600}, # Adjust timeout as needed
broker_connection_retry_on_startup = True,
imports = ['tasks']
)
app.autodiscover_tasks()
# celery -A celery_app worker --pool=solo -Q queue_name -l info --logfile logs/celery.logs
I'm also using docker compose for rabbitmq
config:
rabbitmq:
image: "rabbitmq:3-management"
ports:
- "5677:5672"
- "15677:15672"
environment:
RABBITMQ_DEFAULT_USER: "username"
RABBITMQ_DEFAULT_PASS: "password"
RABBITMQ_DEFAULT_VHOST: "/"
volumes:
- ./rabbitmq_data:/var/lib/rabbitmq
celery task:
@app.task(queue="queue_name")
def task_name(batch_s3_path, task_id, api_data, index):
pass
what am I missing? why is this happening?
本文标签: pythonCelery rabbitmqmultiple consumers consuming the same taskStack Overflow
版权声明:本文标题:python - Celery rabbitmq, multiple consumers consuming the same task - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736307551a1933349.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论