admin管理员组

文章数量:1122832

as the title says, I have rabbitmq on a server and 6 machines connected to it with celery on them

machine 1: [2024-11-21 19:15:12,181: INFO/MainProcess] Task tasks.task_name[ef00cc1f-1be5-44ba-8911-90c0746196ba] received

machine 2: [2024-11-21 19:04:29,949: INFO/MainProcess] Task tasks.task_name[ef00cc1f-1be5-44ba-8911-90c0746196ba] received

celery config:

import os
from celery import Celery
import settings
import sys
sys.path.insert(0, f"{settings.BASE_DIR.parent}/")


broker_url = "amqp://username:password@machine_ip:rabbitmq_port/"

app = Celery(broker=broker_url)

app.conf.update(
    task_acks_late=True,
    broker_transport_options={'visibility_timeout': 3600},  # Adjust timeout as needed
    broker_connection_retry_on_startup = True,
    imports = ['tasks']
)

app.autodiscover_tasks()


# celery -A celery_app worker --pool=solo -Q queue_name -l info --logfile logs/celery.logs

I'm also using docker compose for rabbitmq

config:

rabbitmq:
  image: "rabbitmq:3-management"
  ports:
    - "5677:5672"
    - "15677:15672"
  environment:
      RABBITMQ_DEFAULT_USER: "username"
      RABBITMQ_DEFAULT_PASS: "password"
      RABBITMQ_DEFAULT_VHOST: "/"
  volumes:
    - ./rabbitmq_data:/var/lib/rabbitmq

celery task:

@app.task(queue="queue_name")
def task_name(batch_s3_path, task_id, api_data, index):
    pass

what am I missing? why is this happening?

本文标签: pythonCelery rabbitmqmultiple consumers consuming the same taskStack Overflow