admin管理员组

文章数量:1122832

I'm trying to setup an ELK stack (kibana, logstash and elastic search) in portainer which should receive loggings from pc's around the world.

The thing I'm not sure is how the proper setup should look like, so the performance for the clients is good.

Lets say portainer is running in Europe in a docker container and there are users in America, Europe, Australia and Asia.

What would be a proper setup? I guess I need a server on each continent in my stack but how do I redirect the loggings to the "fastest" endpoint?

Would be great if anyone could refer me to some keywords and articles where I can find a solution.

Right now I have the ELK stack on my local computer with docker and send UDP messages (json content) to my stack. Here is my docker-compose.yml

setup:
profiles:
  - setup
build:
  context: setup/
  args:
    ELASTIC_VERSION: ${ELASTIC_VERSION}
init: true
volumes:
  - ./setup/entrypoint.sh:/entrypoint.sh:ro,Z
  - ./setup/lib.sh:/lib.sh:ro,Z
  - ./setup/roles:/roles:ro,Z
environment:
  ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-}
  LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-}
  KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-}
  METRICBEAT_INTERNAL_PASSWORD: ${METRICBEAT_INTERNAL_PASSWORD:-}
  FILEBEAT_INTERNAL_PASSWORD: ${FILEBEAT_INTERNAL_PASSWORD:-}
  HEARTBEAT_INTERNAL_PASSWORD: ${HEARTBEAT_INTERNAL_PASSWORD:-}
  MONITORING_INTERNAL_PASSWORD: ${MONITORING_INTERNAL_PASSWORD:-}
  BEATS_SYSTEM_PASSWORD: ${BEATS_SYSTEM_PASSWORD:-}
networks:
  - elk
depends_on:
  - elasticsearch

elasticsearch:
build:
  context: elasticsearch/
  args:
    ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
  - ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro,Z
  - elasticsearch:/usr/share/elasticsearch/data:Z
ports:
  - 9200:9200
  - 9300:9300
environment:
  node.name: elasticsearch
  ES_JAVA_OPTS: -Xms512m -Xmx512m
  ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-}
  discovery.type: single-node
networks:
  - elk
restart: unless-stopped

logstash:
build:
  context: logstash/
  args:
    ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
  - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro,Z
  - ./logstash/pipeline:/usr/share/logstash/pipeline:ro,Z
ports:
  - 5044:5044/udp
  - 50000:50000/tcp
  - 9600:9600
environment:
  LS_JAVA_OPTS: -Xms256m -Xmx256m
  LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-}
networks:
  - elk
depends_on:
  - elasticsearch
restart: unless-stopped

kibana:
build:
  context: kibana/
  args:
    ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
  - ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro,Z
ports:
  - 5601:5601
environment:
  KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-}
networks:
  - elk
depends_on:
  - elasticsearch
restart: unless-stopped

networks:
  elk:
    driver: bridge

volumes:
  elasticsearch:

本文标签: ELK Stack with Docker and multiple server for loggingStack Overflow