admin管理员组文章数量:1122846
I am very new to Docker and I am trying to dockerize my Python server which can automatically pull llama3.1 by following this post
how to pull model automatically with container creation?
Here is my Dockerfile:
# Use the official Python image as the base image
FROM python:3.11
# Set environment variables
ENV PYTHONUNBUFFERED=1
# Install system dependencies
RUN apt-get update && apt-get install -y \
build-essential \
libgl1-mesa-glx \
libglib2.0-0 \
libsm6 \
libxext6 \
libxrender-dev \
curl \
&& apt-get clean
# Set the working directory
WORKDIR /app
# Copy the project files
COPY . .
# Install Python dependencies
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
# Expose the application port
EXPOSE 8000
# Run the app using uvicorn
CMD ["uvicorn", "PythonServer:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
compose.yaml
version: "3.8"
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
ports:
- "11434:11434"
pull_policy: always
tty: true
restart: always
entrypoint: ["/usr/bin/bash", "/entrypoint.sh"]
volumes:
- ./ollama/ollama:/root/.ollama
- ./entrypoint.sh:/entrypoint.sh
chroma:
image: chromadb/chroma:latest
container_name: chromadb
volumes:
- chromadb-data:/data # Persistent storage for ChromaDB
python-app:
build:
context: .
dockerfile: Dockerfile
container_name: python-app
ports:
- "8000:8000"
volumes:
- .:/app # Mount the current directory for live updates
- /app/__pycache__ # Ignore cache files
depends_on:
- chroma
- ollama
volumes:
chromadb-data:
and entrypoint.sh
#!/bin/bash
# Start Ollama in the background.
/bin/ollama serve &
# Record Process ID.
pid=$!
# Pause for Ollama to start.
sleep 5
echo "
本文标签:
dockerPulling llama31 for Python applicationStack Overflow
版权声明:本文标题:docker - Pulling llama3.1 for Python application - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人,
转载请联系作者并注明出处:http://www.betaflare.com/web/1736305305a1932544.html,
本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论