tommytracx commited on
Commit
d2c5505
·
verified ·
1 Parent(s): f361dc7

Upload 4 files

Browse files
Files changed (2) hide show
  1. Dockerfile +11 -3
  2. README.md +10 -6
Dockerfile CHANGED
@@ -2,11 +2,15 @@ FROM python:3.11-slim
2
 
3
  WORKDIR /app
4
 
5
- # Install system dependencies
6
  RUN apt-get update && apt-get install -y \
7
  curl \
 
8
  && rm -rf /var/lib/apt/lists/*
9
 
 
 
 
10
  # Copy requirements and install Python dependencies
11
  COPY requirements.txt .
12
  RUN pip install --no-cache-dir -r requirements.txt
@@ -14,6 +18,10 @@ RUN pip install --no-cache-dir -r requirements.txt
14
  # Copy application code
15
  COPY . .
16
 
 
 
 
 
17
  # Expose port
18
  EXPOSE 7860
19
 
@@ -21,5 +29,5 @@ EXPOSE 7860
21
  HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
22
  CMD curl -f http://localhost:7860/health || exit 1
23
 
24
- # Run the application
25
- CMD ["gunicorn", "--bind", "0.0.0.0:7860", "--workers", "1", "--timeout", "120", "app:app"]
 
2
 
3
  WORKDIR /app
4
 
5
+ # Install system dependencies including Ollama
6
  RUN apt-get update && apt-get install -y \
7
  curl \
8
+ wget \
9
  && rm -rf /var/lib/apt/lists/*
10
 
11
+ # Install Ollama
12
+ RUN curl -fsSL https://ollama.ai/install.sh | sh
13
+
14
  # Copy requirements and install Python dependencies
15
  COPY requirements.txt .
16
  RUN pip install --no-cache-dir -r requirements.txt
 
18
  # Copy application code
19
  COPY . .
20
 
21
+ # Create a startup script
22
+ RUN echo '#!/bin/bash\nollama serve &\nsleep 5\npython3 -m gunicorn --bind 0.0.0.0:7860 --workers 1 --timeout 120 app:app' > /app/start.sh && \
23
+ chmod +x /app/start.sh
24
+
25
  # Expose port
26
  EXPOSE 7860
27
 
 
29
  HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
30
  CMD curl -f http://localhost:7860/health || exit 1
31
 
32
+ # Run the startup script
33
+ CMD ["/app/start.sh"]
README.md CHANGED
@@ -25,9 +25,8 @@ A Hugging Face Space that provides a REST API interface for Ollama models, allow
25
 
26
  1. Fork this repository or create a new Space
27
  2. Upload these files to your Space
28
- 3. Set the following environment variables in your Space settings:
29
- - `OLLAMA_BASE_URL`: URL to your Ollama instance (e.g., `http://localhost:11434`)
30
- - `ALLOWED_MODELS`: Comma-separated list of allowed models (optional)
31
 
32
  ### 2. Local Development
33
 
@@ -39,8 +38,11 @@ cd ollama-space
39
  # Install dependencies
40
  pip install -r requirements.txt
41
 
42
- # Set environment variables
43
- export OLLAMA_BASE_URL=http://localhost:11434
 
 
 
44
 
45
  # Run the application
46
  python app.py
@@ -121,10 +123,12 @@ Health check endpoint.
121
 
122
  ### Environment Variables
123
 
124
- - `OLLAMA_BASE_URL`: URL to your Ollama instance (default: `http://localhost:11434`)
125
  - `MODELS_DIR`: Directory for storing models (default: `/models`)
126
  - `ALLOWED_MODELS`: Comma-separated list of allowed models (default: all models)
127
 
 
 
128
  ### Supported Models
129
 
130
  By default, the following models are allowed:
 
25
 
26
  1. Fork this repository or create a new Space
27
  2. Upload these files to your Space
28
+ 3. **No environment variables needed** - Ollama runs inside the Space!
29
+ 4. Wait for the build to complete (may take 10-15 minutes due to Ollama installation)
 
30
 
31
  ### 2. Local Development
32
 
 
38
  # Install dependencies
39
  pip install -r requirements.txt
40
 
41
+ # Install Ollama locally
42
+ curl -fsSL https://ollama.ai/install.sh | sh
43
+
44
+ # Start Ollama in another terminal
45
+ ollama serve
46
 
47
  # Run the application
48
  python app.py
 
123
 
124
  ### Environment Variables
125
 
126
+ - `OLLAMA_BASE_URL`: URL to your Ollama instance (default: `http://localhost:11434` - **Ollama runs inside this Space!**)
127
  - `MODELS_DIR`: Directory for storing models (default: `/models`)
128
  - `ALLOWED_MODELS`: Comma-separated list of allowed models (default: all models)
129
 
130
+ **Note**: This Space now includes Ollama installed directly inside it, so you don't need an external Ollama instance!
131
+
132
  ### Supported Models
133
 
134
  By default, the following models are allowed: