Open
Description
When starting, I've noticed text:
Namespace(share=False, server='0.0.0.0', port=None, inbrowser=False)
Free VRAM 11.4376220703125 GB
High-VRAM Mode: False
I would interpret this as using one gpu (30xx series, 12gb). However, I started this in docker (having docker nvidia container working), and --gpus passed.
Since i have 2 gpus, used by ollama in same configuration, I was expecting some other text.
So, will it support multiple GPUs in this configuration?
In any case, when everything started to be downloaded, I stopped, since it look like tens od GB, so I will wait for official docker compose so I don't need to redo it again.
[edit]: my setup was in docker in linux
if anyone is interested, my patched version - just to try it (and based on some pull i've seen here)
# Use the Python 3.10 slim image as the base
FROM python:3.10-slim
# Set the working directory to /app
WORKDIR /app
# Install necessary system dependencies (for OpenCV and other libraries)
RUN apt-get update && apt-get install -y \
libsm6 \
libxext6 \
libgl1 \
libglib2.0-0 \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
# Install PyTorch with CUDA support (choose the appropriate CUDA version as needed)
RUN pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124
# Copy the requirements.txt file and install Python dependencies
COPY requirements.txt .
RUN pip install -r requirements.txt
# Copy the rest of the application code into the container
COPY . .
# Expose port for Gradio (if you are using Gradio for serving the model)
EXPOSE 7860
# Set the default command to run the demo script
CMD ["python", "demo_gradio.py", "--server", "0.0.0.0"]
docker build -t framepack-container .
docker run --gpus all -it -p 7860:7860 --name framepack-dev2 framepack-container
Metadata
Metadata
Assignees
Labels
No labels