-
Notifications
You must be signed in to change notification settings - Fork 1
Document vllm base image version and update process #15
Copy link
Copy link
Open
Labels
P2: MediumMedium priority - fix when possibleMedium priority - fix when possibledocumentationImprovements or additions to documentationImprovements or additions to documentation
Description
Problem
In Dockerfile:23, the vllm base image is pinned to a specific SHA:
FROM vllm/vllm-openai@sha256:014a95f21c9edf6abe0aea6b07353f96baa4ec291c427bb1176dc7c93a85845cHowever:
- There's no comment indicating what vllm version this corresponds to
- No documentation on when it was last updated
- No process documented for updating it
- No way to verify the SHA hasn't been compromised
Impact
Medium - Makes it difficult to:
- Understand what vllm features are available
- Coordinate updates with vllm releases
- Debug version-specific issues
- Audit the supply chain
Solution
1. Add version documentation to Dockerfile
# Stage 2: Runtime image
# GPU attestation requires pynvml (needs CUDA), so use vllm base image
# vllm version: v0.6.3 (2024-11-15)
# Image digest verified: 2024-12-01
# See: https://hub.docker.com/r/vllm/vllm-openai/tags
FROM vllm/vllm-openai@sha256:014a95f21c9edf6abe0aea6b07353f96baa4ec291c427bb1176dc7c93a85845c2. Create update documentation
Create docs/DOCKER_UPDATES.md:
# Docker Image Updates
## vllm Base Image
The proxy uses the official vllm-openai image for GPU support and CUDA libraries.
### Current Version
- **Image**: `vllm/vllm-openai`
- **Tag**: `latest` (pinned to SHA)
- **SHA256**: `014a95f21c9edf6abe0aea6b07353f96baa4ec291c427bb1176dc7c93a85845c`
- **vllm version**: v0.6.3
- **Last updated**: 2024-12-01
- **Updated by**: @username
### Update Process
1. Check for new vllm releases: https://github.com/vllm-project/vllm/releases
2. Find the corresponding Docker image:
```bash
# List recent tags
crane ls vllm/vllm-openai
# Get SHA for a tag
crane digest vllm/vllm-openai:v0.6.3-
Verify the image locally:
docker pull vllm/vllm-openai@sha256:NEW_SHA docker run --rm vllm/vllm-openai@sha256:NEW_SHA --version
-
Test with the new image:
# Update Dockerfile with new SHA docker build -t inference-proxy:test . docker run --rm inference-proxy:test --help # Run integration tests cargo test
-
Update Dockerfile and this file with:
- New SHA
- vllm version
- Update date
- Your GitHub handle
-
Create PR with changelog entry
Compatibility Notes
- vllm >= v0.6.0 required for tool calling support
- GPU attestation requires CUDA 12.1+ (provided by vllm image)
- Python 3.10+ required for nv-attestation-sdk
Security
The SHA pin ensures:
- Reproducible builds
- Protection against tag mutation
- Ability to audit exactly what's deployed
To verify image integrity:
docker pull vllm/vllm-openai@sha256:014a95f2...
cosign verify vllm/vllm-openai@sha256:014a95f2... # If signed
### 3. Add automated check for outdated image
Add to CI (`.github/workflows/docker-check.yml`):
```yaml
name: Check Docker Image Updates
on:
schedule:
# Run monthly on the 1st at midnight UTC
- cron: '0 0 1 * *'
workflow_dispatch:
jobs:
check_base_image:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check for vllm updates
run: |
CURRENT_SHA="014a95f21c9edf6abe0aea6b07353f96baa4ec291c427bb1176dc7c93a85845c"
LATEST_SHA=$(crane digest vllm/vllm-openai:latest)
if [ "$CURRENT_SHA" != "$LATEST_SHA" ]; then
echo "::warning::vllm base image is outdated. Current: $CURRENT_SHA, Latest: $LATEST_SHA"
fi
File Locations
Dockerfile:23- Add version commentdocs/DOCKER_UPDATES.md- New file with update process.github/workflows/docker-check.yml- Automated update check
References
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
P2: MediumMedium priority - fix when possibleMedium priority - fix when possibledocumentationImprovements or additions to documentationImprovements or additions to documentation