Skip to content

Commit ba33e88

Browse files
authored
Reapply "Install pre-built xformers-0.0.32.post2 built with pt-2.9.0" (#27768)
Signed-off-by: Huy Do <huydhn@gmail.com>
1 parent 33a0ea5 commit ba33e88

File tree

2 files changed

+2
-9
lines changed

2 files changed

+2
-9
lines changed

docker/Dockerfile

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -361,13 +361,6 @@ RUN --mount=type=bind,from=build,src=/workspace/dist,target=/vllm-workspace/dist
361361
&& uv pip install --system dist/*.whl --verbose \
362362
--extra-index-url ${PYTORCH_CUDA_INDEX_BASE_URL}/cu$(echo $CUDA_VERSION | cut -d. -f1,2 | tr -d '.')
363363

364-
# TODO (huydhn): Remove this once xformers is released for 2.9.0
365-
RUN --mount=type=cache,target=/root/.cache/uv bash - <<'BASH'
366-
. /etc/environment
367-
export TORCH_CUDA_ARCH_LIST='7.5 8.0+PTX 9.0a'
368-
uv pip install --system --no-build-isolation "git+https://github.com/facebookresearch/xformers@v0.0.32.post2"
369-
BASH
370-
371364
# Install FlashInfer pre-compiled kernel cache and binaries
372365
# https://docs.flashinfer.ai/installation.html
373366
RUN --mount=type=cache,target=/root/.cache/uv \

requirements/cuda.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ torch==2.9.0
99
torchaudio==2.9.0
1010
# These must be updated alongside torch
1111
torchvision==0.24.0 # Required for phi3v processor. See https://github.com/pytorch/vision?tab=readme-ov-file#installation for corresponding version
12-
# https://github.com/facebookresearch/xformers/releases/tag/v0.0.32.post1
13-
# xformers==0.0.32.post1; platform_system == 'Linux' and platform_machine == 'x86_64' # Requires PyTorch >= 2.8
12+
# Build from https://github.com/facebookresearch/xformers/releases/tag/v0.0.32.post1
13+
xformers==0.0.33+5d4b92a5.d20251029; platform_system == 'Linux' and platform_machine == 'x86_64' # Requires PyTorch >= 2.9
1414
# FlashInfer should be updated together with the Dockerfile
1515
flashinfer-python==0.4.1

0 commit comments

Comments
 (0)