Skip to content

Conversation

@ehsk
Copy link
Collaborator

@ehsk ehsk commented Dec 23, 2025

This PR upgrades vLLM from 0.8.5.post1 to 0.11.2. Other notable upgrades as a result of this change is torch upgraded to 2.9.0, transformers to 4.57.x and flash-attention to 2.8.3

The vLLM upgrade is needed for Apriel multi-modal training (#111), using new tool parsers, and newer models.

For weight updates in vLLM v1, I followed https://github.com/vllm-project/vllm/blob/v0.11.2/examples/offline_inference/rlhf_utils.py.

@ehsk ehsk requested a review from rafapi December 23, 2025 14:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants