Skip to content

[LTX2] Add LTX-2 Attention and Spatio-Temporal RoPE#342

Open
Perseus14 wants to merge 2 commits intomainfrom
ltx2-attention
Open

[LTX2] Add LTX-2 Attention and Spatio-Temporal RoPE#342
Perseus14 wants to merge 2 commits intomainfrom
ltx2-attention

Conversation

@Perseus14
Copy link
Collaborator

@Perseus14 Perseus14 commented Mar 3, 2026

This PR implements the core attention mechanism and specialized RoPE for the LTX-2.0 model (Video/Audio). This implementation ensures parity with the reference PyTorch/Diffusers logic, specifically handling the multi-modal coordinate generation and interleaved/split rotary application.

Test: https://screenshot.googleplex.com/6KZP7yTFne8eszv

@Perseus14 Perseus14 requested a review from entrpn as a code owner March 3, 2026 12:47
@github-actions
Copy link

github-actions bot commented Mar 3, 2026

@Perseus14 Perseus14 changed the title [LTX2] Add ltx2 attention block [LTX2] Add LTX-2 Attention and Spatio-Temporal RoPE Mar 3, 2026
@Perseus14 Perseus14 requested a review from mbohlool March 3, 2026 13:32
@Perseus14 Perseus14 force-pushed the ltx2-attention branch 5 times, most recently from 40dd0b7 to d7ce12c Compare March 3, 2026 19:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants