Skip to content

Fix flaky test_gradient_flow in JambaEHR (#855)#857

Merged
jhnwu3 merged 1 commit intosunlabuiuc:masterfrom
joshuasteier:fix/jamba-gradient-test
Feb 16, 2026
Merged

Fix flaky test_gradient_flow in JambaEHR (#855)#857
jhnwu3 merged 1 commit intosunlabuiuc:masterfrom
joshuasteier:fix/jamba-gradient-test

Conversation

@joshuasteier
Copy link
Collaborator

Fixes #855

Problem: test_gradient_flow in TestJambaLayer fails intermittently because random initialization can produce zero gradients through Mamba's selective scan path.

Fix: Added torch.manual_seed(42) before layer construction and input generation to make the test deterministic. Verified stable across multiple consecutive runs.

Files changed: tests/core/test_jamba_ehr.py (1 line added)

@joshuasteier joshuasteier force-pushed the fix/jamba-gradient-test branch from e51e62d to 3628272 Compare February 16, 2026 18:09
Copy link
Collaborator

@jhnwu3 jhnwu3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@jhnwu3 jhnwu3 merged commit 3d11e3b into sunlabuiuc:master Feb 16, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

The core.test_jamba_ehr.TestJambaLayer.test_gradient_flow unit test is unstable

3 participants