Record: 0.3212 BPB — Complementary N-gram 65K + Int5 GPTQ + LoRA TTT#850
Closed
callithyia wants to merge 1 commit intoopenai:mainfrom
Closed
Record: 0.3212 BPB — Complementary N-gram 65K + Int5 GPTQ + LoRA TTT#850callithyia wants to merge 1 commit intoopenai:mainfrom
callithyia wants to merge 1 commit intoopenai:mainfrom
Conversation
3-seed mean 0.3212 (std 0.0003). Complementary training + order-9 n-gram eval cache with 65K-token chunks + Full Hessian GPTQ int5 + LoRA TTT with Polyak averaging.
Contributor
|
Thanks for your submission! Unfortunately, it's disallowed due to the use of hashed n-gram caches, which do not renormalize correctly / correctly reweight the LM's token distribution, look ahead to the target token to mix probabilities and therefore leak eval tokens. Please refer to the long discussion about this under the issues tab for more details, and please submit more runs in the future! |
newjordan
pushed a commit
to newjordan/parameter-golf-1
that referenced
this pull request
Mar 28, 2026
Phrase cache (PR openai#880 / PR openai#900 — proven +0.1 BPB, legal): - Variable-length suffix matching at 48/36/28/20/16 token probe lengths - One ctx+full count table pair per probe length (4M buckets each) - 48-prime XOR hash — unique prime per context position up to length 48 - Dirichlet smoothing: p=(min(fc,cc)+c*neural)/(ctx+c), c=2.0 - Applied inline after n-gram mixing, before NLL conversion - Score-first: tables updated with chunk tokens AFTER all scoring done RegimeTracker (PR openai#880): - Tracks match rate + token diversity over rolling 4096-token window - Adapts effective phrase concentration: repetitive/boilerplate content → lower c (more cache trust); novel prose → higher c (more neural trust) - Multiplier range [0.7, 1.5], effective_c = base_c / mult Config improvements: - WARMDOWN_ITERS=2000 (confirmed best from A/B sweep) - NGRAM_CHUNK_TOKENS=65536 (PR openai#850, 15x more cache refreshes vs 1M) - MATRIX_LR=0.03 (PR openai#859) ARTIFACT_NGRAM=0 remains disabled (legally gray). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Results (8xH100 SXM)
Key Techniques
Compliance
Credits
Built on: PR #809 (n-gram cache), PR #803 (complementary training), PR #798 (entropy centers, Polyak TTT), PR #840 (65K chunks), PR #779 (integrated eval), PR #414 (GPTQ baseline).