Integrate field and type lookups in compare and simplify command flow#110
Merged
Integrate field and type lookups in compare and simplify command flow#110
Conversation
Member
Author
|
This change is part of the following stack:
Change managed by git-spice. |
This was referenced Feb 27, 2026
a7a6ad1 to
df3887b
Compare
5149087 to
475820b
Compare
df3887b to
faede88
Compare
475820b to
97c231d
Compare
Zaid-Ajaj
approved these changes
Mar 2, 2026
faede88 to
4dbfe54
Compare
97c231d to
7975311
Compare
corymhall
added a commit
that referenced
this pull request
Mar 2, 2026
## Summary This PR adds field-path and type-equivalence lookup helpers for normalization decisions. These helpers provide conservative, metadata-backed matching used to classify rename and maxItems-style transitions without schema rewriting. ## What Changed - Added field history flattening and lookup utilities. - Added resolver helpers for equivalent type transitions. - Added targeted fixtures for nested/coexistence maxItems scenarios. - Added extensive tests for path resolution, ambiguity handling, and equivalence checks. ## Why Token lookup alone is not enough for field/type transition classification. Compare-time normalization needs explicit field/type lookup primitives before integration. ## Context - Builds on token lookup primitives from #107. - Used by integration work in #110 and #111. ## Testing - `go test ./internal/normalize -count=1`
corymhall
added a commit
that referenced
this pull request
Mar 2, 2026
## Summary This PR wires token lookup outcomes into compare change generation. Missing/add/remap decisions now use metadata-backed token resolution in the engine path rather than post-processing. ## What Changed - Integrated token resolution into compare engine matching decisions. - Added typed normalization attribution for token-driven outcomes. - Added handling for retained aliases with canonical add/remap behavior. - Updated compare/json/text/summary tests to assert token-resolution behavior. - Threaded metadata through compare options where needed for lookup decisions. ## Why Token mapping decisions belong where canonical changes are generated. This reduces downstream synthetic behavior and keeps classification deterministic. ## Context - Uses lookup primitives from #107 and #108. - Prepares command/field-type integration cleanup in #110. ## Testing - `go test ./internal/compare ./compare -count=1`
Squash details: - Branch: st-ac3.17.3.6-field-type-cli-cleanup - Base: st-ac3.17.3.5-integrate-token-lookups - Squashed commits: 2 Original commits: - 55153e0 Integrate field/type lookup in compare engine and drop cmd glue assumptions - dd2d83a Harden type equivalence suppression for map value shapes
7975311 to
f14979c
Compare
corymhall
added a commit
that referenced
this pull request
Mar 2, 2026
…111) ## Summary This PR improves rename-aware and maxItems-related transition handling using real AWS fixture coverage. It focuses on turning noisy or split diagnostics into clearer single-cause type/rename messages where metadata evidence supports that outcome. ## What Changed - Added real AWS normalization fixture tests and fixture data. - Added/updated normalization helpers for rename-aware field and type transitions. - Refined compare engine handling for ref/array boundary transitions. - Added coverage for type-ref token parsing and related normalization edge cases. - Updated supporting normalize/compare tests and helper wiring needed by these scenarios. ## Why Real fixture coverage is required to validate that normalization behavior matches expected user-facing diagnostics. This hardens behavior before final golden/harness consolidation. ## Context - Builds on engine integration from #110. - Feeds the final golden/harness pass in #112. > Note: I had to upgrade to Go 1.25 in this since I brought in new dependencies ## Testing - `go test ./internal/normalize ./internal/compare ./compare -count=1`
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR integrates field/type lookup decisions into compare generation and simplifies command-layer behavior accordingly.
It removes reliance on downstream shaping assumptions by making the engine classification path authoritative.
What Changed
Why
Field/type normalization decisions should happen during change generation, not after rendering.
This keeps output deterministic and reduces fragile command-layer logic.
Context
Testing
go test ./internal/compare ./compare ./internal/cmd -count=1