fix: Preserve numRows for zero-column RecordBatch in IPC#402
fix: Preserve numRows for zero-column RecordBatch in IPC#402rustyconover wants to merge 1 commit intoapache:mainfrom
Conversation
|
@raulcd can you please trigger the CI? |
e6e5d97 to
a829ad9
Compare
When a zero-column RecordBatch is deserialized from IPC, ensureSameLengthData in the RecordBatch constructor recomputes length from children via chunks.reduce((max, col) => Math.max(max, col.length), 0). With zero children, this always returns 0 — discarding the original length from the IPC message header. Pass this.data.length to ensureSameLengthData as the explicit maxLength parameter, which the function already accepts as an optional third argument. For batches with columns, this.data.length already matches the max column length, so there is no behavior change. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
a829ad9 to
bc6f6eb
Compare
There was a problem hiding this comment.
Can we create this file by our writer in test instead of adding this file to the repository?
There was a problem hiding this comment.
Agreed, we should generate this file on-demand instead of checking it in. We'll need to rebase it out of the commit history before merging this.
There was a problem hiding this comment.
Pull request overview
Fixes Arrow JS RecordBatch construction so that zero-column batches preserve numRows when deserialized from IPC, matching other Arrow implementations and the IPC message header.
Changes:
- Pass
this.data.lengthas the explicitmaxLengthtoensureSameLengthData()in the[Schema, Data]RecordBatchconstructor path. - Add unit tests covering PyArrow interop, JS IPC round-trip, and direct constructor behavior for zero-column batches.
- Add a PyArrow-generated IPC stream fixture containing a zero-column
RecordBatchwith 100 rows.
Reviewed changes
Copilot reviewed 2 out of 3 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
src/recordbatch.ts |
Preserves parent Data.length when normalizing children, fixing zero-column length loss. |
test/unit/ipc/reader/zero-column-batch-tests.ts |
Adds regression tests to ensure numRows stays correct across read/write and direct construction. |
test/data/zero_column_batch.arrow |
Adds a PyArrow-generated IPC fixture for interop/regression testing. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
trxcllnt
left a comment
There was a problem hiding this comment.
Wow, I didn't realize zero-column batches were valid in the IPC format. Like, what does recordBatch.length even mean without columns? "Here's 20 rows of nothing, good luck!"
Was this an explicit decision by Arrow C++, or is this just an oversight in the implementation?
I'm not explicitly against this change, just confused by the behavior and would like to learn more.
So take this comment as a tentative approval, but marking as needs changes until the binary file is rebased out.
|
I'm writing a RPC layer using Arrow IPC (https://vgi-rpc.query.farm) that calls a function that takes no arguments 200 times, I want to serialize those arguments to those function calls (that take no arguments), so I wind up with a zero-column record batch that has 200 rows. |
|
And that's not representable by a |
Summary
When a zero-column RecordBatch is deserialized from IPC,
ensureSameLengthDatain the
RecordBatchconstructor recomputes length from children viachunks.reduce((max, col) => Math.max(max, col.length), 0). With zero children,this always returns 0 — discarding the original length from the IPC message header.
Other Arrow implementations (PyArrow, Arrow Go, arrow-rs) correctly preserve
numRows for zero-column batches.
Fix
Pass
this.data.lengthtoensureSameLengthDataas the explicitmaxLengthparameter, which the function already accepts as an optional third argument.
For batches with columns,
this.data.lengthalready matches the max columnlength, so there is no behavior change.
Tests
Closes #401