Skip to content

fix: Preserve numRows for zero-column RecordBatch in IPC#402

Open
rustyconover wants to merge 1 commit intoapache:mainfrom
Query-farm:fix/zero-column-recordbatch-numrows
Open

fix: Preserve numRows for zero-column RecordBatch in IPC#402
rustyconover wants to merge 1 commit intoapache:mainfrom
Query-farm:fix/zero-column-recordbatch-numrows

Conversation

@rustyconover
Copy link
Contributor

Summary

When a zero-column RecordBatch is deserialized from IPC, ensureSameLengthData
in the RecordBatch constructor recomputes length from children via
chunks.reduce((max, col) => Math.max(max, col.length), 0). With zero children,
this always returns 0 — discarding the original length from the IPC message header.

Other Arrow implementations (PyArrow, Arrow Go, arrow-rs) correctly preserve
numRows for zero-column batches.

Fix

Pass this.data.length to ensureSameLengthData as the explicit maxLength
parameter, which the function already accepts as an optional third argument.
For batches with columns, this.data.length already matches the max column
length, so there is no behavior change.

Tests

  • Read a PyArrow-generated zero-column IPC stream (100 rows) and verify numRows
  • JS round-trip: write + read zero-column batch, verify numRows preserved
  • Direct constructor: verify zero-column RecordBatch preserves length

Closes #401

@rustyconover
Copy link
Contributor Author

@raulcd can you please trigger the CI?

@rustyconover rustyconover force-pushed the fix/zero-column-recordbatch-numrows branch from e6e5d97 to a829ad9 Compare March 4, 2026 15:30
When a zero-column RecordBatch is deserialized from IPC,
ensureSameLengthData in the RecordBatch constructor recomputes length
from children via chunks.reduce((max, col) => Math.max(max, col.length), 0).
With zero children, this always returns 0 — discarding the original
length from the IPC message header.

Pass this.data.length to ensureSameLengthData as the explicit maxLength
parameter, which the function already accepts as an optional third
argument. For batches with columns, this.data.length already matches
the max column length, so there is no behavior change.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@rustyconover rustyconover force-pushed the fix/zero-column-recordbatch-numrows branch from a829ad9 to bc6f6eb Compare March 4, 2026 15:31
@kou kou requested review from Copilot, domoritz and trxcllnt March 5, 2026 02:35
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we create this file by our writer in test instead of adding this file to the repository?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, we should generate this file on-demand instead of checking it in. We'll need to rebase it out of the commit history before merging this.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes Arrow JS RecordBatch construction so that zero-column batches preserve numRows when deserialized from IPC, matching other Arrow implementations and the IPC message header.

Changes:

  • Pass this.data.length as the explicit maxLength to ensureSameLengthData() in the [Schema, Data] RecordBatch constructor path.
  • Add unit tests covering PyArrow interop, JS IPC round-trip, and direct constructor behavior for zero-column batches.
  • Add a PyArrow-generated IPC stream fixture containing a zero-column RecordBatch with 100 rows.

Reviewed changes

Copilot reviewed 2 out of 3 changed files in this pull request and generated no comments.

File Description
src/recordbatch.ts Preserves parent Data.length when normalizing children, fixing zero-column length loss.
test/unit/ipc/reader/zero-column-batch-tests.ts Adds regression tests to ensure numRows stays correct across read/write and direct construction.
test/data/zero_column_batch.arrow Adds a PyArrow-generated IPC fixture for interop/regression testing.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Copy link
Contributor

@trxcllnt trxcllnt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wow, I didn't realize zero-column batches were valid in the IPC format. Like, what does recordBatch.length even mean without columns? "Here's 20 rows of nothing, good luck!"

Was this an explicit decision by Arrow C++, or is this just an oversight in the implementation?

I'm not explicitly against this change, just confused by the behavior and would like to learn more.

So take this comment as a tentative approval, but marking as needs changes until the binary file is rebased out.

@rustyconover
Copy link
Contributor Author

I'm writing a RPC layer using Arrow IPC (https://vgi-rpc.query.farm) that calls a function that takes no arguments 200 times, I want to serialize those arguments to those function calls (that take no arguments), so I wind up with a zero-column record batch that has 200 rows.

@trxcllnt
Copy link
Contributor

trxcllnt commented Mar 5, 2026

And that's not representable by a Table<{ args: FixedSizeList(0) }>?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Zero-column RecordBatch loses numRows after IPC deserialization

4 participants