Conversation
…ng in iframe adapter
… bootstrap processes
… in iframe communication
…curity architecture, and React integration
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds OpenAPI source polling/loading and a catalog endpoint to the broker; extends session runtime with session-wide and per-tool deadlines, partial results, tool progress events, cancel-on-first-error behavior, richer error payloads, and bumps the wire protocol to v2. Changes
Sequence DiagramsequenceDiagram
participant Broker
participant OpenApiSource
participant OpenApiSpecPoller
participant OpenApiToolLoader
participant ToolRegistry
Broker->>OpenApiSource: openapi(config) / start()
OpenApiSource->>OpenApiSpecPoller: start polling
loop periodic poll
OpenApiSpecPoller->>OpenApiSpecPoller: fetch (with retry/timeout)
alt changed
OpenApiSpecPoller-->>OpenApiSource: changed(spec, hash)
OpenApiSource->>OpenApiToolLoader: fromSpec(spec)
OpenApiToolLoader-->>OpenApiSource: tools[], specHash
OpenApiSource->>ToolRegistry: register/replace/unregister(tools)
OpenApiSource-->>Broker: emit toolsUpdated / catalogChanged
else unchanged
OpenApiSpecPoller-->>OpenApiSource: unchanged
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 11
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 157-169: The deadline logic uses two different epochs and emits
terminal events from multiple paths; unify to a single epoch (e.g., use
this.createdAt or a new this.deadlineEpoch set once) and compute remaining
budget from that single timestamp everywhere (including where you check
remaining budget before runExecution and where you create the timeout from
limits.deadlineMs). Ensure only runExecution() is responsible for emitting the
final terminal event from enclave.run(); modify the timeout handler
(deadlineTimer) and cancel() to stop work and mark the session as
canceled/terminal without emitting the final EventType.DeadlineExceeded event
itself, and make runExecution() check the canceled/terminal flag before emitting
to avoid duplicate final emissions. Reference symbols: this.deadlineTimer,
this.limits.deadlineMs, this.createdAt (or new this.deadlineEpoch),
this.emitter.emit(this.makeCustomEvent(EventType.DeadlineExceeded,...)),
cancel(), and runExecution()/enclave.run().
In `@libs/broker/src/index.ts`:
- Around line 48-65: The new OpenAPI public exports (OpenApiSpecPoller,
OpenApiToolLoader, OpenApiSource, CatalogHandler and their types like
OpenApiPollerConfig, ChangeDetectionMode, PollerRetryConfig,
OpenApiSourceConfig, etc.) are missing documentation; add documentation pages
under docs/enclave/** (either docs/enclave/api-reference/enclavejs-broker.mdx or
docs/enclave/enclavejs/broker.mdx) that describe each exported class and the key
types, include concise usage examples showing how to instantiate
OpenApiSpecPoller and OpenApiToolLoader, how to configure OpenApiPollerConfig
and UpstreamAuth, and document events/types such as OpenApiPollerEvents,
ToolsUpdatedEvent and SourceHealthStatus so consumers can use the new OpenAPI
integration.
In `@libs/broker/src/openapi/catalog-handler.ts`:
- Around line 107-110: The catalog currently attributes every tool to the wrong
service because findServiceForTool(name) ignores the specific toolName; create a
proper mapping from tool name to service when you enumerate OpenAPI sources
(e.g., build a Map<string,string> toolToService or a per-source membership
snapshot) and use that map when you construct actions (replace calls to
this.findServiceForTool(name) with toolToService.get(tool.name) or equivalent);
update both places where actions are pushed (around the actions.push block and
the 124-131 section) so non-OpenAPI tools and multiple OpenAPI sources are
correctly resolved.
In `@libs/broker/src/openapi/openapi-source.ts`:
- Around line 149-165: In refresh(), you overwrite this.previousToolNames before
computing the diff so emit('toolsUpdated', ...) always reports empty changes;
fix by capturing the old set into a local (e.g., const oldToolNames =
this.previousToolNames) before changing it, compute added/removed/updated using
the same logic as syncTools() (compare oldToolNames and newToolNames), call
this.toolRegistry.unregister for removed names as currently done, then assign
this.previousToolNames = newToolNames and update specHash/lastUpdate/health and
finally emit('toolsUpdated', { added, removed, updated }) so the manual refresh
emits correct diffs and triggers catalogChanged for additions/removals.
- Around line 86-94: The changed handler can run concurrently and interleave
registry updates; make reconciliations for a given source run serially by
queuing or locking around this.syncTools. Implement a per-source mutex/queue
(e.g., a Map<string, Promise> or a boolean lock like this.reconcilingPerSource)
and in the this.poller.on('changed', ...) callback enqueue the JSON-parse + call
to this.syncTools(parsed) onto that per-source queue so the next change awaits
the previous promise before starting; update health and emit error only after
the awaited syncTools finishes. Reference symbols: this.poller.on('changed',
this.syncTools, previousToolNames, this.health, this.emit).
In `@libs/broker/src/openapi/openapi-spec-poller.ts`:
- Around line 117-122: The stop() method currently clears intervalTimer but
doesn't cancel any in-flight fetch or backoff, allowing late changed/error
events; add an instance-level AbortController and a boolean stopped flag on the
OpenApiSpecPoller class, have stop() set stopped=true, abort the controller,
clear intervalTimer and create a new controller for future runs, then update
doFetch() and any backoff/wait logic (the retry sleep paths referenced around
doFetch and backoff) to pass and/or check the controller.signal and
short-circuit immediately if stopped is true (throw or return) so no callbacks
fire after stop; ensure any retry timers are also skipped when stopped.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 83-89: The fromURL factory (OpenApiToolLoader.fromURL) currently
calls bare fetch(url) and must accept/specify the same request controls used by
OpenApiSource.refresh; update the method signature to accept fetch configuration
(e.g., an additional fetchOptions or specFetch field on LoaderOptions that can
include headers/timeout/signal or a custom fetch function) and use that
configuration when performing the request instead of a plain fetch(url); ensure
callers (like OpenApiSource.refresh) pass their poller fetch config into fromURL
so endpoints requiring custom headers/timeouts/signals behave the same on
startup/manual refresh as in the poller.
- Around line 143-145: The generated default toolName currently uses only
this.options.namingStrategy or op.operationId/method_path which can collide
across sources; update the default naming branch so when no namingStrategy is
provided it prefixes the computed name with the source/service identifier (e.g.
use a sourceName from this.options.sourceName or this.options.serviceName or
this.source) so the fallback becomes "<sourceName>_<operationId|method_path>";
adjust the logic around toolName assignment (referencing
this.options.namingStrategy, toolName, op.operationId, op.method, op.path) to
construct and use this namespaced default.
- Around line 292-299: The fetch wrapper in openapi-tool-loader.ts currently
returns response.json()/text() for all responses; change it to check response.ok
and throw an Error (so the registry treats it as a tool failure and
cancelOnFirstError can act) before parsing the body. Specifically, after
obtaining response and contentType, if (!response.ok) read the response body
(text or json depending on contentType) to include in the thrown Error and throw
an error containing response.status, response.statusText and the body; otherwise
continue to parse and return the body as before. Ensure you reference the
existing response and contentType variables so the behavior integrates with the
surrounding code paths.
In `@libs/types/src/events.ts`:
- Around line 218-231: The client currently ignores FinalPayload.errors; update
the code that builds SessionResult in handleFinalEvent (the function in
client.ts that reads FinalPayload and sets SessionResult) to preserve and
populate the errors array (e.g., set sessionResult.errors = finalPayload.errors)
and/or merge errors into sessionResult.error when appropriate; also update
useEnclaveSession (the hook that checks sessionResult.error?.code) to also
inspect sessionResult.errors (or normalize errors into sessionResult.error) so
per-path GraphQL-style errors are surfaced to consumers and not silently
dropped.
- Around line 71-78: SessionEventHandlers currently exposes specific callbacks
for many events but omits dedicated handlers for the new event types
(isPartialResultEvent, isToolProgressEvent, isDeadlineExceededEvent,
isCatalogChangedEvent), forcing consumers to use the generic onEvent and
manually discriminate; add four new optional callback properties to
SessionEventHandlers—onPartialResult, onToolProgress, onDeadlineExceeded, and
onCatalogChanged—typed to receive the appropriate event payloads (matching the
types used by the corresponding type guards) and ensure the event dispatch logic
that calls onEvent also invokes these new handlers when those events are
detected, or alternatively add documentation to SessionEventHandlers explaining
that these events must be handled via onEvent and reference the type guards for
discrimination.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: bbd29736-c9c8-4ec5-a177-4e8cf238b5f9
📒 Files selected for processing (15)
libs/broker/src/broker-session.tslibs/broker/src/broker.tslibs/broker/src/index.tslibs/broker/src/openapi/catalog-handler.tslibs/broker/src/openapi/index.tslibs/broker/src/openapi/openapi-source.tslibs/broker/src/openapi/openapi-spec-poller.tslibs/broker/src/openapi/openapi-tool-loader.tslibs/broker/src/tool-registry.tslibs/types/src/events.tslibs/types/src/index.tslibs/types/src/protocol.spec.tslibs/types/src/protocol.tslibs/types/src/schemas.spec.tslibs/types/src/schemas.ts
…and error handling
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (6)
libs/broker/src/openapi/catalog-handler.ts (1)
120-121: Make catalogversiondeterministic across source ordering.If
this.sourcesorder can vary between runs,join(':')may produce different versions for identical source sets. Sorting hashes before joining avoids unnecessary version churn/cache misses.Suggested diff
- const versionParts = this.sources.map((s) => s.getStats().specHash).join(':'); + const versionParts = this.sources + .map((s) => s.getStats().specHash) + .sort() + .join(':');🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/catalog-handler.ts` around lines 120 - 121, The computed catalog version uses this.sources.map(...).join(':') which is non-deterministic if this.sources ordering changes; instead extract each source's specHash via getStats().specHash, sort the array of hashes, then join with ':' to produce versionParts (falling back to 'empty') so identical source sets always yield the same version; update the logic around versionParts/version to use the sorted hashes before joining.libs/broker/src/openapi/openapi-tool-loader.ts (2)
217-235: All parameters typed as strings regardless of OpenAPI schema.Line 223 maps every parameter to
z.string(), ignoring the actual OpenAPI type (integer, boolean, array, enum). This works because the handler coerces values viaString(value), but loses type-specific validation (e.g., numeric range checks, enum constraints).Consider enhancing to map OpenAPI types to appropriate Zod types in a future iteration.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/openapi-tool-loader.ts` around lines 217 - 235, The buildArgsSchema function currently maps every parameter to z.string(), losing OpenAPI-specific types; update buildArgsSchema (used with ParsedOperation and param.name) to inspect param.schema (and op.requestBody.schema) and map OpenAPI types to appropriate Zod types (e.g., "integer" -> z.number().int(), "number" -> z.number(), "boolean" -> z.boolean(), "string" with enum -> z.enum([...]) or z.string().refine, "array" -> z.array(mapped item schema)), applying .optional() when param.required or requestBody.required is false and falling back to z.string() or z.unknown() only for unsupported cases; ensure enum values and item schemas are handled so type-specific validation is preserved.
271-274:Content-Typeheader set unconditionally.
Content-Type: application/jsonis added for all requests, including GET/DELETE/HEAD which have no body. While typically harmless, some strict servers may reject requests withContent-Typebut no body.♻️ Conditional Content-Type
const requestHeaders: Record<string, string> = { - 'Content-Type': 'application/json', ...headers, }; + + // Only set Content-Type for methods that send a body + if (['post', 'put', 'patch'].includes(op.method) && params['body']) { + requestHeaders['Content-Type'] = 'application/json'; + }Then remove the duplicate condition check at lines 308-310 since body serialization can rely on the same logic.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/openapi-tool-loader.ts` around lines 271 - 274, The current code unconditionally adds 'Content-Type': 'application/json' to requestHeaders; change this to add Content-Type only when the HTTP method permits a body and a body is present (e.g., method not in ['GET','HEAD','DELETE'] and body != null/undefined), so requestHeaders is built conditionally from headers plus Content-Type only when needed (refer to the requestHeaders variable). Then remove the redundant body-presence conditional used later in the body serialization logic (the duplicate check that serializes JSON) so serialization relies on the same method+body condition you use to set Content-Type.libs/client/src/types.ts (2)
124-127: Consider usingErrorPayloadtype instead ofunknownfor the error parameter.The
errorparameter is typed asunknown, but the broker emitsErrorPayloadobjects with a defined structure (code,message,path,details). Using the actual type would provide better type safety for consumers.♻️ Proposed improvement
Import
ErrorPayloadand use it:+import type { SessionId, StreamEvent, SessionLimits, ErrorPayload } from '@enclave-vm/types'; -import type { SessionId, StreamEvent, SessionLimits } from '@enclave-vm/types'; // ... /** * Called when a partial result arrives */ - onPartialResult?: (path: string[], data?: unknown, error?: unknown, hasNext?: boolean) => void; + onPartialResult?: (path: string[], data?: unknown, error?: ErrorPayload, hasNext?: boolean) => void;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/client/src/types.ts` around lines 124 - 127, The onPartialResult callback currently types the error parameter as unknown; update it to use the concrete ErrorPayload type emitted by the broker. Import the ErrorPayload type and change the signature onPartialResult?: (path: string[], data?: unknown, error?: ErrorPayload, hasNext?: boolean) => void so consumers get proper typing for code/message/path/details; ensure the import of ErrorPayload is added near other type imports in the file and update any related usages or exports referencing onPartialResult.
129-132: Consider using a stricter type for thephaseparameter.The
phaseparameter is typed asstring, but the broker only emits specific phases. Using a union type would provide better type safety and IDE autocompletion.♻️ Proposed improvement
/** * Called when tool progress is reported */ - onToolProgress?: (callId: string, phase: string, elapsedMs: number) => void; + onToolProgress?: (callId: string, phase: 'connecting' | 'sending' | 'receiving' | 'processing', elapsedMs: number) => void;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/client/src/types.ts` around lines 129 - 132, Replace the loose string type for the onToolProgress phase parameter with a stricter union/type that reflects the broker's actual emitted phases: define/export a ToolPhase type (or import the broker's enum/type if it exists) and change the signature onToolProgress?: (callId: string, phase: ToolPhase, elapsedMs: number) => void; then update usages to use the new ToolPhase so IDE/typechecking enforces only the valid phases.libs/broker/src/broker-session.ts (1)
419-431: Custom events reuse the same sequence number.
makeCustomEventusesthis.seq(which returnsemitter.getSeq()) without incrementing. Multiple consecutive custom events (e.g.,ToolProgressfollowed byDeadlineExceeded) will share the same sequence number, potentially causing ordering ambiguity or deduplication issues on the client during reconnection replay.Consider incrementing the sequence for each custom event, or document that custom events intentionally share sequence numbers with the preceding standard event.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 419 - 431, makeCustomEvent currently reuses this.seq (which reads emitter.getSeq()) so consecutive custom events share the same sequence; update makeCustomEvent to advance the session/emitter sequence for each custom event (e.g., call the emitter/Session sequence increment method or use a nextSeq() helper instead of this.seq) so each returned StreamEvent has a unique incremented seq, and ensure you still set protocolVersion, sessionId, type and payload as before (refer to makeCustomEvent, this.seq / emitter.getSeq(), StreamEvent, PROTOCOL_VERSION).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 234-241: The cancellation branch inside broker-session where
isTerminal() is true builds a result but never emits a final event; update that
branch (the code after the isTerminal() check within the enclave.run() handling)
to call the appropriate emitter to terminate the stream — e.g., call
emitter.emitFinalError(...) with the created error payload, stats and finalState
'cancelled' (or emitter.emitFinalSuccess(...) if your contract expects success),
so the client receives a FinalEvent; ensure you use the same result object
created there and follow existing emitter method signatures
(emitter.emitFinalError / emitFinalSuccess) to preserve logging and downstream
handling.
- Around line 359-364: The code accumulates partial errors via
getPartialErrors() and emitPartialResult but never includes them when calling
emitFinalError, so update emitFinalError (in session-emitter.ts) to accept an
errors parameter (or an options object) and include the accumulated
partialErrors into the FinalPayload.errors field when emitting the final event;
update all call sites that invoke emitFinalError to pass this.getPartialErrors()
(or merge with an existing error) so handleFinalEvent can read
event.payload.errors, or alternatively remove getPartialErrors() and
emitPartialResult if you prefer to drop partial-error accumulation—ensure
consistency between emitPartialResult, getPartialErrors, emitFinalError, and
FinalPayload.
In `@libs/broker/src/openapi/catalog-handler.ts`:
- Around line 17-48: Add a new section to
docs/enclave/api-reference/enclavejs-broker.mdx documenting the GET
/code/actions endpoint exported by the broker: describe that it returns 200 with
a JSON body matching the CatalogResponse interface (fields: actions:
CatalogAction[] and services: CatalogService[], plus version string), then
enumerate the CatalogAction fields (name, description?, inputSchema?, service,
tags?, deprecated?) and CatalogService fields (name, specUrl, lastUpdated,
actionCount) and include an example response payload and curl request; finally
document version semantics for the version field (when it increments for
non-breaking vs breaking catalog changes) and add short notes that
CatalogHandler is the HTTP handler to wire to this route and that these types
(CatalogAction, CatalogService, CatalogResponse, CatalogHandler) are exported
from the broker package for consumers to import.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 1-13: The code imports createHash from node:crypto which breaks
browser compatibility; remove the node:crypto import and replace any use of
createHash with a cross-platform hashing helper that uses the Web Crypto API
(crypto.subtle.digest) when available and falls back to a small universal hash
library or a JS implementation for older environments; update the function(s)
that compute the hash (references to createHash) to be async or compute the hash
lazily (e.g., expose an async getHash() or await the factory) so callers using
functions like the OpenAPI tool loader can await the hash; ensure the helper is
used wherever createHash was referenced so both Node and browser builds work.
---
Nitpick comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 419-431: makeCustomEvent currently reuses this.seq (which reads
emitter.getSeq()) so consecutive custom events share the same sequence; update
makeCustomEvent to advance the session/emitter sequence for each custom event
(e.g., call the emitter/Session sequence increment method or use a nextSeq()
helper instead of this.seq) so each returned StreamEvent has a unique
incremented seq, and ensure you still set protocolVersion, sessionId, type and
payload as before (refer to makeCustomEvent, this.seq / emitter.getSeq(),
StreamEvent, PROTOCOL_VERSION).
In `@libs/broker/src/openapi/catalog-handler.ts`:
- Around line 120-121: The computed catalog version uses
this.sources.map(...).join(':') which is non-deterministic if this.sources
ordering changes; instead extract each source's specHash via
getStats().specHash, sort the array of hashes, then join with ':' to produce
versionParts (falling back to 'empty') so identical source sets always yield the
same version; update the logic around versionParts/version to use the sorted
hashes before joining.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 217-235: The buildArgsSchema function currently maps every
parameter to z.string(), losing OpenAPI-specific types; update buildArgsSchema
(used with ParsedOperation and param.name) to inspect param.schema (and
op.requestBody.schema) and map OpenAPI types to appropriate Zod types (e.g.,
"integer" -> z.number().int(), "number" -> z.number(), "boolean" -> z.boolean(),
"string" with enum -> z.enum([...]) or z.string().refine, "array" ->
z.array(mapped item schema)), applying .optional() when param.required or
requestBody.required is false and falling back to z.string() or z.unknown() only
for unsupported cases; ensure enum values and item schemas are handled so
type-specific validation is preserved.
- Around line 271-274: The current code unconditionally adds 'Content-Type':
'application/json' to requestHeaders; change this to add Content-Type only when
the HTTP method permits a body and a body is present (e.g., method not in
['GET','HEAD','DELETE'] and body != null/undefined), so requestHeaders is built
conditionally from headers plus Content-Type only when needed (refer to the
requestHeaders variable). Then remove the redundant body-presence conditional
used later in the body serialization logic (the duplicate check that serializes
JSON) so serialization relies on the same method+body condition you use to set
Content-Type.
In `@libs/client/src/types.ts`:
- Around line 124-127: The onPartialResult callback currently types the error
parameter as unknown; update it to use the concrete ErrorPayload type emitted by
the broker. Import the ErrorPayload type and change the signature
onPartialResult?: (path: string[], data?: unknown, error?: ErrorPayload,
hasNext?: boolean) => void so consumers get proper typing for
code/message/path/details; ensure the import of ErrorPayload is added near other
type imports in the file and update any related usages or exports referencing
onPartialResult.
- Around line 129-132: Replace the loose string type for the onToolProgress
phase parameter with a stricter union/type that reflects the broker's actual
emitted phases: define/export a ToolPhase type (or import the broker's enum/type
if it exists) and change the signature onToolProgress?: (callId: string, phase:
ToolPhase, elapsedMs: number) => void; then update usages to use the new
ToolPhase so IDE/typechecking enforces only the valid phases.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: f591fd32-8bbe-464c-9ee7-8a9a45874495
⛔ Files ignored due to path filters (1)
yarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (9)
libs/broker/package.jsonlibs/broker/src/broker-session.tslibs/broker/src/openapi/catalog-handler.tslibs/broker/src/openapi/openapi-source.tslibs/broker/src/openapi/openapi-spec-poller.tslibs/broker/src/openapi/openapi-tool-loader.tslibs/client/src/client.tslibs/client/src/types.tspackage.json
✅ Files skipped from review due to trivial changes (3)
- libs/broker/package.json
- package.json
- libs/broker/src/openapi/openapi-source.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- libs/broker/src/openapi/openapi-spec-poller.ts
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (3)
libs/broker/src/broker-session.ts (2)
235-240:⚠️ Potential issue | 🟠 MajorCarry
partialErrorsthrough the cancelled terminal path.This is the only final-emission branch that still skips
this.getPartialErrors(). If the session is cancelled after one or moreemitPartialResult(..., error)calls,FinalEvent.payload.errorsstays empty andlibs/client/src/client.tscannot populateSessionResult.errors.Proposed fix
- this.emitter.emitFinalError(cancelError, eventStats); + this.emitter.emitFinalError(cancelError, eventStats, this.getPartialErrors());🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 235 - 240, The cancel/terminal branch currently builds cancelError and calls this.emitter.emitFinalError without including any partial errors; update the terminal-cancel path in the isTerminal() branch so it carries through partial errors from previous emitPartialResult calls by calling this.getPartialErrors() and attaching them to the final emission (e.g., include them on the cancelError object as an errors field or pass them into this.emitter.emitFinalError so FinalEvent.payload.errors is populated); modify the code around this.isTerminal(), cancelError, and the this.emitter.emitFinalError call to ensure partial errors are forwarded.
374-405:⚠️ Potential issue | 🟠 Major
perToolDeadlineMsis currently a no-op.
libs/types/src/protocol.tsexposesperToolDeadlineMsas a supported session limit, butexecuteTool()never reads it and passes the unmodified session abort signal intothis.toolRegistry.execute(...). A slow tool is only stopped by the overall session deadline, so the new per-tool budget is not actually enforced. Reintroduce per-tool timeout enforcement around the registry call, capped by the remaining session budget.As per coding guidelines,
libs/**: Contains publishable SDK libraries. Review for API correctness, breaking changes, and consistency with docs.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 374 - 405, executeTool currently ignores perToolDeadlineMs so slow tools only respect the overall session deadline; compute the per-tool timeout by reading this.limits.perToolDeadlineMs and cap it with the remaining session budget (remaining = this.limits.deadlineMs - (Date.now() - this.createdAt) when deadlineMs>0), then create a dedicated AbortController for the tool that races the parent this.abortController.signal and a timer that calls toolAbort.abort() after the computed timeout (if timeout>0); pass toolAbort.signal (not the parent signal) into this.toolRegistry.execute(toolName, args, context) where context.signal is set to that signal, ensure the timer is cleared after the call and propagate/throw a clear timeout error (e.g., "Tool deadline exceeded") when the per-tool timer fires; touch symbols: executeTool, this.limits.perToolDeadlineMs, this.limits.deadlineMs, this.abortController.signal, this.toolRegistry.execute, callId.libs/broker/src/openapi/openapi-spec-poller.ts (1)
107-114:⚠️ Potential issue | 🟠 MajorA fast stop/start can resurrect the previous poll cycle.
stop()only aborts the active fetch. If it fires while the abort is unwinding or the retry sleep is pending,start()resets_stoppedtofalsebefore the old promise settles, so that stale cycle can resume retries and emit into the new run. Please keep a per-run abort/generation token and make the backoff wait abortable too.Also applies to: 120-129, 178-195
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/openapi-spec-poller.ts` around lines 107 - 114, The start/stop race allows an old poll cycle to continue after a new start because stop() only aborts the active fetch via the shared _stopped flag; fix this by introducing a per-run AbortController/Generation token (e.g., create a new AbortController in start() and store it as currentRunAbort or runId) and pass its signal into poll(), any fetch calls inside poll(), and into the retry/backoff sleep so the wait is abortable; update stop() to call currentRunAbort.abort() and clear/reset the currentRunAbort, ensure poll() checks the run token before scheduling retries and that intervalTimer callbacks capture the current run token so stale cycles cannot resume retries or emit into a new run (apply same pattern to functions referenced at poll(), start(), stop(), and the backoff/sleep logic).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 167-196: The pre-start terminal branches (e.g., the
DeadlineExceeded path that sets this._deadlineExceeded, calls stopHeartbeat(),
sets this._state='failed', emits events via this.emitter.emit/emitFinalError)
finalize the session but do not populate the cached execution result used by
wait(), so callers of wait() still get "Session has not started execution"; fix
by assigning a resolved SessionFinalResult to this.executionPromise (or another
internal cached resolved result that wait() consults) in these pre-start
terminal branches (the DeadlineExceeded branch and the cancel-before-start
branch), ensuring runExecution()/wait() will return the same final result
already emitted via emitter; keep the existing emitted events and state changes
but set this.executionPromise = Promise.resolve(<the SessionFinalResult object>)
before returning.
- Around line 385-391: The guard that checks this.limits.deadlineMs currently
throws a generic Error when remaining <= 0, which doesn't set
this._deadlineExceeded and causes downstream code to report EXECUTION_ERROR
instead of DEADLINE_EXCEEDED; modify the branch so that before throwing you set
this._deadlineExceeded = true and throw (or construct) a
DeadlineExceeded-specific error (or an Error with a distinct type/message that
your catch path recognizes as DeadlineExceeded) so the catch logic will emit the
DeadlineExceeded event and tag the tool-start as DEADLINE_EXCEEDED; update the
check in the same method where this.limits.deadlineMs, this.createdAt,
remaining, and this._deadlineExceeded are referenced.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 231-234: The current code always models request bodies as a
generic object (shape['body']) and later sends them as JSON; instead, inspect
op.requestBody.content to determine the media type and proper Zod shape and
encoding: look up op.requestBody.content keys, prefer application/json or +json
variants and build a Zod schema from the media.schema (or fallback to
z.unknown()/z.any()), for application/x-www-form-urlencoded and
multipart/form-data use z.record(z.string(), z.unknown()) (or
form-field-appropriate types), for text/* use z.string(), and for
array/primitive JSON schemas generate corresponding Zod array/string/number
schemas rather than forcing an object; respect op.requestBody.required when
wrapping .optional(); also attach or return the chosen contentType/mediaType
(e.g., bodyMediaType) so the request sender uses the correct Content-Type
instead of always using application/json; update the logic around shape['body'],
op.requestBody, and any sender/encoder that reads shape['body'] to use this
media-type-driven schema and encoding.
- Around line 105-107: The fallback for baseUrl currently uses the fetched
document origin; change it to prefer the OpenAPI spec's servers entry: when
options?.baseUrl is not provided, read spec.servers (array) and if the first
server object has a url, resolve that URL against the fetch request URL (use new
URL(serverUrl, url) to handle relative paths like "/v1") and pass the resolved
server URL as baseUrl into the OpenApiToolLoader constructor (keep using spec,
hash, options, auth and only replace the baseUrl resolution logic around
OpenApiToolLoader).
- Around line 194-209: The loop that builds operations currently uses only
operation['parameters'] and drops shared path-level params from
pathItem.parameters; update the logic inside the for (const [path, pathItem]
...) / for (const method of httpMethods) block (where operation is derived and
operations.push is called) to merge pathItem.parameters (if present) with
operation['parameters'], placing path-level params before operation-level params
and de-duplicating by the tuple (name, in) so operation-level entries override
path-level ones; ensure the merged array is assigned to the operation.parameters
field passed into operations.push (maintain types like
ParsedOperation['parameters'] and preserve undefined handling).
---
Duplicate comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 235-240: The cancel/terminal branch currently builds cancelError
and calls this.emitter.emitFinalError without including any partial errors;
update the terminal-cancel path in the isTerminal() branch so it carries through
partial errors from previous emitPartialResult calls by calling
this.getPartialErrors() and attaching them to the final emission (e.g., include
them on the cancelError object as an errors field or pass them into
this.emitter.emitFinalError so FinalEvent.payload.errors is populated); modify
the code around this.isTerminal(), cancelError, and the
this.emitter.emitFinalError call to ensure partial errors are forwarded.
- Around line 374-405: executeTool currently ignores perToolDeadlineMs so slow
tools only respect the overall session deadline; compute the per-tool timeout by
reading this.limits.perToolDeadlineMs and cap it with the remaining session
budget (remaining = this.limits.deadlineMs - (Date.now() - this.createdAt) when
deadlineMs>0), then create a dedicated AbortController for the tool that races
the parent this.abortController.signal and a timer that calls toolAbort.abort()
after the computed timeout (if timeout>0); pass toolAbort.signal (not the parent
signal) into this.toolRegistry.execute(toolName, args, context) where
context.signal is set to that signal, ensure the timer is cleared after the call
and propagate/throw a clear timeout error (e.g., "Tool deadline exceeded") when
the per-tool timer fires; touch symbols: executeTool,
this.limits.perToolDeadlineMs, this.limits.deadlineMs,
this.abortController.signal, this.toolRegistry.execute, callId.
In `@libs/broker/src/openapi/openapi-spec-poller.ts`:
- Around line 107-114: The start/stop race allows an old poll cycle to continue
after a new start because stop() only aborts the active fetch via the shared
_stopped flag; fix this by introducing a per-run AbortController/Generation
token (e.g., create a new AbortController in start() and store it as
currentRunAbort or runId) and pass its signal into poll(), any fetch calls
inside poll(), and into the retry/backoff sleep so the wait is abortable; update
stop() to call currentRunAbort.abort() and clear/reset the currentRunAbort,
ensure poll() checks the run token before scheduling retries and that
intervalTimer callbacks capture the current run token so stale cycles cannot
resume retries or emit into a new run (apply same pattern to functions
referenced at poll(), start(), stop(), and the backoff/sleep logic).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 90f93c66-e784-47f9-8c82-e1b367f089a2
📒 Files selected for processing (8)
docs/enclave/api-reference/enclavejs-broker.mdxlibs/broker/src/broker-session.tslibs/broker/src/openapi/catalog-handler.tslibs/broker/src/openapi/hash-utils.tslibs/broker/src/openapi/openapi-spec-poller.tslibs/broker/src/openapi/openapi-tool-loader.tslibs/client/src/types.tslibs/core/src/session/session-emitter.ts
✅ Files skipped from review due to trivial changes (1)
- docs/enclave/api-reference/enclavejs-broker.mdx
🚧 Files skipped from review as they are similar to previous changes (2)
- libs/client/src/types.ts
- libs/broker/src/openapi/catalog-handler.ts
…th parameter merging
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (1)
libs/broker/src/openapi/openapi-tool-loader.ts (1)
314-335:⚠️ Potential issue | 🟠 MajorRequest bodies are still sent as JSON regardless of
requestBody.content.These lines infer text/form/media-specific schemas, but Line 372 only sends bodies for
post/put/patch, and Lines 374/410 always useapplication/jsonplusJSON.stringify.text/plain,multipart/form-data,application/x-www-form-urlencoded, and bodies on other methods will validate and then be encoded incorrectly or dropped.Also applies to: 371-410
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/openapi-tool-loader.ts` around lines 314 - 335, The code validates multiple content types in buildBodySchema but always serializes and sends bodies as JSON; update the implementation so buildBodySchema (or its caller) tracks/returns the chosen media type (e.g., 'application/json', 'text/plain', 'application/x-www-form-urlencoded', 'multipart/form-data') alongside the Zod schema, and change the request-body serialization logic (the place that currently uses JSON.stringify for post/put/patch bodies) to branch on that media type: for application/json use JSON.stringify, for text/plain send raw string, for application/x-www-form-urlencoded use URLSearchParams or form-encoded string and set header Content-Type accordingly, for multipart/form-data build a FormData (and let the browser/node set the boundary instead of forcing Content-Type), and ensure bodies for other HTTP verbs are honored instead of only post/put/patch; also set the appropriate Content-Type header where the body is set.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 236-248: The terminal-branch in BrokerSession (the block using
this.isTerminal(), this._deadlineExceeded, emitter.emitFinalError,
getPartialErrors()) must preserve the deadline path: if this._deadlineExceeded
is true, set/emit EventType.DeadlineExceeded in eventStats (or adjust
eventStats.type accordingly) and return result.finalState = 'failed' (and keep
error.code = 'DEADLINE_EXCEEDED'); otherwise keep the existing 'cancelled' flow.
Update the branch so emitter.emitFinalError receives the deadline eventStats
when _deadlineExceeded is true and the result object uses finalState 'failed'
for the deadline case to match the pre-start timeout behavior.
- Around line 380-399: The code currently emits a ToolCall and initial
ToolProgress and increments toolCallCount before checking session deadline,
causing phantom calls when remaining budget is <= 0; move the deadline-check
logic (the block computing elapsed/remaining, setting _deadlineExceeded and
throwing the DEADLINE_EXCEEDED error) to run before calling
this.emitter.emitToolCall, this.emitToolProgress and this.toolCallCount++ (or
alternatively compute remaining first and bail out early), so that functions
emitToolCall, emitToolProgress and the counter increment only execute when there
is positive remaining budget; update references in this block:
emitter.emitToolCall, emitToolProgress, toolCallCount, limits.deadlineMs,
limits.perToolDeadlineMs, _deadlineExceeded.
In `@libs/broker/src/openapi/openapi-spec-poller.ts`:
- Around line 198-207: The backoff delay code in openapi-spec-poller.ts (inside
the start() retry loop) registers an 'abort' listener on runSignal each attempt
but never removes it when the timeout completes, causing listener accumulation;
fix it by capturing the onAbort handler and, in the timeout completion path,
call runSignal.removeEventListener('abort', onAbort) (and still clearTimeout) so
the listener is removed whether the delay resolves normally or via abort—ensure
the same onAbort reference is used for both addEventListener and
removeEventListener to clean up after each await.
- Around line 249-267: The code stores response headers (this.lastEtag /
this.lastModified) before the body is fully processed, so if response.text() or
sha256Hex(body) fails the poller can mark a version as cached without having
emitted/hashed it; move assignment of this.lastEtag and this.lastModified to
after successful await response.text() and successful sha256Hex(body) and after
updating this.lastHash and emitting 'changed' (i.e., only update header cache
once hashing/emitting succeeded), keeping the existing checks that emit
'unchanged' when lastHash matches to avoid advancing validators prematurely.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 286-307: mapOpenApiType currently returns z.string() for objects
or schemas without an explicit "type", which collapses JSON/$ref bodies; update
mapOpenApiType to handle type === 'object' by returning a generic object schema
(e.g., z.record(z.any()) or z.unknown()) and, for schemas missing a "type"
(including $ref cases), return z.unknown() or a generic object schema instead of
z.string(); ensure the same change is applied where request bodies are routed
through mapOpenApiType so $ref and object payloads are not coerced to strings.
- Around line 108-123: fromSpec() currently leaves options.baseUrl empty when
callers omit it, causing generated handlers to fetch relative paths instead of
the upstream declared in spec.servers; update the base-url resolution in
fromSpec() to mirror the logic used in fromURL(): inspect
spec['servers']?.[0]?.url, try to construct an absolute base (using new
URL(servers[0].url, url) to resolve relative server entries) and assign that to
resolvedBaseUrl, falling back to new URL(url).origin only on failure; if the
server URL is relative and cannot be resolved to an absolute origin+pathname,
fail fast with a clear error instead of returning an empty baseUrl so
OpenApiToolLoader (constructed via new OpenApiToolLoader(spec, hash, {
...options, baseUrl: resolvedBaseUrl }, auth)) always receives a valid baseUrl.
- Around line 373-399: The requestHeaders merge currently allows op.parameters
header values to overwrite credentials set from auth; update the logic in
openapi-tool-loader.ts (symbols: requestHeaders, auth, op.parameters, params) so
protected headers (at minimum 'Authorization' and the configured api-key header
name auth.header ?? 'X-API-Key') cannot be replaced by parameter headers—either
ignore op.parameters entries that match those protected header names or throw a
clear error when a collision is detected; ensure the auth-derived values always
take precedence and add a comment explaining the protection policy.
---
Duplicate comments:
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 314-335: The code validates multiple content types in
buildBodySchema but always serializes and sends bodies as JSON; update the
implementation so buildBodySchema (or its caller) tracks/returns the chosen
media type (e.g., 'application/json', 'text/plain',
'application/x-www-form-urlencoded', 'multipart/form-data') alongside the Zod
schema, and change the request-body serialization logic (the place that
currently uses JSON.stringify for post/put/patch bodies) to branch on that media
type: for application/json use JSON.stringify, for text/plain send raw string,
for application/x-www-form-urlencoded use URLSearchParams or form-encoded string
and set header Content-Type accordingly, for multipart/form-data build a
FormData (and let the browser/node set the boundary instead of forcing
Content-Type), and ensure bodies for other HTTP verbs are honored instead of
only post/put/patch; also set the appropriate Content-Type header where the body
is set.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: bbc4e19a-5edf-4db0-a246-172587deb0d6
📒 Files selected for processing (3)
libs/broker/src/broker-session.tslibs/broker/src/openapi/openapi-spec-poller.tslibs/broker/src/openapi/openapi-tool-loader.ts
…ved body handling
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (1)
libs/broker/src/openapi/openapi-spec-poller.ts (1)
260-274:⚠️ Potential issue | 🟠 MajorRe-check cancellation before committing the new hash.
After Line 260 awaits
sha256Hex(body),stop()can abort while hashing is still in flight. Lines 262-274 then still emit and/or advancelastHash; if achangedlistener throws, the next retry collapses tounchangedand the new spec is never replayed.🩹 Suggested fix
const hash = await sha256Hex(body); + if (runSignal.aborted) return; if (this.lastHash && this.lastHash === hash) { this.emit('unchanged'); + if (runSignal.aborted) return; if (etag) this.lastEtag = etag; if (lastModified) this.lastModified = lastModified; return; } - this.lastHash = hash; this.emit('changed', body, hash); + if (runSignal.aborted) return; + this.lastHash = hash; // Store headers only after successful body processing if (etag) this.lastEtag = etag; if (lastModified) this.lastModified = lastModified;Node.js EventEmitter emit() behavior when a listener throws synchronously🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/openapi/openapi-spec-poller.ts` around lines 260 - 274, After awaiting sha256Hex(body) re-check that the poller wasn't stopped before you compare/update state or emit events: add a guard (e.g., if (this.stopped || this.isStopped()) return;) immediately after the await to abort if stop() ran; also avoid advancing this.lastHash before a successful listener run — move the this.lastHash = hash assignment to after emit('changed', body, hash) and/or wrap the emit in try/catch so you only update lastHash/lastEtag/lastModified when emit completes without throwing (and still respect the cancellation guard). Ensure you reference sha256Hex, this.lastHash, emit('changed'), stop(), this.lastEtag and this.lastModified when applying the changes.
🧹 Nitpick comments (4)
libs/broker/src/broker-session.ts (4)
336-379: New public session APIs may need documentation.The new public methods
emitToolProgress(),emitPartialResult(), andgetPartialErrors()extend theBrokerSessionpublic API. Per coding guidelines forlibs/**, ensure there is matching documentation underdocs/enclave/**for these new capabilities.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 336 - 379, The new public BrokerSession methods emitToolProgress, emitPartialResult, and getPartialErrors were added but lack corresponding docs; add documentation entries under docs/enclave/** describing each API (purpose, parameters, return values, examples/usage, and error semantics), referencing BrokerSession and the EventType.ToolProgress and EventType.PartialResult behaviors, and update any public API index or changelog so these methods are discoverable in the libs/** documentation set.
346-354: Optional fields may includeundefinedin payload.When
bytesReceivedandtotalBytesare not provided, they're included asundefinedin the payload object. Consider filtering them out:♻️ Suggested improvement
this.emitter.emit( this.makeCustomEvent(EventType.ToolProgress, { callId, phase, elapsedMs, - bytesReceived, - totalBytes, + ...(bytesReceived !== undefined && { bytesReceived }), + ...(totalBytes !== undefined && { totalBytes }), }), );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 346 - 354, The payload passed to this.emitter.emit for EventType.ToolProgress includes optional properties bytesReceived and totalBytes even when undefined; update the emit call to build the event payload for makeCustomEvent by only including bytesReceived and totalBytes when they are not undefined (e.g., create a payload object with callId, phase, elapsedMs and conditionally add bytesReceived/totalBytes or delete them if undefined) so the emitted event does not contain undefined fields; target the emitter.emit / makeCustomEvent invocation and the variables callId, phase, elapsedMs, bytesReceived, totalBytes when making the change.
308-317: Consider extracting error code logic for clarity.The nested ternary for determining
errorInfo.codeworks correctly but is somewhat hard to follow. Consider extracting to a helper or using if-else for readability:let code: string; if (isCancelled) { code = this._deadlineExceeded ? 'DEADLINE_EXCEEDED' : 'SESSION_CANCELLED'; } else { code = (err as Error & { code?: string }).code ?? 'EXECUTION_ERROR'; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 308 - 317, The nested ternary that sets errorInfo.code is hard to read; extract that logic into a small helper (e.g., determineErrorCode(err) or a local function) or replace the ternary with an if/else block: use this._state === 'cancelled' (or the existing isCancelled variable) and this._deadlineExceeded to return 'DEADLINE_EXCEEDED' or 'SESSION_CANCELLED', otherwise return (err as Error & { code?: string }).code ?? 'EXECUTION_ERROR'; then use that result when building errorInfo before calling this.emitter.emitFinalError(..., eventStats, this.getPartialErrors()).
463-474: Double cast is a necessary pragmatic solution given SessionEmitter's lack of typed methods for these events.The
as unknown as StreamEventcast correctly identifies a type safety gap: the genericRecord<string, unknown>payload cannot satisfy the specific typed payloads (DeadlineExceededPayload, ToolProgressPayload, PartialResultPayload). However, the actual payloads being passed match their event types correctly (verified against event definitions):
DeadlineExceeded: elapsedMs/budgetMs ✓ToolProgress: callId/phase/bytesReceived/totalBytes ✓PartialResult: path/data/error/hasNext ✓Consider adding dedicated typed methods to SessionEmitter (e.g.,
emitDeadlineExceeded,emitToolProgress,emitPartialResult) to eliminate the cast and match the pattern used elsewhere in the codebase for type-safe event creation.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@libs/broker/src/broker-session.ts` around lines 463 - 474, The makeCustomEvent function is using a double cast to StreamEvent because SessionEmitter lacks typed emitters for specific event payloads; add dedicated, type-safe emitter methods on SessionEmitter (e.g., emitDeadlineExceeded, emitToolProgress, emitPartialResult) that accept strongly-typed payloads and construct StreamEvent objects rather than passing Record<string, unknown>, then update callers to use those new methods so makeCustomEvent can either be removed or typed properly without the as unknown as StreamEvent cast; ensure new methods mirror existing emitter patterns and produce StreamEvent instances with protocolVersion, sessionId, seq (via this.emitter.nextSeq()), type, and typed payload.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@libs/broker/src/openapi/openapi-spec-poller.ts`:
- Around line 106-116: The start/stop race drops the new run's initial poll if a
prior poll is aborted but still unwinding; fix by tracking the active poll
promise (e.g., add a field like this._activePollPromise or this._runId) and when
start() creates a new AbortController/ signal, if there is an existing active
poll promise attach a finally handler to that promise so once the previous run
settles you immediately invoke pollWithSignal(newSignal) for the new run
(instead of waiting for the next interval tick); update start(), stop(), and
poll invocation sites (start(), stop(), pollWithSignal(), this._runAbort,
this.intervalTimer, config.intervalMs) to set/clear the tracked promise/run id
so only the correct run triggers the queued immediate poll.
- Around line 55-60: The EventEmitter 'error' event in OpenApiPoller is reserved
and must be renamed: update the OpenApiPollerEvents interface to replace the
error tuple (error: Error) with a non-reserved name like pollError or fetchError
(e.g., pollError: [error: Error]), then change all emit calls from
this.emit('error', ...) to this.emit('pollError', ...) inside the OpenApiPoller
implementation (where consecutiveFailures, wasUnhealthy and
config.unhealthyThreshold are handled) and update any external
listeners/subscribers to listen for the new 'pollError' event name.
In `@libs/broker/src/openapi/openapi-tool-loader.ts`:
- Around line 405-407: The code currently limits request bodies to
POST/PUT/PATCH and drops explicit null bodies; change the body-detection logic
used when building/sending requests (the hasBody calculation around op.method
and params['body']) to instead check whether the operation defines a requestBody
(op.requestBody != null) and whether the caller actually provided a body key
(use Object.prototype.hasOwnProperty.call(params, 'body') or
params.hasOwnProperty('body')) so null is preserved; update the same logic in
the later send branch (the block around resolvedMediaType and header setting at
lines ~452-469) so DELETE/OPTIONS and other methods with requestBody will
include the body and Content-Type when present.
- Around line 200-205: The code currently generates baseName and toolName
(variables baseName and toolName computed from op.operationId,
this.options.namingStrategy, and this.options.sourceName) but silently allows
duplicates later when deduping toolNames; change the generator to fail fast:
after computing toolName for each operation, check a shared Set of already-seen
toolNames and if a collision is detected throw an Error (include the conflicting
toolName and operationId/method+path context); also apply the same uniqueness
check where toolNames are deduped (the logic around the toolNames array/Set used
at lines 220-221) so duplicates are rejected rather than overwritten. Ensure the
thrown error halts the refresh/registration flow so duplicate names are explicit
and must be resolved.
- Around line 286-291: The generated parameter shapes use mapOpenApiType and set
shape[param.name] but later code collapses all runtime parameter values with
String(...), which mangles arrays/objects; update the serialization logic that
currently wraps parameter values with String(...) (the call sites that serialize
path/query/header params) to detect when the mapped type is an object/array (use
the schema info from op.parameters and/or mapOpenApiType result) and serialize
such values with JSON.stringify (or the OpenAPI style/explode logic if present)
while keeping scalar values coerced with String; ensure null/undefined are
handled consistently and preserve shape[param.name] optionality when choosing
the serialization branch.
- Around line 473-490: The code only treats content-type values that include
'application/json' as JSON; change both checks that inspect
response.headers.get('content-type') (the local variable contentType used before
throwing and the later contentType check before returning) to also detect
vendor/problem JSON types by checking for '+json' (e.g., treat as JSON when
contentType includes 'application/json' OR includes '+json' / matches
/(?:^|\s)application\/(?:.+\+json|json)(?:;|$)/). Update both the body-parsing
branch and the final return branch so responses like 'application/problem+json'
or 'application/vnd.api+json' are parsed with response.json() instead of text.
- Around line 382-383: The code builds the request URL by naively concatenating
baseUrl and op.path into the variable url, which can produce double slashes when
servers[0].url or options.baseUrl ends with '/' and op.path begins with '/';
update the logic that sets url (using the existing baseUrl and op.path
variables) to normalize slashes by trimming any trailing slashes from baseUrl
(servers[0].url / options.baseUrl) and any leading slashes from op.path before
joining them with a single '/' so the result never contains '//' between host
and path.
---
Duplicate comments:
In `@libs/broker/src/openapi/openapi-spec-poller.ts`:
- Around line 260-274: After awaiting sha256Hex(body) re-check that the poller
wasn't stopped before you compare/update state or emit events: add a guard
(e.g., if (this.stopped || this.isStopped()) return;) immediately after the
await to abort if stop() ran; also avoid advancing this.lastHash before a
successful listener run — move the this.lastHash = hash assignment to after
emit('changed', body, hash) and/or wrap the emit in try/catch so you only update
lastHash/lastEtag/lastModified when emit completes without throwing (and still
respect the cancellation guard). Ensure you reference sha256Hex, this.lastHash,
emit('changed'), stop(), this.lastEtag and this.lastModified when applying the
changes.
---
Nitpick comments:
In `@libs/broker/src/broker-session.ts`:
- Around line 336-379: The new public BrokerSession methods emitToolProgress,
emitPartialResult, and getPartialErrors were added but lack corresponding docs;
add documentation entries under docs/enclave/** describing each API (purpose,
parameters, return values, examples/usage, and error semantics), referencing
BrokerSession and the EventType.ToolProgress and EventType.PartialResult
behaviors, and update any public API index or changelog so these methods are
discoverable in the libs/** documentation set.
- Around line 346-354: The payload passed to this.emitter.emit for
EventType.ToolProgress includes optional properties bytesReceived and totalBytes
even when undefined; update the emit call to build the event payload for
makeCustomEvent by only including bytesReceived and totalBytes when they are not
undefined (e.g., create a payload object with callId, phase, elapsedMs and
conditionally add bytesReceived/totalBytes or delete them if undefined) so the
emitted event does not contain undefined fields; target the emitter.emit /
makeCustomEvent invocation and the variables callId, phase, elapsedMs,
bytesReceived, totalBytes when making the change.
- Around line 308-317: The nested ternary that sets errorInfo.code is hard to
read; extract that logic into a small helper (e.g., determineErrorCode(err) or a
local function) or replace the ternary with an if/else block: use this._state
=== 'cancelled' (or the existing isCancelled variable) and
this._deadlineExceeded to return 'DEADLINE_EXCEEDED' or 'SESSION_CANCELLED',
otherwise return (err as Error & { code?: string }).code ?? 'EXECUTION_ERROR';
then use that result when building errorInfo before calling
this.emitter.emitFinalError(..., eventStats, this.getPartialErrors()).
- Around line 463-474: The makeCustomEvent function is using a double cast to
StreamEvent because SessionEmitter lacks typed emitters for specific event
payloads; add dedicated, type-safe emitter methods on SessionEmitter (e.g.,
emitDeadlineExceeded, emitToolProgress, emitPartialResult) that accept
strongly-typed payloads and construct StreamEvent objects rather than passing
Record<string, unknown>, then update callers to use those new methods so
makeCustomEvent can either be removed or typed properly without the as unknown
as StreamEvent cast; ensure new methods mirror existing emitter patterns and
produce StreamEvent instances with protocolVersion, sessionId, seq (via
this.emitter.nextSeq()), type, and typed payload.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: c3c5c50f-0de8-4005-a43f-17609b921f82
📒 Files selected for processing (3)
libs/broker/src/broker-session.tslibs/broker/src/openapi/openapi-spec-poller.tslibs/broker/src/openapi/openapi-tool-loader.ts
Summary by CodeRabbit
New Features
Documentation