Skip to content

Conversation

@sibidharan
Copy link

@sibidharan sibidharan commented Oct 28, 2025

Problem

The slowdown were seen when chat gets big and big and big, traced back to the request log view inside the Copilot Chat extension. Every time a new chat request or tool call arrived, the old code rebuilt the entire tree of prompts and children from scratch—reflattening the full history and firing a full refresh event. In long chat sessions that tree gets big, so each update became progressively heavier and the UI stuttered.

I fixed it by making ChatRequestProvider incremental:

It now tracks how many log entries have already been processed, keeps the existing prompt nodes in a cache, and only appends new children. If the log ever shrinks (e.g., on reload) it resets cleanly, but otherwise it avoids rebuilding everything.
Refresh events are fired only for the nodes that genuinely changed; a root-level refresh happens once when structure changes, instead of on every update. Once that change is in, the request log will stop rebuilding itself over and over, so the chat UI stayed responsive even during long-running sessions.

Fix Summary

  • reworked ChatRequestProvider so it caches prompt/root nodes, appends new log entries incrementally, and only refreshes affected tree items
  • reset cached state automatically if the request log shrinks (e.g., on reload) to avoid stale UI
  • no other files or dependencies were touched; the change lives entirely in src/extension/log/vscode-node/requestLogTree.ts

Testing

  • npm run compile
  • npm run test:unit
  • Manual: built a VSIX, installed it on the previously slow machine, and confirmed the request log view stays responsive during long polling sessions

Copilot AI review requested due to automatic review settings October 28, 2025 06:31
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR refactors the ChatRequestProvider class to improve performance by introducing incremental tree updates instead of rebuilding the entire tree on every request log change. The key optimization is caching prompt/root nodes and only refreshing affected tree items when new entries are added.

Key Changes:

  • Introduced state caching (rootItems, seenChatRequests, processedCount, currentPrompt) to track processed entries
  • Replaced full tree rebuild with incremental appendNewEntries() method that only processes new log entries
  • Added automatic cache reset when request log shrinks to prevent stale UI state

@sibidharan
Copy link
Author

@microsoft-github-policy-service agree

@sibidharan sibidharan force-pushed the feature/request-log-refresh branch from b198148 to da80e7b Compare October 28, 2025 06:37
@sibidharan sibidharan requested a review from Copilot October 28, 2025 08:26
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 1 out of 1 changed files in this pull request and generated 4 comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants