-
Couldn't load subscription status.
- Fork 1.3k
Improve request log tree performance #1671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR refactors the ChatRequestProvider class to improve performance by introducing incremental tree updates instead of rebuilding the entire tree on every request log change. The key optimization is caching prompt/root nodes and only refreshing affected tree items when new entries are added.
Key Changes:
- Introduced state caching (
rootItems,seenChatRequests,processedCount,currentPrompt) to track processed entries - Replaced full tree rebuild with incremental
appendNewEntries()method that only processes new log entries - Added automatic cache reset when request log shrinks to prevent stale UI state
|
@microsoft-github-policy-service agree |
b198148 to
da80e7b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 1 out of 1 changed files in this pull request and generated 4 comments.
Problem
The slowdown were seen when chat gets big and big and big, traced back to the request log view inside the Copilot Chat extension. Every time a new chat request or tool call arrived, the old code rebuilt the entire tree of prompts and children from scratch—reflattening the full history and firing a full refresh event. In long chat sessions that tree gets big, so each update became progressively heavier and the UI stuttered.
I fixed it by making ChatRequestProvider incremental:
It now tracks how many log entries have already been processed, keeps the existing prompt nodes in a cache, and only appends new children. If the log ever shrinks (e.g., on reload) it resets cleanly, but otherwise it avoids rebuilding everything.
Refresh events are fired only for the nodes that genuinely changed; a root-level refresh happens once when structure changes, instead of on every update. Once that change is in, the request log will stop rebuilding itself over and over, so the chat UI stayed responsive even during long-running sessions.
Fix Summary
ChatRequestProviderso it caches prompt/root nodes, appends new log entries incrementally, and only refreshes affected tree itemssrc/extension/log/vscode-node/requestLogTree.tsTesting