* feat: make stream a caller-controlled passthrough in Messages API
Allow callers to set stream: false via requestOptions instead of
hardcoding stream: true. Add non-streaming response handler for the
Anthropic Messages API that parses single JSON responses.
- createMessagesRequestBody: stream: true → options.requestOptions?.stream ?? true
- preparePostOptions: stream: true as default before spread (callers can override)
- processResponseFromMessagesEndpoint: auto-detect via Content-Type header
- processNonStreamingResponseFromMessagesEndpoint: new handler for JSON responses
with tool call support in finishedCb delta, defensive parsing, cache-token
consistency warning, unknown block type logging
- Remove stale 'stream not respected' comment from fetch.ts
- Remove stream: false from agentIntent.ts inline summarization
- 10 new tests for non-streaming handler
* fix: add telemetry parity for non-streaming path and bump cache salt
* regenerate simulation cache for review-inline tests
* Regenerate simulation cache after rebase
* Temporarily disable multifile-edit-claude variant (#315940)
claude-3.5-sonnet returns model_not_supported from the endpoint, breaking
simulation cache regen. Re-enable when the test is updated to use a
currently-supported Claude model.
* Fix terminal strict-mode crash on empty suggestions + update baseline
- terminal.stest.ts: guard strict-mode `ok()` predicate so when the model
returns no code block, the test fails cleanly with the existing message
instead of crashing with 'Cannot read properties of undefined (reading match)'.
Also drop the stale commented-out debug block.
- baseline.json: refresh scores (68.01 -> 68.69) and drop the 14 entries for
the disabled multifile-edit-claude variant (see #315940).
- Remove now-orphaned multifile-edit-claude-panel.json outcome file.
* Apply CI-observed score improvements for cpp inline scenarios
CI on Linux scores 4 cpp InlineChatIntent scenarios higher than my local
macOS run does (likely platform-specific line-ending/whitespace normalization
in the cpp grader). Update baseline.json to match the Linux scores:
- edit-InlineChatIntent [inline] [cpp] - edit for cpp: 5 -> 9
- edit-InlineChatIntent [inline] [cpp] - edit for macro: 0 -> 2
- generate-InlineChatIntent [inline] [cpp] - cpp code generation: 3 -> 10
- generate-InlineChatIntent [inline] [cpp] - templated code gen: 0 -> 10
Overall score: 68.69 -> 68.86.
* Populate cpp diagnostic cache via Docker for cross-platform parity
The earlier rebase cache regen produced new LLM responses for the cpp
inline tests but failed to populate the clang diagnostic provider cache
for those new inputs, because clang detection on macOS is broken (Apple
clang prints '-v' output to stderr, but findIfInstalled only checks
stdout) and Docker wasn't running. As a result the cpp diagnostic cache
was missing entries for the new LLM responses, and CI re-ran clang live
on each platform with diverging results:
- Linux CI: clang available, scored highest (9, 2, 10, 10)
- Windows CI: no clang, errored out (5, 0, 10, 10 with worsening)
- macOS: Apple clang misdetected as missing, Docker off, errored
This commit:
1. Bumps CLANG_DIAGNOSTICS_PROVIDER_CACHE_SALT 5 -> 6 to invalidate
any contaminated entries.
2. Adds two new cache layers populated by running cpp tests via Docker
(using the mcr.microsoft.com/devcontainers/cpp image, same Linux
clang as CI). All 14 cpp scenarios now produce deterministic,
platform-independent diagnostic results when read from cache.
Verified with --require-cache: all cpp scenarios pass without invoking
clang/docker at runtime.
* tools: add binary file support with hexdump display
Adds support for reading and displaying binary files in the read file tool
with a hexdump-formatted view. This enables better handling of binary content
in the IDE context without attempting to interpret them as text.
- Adds hexdump utility to format binary data in a readable hex/ASCII view
- Extends readFileTool to detect binary files and provide formatted output
- Adds binaryFileHexdump prompt component for displaying binary content
- Integrates binary file variable support in file variable display
- Updates test fixtures with binary file handling scenarios
Fixes https://github.com/microsoft/vscode/issues/284178
Fixes https://github.com/microsoft/vscode/issues/299973
(Commit message generated by Copilot)
* pr comments
* baseline update
* baseline update
* Add session transcript service for hooks
Implement JSONL-based session transcript service that records
conversation turns, tool executions, and assistant messages for
hook consumers.
Key changes:
- New ISessionTranscriptService interface and SessionTranscriptService
implementation with buffered writes and automatic old transcript cleanup
- Integrate transcript logging into tool calling loop: user messages,
assistant messages, tool execution start/complete, turn boundaries
- Auto-flush transcript and inject transcript_path into hook input
- Race flush with 500ms timeout to avoid blocking hook execution
- Gate transcript creation on ChatRequest.hasHooksEnabled
- Include copilotVersion and vscodeVersion in session.start entry
- Add timestamp to IToolCallRound for transcript timing
- Add hasHooksEnabled to ChatRequest interface and all implementors
* Fixes
* Remove `@workspace` chat participant
For https://github.com/microsoft/vscode/issues/292972
Removes the `@workspace` chat participant since this is now an outdated (and confusing) way to use code search. For now we'll keep the commands but I've moved them under the default agent instead
* Updating tests too and fixing some references
We removed the 'pause' feature back in ~June, cleanup logic we still had
around it. This removes:
- The PauseController class which was no longer being used
- onPaused event parameters from chat participant, request, and intent
handlers
- Pause-related event listening and stream pausing logic
- Simplified throwIfCancelled to be synchronous since it no longer needs
to wait for pause resumption
(Commit message generated by Copilot)
* chat: wire up yieldrequested for steering messages
Allows the client to do a 'soft cancel' after a tool call happens before returning back to the model, or before the next turn.
* fix compile
* Add question carousel functionality and related types for chat interactions
* feat: enhance logging for question carousel processing and answer conversion
* feat: enhance answer processing in AskQuestionsTool to support freeform text input
* cleanup: remove unused AskQuestionResult type from IQuestionAnswer
* clenaup: restore package.json
* feat: enhance AskQuestionsTool to handle no stream scenario and improve carousel answer conversion
Enables the same `no-unexternalized-strings` with have in `vscode` in this repo. This make sure we have a more consistent style across repos and when generating edits
* Always use `github.copilot.editingSessionEditor`, enable slash-command based on v2-config
This makes all requests go to the `InlineChatIntent` and it decides (based on the v2-config) if it uses the "old world" for making inline chat requests happen
* re-run tests