mirror of https://github.com/openclaw/openclaw.git
The Ollama stream function requested `stream: true` from the API but accumulated all content chunks internally, emitting only a single `done` event at the end. This prevented downstream consumers (block streaming pipeline, typing indicators, draft stream) from receiving incremental text updates during generation. Emit the full `start → text_start → text_delta* → text_end → done` event sequence matching the AssistantMessageEvent contract used by Anthropic, OpenAI, and Google providers. Each `text_delta` carries both the incremental `delta` and an accumulated `partial` snapshot. Tool-call-only responses (no text content) continue to emit only the `done` event, preserving backward compatibility. --------- Signed-off-by: Jakub Rusz <jrusz@proton.me> Co-authored-by: Claude <claude-opus-4-6> <noreply@anthropic.com> Co-authored-by: Bruce MacDonald <brucewmacdonald@gmail.com> |
||
|---|---|---|
| .. | ||
| src | ||
| README.md | ||
| api.ts | ||
| index.test.ts | ||
| index.ts | ||
| openclaw.plugin.json | ||
| package.json | ||
| runtime-api.ts | ||
README.md
Ollama Provider
Bundled provider plugin for Ollama discovery and setup.