openclaw/extensions/ollama
Jakub Rusz 8f44bd6426
fix(ollama): emit streaming events for text content during generation (#53891)
The Ollama stream function requested `stream: true` from the API but
accumulated all content chunks internally, emitting only a single `done`
event at the end. This prevented downstream consumers (block streaming
pipeline, typing indicators, draft stream) from receiving incremental
text updates during generation.

Emit the full `start → text_start → text_delta* → text_end → done`
event sequence matching the AssistantMessageEvent contract used by
Anthropic, OpenAI, and Google providers. Each `text_delta` carries both
the incremental `delta` and an accumulated `partial` snapshot.

Tool-call-only responses (no text content) continue to emit only the
`done` event, preserving backward compatibility.

---------

Signed-off-by: Jakub Rusz <jrusz@proton.me>
Co-authored-by: Claude <claude-opus-4-6> <noreply@anthropic.com>
Co-authored-by: Bruce MacDonald <brucewmacdonald@gmail.com>
2026-03-27 11:12:09 -07:00
..
src fix(ollama): emit streaming events for text content during generation (#53891) 2026-03-27 11:12:09 -07:00
README.md
api.ts refactor: route ollama sdk through public barrels 2026-03-27 13:46:17 +00:00
index.test.ts refactor: move provider runtime into extensions 2026-03-27 05:38:58 +00:00
index.ts refactor: move provider runtime into extensions 2026-03-27 05:38:58 +00:00
openclaw.plugin.json refactor: move bundled plugin policy into manifests 2026-03-27 16:40:27 +00:00
package.json chore: bump versions to 2026.3.26 2026-03-27 02:03:22 +00:00
runtime-api.ts refactor: route ollama sdk through public barrels 2026-03-27 13:46:17 +00:00

README.md

Ollama Provider

Bundled provider plugin for Ollama discovery and setup.