mirror of https://github.com/openclaw/openclaw.git
docs: refresh minimax thinking refs
This commit is contained in:
parent
3dda75894b
commit
3ec0463da9
|
|
@ -557,6 +557,10 @@ MiniMax is configured via `models.providers` because it uses custom endpoints:
|
|||
|
||||
See [/providers/minimax](/providers/minimax) for setup details, model options, and config snippets.
|
||||
|
||||
On MiniMax's Anthropic-compatible streaming path, OpenClaw disables thinking by
|
||||
default unless you explicitly set it, and `/fast on` rewrites
|
||||
`MiniMax-M2.7` to `MiniMax-M2.7-highspeed`.
|
||||
|
||||
### Ollama
|
||||
|
||||
Ollama ships as a bundled provider plugin and uses Ollama's native API:
|
||||
|
|
|
|||
|
|
@ -2455,6 +2455,10 @@ Set `MINIMAX_API_KEY`. Shortcuts:
|
|||
`openclaw onboard --auth-choice minimax-global-api` or
|
||||
`openclaw onboard --auth-choice minimax-cn-api`.
|
||||
The model catalog now defaults to M2.7 only.
|
||||
On the Anthropic-compatible streaming path, OpenClaw disables MiniMax thinking
|
||||
by default unless you explicitly set `thinking` yourself. `/fast on` or
|
||||
`params.fastMode: true` rewrites `MiniMax-M2.7` to
|
||||
`MiniMax-M2.7-highspeed`.
|
||||
|
||||
</Accordion>
|
||||
|
||||
|
|
|
|||
|
|
@ -143,6 +143,12 @@ openclaw onboard --auth-choice minimax-cn-api
|
|||
}
|
||||
```
|
||||
|
||||
On the Anthropic-compatible streaming path, OpenClaw now disables MiniMax
|
||||
thinking by default unless you explicitly set `thinking` yourself. MiniMax's
|
||||
streaming endpoint emits `reasoning_content` in OpenAI-style delta chunks
|
||||
instead of native Anthropic thinking blocks, which can leak internal reasoning
|
||||
into visible output if left enabled implicitly.
|
||||
|
||||
### MiniMax M2.7 as fallback (example)
|
||||
|
||||
**Best for:** keep your strongest latest-generation model as primary, fail over to MiniMax M2.7.
|
||||
|
|
@ -196,6 +202,11 @@ Current MiniMax auth choices in the wizard/CLI:
|
|||
- Model refs are `minimax/<model>`.
|
||||
- Default chat model: `MiniMax-M2.7`
|
||||
- Alternate chat model: `MiniMax-M2.7-highspeed`
|
||||
- On `api: "anthropic-messages"`, OpenClaw injects
|
||||
`thinking: { type: "disabled" }` unless thinking is already explicitly set in
|
||||
params/config.
|
||||
- `/fast on` or `params.fastMode: true` rewrites `MiniMax-M2.7` to
|
||||
`MiniMax-M2.7-highspeed` on the Anthropic-compatible stream path.
|
||||
- Onboarding and direct API-key setup write explicit model definitions with
|
||||
`input: ["text", "image"]` for both M2.7 variants
|
||||
- The bundled provider catalog currently exposes the chat refs as text-only
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ title: "Thinking Levels"
|
|||
- `highest`, `max` map to `high`.
|
||||
- Provider notes:
|
||||
- Anthropic Claude 4.6 models default to `adaptive` when no explicit thinking level is set.
|
||||
- MiniMax (`minimax/*`) on the Anthropic-compatible streaming path defaults to `thinking: { type: "disabled" }` unless you explicitly set thinking in model params or request params. This avoids leaked `reasoning_content` deltas from MiniMax's non-native Anthropic stream format.
|
||||
- Z.AI (`zai/*`) only supports binary thinking (`on`/`off`). Any non-`off` level is treated as `on` (mapped to `low`).
|
||||
- Moonshot (`moonshot/*`) maps `/think off` to `thinking: { type: "disabled" }` and any non-`off` level to `thinking: { type: "enabled" }`. When thinking is enabled, Moonshot only accepts `tool_choice` `auto|none`; OpenClaw normalizes incompatible values to `auto`.
|
||||
|
||||
|
|
@ -57,6 +58,7 @@ title: "Thinking Levels"
|
|||
- For `openai/*`, fast mode maps to OpenAI priority processing by sending `service_tier=priority` on supported Responses requests.
|
||||
- For `openai-codex/*`, fast mode sends the same `service_tier=priority` flag on Codex Responses. OpenClaw keeps one shared `/fast` toggle across both auth paths.
|
||||
- For direct public `anthropic/*` requests, including OAuth-authenticated traffic sent to `api.anthropic.com`, fast mode maps to Anthropic service tiers: `/fast on` sets `service_tier=auto`, `/fast off` sets `service_tier=standard_only`.
|
||||
- For `minimax/*` on the Anthropic-compatible path, `/fast on` (or `params.fastMode: true`) rewrites `MiniMax-M2.7` to `MiniMax-M2.7-highspeed`.
|
||||
- Explicit Anthropic `serviceTier` / `service_tier` model params override the fast-mode default when both are set. OpenClaw still skips Anthropic service-tier injection for non-Anthropic proxy base URLs.
|
||||
|
||||
## Verbose directives (/verbose or /v)
|
||||
|
|
|
|||
Loading…
Reference in New Issue