header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Ollama announces its role as the official model provider for OpenClaw: One-command access, supporting hybrid cloud and on-premises model invocation

According to 1M AI News, local large-scale model execution tool Ollama has today announced its official integration as the OpenClaw's model provider. Users can onboard by running openclaw onboard --auth-choice ollama, enabling seamless collaboration between all Ollama models and OpenClaw.

With this integration, Ollama now offers a "Cloud + Local" hybrid mode, allowing users to utilize both Ollama's cloud-hosted models and locally run models concurrently. OpenClaw's onboarding wizard will automatically detect locally installed Ollama models and facilitate streaming and tool invocation through Ollama's native API. Peter Steinberger, the founder of OpenClaw, was involved in the review process of this integration.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish