跳过正文
  1. 开源项目/

Ollama v4: Distributed Inference Across Multiple Machines

RekCore
作者
RekCore
用通俗易懂的语言,为你解读 AI 世界正在发生的一切

Ollama v4 Brings Distributed Inference to Everyone
#

The Ollama project has released v4.0, a landmark update that introduces native distributed inference support. Users can now split large language model workloads across multiple machines on the same network, making it feasible to run 70B+ parameter models on clusters of consumer hardware.

Key Features
#