Ollama LLM
Ollama (opens in a new tab) is a popular open-source (opens in a new tab) command-line tool and engine that allows you to download quantized versions of the most popular LLM chat models.
Ollama is a separate application that you need to download first and connect to. Ollama supports both running LLMs on CPU and GPU.
Connecting to Ollama
When running ollama locally, you should connect to Ollama with http://127.0.0.1:11434
when using the default settings.
You can update your model to a different model at any time in the Settings.
当前内容版权归 AnythingLLM 或其关联方所有,如需对内容或内容相关联开源项目进行关注与资助,请访问 AnythingLLM .