LM Studio Embedder
️⚠️
Heads up!
LMStudio’s inference server only allows you to load multiple LLMs or a single embedding model, but not both. This means LMStudio cannot be both your LLM and embedder.
LMStudio (opens in a new tab) supports LLM and embedding GGUF models from HuggingFace that can be run on CPU or GPU.
LMStudio is a separate application that you need to download first and connect to.
Connecting to LM Studio
When running LMStudio locally, you should connect to LMStudio by first running the built-in inference server.
You must explicitly load the embedding model before starting the inference server.
You can update your model to a different model at any time in the Settings.
当前内容版权归 AnythingLLM 或其关联方所有,如需对内容或内容相关联开源项目进行关注与资助,请访问 AnythingLLM .