书栈网 · BookStack 本次搜索耗时 0.018 秒,为您找到 323 个相关结果.
  • Ollama使用GPU运行LLM模型

    如何让Ollama使用GPU运行LLM模型 1. 安装英伟达容器安装包 2. 使用 GPU 运行 Ollama 3. 使用 Ollama 下载模型 4. 在 MaxKB 的模型设置中添加模型进行对接 如何让Ollama使用GPU运行LLM模型 说明:以 GPU 模式运行 Ollama 需要有 NVIDIA 显卡支持。 1. 安装英...
  • Overview

    Large Language Models Types of LLMs in AnythingLLM System LLM Workspace LLM Agent LLM Supported LLM Providers Local Language Model Providers Cloud Language Model Providers ...
  • Mistral

    Upstream Formats OpenAI Format Ollama Format Using the plugin with Mistral Prerequisites Set up route and plugin Test the configuration This guide walks you through setting...
  • OpenAI (generic)

    OpenAI (Generic) LLM Connecting to OpenAI (Generic) OpenAI (Generic) LLM ️‼️ Caution! This is a developer-focused llm provider - you should not use it unless you know what...
  • Mistral

    Upstream Formats OpenAI Format Ollama Format Using the plugin with Mistral Prerequisites Set up route and plugin Test the configuration This guide walks you through setting...
  • Mistral

    Upstream Formats OpenAI Format Ollama Format Using the plugin with Mistral Prerequisites Set up route and plugin Test the configuration This guide walks you through setting...
  • Overview

    Kong AI Gateway How to get started What is the Kong AI Gateway? AI Gateway capabilities AI Provider Proxy AI usage governance Data governance Prompt engineering Request transf...
  • Use Cases

    Use Cases Semantic Search LLM Orchestration Chains Retrieval augmented generation Language Model Workflows Use Cases The following sections introduce common txtai use cases...
  • Overview

    Kong AI Gateway How to get started What is the Kong AI Gateway? AI Gateway capabilities AI Provider Proxy AI usage governance Data governance Prompt engineering Request transf...
  • Conversational search

    Conversational search Conversation history RAG Prerequisites Using conversational search Step 1: Create a connector to a model Step 2: Register and deploy the model Step 3: Cr...