mirror of
https://github.com/arc53/DocsGPT.git
synced 2025-12-03 18:43:14 +00:00
models+guide-upd+extentions
This commit is contained in:
55
docs/pages/Models/cloud-providers.mdx
Normal file
55
docs/pages/Models/cloud-providers.mdx
Normal file
@@ -0,0 +1,55 @@
|
||||
---
|
||||
title: Connecting DocsGPT to Cloud LLM Providers
|
||||
description: Connect DocsGPT to various Cloud Large Language Model (LLM) providers to power your document Q&A.
|
||||
---
|
||||
|
||||
# Connecting DocsGPT to Cloud LLM Providers
|
||||
|
||||
DocsGPT is designed to seamlessly integrate with a variety of Cloud Large Language Model (LLM) providers, giving you access to state-of-the-art AI models for document question answering.
|
||||
|
||||
## Configuration via `.env` file
|
||||
|
||||
The primary method for configuring your LLM provider in DocsGPT is through the `.env` file. For a comprehensive understanding of all available settings, please refer to the detailed [DocsGPT Settings Guide](/Deploying/DocsGPT-Settings).
|
||||
|
||||
To connect to a cloud LLM provider, you will typically need to configure the following basic settings in your `.env` file:
|
||||
|
||||
* **`LLM_NAME`**: This setting is essential and identifies the specific cloud provider you wish to use (e.g., `openai`, `google`, `anthropic`).
|
||||
* **`MODEL_NAME`**: Specifies the exact model you want to utilize from your chosen provider (e.g., `gpt-4o`, `gemini-2.0-flash`, `claude-3-5-sonnet-latest`). Refer to your provider's documentation for a list of available models.
|
||||
* **`API_KEY`**: Almost all cloud LLM providers require an API key for authentication. Obtain your API key from your chosen provider's platform and securely store it in your `.env` file.
|
||||
|
||||
## Explicitly Supported Cloud Providers
|
||||
|
||||
DocsGPT offers direct, streamlined support for the following cloud LLM providers, making configuration straightforward. The table below outlines the `LLM_NAME` and example `MODEL_NAME` values to use for each provider in your `.env` file.
|
||||
|
||||
| Provider | `LLM_NAME` | Example `MODEL_NAME` |
|
||||
| :--------------------------- | :------------- | :-------------------------- |
|
||||
| DocsGPT Public API | `docsgpt` | `None` |
|
||||
| OpenAI | `openai` | `gpt-4o` |
|
||||
| Google (Vertex AI, Gemini) | `google` | `gemini-2.0-flash` |
|
||||
| Anthropic (Claude) | `anthropic` | `claude-3-5-sonnet-latest` |
|
||||
| Groq | `groq` | `llama-3.1-8b-instant` |
|
||||
| HuggingFace Inference API | `huggingface` | `meta-llama/Llama-3.1-8B-Instruct` |
|
||||
| Azure OpenAI | `azure_openai` | `gpt-4o` |
|
||||
|
||||
## Connecting to OpenAI-Compatible Cloud APIs
|
||||
|
||||
DocsGPT's flexible architecture allows you to connect to any cloud provider that offers an API compatible with the OpenAI API standard. This opens up a vast ecosystem of LLM services.
|
||||
|
||||
To connect to an OpenAI-compatible cloud provider, you will still use `LLM_NAME=openai` in your `.env` file. However, you will also need to specify the API endpoint of your chosen provider using the `OPENAI_BASE_URL` setting. You will also likely need to provide an `API_KEY` and `MODEL_NAME` as required by that provider.
|
||||
|
||||
**Example for DeepSeek (OpenAI-Compatible API):**
|
||||
|
||||
To connect to DeepSeek, which offers an OpenAI-compatible API, your `.env` file could be configured as follows:
|
||||
|
||||
```
|
||||
LLM_NAME=openai
|
||||
API_KEY=YOUR_API_KEY # Your DeepSeek API key
|
||||
MODEL_NAME=deepseek-chat # Or your desired DeepSeek model name
|
||||
OPENAI_BASE_URL=https://api.deepseek.com/v1 # DeepSeek's OpenAI API URL
|
||||
```
|
||||
|
||||
Remember to consult the documentation of your chosen OpenAI-compatible cloud provider for their specific API endpoint, required model names, and authentication methods.
|
||||
|
||||
## Adding Support for Other Cloud Providers
|
||||
|
||||
If you wish to connect to a cloud provider that is not explicitly listed above or doesn't offer OpenAI API compatibility, you can extend DocsGPT to support it. Within the DocsGPT repository, navigate to the `application/llm` directory. Here, you will find Python files defining the existing LLM integrations. You can use these files as examples to create a new module for your desired cloud provider. After creating your new LLM module, you will need to register it within the `llm_creator.py` file. This process involves some coding, but it allows for virtually unlimited extensibility to connect to any cloud-based LLM service with an accessible API.
|
||||
Reference in New Issue
Block a user