mirror of
https://github.com/arc53/DocsGPT.git
synced 2025-12-01 01:23:14 +00:00
refactor: update docs LLM_NAME and MODEL_NAME to LLM_PROVIDER and LLM_NAME
This commit is contained in:
@@ -13,15 +13,15 @@ The primary method for configuring your LLM provider in DocsGPT is through the `
|
||||
|
||||
To connect to a cloud LLM provider, you will typically need to configure the following basic settings in your `.env` file:
|
||||
|
||||
* **`LLM_NAME`**: This setting is essential and identifies the specific cloud provider you wish to use (e.g., `openai`, `google`, `anthropic`).
|
||||
* **`MODEL_NAME`**: Specifies the exact model you want to utilize from your chosen provider (e.g., `gpt-4o`, `gemini-2.0-flash`, `claude-3-5-sonnet-latest`). Refer to your provider's documentation for a list of available models.
|
||||
* **`LLM_PROVIDER`**: This setting is essential and identifies the specific cloud provider you wish to use (e.g., `openai`, `google`, `anthropic`).
|
||||
* **`LLM_NAME`**: Specifies the exact model you want to utilize from your chosen provider (e.g., `gpt-4o`, `gemini-2.0-flash`, `claude-3-5-sonnet-latest`). Refer to your provider's documentation for a list of available models.
|
||||
* **`API_KEY`**: Almost all cloud LLM providers require an API key for authentication. Obtain your API key from your chosen provider's platform and securely store it in your `.env` file.
|
||||
|
||||
## Explicitly Supported Cloud Providers
|
||||
|
||||
DocsGPT offers direct, streamlined support for the following cloud LLM providers, making configuration straightforward. The table below outlines the `LLM_NAME` and example `MODEL_NAME` values to use for each provider in your `.env` file.
|
||||
DocsGPT offers direct, streamlined support for the following cloud LLM providers, making configuration straightforward. The table below outlines the `LLM_PROVIDER` and example `LLM_NAME` values to use for each provider in your `.env` file.
|
||||
|
||||
| Provider | `LLM_NAME` | Example `MODEL_NAME` |
|
||||
| Provider | `LLM_PROVIDER` | Example `LLM_NAME` |
|
||||
| :--------------------------- | :------------- | :-------------------------- |
|
||||
| DocsGPT Public API | `docsgpt` | `None` |
|
||||
| OpenAI | `openai` | `gpt-4o` |
|
||||
@@ -35,16 +35,16 @@ DocsGPT offers direct, streamlined support for the following cloud LLM providers
|
||||
|
||||
DocsGPT's flexible architecture allows you to connect to any cloud provider that offers an API compatible with the OpenAI API standard. This opens up a vast ecosystem of LLM services.
|
||||
|
||||
To connect to an OpenAI-compatible cloud provider, you will still use `LLM_NAME=openai` in your `.env` file. However, you will also need to specify the API endpoint of your chosen provider using the `OPENAI_BASE_URL` setting. You will also likely need to provide an `API_KEY` and `MODEL_NAME` as required by that provider.
|
||||
To connect to an OpenAI-compatible cloud provider, you will still use `LLM_PROVIDER=openai` in your `.env` file. However, you will also need to specify the API endpoint of your chosen provider using the `OPENAI_BASE_URL` setting. You will also likely need to provide an `API_KEY` and `LLM_NAME` as required by that provider.
|
||||
|
||||
**Example for DeepSeek (OpenAI-Compatible API):**
|
||||
|
||||
To connect to DeepSeek, which offers an OpenAI-compatible API, your `.env` file could be configured as follows:
|
||||
|
||||
```
|
||||
LLM_NAME=openai
|
||||
LLM_PROVIDER=openai
|
||||
API_KEY=YOUR_API_KEY # Your DeepSeek API key
|
||||
MODEL_NAME=deepseek-chat # Or your desired DeepSeek model name
|
||||
LLM_NAME=deepseek-chat # Or your desired DeepSeek model name
|
||||
OPENAI_BASE_URL=https://api.deepseek.com/v1 # DeepSeek's OpenAI API URL
|
||||
```
|
||||
|
||||
|
||||
Reference in New Issue
Block a user