mirror of
https://github.com/dalekurt/local-llm-stack.git
synced 2026-01-28 15:50:27 +00:00
Update README
This commit is contained in:
47
README.md
47
README.md
@@ -2,6 +2,20 @@
|
||||
|
||||
This repository contains a Docker Compose setup for running a local AI and automation environment with multiple services.
|
||||
|
||||
## Overview
|
||||
|
||||
The Local LLM Stack is a comprehensive, Docker-based environment designed to provide a complete ecosystem for working with Large Language Models (LLMs) locally. It integrates several powerful open-source tools that work together to provide document management, workflow automation, vector search, and LLM inference capabilities.
|
||||
|
||||
This stack is designed with flexibility in mind, allowing you to use either a locally installed Ollama instance or run Ollama as a containerized service. All components are configured to work together out of the box, with sensible defaults that can be customized to suit your specific needs.
|
||||
|
||||
Key features of this stack include:
|
||||
- Document processing and management with AnythingLLM
|
||||
- AI workflow automation with Flowise
|
||||
- Direct LLM interaction through Open WebUI
|
||||
- Workflow automation with n8n
|
||||
- Vector storage with Qdrant
|
||||
- Seamless integration with Ollama for LLM inference
|
||||
|
||||
## Services
|
||||
|
||||
### Flowise
|
||||
@@ -37,6 +51,13 @@ This repository contains a Docker Compose setup for running a local AI and autom
|
||||
- Database used by n8n
|
||||
- Port: 5432
|
||||
|
||||
### Ollama (Optional)
|
||||
- LLM inference engine
|
||||
- Port: 11434
|
||||
- API endpoint: http://localhost:11434
|
||||
- Can be run natively (default) or as a containerized service
|
||||
- Used by AnythingLLM and can be used by Flowise
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Docker and Docker Compose installed on your system
|
||||
@@ -47,8 +68,8 @@ This repository contains a Docker Compose setup for running a local AI and autom
|
||||
|
||||
1. Clone this repository:
|
||||
```
|
||||
git clone https://github.com/yourusername/docker-local-ai-llm.git
|
||||
cd docker-local-ai-llm
|
||||
git clone https://github.com/yourusername/local-llm-stack.git
|
||||
cd local-llm-stack
|
||||
```
|
||||
|
||||
2. Create a `.env` file based on the provided `.env.sample` file:
|
||||
@@ -60,7 +81,10 @@ This repository contains a Docker Compose setup for running a local AI and autom
|
||||
|
||||
4. Ollama Configuration:
|
||||
- By default, the setup is configured to use a locally installed Ollama instance
|
||||
- If you want to run Ollama as a container, uncomment the Ollama service in docker-compose.yml and update the OLLAMA_BASE_PATH and EMBEDDING_BASE_PATH in .env to use http://ollama:11434
|
||||
- If you want to run Ollama as a container, uncomment the following in docker-compose.yml:
|
||||
- The `ollama_storage` volume in the volumes section
|
||||
- The entire `ollama` service definition
|
||||
- Update the OLLAMA_BASE_PATH and EMBEDDING_BASE_PATH in .env to use http://ollama:11434
|
||||
|
||||
5. Start the services:
|
||||
```
|
||||
@@ -101,6 +125,20 @@ AnythingLLM is configured to use:
|
||||
- Ollama for LLM capabilities and embeddings
|
||||
- The configuration can be adjusted in the .env file
|
||||
|
||||
### Ollama Configuration
|
||||
|
||||
Ollama can be configured in two ways:
|
||||
1. **Native Installation (Default)**:
|
||||
- Install Ollama directly on your host machine
|
||||
- The services are configured to access Ollama via host.docker.internal
|
||||
- Requires Ollama to be running on your host (`ollama serve`)
|
||||
|
||||
2. **Containerized Installation**:
|
||||
- Uncomment the Ollama service in docker-compose.yml
|
||||
- Uncomment the ollama_storage volume
|
||||
- Update the OLLAMA_BASE_PATH and EMBEDDING_BASE_PATH in .env to http://ollama:11434
|
||||
- No need to run Ollama separately on your host
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
If you encounter any issues:
|
||||
@@ -121,6 +159,3 @@ If you encounter any issues:
|
||||
|
||||
5. If you're having connection issues between containers and your host machine, check that the `extra_hosts` configuration is correctly set in docker-compose.yml.
|
||||
|
||||
## License
|
||||
|
||||
[Specify your license here]
|
||||
Reference in New Issue
Block a user