Initial commit: Docker-based local LLM environment

This commit sets up a complete Docker Compose environment for running local AI and LLM services:

Features:
- AnythingLLM with Qdrant vector database integration
- Flowise AI workflow automation
- Open WebUI for Ollama interaction
- n8n workflow automation platform
- Qdrant vector database
- PostgreSQL database for n8n

Configuration:
- Environment variables for all services
- Support for both native and containerized Ollama
- Proper data persistence with Docker volumes
- Network configuration for inter-service communication

Documentation:
- Comprehensive README with setup instructions
- Troubleshooting guidance
- Service descriptions and port mappings

Directory structure:
- Organized directories for each service
- .gitignore configured to track directories but ignore contents
- Sample environment configuration
This commit is contained in:
Dale-Kurt Murray
2025-03-10 22:40:34 -04:00
parent 71dae881d0
commit 125a71fc91
4448 changed files with 26 additions and 493050 deletions

26
.gitignore vendored
View File

@@ -1,5 +1,6 @@
# Ignore environment variables
.env
.env.sample
# Keep directories but ignore their contents
shared/*
@@ -10,3 +11,28 @@ n8n/*
!n8n/backup/
n8n/backup/*
!n8n/backup/.gitkeep
flowise/*
!flowise/.gitkeep
n8n-tool-workflows/*
!n8n-tool-workflows/.gitkeep
# Docker volumes
postgres_storage/
n8n_storage/
qdrant_storage/
open-webui/
flowise/
anythingllm_storage/
# ollama_storage/ # Uncomment if using containerized Ollama
# OS specific files
.DS_Store
Thumbs.db
# Editor directories and files
.idea/
.vscode/
*.swp
*.swo