Files
Dale-Kurt Murray 125a71fc91 Initial commit: Docker-based local LLM environment
This commit sets up a complete Docker Compose environment for running local AI and LLM services:

Features:
- AnythingLLM with Qdrant vector database integration
- Flowise AI workflow automation
- Open WebUI for Ollama interaction
- n8n workflow automation platform
- Qdrant vector database
- PostgreSQL database for n8n

Configuration:
- Environment variables for all services
- Support for both native and containerized Ollama
- Proper data persistence with Docker volumes
- Network configuration for inter-service communication

Documentation:
- Comprehensive README with setup instructions
- Troubleshooting guidance
- Service descriptions and port mappings

Directory structure:
- Organized directories for each service
- .gitignore configured to track directories but ignore contents
- Sample environment configuration
2025-03-10 22:40:34 -04:00

0 lines
0 B
Plaintext

The file is empty.