This commit sets up a complete Docker Compose environment for running local AI and LLM services:
Features:
- AnythingLLM with Qdrant vector database integration
- Flowise AI workflow automation
- Open WebUI for Ollama interaction
- n8n workflow automation platform
- Qdrant vector database
- PostgreSQL database for n8n
Configuration:
- Environment variables for all services
- Support for both native and containerized Ollama
- Proper data persistence with Docker volumes
- Network configuration for inter-service communication
Documentation:
- Comprehensive README with setup instructions
- Troubleshooting guidance
- Service descriptions and port mappings
Directory structure:
- Organized directories for each service
- .gitignore configured to track directories but ignore contents
- Sample environment configuration