mirror of
https://github.com/dalekurt/local-llm-stack.git
synced 2026-01-28 15:50:27 +00:00
This commit sets up a complete Docker Compose environment for running local AI and LLM services: Features: - AnythingLLM with Qdrant vector database integration - Flowise AI workflow automation - Open WebUI for Ollama interaction - n8n workflow automation platform - Qdrant vector database - PostgreSQL database for n8n Configuration: - Environment variables for all services - Support for both native and containerized Ollama - Proper data persistence with Docker volumes - Network configuration for inter-service communication Documentation: - Comprehensive README with setup instructions - Troubleshooting guidance - Service descriptions and port mappings Directory structure: - Organized directories for each service - .gitignore configured to track directories but ignore contents - Sample environment configuration
0 lines
0 B
Plaintext
0 lines
0 B
Plaintext
The file is empty.