mirror of
https://github.com/dalekurt/local-llm-stack.git
synced 2026-03-02 00:01:01 +00:00
Initial commit: Docker-based local LLM environment
This commit sets up a complete Docker Compose environment for running local AI and LLM services: Features: - AnythingLLM with Qdrant vector database integration - Flowise AI workflow automation - Open WebUI for Ollama interaction - n8n workflow automation platform - Qdrant vector database - PostgreSQL database for n8n Configuration: - Environment variables for all services - Support for both native and containerized Ollama - Proper data persistence with Docker volumes - Network configuration for inter-service communication Documentation: - Comprehensive README with setup instructions - Troubleshooting guidance - Service descriptions and port mappings Directory structure: - Organized directories for each service - .gitignore configured to track directories but ignore contents - Sample environment configuration
This commit is contained in:
26
.gitignore
vendored
26
.gitignore
vendored
@@ -1,5 +1,6 @@
|
||||
# Ignore environment variables
|
||||
.env
|
||||
.env.sample
|
||||
|
||||
# Keep directories but ignore their contents
|
||||
shared/*
|
||||
@@ -10,3 +11,28 @@ n8n/*
|
||||
!n8n/backup/
|
||||
n8n/backup/*
|
||||
!n8n/backup/.gitkeep
|
||||
|
||||
flowise/*
|
||||
!flowise/.gitkeep
|
||||
|
||||
n8n-tool-workflows/*
|
||||
!n8n-tool-workflows/.gitkeep
|
||||
|
||||
# Docker volumes
|
||||
postgres_storage/
|
||||
n8n_storage/
|
||||
qdrant_storage/
|
||||
open-webui/
|
||||
flowise/
|
||||
anythingllm_storage/
|
||||
# ollama_storage/ # Uncomment if using containerized Ollama
|
||||
|
||||
# OS specific files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Editor directories and files
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
Reference in New Issue
Block a user