Files
local-llm-stack/.gitignore
Dale-Kurt Murray 125a71fc91 Initial commit: Docker-based local LLM environment
This commit sets up a complete Docker Compose environment for running local AI and LLM services:

Features:
- AnythingLLM with Qdrant vector database integration
- Flowise AI workflow automation
- Open WebUI for Ollama interaction
- n8n workflow automation platform
- Qdrant vector database
- PostgreSQL database for n8n

Configuration:
- Environment variables for all services
- Support for both native and containerized Ollama
- Proper data persistence with Docker volumes
- Network configuration for inter-service communication

Documentation:
- Comprehensive README with setup instructions
- Troubleshooting guidance
- Service descriptions and port mappings

Directory structure:
- Organized directories for each service
- .gitignore configured to track directories but ignore contents
- Sample environment configuration
2025-03-10 22:40:34 -04:00

39 lines
537 B
Plaintext

# Ignore environment variables
.env
.env.sample
# Keep directories but ignore their contents
shared/*
!shared/.gitkeep
n8n/*
!n8n/.gitkeep
!n8n/backup/
n8n/backup/*
!n8n/backup/.gitkeep
flowise/*
!flowise/.gitkeep
n8n-tool-workflows/*
!n8n-tool-workflows/.gitkeep
# Docker volumes
postgres_storage/
n8n_storage/
qdrant_storage/
open-webui/
flowise/
anythingllm_storage/
# ollama_storage/ # Uncomment if using containerized Ollama
# OS specific files
.DS_Store
Thumbs.db
# Editor directories and files
.idea/
.vscode/
*.swp
*.swo