Files
DocsGPT/application/streaming/keys.py
Alex ed9444cf3d feat: SSE notification system
Adds a per-user SSE pipe (GET /api/events) plus a per-message
chat-stream reconnect endpoint (GET /api/messages/<id>/events).

Backend substrate:
- application/events/ — durable journal (Redis Streams) + live
  pub/sub for user-scoped events, with publish_user_event() as
  the worker-side entrypoint.
- application/streaming/ — broadcast_channel for pub/sub fanout
  and event_replay for the per-message snapshot+tail path.
- application/storage/db/repositories/message_events.py +
  alembic 0007 — Postgres journal for chat-stream events.
- application/worker.py — ingest/reingest/remote/connector/
  attachment/mcp_oauth tasks publish queued/progress/completed/
  failed envelopes alongside their existing status updates.

Frontend client:
- frontend/src/events/ — connect/reconnect, Last-Event-ID cursor,
  backoff with jitter. Each tab runs its own connection; no
  cross-tab dedup (future work).
- frontend/src/notifications/ — recentEvents ring, cursor
  tracking, tool-approval toast.
- frontend/src/upload/uploadSlice.ts — extraReducers for
  source.ingest.* and attachment.* events.

Coverage: 132 SSE tests across events substrate, replay, journal,
routes, and worker publishes.
2026-05-12 14:29:45 +01:00

20 lines
707 B
Python

"""Per-chat-message stream key derivations.
Single source of truth for the Redis pub/sub topic name and any
auxiliary keys that the chat-stream snapshot+tail reconnect path
shares between the writer (``complete_stream`` + journal) and the
reader (``/api/messages/<id>/events`` reconnect endpoint).
"""
from __future__ import annotations
def message_topic_name(message_id: str) -> str:
"""Redis pub/sub channel for live fan-out of one chat message.
Subscribers tail this topic for every event that ``complete_stream``
yielded after the SUBSCRIBE-ack arrived; older events are recovered
from the ``message_events`` snapshot half of the pattern.
"""
return f"channel:{message_id}"