Compare commits

...

42 Commits

Author SHA1 Message Date
Alex
107eb56b1d fix: mini cors hardening 2026-04-23 11:58:20 +01:00
Alex
5c07f5f340 fix: asgi issues 2026-04-23 10:41:27 +01:00
Alex
7ad78d4219 feat: asgi and mcp tool server 2026-04-22 22:19:20 +01:00
Alex
a5153d5212 feat: asgi and search service 2026-04-22 09:12:29 +01:00
Alex
d4b1c1fd81 chore: 0.17.0 version 2026-04-21 16:16:11 +01:00
Alex
2de84acf81 fix: mini callout 2026-04-21 16:14:08 +01:00
Alex
2702750861 docs: upgrading guide 2026-04-21 15:04:17 +01:00
Alex
2b5f20d0ec fix: safer version 2026-04-21 14:22:32 +01:00
Alex
619b41dc5b fix: better version fetch 2026-04-21 14:07:26 +01:00
Alex
76d8f49ccb feat: security version check 2026-04-21 09:16:52 +01:00
Manish Madan
9fe96fb50f Merge pull request #2397 from arc53/dependabot/npm_and_yarn/extensions/react-widget/multi-0193e73c84
chore(deps): bump react and @types/react in /extensions/react-widget
2026-04-21 01:35:29 +05:30
Alex
08822c3379 feat: lazy pymongo 2026-04-20 15:58:02 +01:00
ManishMadan2882
68ca8ff9ea (chore) update react-dom 2026-04-20 19:30:14 +05:30
dependabot[bot]
81be3cdccc chore(deps): bump react and @types/react in /extensions/react-widget
Bumps [react](https://github.com/facebook/react/tree/HEAD/packages/react) and [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react). These dependencies needed to be updated together.

Updates `react` from 18.3.1 to 19.2.5
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.5/packages/react)

Updates `@types/react` from 18.3.3 to 19.2.14
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

---
updated-dependencies:
- dependency-name: react
  dependency-version: 19.2.5
  dependency-type: direct:production
  update-type: version-update:semver-major
- dependency-name: "@types/react"
  dependency-version: 19.2.14
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-20 13:20:46 +00:00
Manish Madan
3ceabed8ad Merge pull request #2406 from arc53/dependabot/npm_and_yarn/extensions/react-widget/npm_and_yarn-2a73d1bbcf
chore(deps): bump dompurify from 3.3.3 to 3.4.0 in /extensions/react-widget in the npm_and_yarn group across 1 directory
2026-04-20 14:45:32 +05:30
dependabot[bot]
422a4b139e chore(deps): bump dompurify
Bumps the npm_and_yarn group with 1 update in the /extensions/react-widget directory: [dompurify](https://github.com/cure53/DOMPurify).


Updates `dompurify` from 3.3.3 to 3.4.0
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/3.3.3...3.4.0)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-version: 3.4.0
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-20 08:58:53 +00:00
Manish Madan
e85935eed0 Merge pull request #2402 from arc53/dependabot/npm_and_yarn/extensions/react-widget/flow-bin-0.309.0
chore(deps): bump flow-bin from 0.306.0 to 0.309.0 in /extensions/react-widget
2026-04-20 14:27:30 +05:30
dependabot[bot]
6a69b8aca0 chore(deps): bump flow-bin in /extensions/react-widget
Bumps [flow-bin](https://github.com/flowtype/flow-bin) from 0.306.0 to 0.309.0.
- [Release notes](https://github.com/flowtype/flow-bin/releases)
- [Commits](https://github.com/flowtype/flow-bin/commits)

---
updated-dependencies:
- dependency-name: flow-bin
  dependency-version: 0.309.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-20 08:55:07 +00:00
Manish Madan
33c2cc9660 Merge pull request #2400 from arc53/dependabot/npm_and_yarn/extensions/react-widget/styled-components-6.4.0
chore(deps): bump styled-components from 6.3.12 to 6.4.0 in /extensions/react-widget
2026-04-20 13:30:30 +05:30
dependabot[bot]
175d4d5a68 chore(deps): bump styled-components in /extensions/react-widget
Bumps [styled-components](https://github.com/styled-components/styled-components) from 6.3.12 to 6.4.0.
- [Release notes](https://github.com/styled-components/styled-components/releases)
- [Commits](https://github.com/styled-components/styled-components/compare/styled-components@6.3.12...styled-components@6.4.0)

---
updated-dependencies:
- dependency-name: styled-components
  dependency-version: 6.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-20 07:59:02 +00:00
Manish Madan
6c3ead1071 Merge pull request #2413 from arc53/dependabot/npm_and_yarn/docs/npm_and_yarn-a087653e68
chore(deps): bump the npm_and_yarn group across 1 directory with 2 updates
2026-04-20 00:53:14 +05:30
dependabot[bot]
d23f88f825 chore(deps): bump the npm_and_yarn group across 1 directory with 2 updates
Bumps the npm_and_yarn group with 2 updates in the /docs directory: [@xmldom/xmldom](https://github.com/xmldom/xmldom) and [lodash-es](https://github.com/lodash/lodash).


Updates `@xmldom/xmldom` from 0.9.8 to 0.9.9
- [Release notes](https://github.com/xmldom/xmldom/releases)
- [Changelog](https://github.com/xmldom/xmldom/blob/master/CHANGELOG.md)
- [Commits](https://github.com/xmldom/xmldom/compare/0.9.8...0.9.9)

Updates `lodash-es` from 4.17.23 to 4.18.1
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.23...4.18.1)

---
updated-dependencies:
- dependency-name: "@xmldom/xmldom"
  dependency-version: 0.9.9
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: lodash-es
  dependency-version: 4.18.1
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 17:55:14 +00:00
Manish Madan
da1df515f7 Merge pull request #2411 from arc53/dependabot/npm_and_yarn/docs/npm_and_yarn-bfabbafa3d
chore(deps): bump the npm_and_yarn group across 3 directories with 10 updates
2026-04-19 23:23:34 +05:30
dependabot[bot]
671a9d75ad chore(deps): bump the npm_and_yarn group across 3 directories with 10 updates
Bumps the npm_and_yarn group with 6 updates in the /docs directory:

| Package | From | To |
| --- | --- | --- |
| [next](https://github.com/vercel/next.js) | `15.5.9` | `15.5.15` |
| [brace-expansion](https://github.com/juliangruber/brace-expansion) | `5.0.2` | `5.0.5` |
| [dompurify](https://github.com/cure53/DOMPurify) | `3.3.1` | `3.4.0` |
| [markdown-it](https://github.com/markdown-it/markdown-it) | `14.1.0` | `14.1.1` |
| [minimatch](https://github.com/isaacs/minimatch) | `10.2.1` | `10.2.5` |
| [yaml](https://github.com/eemeli/yaml) | `1.10.2` | `1.10.3` |

Bumps the npm_and_yarn group with 2 updates in the /extensions/chrome directory: [yaml](https://github.com/eemeli/yaml) and [picomatch](https://github.com/micromatch/picomatch).
Bumps the npm_and_yarn group with 4 updates in the /extensions/web-widget directory: [brace-expansion](https://github.com/juliangruber/brace-expansion), [minimatch](https://github.com/isaacs/minimatch), [yaml](https://github.com/eemeli/yaml) and [picomatch](https://github.com/micromatch/picomatch).


Updates `next` from 15.5.9 to 15.5.15
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/compare/v15.5.9...v15.5.15)

Updates `brace-expansion` from 5.0.2 to 5.0.5
- [Release notes](https://github.com/juliangruber/brace-expansion/releases)
- [Commits](https://github.com/juliangruber/brace-expansion/compare/v5.0.2...v5.0.5)

Updates `dompurify` from 3.3.1 to 3.4.0
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/3.3.1...3.4.0)

Updates `markdown-it` from 14.1.0 to 14.1.1
- [Changelog](https://github.com/markdown-it/markdown-it/blob/master/CHANGELOG.md)
- [Commits](https://github.com/markdown-it/markdown-it/compare/14.1.0...14.1.1)

Updates `minimatch` from 10.2.1 to 10.2.5
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/minimatch/compare/v10.2.1...v10.2.5)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `diff` from 5.2.0 to 5.2.2
- [Changelog](https://github.com/kpdecker/jsdiff/blob/master/release-notes.md)
- [Commits](https://github.com/kpdecker/jsdiff/compare/v5.2.0...v5.2.2)

Updates `glob` from 10.3.12 to 10.5.0
- [Changelog](https://github.com/isaacs/node-glob/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/node-glob/compare/v10.3.12...v10.5.0)

Updates `tar` from 6.2.1 to 7.5.11
- [Release notes](https://github.com/isaacs/node-tar/releases)
- [Changelog](https://github.com/isaacs/node-tar/blob/main/CHANGELOG.md)
- [Commits](https://github.com/isaacs/node-tar/compare/v6.2.1...v7.5.11)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `brace-expansion` from 2.0.1 to 2.0.2
- [Release notes](https://github.com/juliangruber/brace-expansion/releases)
- [Commits](https://github.com/juliangruber/brace-expansion/compare/v5.0.2...v5.0.5)

Updates `glob` from 10.3.12 to 10.5.0
- [Changelog](https://github.com/isaacs/node-glob/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/node-glob/compare/v10.3.12...v10.5.0)

Updates `minimatch` from 9.0.4 to 9.0.9
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/minimatch/compare/v10.2.1...v10.2.5)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `brace-expansion` from 1.1.11 to 1.1.14
- [Release notes](https://github.com/juliangruber/brace-expansion/releases)
- [Commits](https://github.com/juliangruber/brace-expansion/compare/v5.0.2...v5.0.5)

Updates `minimatch` from 3.1.2 to 3.1.5
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/minimatch/compare/v10.2.1...v10.2.5)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

Updates `brace-expansion` from 1.1.11 to 1.1.14
- [Release notes](https://github.com/juliangruber/brace-expansion/releases)
- [Commits](https://github.com/juliangruber/brace-expansion/compare/v5.0.2...v5.0.5)

Updates `minimatch` from 3.1.2 to 3.1.5
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](https://github.com/isaacs/minimatch/compare/v10.2.1...v10.2.5)

Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

Updates `yaml` from 1.10.2 to 1.10.3
- [Release notes](https://github.com/eemeli/yaml/releases)
- [Commits](https://github.com/eemeli/yaml/compare/v1.10.2...v1.10.3)

---
updated-dependencies:
- dependency-name: next
  dependency-version: 15.5.15
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: brace-expansion
  dependency-version: 5.0.5
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: dompurify
  dependency-version: 3.4.0
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: markdown-it
  dependency-version: 14.1.1
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: minimatch
  dependency-version: 10.2.5
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: diff
  dependency-version: 5.2.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: glob
  dependency-version: 10.5.0
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: tar
  dependency-version: 7.5.11
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: brace-expansion
  dependency-version: 2.0.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: glob
  dependency-version: 10.5.0
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: minimatch
  dependency-version: 9.0.9
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: brace-expansion
  dependency-version: 1.1.14
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: minimatch
  dependency-version: 3.1.5
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: brace-expansion
  dependency-version: 1.1.14
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: minimatch
  dependency-version: 3.1.5
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: yaml
  dependency-version: 1.10.3
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 17:47:08 +00:00
Manish Madan
1c829667ff Merge pull request #2326 from arc53/dependabot/npm_and_yarn/extensions/react-widget/class-variance-authority-0.7.1
chore(deps): bump class-variance-authority from 0.7.0 to 0.7.1 in /extensions/react-widget
2026-04-19 23:13:02 +05:30
Manish Madan
3ab0ebb16d Merge pull request #2398 from arc53/dependabot/pip/application/openai-2.31.0
chore(deps): bump openai from 2.30.0 to 2.31.0 in /application
2026-04-19 23:10:36 +05:30
dependabot[bot]
988c4a5a15 chore(deps): bump openai from 2.30.0 to 2.31.0 in /application
Bumps [openai](https://github.com/openai/openai-python) from 2.30.0 to 2.31.0.
- [Release notes](https://github.com/openai/openai-python/releases)
- [Changelog](https://github.com/openai/openai-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/openai/openai-python/compare/v2.30.0...v2.31.0)

---
updated-dependencies:
- dependency-name: openai
  dependency-version: 2.31.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 17:14:43 +00:00
Manish Madan
01db8b2c41 Merge pull request #2395 from arc53/dependabot/pip/application/google-genai-1.73.1
chore(deps): bump google-genai from 1.69.0 to 1.73.1 in /application
2026-04-19 22:43:14 +05:30
Alex
ef19da9516 fix: pg bouncer compatible 2026-04-19 17:53:22 +01:00
dependabot[bot]
cc1275c3f9 chore(deps): bump google-genai from 1.69.0 to 1.73.1 in /application
Bumps [google-genai](https://github.com/googleapis/python-genai) from 1.69.0 to 1.73.1.
- [Release notes](https://github.com/googleapis/python-genai/releases)
- [Changelog](https://github.com/googleapis/python-genai/blob/main/CHANGELOG.md)
- [Commits](https://github.com/googleapis/python-genai/compare/v1.69.0...v1.73.1)

---
updated-dependencies:
- dependency-name: google-genai
  dependency-version: 1.73.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 15:30:12 +00:00
Manish Madan
14c2f4890f Merge pull request #2394 from arc53/dependabot/pip/application/fastmcp-3.2.4
chore(deps): bump fastmcp from 3.2.0 to 3.2.4 in /application
2026-04-19 20:59:04 +05:30
dependabot[bot]
b3aec36aa2 chore(deps): bump fastmcp from 3.2.0 to 3.2.4 in /application
Bumps [fastmcp](https://github.com/PrefectHQ/fastmcp) from 3.2.0 to 3.2.4.
- [Release notes](https://github.com/PrefectHQ/fastmcp/releases)
- [Changelog](https://github.com/PrefectHQ/fastmcp/blob/main/docs/changelog.mdx)
- [Commits](https://github.com/PrefectHQ/fastmcp/compare/v3.2.0...v3.2.4)

---
updated-dependencies:
- dependency-name: fastmcp
  dependency-version: 3.2.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 15:13:45 +00:00
Manish Madan
50f62beaeb Merge pull request #2392 from arc53/dependabot/pip/application/elevenlabs-2.43.0
chore(deps): bump elevenlabs from 2.41.0 to 2.43.0 in /application
2026-04-19 20:42:32 +05:30
dependabot[bot]
423b4c6494 chore(deps): bump elevenlabs from 2.41.0 to 2.43.0 in /application
Bumps [elevenlabs](https://github.com/elevenlabs/elevenlabs-python) from 2.41.0 to 2.43.0.
- [Release notes](https://github.com/elevenlabs/elevenlabs-python/releases)
- [Commits](https://github.com/elevenlabs/elevenlabs-python/compare/v2.41.0...v2.43.0)

---
updated-dependencies:
- dependency-name: elevenlabs
  dependency-version: 2.43.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 14:49:32 +00:00
Manish Madan
54f615c59d Merge pull request #2407 from arc53/dependabot/npm_and_yarn/frontend/npm_and_yarn-f0540bac9f
chore(deps): bump the npm_and_yarn group across 1 directory with 4 updates
2026-04-19 20:17:07 +05:30
dependabot[bot]
223b3de66e chore(deps): bump the npm_and_yarn group across 1 directory with 4 updates
Bumps the npm_and_yarn group with 3 updates in the /frontend directory: [lodash](https://github.com/lodash/lodash), [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) and [dompurify](https://github.com/cure53/DOMPurify).


Updates `lodash` from 4.17.23 to 4.18.1
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.23...4.18.1)

Updates `vite` from 8.0.0 to 8.0.8
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v8.0.8/packages/vite)

Updates `picomatch` from 4.0.3 to 4.0.4
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/4.0.3...4.0.4)

Updates `dompurify` from 3.3.3 to 3.4.0
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/3.3.3...3.4.0)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.18.1
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: vite
  dependency-version: 8.0.8
  dependency-type: direct:development
  dependency-group: npm_and_yarn
- dependency-name: picomatch
  dependency-version: 4.0.4
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: dompurify
  dependency-version: 3.4.0
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 14:06:01 +00:00
Manish Madan
4db9622ef5 Merge pull request #2403 from arc53/dependabot/npm_and_yarn/frontend/types/react-19.2.14
chore(deps-dev): bump @types/react from 19.2.2 to 19.2.14 in /frontend
2026-04-19 19:34:04 +05:30
dependabot[bot]
e8d1bbfb68 chore(deps-dev): bump @types/react from 19.2.2 to 19.2.14 in /frontend
Bumps [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react) from 19.2.2 to 19.2.14.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

---
updated-dependencies:
- dependency-name: "@types/react"
  dependency-version: 19.2.14
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-19 13:54:47 +00:00
Manish Madan
aff1345ae4 Merge pull request #2399 from arc53/dependabot/npm_and_yarn/frontend/react-router-dom-7.14.1
chore(deps): bump react-router-dom from 7.13.1 to 7.14.1 in /frontend
2026-04-19 04:45:57 +05:30
Alex
ee430aff1e fix: tests 2026-04-18 13:28:03 +01:00
dependabot[bot]
ebb7938d1b chore(deps): bump react-router-dom from 7.13.1 to 7.14.1 in /frontend
Bumps [react-router-dom](https://github.com/remix-run/react-router/tree/HEAD/packages/react-router-dom) from 7.13.1 to 7.14.1.
- [Release notes](https://github.com/remix-run/react-router/releases)
- [Changelog](https://github.com/remix-run/react-router/blob/main/packages/react-router-dom/CHANGELOG.md)
- [Commits](https://github.com/remix-run/react-router/commits/react-router-dom@7.14.1/packages/react-router-dom)

---
updated-dependencies:
- dependency-name: react-router-dom
  dependency-version: 7.14.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-17 08:55:15 +00:00
dependabot[bot]
cdb71a54f0 chore(deps): bump class-variance-authority in /extensions/react-widget
Bumps [class-variance-authority](https://github.com/joe-bell/cva) from 0.7.0 to 0.7.1.
- [Release notes](https://github.com/joe-bell/cva/releases)
- [Commits](https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1)

---
updated-dependencies:
- dependency-name: class-variance-authority
  dependency-version: 0.7.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-03-24 20:53:18 +00:00
47 changed files with 4121 additions and 3428 deletions

View File

@@ -37,6 +37,22 @@ Run the Flask API (if needed):
flask --app application/app.py run --host=0.0.0.0 --port=7091
```
That's the fast inner-loop option — quick startup, the Werkzeug interactive
debugger still works, and it hot-reloads on source changes. It serves the
Flask routes only (`/api/*`, `/stream`, etc.).
If you need to exercise the full ASGI stack — the `/mcp` FastMCP endpoint,
or to match the production runtime exactly — run the ASGI composition under
uvicorn instead:
```bash
uvicorn application.asgi:asgi_app --host 0.0.0.0 --port 7091 --reload
```
Production uses `gunicorn -k uvicorn_worker.UvicornWorker` against the same
`application.asgi:asgi_app` target; see `application/Dockerfile` for the
full flag set.
Run the Celery worker in a separate terminal (if needed):
```bash

View File

@@ -88,5 +88,15 @@ EXPOSE 7091
# Switch to non-root user
USER appuser
# Start Gunicorn
CMD ["gunicorn", "-w", "1", "--timeout", "120", "--bind", "0.0.0.0:7091", "--preload", "application.wsgi:app"]
CMD ["gunicorn", \
"-w", "1", \
"-k", "uvicorn_worker.UvicornWorker", \
"--bind", "0.0.0.0:7091", \
"--timeout", "180", \
"--graceful-timeout", "120", \
"--keep-alive", "5", \
"--worker-tmp-dir", "/dev/shm", \
"--max-requests", "1000", \
"--max-requests-jitter", "100", \
"--config", "application/gunicorn_conf.py", \
"application.asgi:asgi_app"]

View File

@@ -0,0 +1,37 @@
"""0002 app_metadata — singleton key/value table for instance-wide state.
Used by the startup version-check client to persist the anonymous
instance UUID and a one-shot "notice shown" flag. Both values are tiny
plain-text strings; this is a deliberate generic-config table rather
than dedicated columns so future one-off settings (telemetry opt-in
timestamps, feature-flag overrides, etc.) don't each need their own
migration.
Revision ID: 0002_app_metadata
Revises: 0001_initial
"""
from typing import Sequence, Union
from alembic import op
revision: str = "0002_app_metadata"
down_revision: Union[str, None] = "0001_initial"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.execute(
"""
CREATE TABLE app_metadata (
key TEXT PRIMARY KEY,
value TEXT NOT NULL
);
"""
)
def downgrade() -> None:
op.execute("DROP TABLE IF EXISTS app_metadata;")

View File

@@ -1,21 +1,21 @@
import logging
from typing import Any, Dict, List
from flask import make_response, request
from flask_restx import fields, Resource
from application.api.answer.routes.base import answer_ns
from application.core.settings import settings
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.session import db_readonly
from application.vectorstore.vector_creator import VectorCreator
from application.services.search_service import (
InvalidAPIKey,
SearchFailed,
search,
)
logger = logging.getLogger(__name__)
@answer_ns.route("/api/search")
class SearchResource(Resource):
"""Fast search endpoint for retrieving relevant documents"""
"""Fast search endpoint for retrieving relevant documents."""
search_model = answer_ns.model(
"SearchModel",
@@ -32,102 +32,10 @@ class SearchResource(Resource):
},
)
def _get_sources_from_api_key(self, api_key: str) -> List[str]:
"""Get source IDs connected to the API key/agent."""
with db_readonly() as conn:
agent_data = AgentsRepository(conn).find_by_key(api_key)
if not agent_data:
return []
source_ids: List[str] = []
# extra_source_ids is a PG ARRAY(UUID) of source UUIDs.
extra = agent_data.get("extra_source_ids") or []
for src in extra:
if src:
source_ids.append(str(src))
if not source_ids:
single = agent_data.get("source_id")
if single:
source_ids.append(str(single))
return source_ids
def _search_vectorstores(
self, query: str, source_ids: List[str], chunks: int
) -> List[Dict[str, Any]]:
"""Search across vectorstores and return results"""
if not source_ids:
return []
results = []
chunks_per_source = max(1, chunks // len(source_ids))
seen_texts = set()
for source_id in source_ids:
if not source_id or not source_id.strip():
continue
try:
docsearch = VectorCreator.create_vectorstore(
settings.VECTOR_STORE, source_id, settings.EMBEDDINGS_KEY
)
docs = docsearch.search(query, k=chunks_per_source * 2)
for doc in docs:
if len(results) >= chunks:
break
if hasattr(doc, "page_content") and hasattr(doc, "metadata"):
page_content = doc.page_content
metadata = doc.metadata
else:
page_content = doc.get("text", doc.get("page_content", ""))
metadata = doc.get("metadata", {})
# Skip duplicates
text_hash = hash(page_content[:200])
if text_hash in seen_texts:
continue
seen_texts.add(text_hash)
title = metadata.get(
"title", metadata.get("post_title", "")
)
if not isinstance(title, str):
title = str(title) if title else ""
# Clean up title
if title:
title = title.split("/")[-1]
else:
# Use filename or first part of content as title
title = metadata.get("filename", page_content[:50] + "...")
source = metadata.get("source", source_id)
results.append({
"text": page_content,
"title": title,
"source": source,
})
if len(results) >= chunks:
break
except Exception as e:
logger.error(
f"Error searching vectorstore {source_id}: {e}",
exc_info=True,
)
continue
return results[:chunks]
@answer_ns.expect(search_model)
@answer_ns.doc(description="Search for relevant documents based on query")
def post(self):
data = request.get_json()
data = request.get_json() or {}
question = data.get("question")
api_key = data.get("api_key")
@@ -135,32 +43,13 @@ class SearchResource(Resource):
if not question:
return make_response({"error": "question is required"}, 400)
if not api_key:
return make_response({"error": "api_key is required"}, 400)
# Validate API key
with db_readonly() as conn:
agent = AgentsRepository(conn).find_by_key(api_key)
if not agent:
return make_response({"error": "Invalid API key"}, 401)
try:
# Get sources connected to this API key
source_ids = self._get_sources_from_api_key(api_key)
if not source_ids:
return make_response([], 200)
# Perform search
results = self._search_vectorstores(question, source_ids, chunks)
return make_response(results, 200)
except Exception as e:
logger.error(
f"/api/search - error: {str(e)}",
extra={"error": str(e)},
exc_info=True,
)
return make_response(search(api_key, question, chunks), 200)
except InvalidAPIKey:
return make_response({"error": "Invalid API key"}, 401)
except SearchFailed:
logger.exception("/api/search failed")
return make_response({"error": "Search failed"}, 500)

View File

@@ -4,7 +4,7 @@ import platform
import uuid
import dotenv
from flask import Flask, jsonify, redirect, request
from flask import Flask, Response, jsonify, redirect, request
from jose import jwt
from application.auth import handle_auth
@@ -149,12 +149,11 @@ def authenticate_request():
@app.after_request
def after_request(response):
response.headers.add("Access-Control-Allow-Origin", "*")
response.headers.add("Access-Control-Allow-Headers", "Content-Type, Authorization")
response.headers.add(
"Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS"
)
def after_request(response: Response) -> Response:
"""Add CORS headers for the pure Flask development entrypoint."""
response.headers["Access-Control-Allow-Origin"] = "*"
response.headers["Access-Control-Allow-Headers"] = "Content-Type, Authorization"
response.headers["Access-Control-Allow-Methods"] = "GET, POST, PUT, DELETE, OPTIONS"
return response

33
application/asgi.py Normal file
View File

@@ -0,0 +1,33 @@
"""ASGI entrypoint: Flask (WSGI) + FastMCP on the same process."""
from __future__ import annotations
from a2wsgi import WSGIMiddleware
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.middleware.cors import CORSMiddleware
from starlette.routing import Mount
from application.app import app as flask_app
from application.mcp_server import mcp
_WSGI_THREADPOOL = 32
mcp_app = mcp.http_app(path="/")
asgi_app = Starlette(
routes=[
Mount("/mcp", app=mcp_app),
Mount("/", app=WSGIMiddleware(flask_app, workers=_WSGI_THREADPOOL)),
],
middleware=[
Middleware(
CORSMiddleware,
allow_origins=["*"],
allow_methods=["GET", "POST", "PUT", "DELETE", "OPTIONS"],
allow_headers=["Content-Type", "Authorization", "Mcp-Session-Id"],
expose_headers=["Mcp-Session-Id"],
),
],
lifespan=mcp_app.lifespan,
)

View File

@@ -1,6 +1,8 @@
import threading
from celery import Celery
from application.core.settings import settings
from celery.signals import setup_logging, worker_process_init
from celery.signals import setup_logging, worker_process_init, worker_ready
def make_celery(app_name=__name__):
@@ -39,5 +41,25 @@ def _dispose_db_engine_on_fork(*args, **kwargs):
dispose_engine()
@worker_ready.connect
def _run_version_check(*args, **kwargs):
"""Kick off the anonymous version check on worker startup.
Runs in a daemon thread so a slow endpoint or bad DNS never holds
up the worker becoming ready for tasks. The check itself is
fail-silent (see ``application.updates.version_check.run_check``);
this handler's only job is to launch it and get out of the way.
Import is lazy so the symbol resolution never fires at module
import time — consistent with the ``_dispose_db_engine_on_fork``
pattern above.
"""
try:
from application.updates.version_check import run_check
except Exception:
return
threading.Thread(target=run_check, name="version-check", daemon=True).start()
celery = make_celery()
celery.config_from_object("application.celeryconfig")

View File

@@ -149,6 +149,9 @@ class Settings(BaseSettings):
FLASK_DEBUG_MODE: bool = False
STORAGE_TYPE: str = "local" # local or s3
# Anonymous startup version check for security issues.
VERSION_CHECK: bool = True
URL_STRATEGY: str = "backend" # backend or s3
JWT_SECRET_KEY: str = ""

View File

@@ -0,0 +1,72 @@
"""Gunicorn config — keeps uvicorn's access log in NCSA format."""
from __future__ import annotations
import logging
import logging.config
# NCSA common log format:
# %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
# Uvicorn's access formatter exposes a ``client_addr``/``request_line``/
# ``status_code`` trio but not the full NCSA field set, so we re-derive
# what we can.
_NCSA_FMT = (
'%(client_addr)s - - [%(asctime)s] "%(request_line)s" %(status_code)s'
)
logconfig_dict = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"ncsa_access": {
"()": "uvicorn.logging.AccessFormatter",
"fmt": _NCSA_FMT,
"datefmt": "%d/%b/%Y:%H:%M:%S %z",
"use_colors": False,
},
"default": {
"format": "[%(asctime)s] [%(process)d] [%(levelname)s] %(name)s: %(message)s",
},
},
"handlers": {
"access": {
"class": "logging.StreamHandler",
"formatter": "ncsa_access",
"stream": "ext://sys.stdout",
},
"default": {
"class": "logging.StreamHandler",
"formatter": "default",
"stream": "ext://sys.stderr",
},
},
"loggers": {
"uvicorn": {"handlers": ["default"], "level": "INFO", "propagate": False},
"uvicorn.error": {
"handlers": ["default"],
"level": "INFO",
"propagate": False,
},
"uvicorn.access": {
"handlers": ["access"],
"level": "INFO",
"propagate": False,
},
"gunicorn.error": {
"handlers": ["default"],
"level": "INFO",
"propagate": False,
},
"gunicorn.access": {
"handlers": ["access"],
"level": "INFO",
"propagate": False,
},
},
"root": {"handlers": ["default"], "level": "INFO"},
}
def on_starting(server): # pragma: no cover — gunicorn hook
"""Ensure gunicorn's own loggers use the configured handlers."""
logging.config.dictConfig(logconfig_dict)

59
application/mcp_server.py Normal file
View File

@@ -0,0 +1,59 @@
"""FastMCP server exposing DocsGPT retrieval over streamable HTTP.
Mounted at ``/mcp`` by ``application/asgi.py``. Bearer tokens are the
existing DocsGPT agent API keys — no new credential surface.
The tool reads the ``Authorization`` header directly via
``get_http_headers(include={"authorization"})``. The ``include`` kwarg
is required: by default ``get_http_headers`` strips ``authorization``
(and a handful of other hop-by-hop headers) so they aren't forwarded
to downstream services — since we deliberately want the caller's
token, we opt it back in.
"""
from __future__ import annotations
import asyncio
import logging
from fastmcp import FastMCP
from fastmcp.server.dependencies import get_http_headers
from application.services.search_service import (
InvalidAPIKey,
SearchFailed,
search,
)
logger = logging.getLogger(__name__)
mcp = FastMCP("docsgpt")
def _extract_bearer_token() -> str | None:
auth = get_http_headers(include={"authorization"}).get("authorization", "")
parts = auth.split(None, 1)
if len(parts) != 2 or parts[0].lower() != "bearer" or not parts[1]:
return None
return parts[1]
@mcp.tool
async def search_docs(query: str, chunks: int = 5) -> list[dict]:
"""Search the caller's DocsGPT knowledge base.
Authentication is via ``Authorization: Bearer <agent-api-key>`` on
the MCP request — the same opaque key that ``/api/search`` accepts
in its JSON body. Returns at most ``chunks`` hits, each a dict with
``text``, ``title``, ``source`` keys.
"""
api_key = _extract_bearer_token()
if not api_key:
raise PermissionError("Missing Bearer token")
try:
return await asyncio.to_thread(search, api_key, query, chunks)
except InvalidAPIKey as exc:
raise PermissionError("Invalid API key") from exc
except SearchFailed:
logger.exception("search_docs failed")
raise

View File

@@ -1,5 +1,7 @@
a2wsgi==1.10.10
alembic>=1.13,<2
anthropic==0.88.0
asgiref>=3.11.1
boto3==1.42.83
beautifulsoup4==4.14.3
cel-python==0.5.0
@@ -13,12 +15,12 @@ onnxruntime>=1.19.0
docx2txt==0.9
ddgs>=8.0.0
fast-ebook
elevenlabs==2.41.0
Flask==3.1.3
elevenlabs==2.43.0
Flask==3.1.1
faiss-cpu==1.13.2
fastmcp==3.2.0
fastmcp==3.2.4
flask-restx==1.3.2
google-genai==1.69.0
google-genai==1.73.1
google-api-python-client==2.193.0
google-auth-httplib2==0.3.1
google-auth-oauthlib==1.3.1
@@ -47,7 +49,7 @@ msal==1.35.1
mypy-extensions==1.1.0
networkx==3.6.1
numpy==2.4.4
openai==2.30.0
openai==2.32.0
openapi3-parser==1.1.22
orjson==3.11.7
packaging==26.0
@@ -76,6 +78,7 @@ requests==2.33.1
retry==0.9.2
sentence-transformers==5.3.0
sqlalchemy>=2.0,<3
starlette>=1.0,<2
tiktoken==0.12.0
tokenizers==0.22.2
torch==2.11.0
@@ -85,6 +88,8 @@ typing-extensions==4.15.0
typing-inspect==0.9.0
tzdata==2026.1
urllib3==2.6.3
uvicorn[standard]>=0.30,<1
uvicorn-worker>=0.4,<1
vine==5.1.0
wcwidth==0.6.0
werkzeug>=3.1.0

View File

View File

@@ -0,0 +1,153 @@
"""Shared retrieval service used by the HTTP search route and the MCP tool.
Flask-free. Raises domain exceptions (``InvalidAPIKey``, ``SearchFailed``)
that callers translate into their own wire protocol (HTTP status codes,
MCP error responses, etc.).
"""
from __future__ import annotations
import logging
from typing import Any, Dict, List
from application.core.settings import settings
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.session import db_readonly
from application.vectorstore.vector_creator import VectorCreator
logger = logging.getLogger(__name__)
class InvalidAPIKey(Exception):
"""The supplied ``api_key`` does not resolve to an agent."""
class SearchFailed(Exception):
"""Unexpected error during retrieval (e.g. DB outage). Caller maps to 5xx."""
def _collect_source_ids(agent: Dict[str, Any]) -> List[str]:
"""Extract the ordered list of source UUIDs to search.
Prefers ``extra_source_ids`` (PG ARRAY(UUID) of multi-source agents);
falls back to the legacy single ``source_id`` field.
"""
source_ids: List[str] = []
extra = agent.get("extra_source_ids") or []
for src in extra:
if src:
source_ids.append(str(src))
if not source_ids:
single = agent.get("source_id")
if single:
source_ids.append(str(single))
return source_ids
def _search_sources(
query: str, source_ids: List[str], chunks: int
) -> List[Dict[str, Any]]:
"""Search across each source's vectorstore and return up to ``chunks`` hits.
Per-source errors are logged and skipped so one broken index doesn't
take down the whole search. Results are de-duplicated by content hash.
"""
if chunks <= 0 or not source_ids:
return []
results: List[Dict[str, Any]] = []
chunks_per_source = max(1, chunks // len(source_ids))
seen_texts: set[int] = set()
for source_id in source_ids:
if not source_id or not source_id.strip():
continue
try:
docsearch = VectorCreator.create_vectorstore(
settings.VECTOR_STORE, source_id, settings.EMBEDDINGS_KEY
)
docs = docsearch.search(query, k=chunks_per_source * 2)
for doc in docs:
if len(results) >= chunks:
break
if hasattr(doc, "page_content") and hasattr(doc, "metadata"):
page_content = doc.page_content
metadata = doc.metadata
else:
page_content = doc.get("text", doc.get("page_content", ""))
metadata = doc.get("metadata", {})
text_hash = hash(page_content[:200])
if text_hash in seen_texts:
continue
seen_texts.add(text_hash)
title = metadata.get("title", metadata.get("post_title", ""))
if not isinstance(title, str):
title = str(title) if title else ""
if title:
title = title.split("/")[-1]
else:
title = metadata.get("filename", page_content[:50] + "...")
source = metadata.get("source", source_id)
results.append(
{
"text": page_content,
"title": title,
"source": source,
}
)
if len(results) >= chunks:
break
except Exception as e:
logger.error(
f"Error searching vectorstore {source_id}: {e}",
exc_info=True,
)
continue
return results[:chunks]
def search(api_key: str, query: str, chunks: int = 5) -> List[Dict[str, Any]]:
"""Resolve an agent by API key and search its sources.
Args:
api_key: Agent API key (the opaque string stored on
``agents.key`` in Postgres).
query: Free-text search query.
chunks: Max number of hits to return.
Returns:
List of hit dicts with ``text``, ``title``, ``source`` keys.
Empty list if the agent has no sources configured.
Raises:
InvalidAPIKey: if ``api_key`` does not resolve to an agent.
SearchFailed: on unexpected DB / infrastructure errors.
"""
if chunks <= 0:
return []
try:
with db_readonly() as conn:
agent = AgentsRepository(conn).find_by_key(api_key)
except Exception as e:
raise SearchFailed("agent lookup failed") from e
if not agent:
raise InvalidAPIKey()
source_ids = _collect_source_ids(agent)
if not source_ids:
return []
return _search_sources(query, source_ids, chunks)

View File

@@ -16,7 +16,7 @@ don't need to know about SQLAlchemy dialect prefixes.
from typing import Optional
from sqlalchemy import Engine, create_engine
from sqlalchemy import Engine, create_engine, event
from application.core.settings import settings
@@ -43,8 +43,7 @@ def _resolve_uri() -> str:
#: Per-statement wall-clock cap applied to every connection handed out by
#: the engine. 30s is generous for interactive hot paths (reads under a few
#: hundred ms are normal) but still catches a runaway query before it
#: stacks up on PgBouncer or holds locks indefinitely. Override by
#: rebuilding the engine with a different ``connect_args`` in tests.
#: stacks up on PgBouncer or holds locks indefinitely.
STATEMENT_TIMEOUT_MS = 30_000
@@ -52,8 +51,9 @@ def get_engine() -> Engine:
"""Return the process-wide SQLAlchemy Engine, creating it if needed.
The engine applies a server-side ``statement_timeout`` to every
connection it hands out, so both :func:`db_session` and
:func:`db_readonly` inherit the same guardrail.
connection it hands out via a ``connect`` event, so both
:func:`db_session` and :func:`db_readonly` inherit the same
guardrail.
Returns:
A SQLAlchemy ``Engine`` configured with a pooled connection to
@@ -68,13 +68,20 @@ def get_engine() -> Engine:
pool_pre_ping=True, # survive PgBouncer / idle-disconnect recycles
pool_recycle=1800,
future=True,
connect_args={
# ``-c`` passes a GUC to the backend at connect time. This
# covers *all* sessions — interactive, Celery, seeder — so
# no route-handler can opt out by accident.
"options": f"-c statement_timeout={STATEMENT_TIMEOUT_MS}",
},
)
@event.listens_for(_engine, "connect")
def _apply_session_guardrails(dbapi_conn, _record):
# Apply as a SQL ``SET`` (not a libpq ``options=-c ...``
# startup parameter) so the engine works behind
# PgBouncer-style poolers — notably Neon's ``-pooler``
# endpoint, which rejects startup options. Explicit
# ``commit()`` so the session-level SET survives SA's
# transaction resets on pool return.
with dbapi_conn.cursor() as cur:
cur.execute(f"SET statement_timeout = {STATEMENT_TIMEOUT_MS}")
dbapi_conn.commit()
return _engine

View File

@@ -117,6 +117,16 @@ stack_logs_table = Table(
Column("timestamp", DateTime(timezone=True), nullable=False, server_default=func.now()),
)
# Singleton key/value table for instance-wide state (e.g. anonymous
# instance UUID, one-shot notice flags). Added in migration
# ``0002_app_metadata``.
app_metadata_table = Table(
"app_metadata",
metadata,
Column("key", Text, primary_key=True),
Column("value", Text, nullable=False),
)
# --- Phase 2, Tier 2 --------------------------------------------------------

View File

@@ -0,0 +1,60 @@
"""Repository for the ``app_metadata`` singleton key/value table.
Owns the instance-wide state the version-check client needs:
``instance_id`` (anonymous UUID sent with each check) and
``version_check_notice_shown`` (one-shot flag for the first-run
telemetry notice). Kept deliberately generic so future one-off config
values can piggyback without a new migration each time.
"""
from __future__ import annotations
import uuid
from typing import Optional
from sqlalchemy import Connection, text
class AppMetadataRepository:
"""Postgres-backed ``app_metadata`` store. Tiny by design."""
def __init__(self, conn: Connection) -> None:
self._conn = conn
def get(self, key: str) -> Optional[str]:
row = self._conn.execute(
text("SELECT value FROM app_metadata WHERE key = :key"),
{"key": key},
).fetchone()
return row[0] if row is not None else None
def set(self, key: str, value: str) -> None:
self._conn.execute(
text(
"INSERT INTO app_metadata (key, value) VALUES (:key, :value) "
"ON CONFLICT (key) DO UPDATE SET value = EXCLUDED.value"
),
{"key": key, "value": value},
)
def get_or_create_instance_id(self) -> str:
"""Return the anonymous instance UUID, generating one if absent.
Uses ``INSERT ... ON CONFLICT DO NOTHING`` + re-read so two
workers racing on the very first startup converge on a single
UUID instead of each persisting their own.
"""
existing = self.get("instance_id")
if existing:
return existing
candidate = str(uuid.uuid4())
self._conn.execute(
text(
"INSERT INTO app_metadata (key, value) VALUES ('instance_id', :value) "
"ON CONFLICT (key) DO NOTHING"
),
{"value": candidate},
)
# Re-read: if another worker won the race, their UUID is now authoritative.
winner = self.get("instance_id")
return winner or candidate

View File

View File

@@ -0,0 +1,302 @@
"""Anonymous startup version-check client.
Called once per Celery worker boot (see ``application/celery_init.py``
``worker_ready`` handler). Posts the running version + anonymous
instance UUID to ``gptcloud.arc53.com/api/check``, caches the response
in Redis, and surfaces any advisories to stdout + logs.
Design invariants — all enforced by a broad ``try/except`` at the top
of :func:`run_check`:
* Never blocks worker startup (fired from a daemon thread).
* Never raises to the caller (every failure is swallowed + logged at
``DEBUG``).
* Opt-out via ``VERSION_CHECK=0`` short-circuits before any Postgres
write, Redis access, or outbound request.
* Redis coordinates multi-worker and multi-replica deployments — the
first worker to acquire ``docsgpt:version_check:lock`` fetches, the
rest read from the cached response on the next cycle.
"""
from __future__ import annotations
import json
import logging
import os
import platform
import socket
import sys
from typing import Any, Dict, Optional
import requests
from application.cache import get_redis_instance
from application.core.settings import settings
from application.storage.db.repositories.app_metadata import AppMetadataRepository
from application.storage.db.session import db_session
from application.version import get_version
logger = logging.getLogger(__name__)
ENDPOINT_URL = "https://gptcloud.arc53.com/api/check"
CLIENT_NAME = "docsgpt-backend"
REQUEST_TIMEOUT_SECONDS = 5
CACHE_KEY = "docsgpt:version_check:response"
LOCK_KEY = "docsgpt:version_check:lock"
CACHE_TTL_SECONDS = 6 * 3600 # 6h default; shortened by response `next_check_after`.
LOCK_TTL_SECONDS = 60
NOTICE_KEY = "version_check_notice_shown"
INSTANCE_ID_KEY = "instance_id"
_HIGH_SEVERITIES = {"high", "critical"}
_ANSI_RESET = "\033[0m"
_ANSI_RED = "\033[31m"
_ANSI_YELLOW = "\033[33m"
def run_check() -> None:
"""Entry point for the worker-startup daemon thread.
Safe to call unconditionally: the opt-out, Redis-outage, and
Postgres-outage paths all return silently. No exception propagates.
"""
try:
_run_check_inner()
except Exception as exc: # noqa: BLE001 — belt-and-braces; nothing escapes.
logger.debug("version check crashed: %s", exc, exc_info=True)
def _run_check_inner() -> None:
if not settings.VERSION_CHECK:
return
instance_id = _resolve_instance_id_and_notice()
if instance_id is None:
# Postgres unavailable — per spec we skip the check entirely
# rather than phone home with a synthetic/ephemeral UUID.
return
redis_client = get_redis_instance()
cached = _read_cache(redis_client)
if cached is not None:
_render_advisories(cached)
return
# Cache miss. Try to win the lock; if another worker has it, skip.
# ``redis_client is None`` here means Redis is unreachable — per the
# spec we still proceed uncached (acceptable duplicate calls in
# multi-worker Redis-less deploys).
if redis_client is not None and not _acquire_lock(redis_client):
return
response = _fetch(instance_id)
if response is None:
if redis_client is not None:
_release_lock(redis_client)
return
_write_cache(redis_client, response)
_render_advisories(response)
if redis_client is not None:
_release_lock(redis_client)
def _resolve_instance_id_and_notice() -> Optional[str]:
"""Load (or create) the instance UUID and emit the first-run notice.
The notice is printed at most once across the lifetime of the
installation — tracked via the ``version_check_notice_shown`` row
in ``app_metadata``. Both reads and the write happen inside one
short transaction so two racing workers can't each emit the notice.
"""
try:
with db_session() as conn:
repo = AppMetadataRepository(conn)
instance_id = repo.get_or_create_instance_id()
if repo.get(NOTICE_KEY) is None:
_print_first_run_notice()
repo.set(NOTICE_KEY, "1")
return instance_id
except Exception as exc: # noqa: BLE001 — Postgres down, bad URI, etc.
logger.debug("version check: Postgres unavailable (%s)", exc, exc_info=True)
return None
def _print_first_run_notice() -> None:
message = (
"Anonymous version check enabled — sends version to "
"gptcloud.arc53.com.\nDisable with VERSION_CHECK=0."
)
print(message, flush=True)
logger.info("version check: first-run notice shown")
def _read_cache(redis_client) -> Optional[Dict[str, Any]]:
if redis_client is None:
return None
try:
raw = redis_client.get(CACHE_KEY)
except Exception as exc: # noqa: BLE001 — Redis transient errors.
logger.debug("version check: cache GET failed (%s)", exc, exc_info=True)
return None
if raw is None:
return None
try:
return json.loads(raw.decode("utf-8") if isinstance(raw, bytes) else raw)
except (ValueError, AttributeError) as exc:
logger.debug("version check: cache decode failed (%s)", exc, exc_info=True)
return None
def _write_cache(redis_client, response: Dict[str, Any]) -> None:
if redis_client is None:
return
ttl = _compute_ttl(response)
try:
redis_client.setex(CACHE_KEY, ttl, json.dumps(response))
except Exception as exc: # noqa: BLE001
logger.debug("version check: cache SETEX failed (%s)", exc, exc_info=True)
def _compute_ttl(response: Dict[str, Any]) -> int:
"""Cap the cache at 6h but honor a shorter server-specified window."""
next_after = response.get("next_check_after")
if isinstance(next_after, (int, float)) and next_after > 0:
return max(1, min(CACHE_TTL_SECONDS, int(next_after)))
return CACHE_TTL_SECONDS
def _acquire_lock(redis_client) -> bool:
try:
owner = f"{socket.gethostname()}:{os.getpid()}"
return bool(
redis_client.set(LOCK_KEY, owner, nx=True, ex=LOCK_TTL_SECONDS)
)
except Exception as exc: # noqa: BLE001
# Treat a failing Redis the same as "no lock infra" — skip rather
# than fire without coordination, because Redis outage is
# usually transient and one missed cycle is harmless.
logger.debug("version check: lock acquire failed (%s)", exc, exc_info=True)
return False
def _release_lock(redis_client) -> None:
try:
redis_client.delete(LOCK_KEY)
except Exception as exc: # noqa: BLE001
logger.debug("version check: lock release failed (%s)", exc, exc_info=True)
def _fetch(instance_id: str) -> Optional[Dict[str, Any]]:
version = get_version()
if version in ("", "unknown"):
# The endpoint rejects payloads without a valid semver, and the
# rejection is otherwise logged at DEBUG — invisible under the
# usual ``-l INFO`` Celery worker start. Surface it loudly so a
# misconfigured release (missing or unset ``__version__``) is
# obvious instead of silently disabling the check.
logger.warning(
"version check: skipping — get_version() returned %r. "
"Set __version__ in application/version.py to a valid "
"version string.",
version,
)
return None
payload = {
"version": version,
"instance_id": instance_id,
"python_version": platform.python_version(),
"platform": sys.platform,
"client": CLIENT_NAME,
}
try:
resp = requests.post(
ENDPOINT_URL,
json=payload,
timeout=REQUEST_TIMEOUT_SECONDS,
)
except requests.RequestException as exc:
logger.debug("version check: request failed (%s)", exc, exc_info=True)
return None
if resp.status_code >= 400:
logger.debug("version check: non-2xx response %s", resp.status_code)
return None
try:
return resp.json()
except ValueError as exc:
logger.debug("version check: response decode failed (%s)", exc, exc_info=True)
return None
def _render_advisories(response: Dict[str, Any]) -> None:
advisories = response.get("advisories") or []
if not isinstance(advisories, list):
return
current_version = get_version()
for advisory in advisories:
if not isinstance(advisory, dict):
continue
severity = str(advisory.get("severity", "")).lower()
advisory_id = advisory.get("id", "UNKNOWN")
title = advisory.get("title", "")
url = advisory.get("url", "")
fixed_in = advisory.get("fixed_in")
summary = advisory.get(
"summary",
f"Your DocsGPT version {current_version} is vulnerable.",
)
logger.warning(
"security advisory %s (severity=%s) affects version %s: %s%s%s",
advisory_id,
severity or "unknown",
current_version,
title or summary,
f" — fixed in {fixed_in}" if fixed_in else "",
f"{url}" if url else "",
)
if severity in _HIGH_SEVERITIES:
_print_console_advisory(
advisory_id=advisory_id,
title=title,
severity=severity,
summary=summary,
fixed_in=fixed_in,
url=url,
)
def _print_console_advisory(
*,
advisory_id: str,
title: str,
severity: str,
summary: str,
fixed_in: Optional[str],
url: str,
) -> None:
color = _ANSI_RED if severity == "critical" else _ANSI_YELLOW
bar = "=" * 60
upgrade_line = ""
if fixed_in and url:
upgrade_line = f" Upgrade to {fixed_in}+ — {url}"
elif fixed_in:
upgrade_line = f" Upgrade to {fixed_in}+"
elif url:
upgrade_line = f" {url}"
lines = [
bar,
f"\u26a0 SECURITY ADVISORY: {advisory_id}",
f" {summary}",
f" {title} (severity: {severity})" if title else f" severity: {severity}",
]
if upgrade_line:
lines.append(upgrade_line)
lines.append(bar)
print(f"{color}{chr(10).join(lines)}{_ANSI_RESET}", flush=True)

View File

@@ -1,9 +1,23 @@
import logging
from functools import cached_property
from application.core.settings import settings
from application.vectorstore.base import BaseVectorStore
from application.vectorstore.document_class import Document
def _lazy_import_pymongo():
"""Lazy import of pymongo so installations that don't use the MongoDB vectorstore don't need it."""
try:
import pymongo
except ImportError as exc:
raise ImportError(
"Could not import pymongo python package. "
"Please install it with `pip install pymongo`."
) from exc
return pymongo
class MongoDBVectorStore(BaseVectorStore):
def __init__(
self,
@@ -20,20 +34,23 @@ class MongoDBVectorStore(BaseVectorStore):
self._embedding_key = embedding_key
self._embeddings_key = embeddings_key
self._mongo_uri = settings.MONGO_URI
self._database_name = database
self._collection_name = collection
self._source_id = source_id.replace("application/indexes/", "").rstrip("/")
self._embedding = self._get_embeddings(settings.EMBEDDINGS_NAME, embeddings_key)
try:
import pymongo
except ImportError:
raise ImportError(
"Could not import pymongo python package. "
"Please install it with `pip install pymongo`."
)
@cached_property
def _client(self):
pymongo = _lazy_import_pymongo()
return pymongo.MongoClient(self._mongo_uri)
self._client = pymongo.MongoClient(self._mongo_uri)
self._database = self._client[database]
self._collection = self._database[collection]
@cached_property
def _database(self):
return self._client[self._database_name]
@cached_property
def _collection(self):
return self._database[self._collection_name]
def search(self, question, k=2, *args, **kwargs):
query_vector = self._embedding.embed_query(question)

10
application/version.py Normal file
View File

@@ -0,0 +1,10 @@
"""DocsGPT backend version string."""
from __future__ import annotations
__version__ = "0.17.0"
def get_version() -> str:
"""Return the DocsGPT backend version."""
return __version__

View File

@@ -104,7 +104,15 @@ To run the DocsGPT backend locally, you'll need to set up a Python environment a
flask --app application/app.py run --host=0.0.0.0 --port=7091
```
This command will launch the backend server, making it accessible on `http://localhost:7091`.
This command will launch the backend server, making it accessible on `http://localhost:7091`. It's the fastest inner-loop option for day-to-day development — the Werkzeug interactive debugger still works and it hot-reloads on source changes. It serves the Flask routes only.
If you need to exercise the full ASGI stack — the `/mcp` endpoint (FastMCP server), or to match the production runtime — run the ASGI composition under uvicorn instead:
```bash
uvicorn application.asgi:asgi_app --host 0.0.0.0 --port 7091 --reload
```
Production uses `gunicorn -k uvicorn_worker.UvicornWorker` against the same `application.asgi:asgi_app` target.
6. **Start the Celery Worker:**

View File

@@ -1,6 +1,7 @@
export default {
"index": "Home",
"quickstart": "Quickstart",
"upgrading": "Upgrading",
"Deploying": "Deploying",
"Models": "Models",
"Tools": "Tools",

View File

@@ -0,0 +1,66 @@
---
title: Upgrading DocsGPT
description: Upgrade your DocsGPT deployment across Docker Compose, source builds, and Kubernetes.
---
import { Callout } from 'nextra/components'
# Upgrading DocsGPT
<Callout type="warning">
**Upgrading from 0.16.x?** User data moved from MongoDB to Postgres in 0.17.0. Follow the [Postgres Migration guide](/Deploying/Postgres-Migration) before running `docker compose pull` or `git pull` — existing deployments will not start cleanly without it.
</Callout>
## Check your version
```bash
docker compose exec backend python -c "from application.version import get_version; print(get_version())"
```
Release notes: [changelog](/changelog). Tags: [GitHub releases](https://github.com/arc53/DocsGPT/releases).
## Docker Compose — hub images
```bash
cd DocsGPT/deployment
docker compose -f docker-compose-hub.yaml pull
docker compose -f docker-compose-hub.yaml up -d
```
`pull` fetches the latest image for whichever tag your compose file references. To move to a specific release, edit `image: arc53/docsgpt:<tag>` first.
## Docker Compose — from source
```bash
cd DocsGPT
git pull
docker compose -f deployment/docker-compose.yaml build
docker compose -f deployment/docker-compose.yaml up -d
```
Swap `git pull` for `git checkout <tag>` if you want to pin a specific release.
## Kubernetes
```bash
kubectl set image deployment/docsgpt-backend backend=arc53/docsgpt:<tag>
kubectl set image deployment/docsgpt-worker worker=arc53/docsgpt:<tag>
kubectl rollout status deployment/docsgpt-backend
kubectl rollout status deployment/docsgpt-worker
```
Full manifests: [Kubernetes deployment guide](/Deploying/Kubernetes-Deploying).
## Migrations
Alembic migrations run on worker startup. To apply manually:
```bash
docker compose exec backend alembic -c application/alembic.ini upgrade head
```
`upgrade head` is idempotent.
## Rollback
Set the image tag to the previous release and `up -d` again. Schema changes are not reversible without a backup — take one before upgrading any release that mentions migrations in the changelog.

3063
docs/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -9,7 +9,7 @@
"dependencies": {
"@vercel/analytics": "^1.1.1",
"docsgpt-react": "^0.5.1",
"next": "^15.5.9",
"next": "^15.5.15",
"nextra": "^4.6.1",
"nextra-theme-docs": "^4.6.1",
"react": "^18.2.0",

View File

@@ -458,10 +458,11 @@
"dev": true
},
"node_modules/picomatch": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.2.tgz",
"integrity": "sha512-V7+vQEJ06Z+c5tSye8S+nHUfI51xoXIXjHQ99cQtKUkQqqO1kO/KCJUfZXuB47h/YBlDhah2H3hdUGXn8ie0oA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8.6"
},
@@ -802,10 +803,11 @@
}
},
"node_modules/yaml": {
"version": "1.10.2",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz",
"integrity": "sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==",
"version": "1.10.3",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.3.tgz",
"integrity": "sha512-vIYeF1u3CjlhAFekPPAk2h/Kv4T3mAkMox5OymRiJQB0spDP10LHvt+K7G9Ny6NuuMAb25/6n1qyUjAcGNf/AA==",
"dev": true,
"license": "ISC",
"engines": {
"node": ">= 6"
}
@@ -1137,9 +1139,9 @@
"dev": true
},
"picomatch": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.2.tgz",
"integrity": "sha512-V7+vQEJ06Z+c5tSye8S+nHUfI51xoXIXjHQ99cQtKUkQqqO1kO/KCJUfZXuB47h/YBlDhah2H3hdUGXn8ie0oA==",
"dev": true
},
"pify": {
@@ -1335,9 +1337,9 @@
"dev": true
},
"yaml": {
"version": "1.10.2",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz",
"integrity": "sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==",
"version": "1.10.3",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.3.tgz",
"integrity": "sha512-vIYeF1u3CjlhAFekPPAk2h/Kv4T3mAkMox5OymRiJQB0spDP10LHvt+K7G9Ny6NuuMAb25/6n1qyUjAcGNf/AA==",
"dev": true
}
}

View File

@@ -19,10 +19,10 @@
"class-variance-authority": "^0.7.0",
"clsx": "^2.1.0",
"dompurify": "^3.1.5",
"flow-bin": "^0.306.0",
"flow-bin": "^0.309.0",
"markdown-it": "^14.1.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react": "^19.2.5",
"react-dom": "^19.2.5",
"styled-components": "^6.1.8"
},
"devDependencies": {
@@ -34,8 +34,8 @@
"@parcel/transformer-typescript-types": "^2.16.4",
"@types/dompurify": "^3.0.5",
"@types/markdown-it": "^14.1.2",
"@types/react": "^18.3.3",
"@types/react-dom": "^18.3.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"@typescript-eslint/eslint-plugin": "^8.57.2",
"@typescript-eslint/parser": "^8.57.2",
"babel-loader": "^10.1.1",
@@ -1697,12 +1697,6 @@
"integrity": "sha512-30FAj7/EoJ5mwVPOWhAyCX+FPfMDrVecJAM+Iw9NRoSl4BBAQeqj4cApHHUXOVvIPgLVDsCFoz/hGD+5QQD1GQ==",
"license": "MIT"
},
"node_modules/@emotion/unitless": {
"version": "0.10.0",
"resolved": "https://registry.npmjs.org/@emotion/unitless/-/unitless-0.10.0.tgz",
"integrity": "sha512-dFoMUuQA20zvtVTuxZww6OHoJYgrzfKM1t52mVySDJnMSEa08ruEvdYQbhvyu6soU+NeLVd3yKfTfT0NeV6qGg==",
"license": "MIT"
},
"node_modules/@eslint-community/eslint-utils": {
"version": "4.9.1",
"resolved": "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz",
@@ -4517,37 +4511,26 @@
"dev": true,
"license": "MIT"
},
"node_modules/@types/prop-types": {
"version": "15.7.12",
"resolved": "https://registry.npmjs.org/@types/prop-types/-/prop-types-15.7.12.tgz",
"integrity": "sha512-5zvhXYtRNRluoE/jAp4GVsSduVUzNWKkOZrCDBWYtE7biZywwdC2AcEzg+cSMLFRfVgeAFqpfNabiPjxFddV1Q==",
"dev": true
},
"node_modules/@types/react": {
"version": "18.3.3",
"resolved": "https://registry.npmjs.org/@types/react/-/react-18.3.3.tgz",
"integrity": "sha512-hti/R0pS0q1/xx+TsI73XIqk26eBsISZ2R0wUijXIngRK9R/e7Xw/cXVxQK7R5JjW+SV4zGcn5hXjudkN/pLIw==",
"version": "19.2.14",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz",
"integrity": "sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/prop-types": "*",
"csstype": "^3.0.2"
"csstype": "^3.2.2"
}
},
"node_modules/@types/react-dom": {
"version": "18.3.0",
"resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-18.3.0.tgz",
"integrity": "sha512-EhwApuTmMBmXuFOikhQLIBUn6uFg81SwLMOAUgodJF14SOBOCMdU04gDoYi0WOJJHD144TL32z4yDqCW3dnkQg==",
"version": "19.2.3",
"resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-19.2.3.tgz",
"integrity": "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ==",
"dev": true,
"dependencies": {
"@types/react": "*"
"license": "MIT",
"peerDependencies": {
"@types/react": "^19.2.0"
}
},
"node_modules/@types/stylis": {
"version": "4.2.7",
"resolved": "https://registry.npmjs.org/@types/stylis/-/stylis-4.2.7.tgz",
"integrity": "sha512-VgDNokpBoKF+wrdvhAAfS55OMQpL6QRglwTwNC3kIgBrzZxA4WsFj+2eLfEA/uMUDzBcEhYmjSbwQakn/i3ajA==",
"license": "MIT"
},
"node_modules/@types/trusted-types": {
"version": "2.0.7",
"resolved": "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.7.tgz",
@@ -5311,22 +5294,15 @@
}
},
"node_modules/class-variance-authority": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/class-variance-authority/-/class-variance-authority-0.7.0.tgz",
"integrity": "sha512-jFI8IQw4hczaL4ALINxqLEXQbWcNjoSkloa4IaufXCJr6QawJyw7tuRysRsrE8w2p/4gGaxKIt/hX3qz/IbD1A==",
"version": "0.7.1",
"resolved": "https://registry.npmjs.org/class-variance-authority/-/class-variance-authority-0.7.1.tgz",
"integrity": "sha512-Ka+9Trutv7G8M6WT6SeiRWz792K5qEqIGEGzXKhAE6xOWAY6pPH8U+9IY3oCMv6kqTmLsv7Xh/2w2RigkePMsg==",
"license": "Apache-2.0",
"dependencies": {
"clsx": "2.0.0"
"clsx": "^2.1.1"
},
"funding": {
"url": "https://joebell.co.uk"
}
},
"node_modules/class-variance-authority/node_modules/clsx": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/clsx/-/clsx-2.0.0.tgz",
"integrity": "sha512-rQ1+kcj+ttHG0MKVGBUXwayCCF1oh39BF5COIpRzuCEv8Mwjv0XucrI2ExNTOn9IlLifGClWQcU9BrZORvtw6Q==",
"engines": {
"node": ">=6"
"url": "https://polar.sh/cva"
}
},
"node_modules/clone": {
@@ -5727,9 +5703,9 @@
}
},
"node_modules/dompurify": {
"version": "3.3.3",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.3.3.tgz",
"integrity": "sha512-Oj6pzI2+RqBfFG+qOaOLbFXLQ90ARpcGG6UePL82bJLtdsa6CYJD7nmiU8MW9nQNOtCHV3lZ/Bzq1X0QYbBZCA==",
"version": "3.4.0",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.4.0.tgz",
"integrity": "sha512-nolgK9JcaUXMSmW+j1yaSvaEaoXYHwWyGJlkoCTghc97KgGDDSnpoU/PlEnw63Ah+TGKFOyY+X5LnxaWbCSfXg==",
"license": "(MPL-2.0 OR Apache-2.0)",
"optionalDependencies": {
"@types/trusted-types": "^2.0.7"
@@ -6508,9 +6484,9 @@
"license": "ISC"
},
"node_modules/flow-bin": {
"version": "0.306.0",
"resolved": "https://registry.npmjs.org/flow-bin/-/flow-bin-0.306.0.tgz",
"integrity": "sha512-NaAyPsFWZSY59NLCL+6lUXPS5KCO4td9h7XO0kV9VGzRr19sypImQYc0DD1Skm+SHt0/mOIYKI21KtmhQ4KHBg==",
"version": "0.309.0",
"resolved": "https://registry.npmjs.org/flow-bin/-/flow-bin-0.309.0.tgz",
"integrity": "sha512-/RH68gcCY8OHzcdSVTUCw+fhDSEYmNHoovfK0EcbB4rs1Xbc5HhxhHTvr7U+h55De4bDRlE52ghH23MRP625cQ==",
"license": "MIT",
"bin": {
"flow": "cli.js"
@@ -7738,6 +7714,7 @@
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
"integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
"dev": true,
"dependencies": {
"js-tokens": "^3.0.0 || ^4.0.0"
},
@@ -7889,24 +7866,6 @@
"node-gyp-build-optional-packages-test": "build-test.js"
}
},
"node_modules/nanoid": {
"version": "3.3.11",
"resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz",
"integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"license": "MIT",
"bin": {
"nanoid": "bin/nanoid.cjs"
},
"engines": {
"node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
}
},
"node_modules/natural-compare": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz",
@@ -8355,34 +8314,6 @@
"node": ">= 0.4"
}
},
"node_modules/postcss": {
"version": "8.4.49",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.49.tgz",
"integrity": "sha512-OCVPnIObs4N29kxTjzLfUryOkvZEq+pf8jTF0lg8E7uETuWHA+v7j3c/xJmiqpX450191LlmZfUKkXxkTry7nA==",
"funding": [
{
"type": "opencollective",
"url": "https://opencollective.com/postcss/"
},
{
"type": "tidelift",
"url": "https://tidelift.com/funding/github/npm/postcss"
},
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"license": "MIT",
"dependencies": {
"nanoid": "^3.3.7",
"picocolors": "^1.1.1",
"source-map-js": "^1.2.1"
},
"engines": {
"node": "^10 || ^12 || >=14"
}
},
"node_modules/postcss-value-parser": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz",
@@ -8468,26 +8399,24 @@
}
},
"node_modules/react": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz",
"integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==",
"dependencies": {
"loose-envify": "^1.1.0"
},
"version": "19.2.5",
"resolved": "https://registry.npmjs.org/react/-/react-19.2.5.tgz",
"integrity": "sha512-llUJLzz1zTUBrskt2pwZgLq59AemifIftw4aB7JxOqf1HY2FDaGDxgwpAPVzHU1kdWabH7FauP4i1oEeer2WCA==",
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/react-dom": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz",
"integrity": "sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw==",
"version": "19.2.5",
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-19.2.5.tgz",
"integrity": "sha512-J5bAZz+DXMMwW/wV3xzKke59Af6CHY7G4uYLN1OvBcKEsWOs4pQExj86BBKamxl/Ik5bx9whOrvBlSDfWzgSag==",
"license": "MIT",
"dependencies": {
"loose-envify": "^1.1.0",
"scheduler": "^0.23.2"
"scheduler": "^0.27.0"
},
"peerDependencies": {
"react": "^18.3.1"
"react": "^19.2.5"
}
},
"node_modules/react-is": {
@@ -8732,12 +8661,10 @@
}
},
"node_modules/scheduler": {
"version": "0.23.2",
"resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.23.2.tgz",
"integrity": "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ==",
"dependencies": {
"loose-envify": "^1.1.0"
}
"version": "0.27.0",
"resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.27.0.tgz",
"integrity": "sha512-eNv+WrVbKu1f3vbYJT/xtiF5syA5HPIMtf9IgY/nKg0sWqzAUEvqY/xm7OcZc/qafLx/iO9FgOmeSAp4v5ti/Q==",
"license": "MIT"
},
"node_modules/semver": {
"version": "6.3.1",
@@ -8796,11 +8723,6 @@
"node": ">= 0.4"
}
},
"node_modules/shallowequal": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/shallowequal/-/shallowequal-1.1.0.tgz",
"integrity": "sha512-y0m1JoUZSlPAjXVtPPW70aZWfIL/dSP7AFkRnniLCrK/8MDKog3TySTBmckD+RObVxH0v4Tox67+F14PdED2oQ=="
},
"node_modules/shebang-command": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
@@ -8914,6 +8836,7 @@
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
"integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==",
"dev": true,
"license": "BSD-3-Clause",
"engines": {
"node": ">=0.10.0"
@@ -9045,20 +8968,15 @@
}
},
"node_modules/styled-components": {
"version": "6.3.12",
"resolved": "https://registry.npmjs.org/styled-components/-/styled-components-6.3.12.tgz",
"integrity": "sha512-hFR6xsVkVYbsdcUlzPYFvFfoc6o2KlV0VvgRIQwSYMtdThM7SCxnjX9efh/cWce2kTq16I/Kl3xM98xiLptsXA==",
"version": "6.4.0",
"resolved": "https://registry.npmjs.org/styled-components/-/styled-components-6.4.0.tgz",
"integrity": "sha512-BL1EDFpt+q10eAeZB0q9ps6pSlPejaBQWBkiuM16pyoVTG4NhZrPrZK0cqNbrozxSsYwUsJ9SQYN6NyeKJYX9A==",
"license": "MIT",
"dependencies": {
"@emotion/is-prop-valid": "1.4.0",
"@emotion/unitless": "0.10.0",
"@types/stylis": "4.2.7",
"css-to-react-native": "3.2.0",
"csstype": "3.2.3",
"postcss": "8.4.49",
"shallowequal": "1.1.0",
"stylis": "4.3.6",
"tslib": "2.8.1"
"stylis": "4.3.6"
},
"engines": {
"node": ">= 16"
@@ -9068,12 +8986,20 @@
"url": "https://opencollective.com/styled-components"
},
"peerDependencies": {
"css-to-react-native": ">= 3.2.0",
"react": ">= 16.8.0",
"react-dom": ">= 16.8.0"
"react-dom": ">= 16.8.0",
"react-native": ">= 0.68.0"
},
"peerDependenciesMeta": {
"css-to-react-native": {
"optional": true
},
"react-dom": {
"optional": true
},
"react-native": {
"optional": true
}
}
},

View File

@@ -52,10 +52,10 @@
"class-variance-authority": "^0.7.0",
"clsx": "^2.1.0",
"dompurify": "^3.1.5",
"flow-bin": "^0.306.0",
"flow-bin": "^0.309.0",
"markdown-it": "^14.1.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react": "^19.2.5",
"react-dom": "^19.2.5",
"styled-components": "^6.1.8"
},
"devDependencies": {
@@ -67,8 +67,8 @@
"@parcel/transformer-typescript-types": "^2.16.4",
"@types/dompurify": "^3.0.5",
"@types/markdown-it": "^14.1.2",
"@types/react": "^18.3.3",
"@types/react-dom": "^18.3.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"@typescript-eslint/eslint-plugin": "^8.57.2",
"@typescript-eslint/parser": "^8.57.2",
"babel-loader": "^10.1.1",

View File

@@ -142,10 +142,11 @@
}
},
"node_modules/brace-expansion": {
"version": "1.1.11",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
"version": "1.1.14",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.14.tgz",
"integrity": "sha512-MWPGfDxnyzKU7rNOW9SP/c50vi3xrmrua/+6hfPbCS2ABNWfx24vPidzvC7krjU/RTo235sV776ymlsMtGKj8g==",
"dev": true,
"license": "MIT",
"dependencies": {
"balanced-match": "^1.0.0",
"concat-map": "0.0.1"
@@ -492,10 +493,11 @@
}
},
"node_modules/minimatch": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
"version": "3.1.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.5.tgz",
"integrity": "sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w==",
"dev": true,
"license": "ISC",
"dependencies": {
"brace-expansion": "^1.1.7"
},
@@ -590,10 +592,11 @@
"dev": true
},
"node_modules/picomatch": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.2.tgz",
"integrity": "sha512-V7+vQEJ06Z+c5tSye8S+nHUfI51xoXIXjHQ99cQtKUkQqqO1kO/KCJUfZXuB47h/YBlDhah2H3hdUGXn8ie0oA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8.6"
},
@@ -990,10 +993,11 @@
"dev": true
},
"node_modules/yaml": {
"version": "1.10.2",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz",
"integrity": "sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==",
"version": "1.10.3",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.3.tgz",
"integrity": "sha512-vIYeF1u3CjlhAFekPPAk2h/Kv4T3mAkMox5OymRiJQB0spDP10LHvt+K7G9Ny6NuuMAb25/6n1qyUjAcGNf/AA==",
"dev": true,
"license": "ISC",
"engines": {
"node": ">= 6"
}

View File

@@ -20,7 +20,7 @@
"copy-to-clipboard": "^3.3.3",
"i18next": "^26.0.4",
"i18next-browser-languagedetector": "^8.2.1",
"lodash": "^4.17.21",
"lodash": "^4.18.1",
"lucide-react": "^1.8.0",
"mermaid": "^11.14.0",
"prop-types": "^15.8.1",
@@ -33,7 +33,7 @@
"react-i18next": "^17.0.2",
"react-markdown": "^9.0.1",
"react-redux": "^9.2.0",
"react-router-dom": "^7.6.1",
"react-router-dom": "^7.14.1",
"react-syntax-highlighter": "^16.1.1",
"reactflow": "^11.11.4",
"rehype-katex": "^7.0.1",
@@ -44,7 +44,7 @@
"devDependencies": {
"@tailwindcss/postcss": "^4.2.2",
"@types/lodash": "^4.17.20",
"@types/react": "^19.1.8",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"@types/react-syntax-highlighter": "^15.5.13",
"@typescript-eslint/eslint-plugin": "^8.58.2",
@@ -66,7 +66,7 @@
"tailwindcss": "^4.2.2",
"tw-animate-css": "^1.4.0",
"typescript": "^5.8.3",
"vite": "^8.0.0",
"vite": "^8.0.8",
"vite-plugin-svgr": "^4.3.0"
}
},
@@ -418,21 +418,21 @@
"license": "Apache-2.0"
},
"node_modules/@emnapi/core": {
"version": "1.9.0",
"resolved": "https://registry.npmjs.org/@emnapi/core/-/core-1.9.0.tgz",
"integrity": "sha512-0DQ98G9ZQZOxfUcQn1waV2yS8aWdZ6kJMbYCJB3oUBecjWYO1fqJ+a1DRfPF3O5JEkwqwP1A9QEN/9mYm2Yd0w==",
"version": "1.9.2",
"resolved": "https://registry.npmjs.org/@emnapi/core/-/core-1.9.2.tgz",
"integrity": "sha512-UC+ZhH3XtczQYfOlu3lNEkdW/p4dsJ1r/bP7H8+rhao3TTTMO1ATq/4DdIi23XuGoFY+Cz0JmCbdVl0hz9jZcA==",
"dev": true,
"license": "MIT",
"optional": true,
"dependencies": {
"@emnapi/wasi-threads": "1.2.0",
"@emnapi/wasi-threads": "1.2.1",
"tslib": "^2.4.0"
}
},
"node_modules/@emnapi/runtime": {
"version": "1.9.0",
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.0.tgz",
"integrity": "sha512-QN75eB0IH2ywSpRpNddCRfQIhmJYBCJ1x5Lb3IscKAL8bMnVAKnRg8dCoXbHzVLLH7P38N2Z3mtulB7W0J0FKw==",
"version": "1.9.2",
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.2.tgz",
"integrity": "sha512-3U4+MIWHImeyu1wnmVygh5WlgfYDtyf0k8AbLhMFxOipihf6nrWC4syIm/SwEeec0mNSafiiNnMJwbza/Is6Lw==",
"dev": true,
"license": "MIT",
"optional": true,
@@ -441,9 +441,9 @@
}
},
"node_modules/@emnapi/wasi-threads": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@emnapi/wasi-threads/-/wasi-threads-1.2.0.tgz",
"integrity": "sha512-N10dEJNSsUx41Z6pZsXU8FjPjpBEplgH24sfkmITrBED1/U2Esum9F3lfLrMjKHHjmi557zQn7kR9R+XWXu5Rg==",
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/@emnapi/wasi-threads/-/wasi-threads-1.2.1.tgz",
"integrity": "sha512-uTII7OYF+/Mes/MrcIOYp5yOtSMLBWSIoLPpcgwipoiKbli6k322tcoFsxoIIxPDqW01SQGAgko4EzZi2BNv2w==",
"dev": true,
"license": "MIT",
"optional": true,
@@ -772,36 +772,28 @@
}
},
"node_modules/@napi-rs/wasm-runtime": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@napi-rs/wasm-runtime/-/wasm-runtime-1.1.1.tgz",
"integrity": "sha512-p64ah1M1ld8xjWv3qbvFwHiFVWrq1yFvV4f7w+mzaqiR4IlSgkqhcRdHwsGgomwzBH51sRY4NEowLxnaBjcW/A==",
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/@napi-rs/wasm-runtime/-/wasm-runtime-1.1.4.tgz",
"integrity": "sha512-3NQNNgA1YSlJb/kMH1ildASP9HW7/7kYnRI2szWJaofaS1hWmbGI4H+d3+22aGzXXN9IJ+n+GiFVcGipJP18ow==",
"dev": true,
"license": "MIT",
"optional": true,
"dependencies": {
"@emnapi/core": "^1.7.1",
"@emnapi/runtime": "^1.7.1",
"@tybys/wasm-util": "^0.10.1"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@oxc-project/runtime": {
"version": "0.115.0",
"resolved": "https://registry.npmjs.org/@oxc-project/runtime/-/runtime-0.115.0.tgz",
"integrity": "sha512-Rg8Wlt5dCbXhQnsXPrkOjL1DTSvXLgb2R/KYfnf1/K+R0k6UMLEmbQXPM+kwrWqSmWA2t0B1EtHy2/3zikQpvQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^20.19.0 || >=22.12.0"
},
"peerDependencies": {
"@emnapi/core": "^1.7.1",
"@emnapi/runtime": "^1.7.1"
}
},
"node_modules/@oxc-project/types": {
"version": "0.115.0",
"resolved": "https://registry.npmjs.org/@oxc-project/types/-/types-0.115.0.tgz",
"integrity": "sha512-4n91DKnebUS4yjUHl2g3/b2T+IUdCfmoZGhmwsovZCDaJSs+QkVAM+0AqqTxHSsHfeiMuueT75cZaZcT/m0pSw==",
"version": "0.124.0",
"resolved": "https://registry.npmjs.org/@oxc-project/types/-/types-0.124.0.tgz",
"integrity": "sha512-VBFWMTBvHxS11Z5Lvlr3IWgrwhMTXV+Md+EQF0Xf60+wAdsGFTBx7X7K/hP4pi8N7dcm1RvcHwDxZ16Qx8keUg==",
"dev": true,
"license": "MIT",
"funding": {
@@ -2592,9 +2584,9 @@
}
},
"node_modules/@rolldown/binding-android-arm64": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-android-arm64/-/binding-android-arm64-1.0.0-rc.9.tgz",
"integrity": "sha512-lcJL0bN5hpgJfSIz/8PIf02irmyL43P+j1pTCfbD1DbLkmGRuFIA4DD3B3ZOvGqG0XiVvRznbKtN0COQVaKUTg==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-android-arm64/-/binding-android-arm64-1.0.0-rc.15.tgz",
"integrity": "sha512-YYe6aWruPZDtHNpwu7+qAHEMbQ/yRl6atqb/AhznLTnD3UY99Q1jE7ihLSahNWkF4EqRPVC4SiR4O0UkLK02tA==",
"cpu": [
"arm64"
],
@@ -2609,9 +2601,9 @@
}
},
"node_modules/@rolldown/binding-darwin-arm64": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-darwin-arm64/-/binding-darwin-arm64-1.0.0-rc.9.tgz",
"integrity": "sha512-J7Zk3kLYFsLtuH6U+F4pS2sYVzac0qkjcO5QxHS7OS7yZu2LRs+IXo+uvJ/mvpyUljDJ3LROZPoQfgBIpCMhdQ==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-darwin-arm64/-/binding-darwin-arm64-1.0.0-rc.15.tgz",
"integrity": "sha512-oArR/ig8wNTPYsXL+Mzhs0oxhxfuHRfG7Ikw7jXsw8mYOtk71W0OkF2VEVh699pdmzjPQsTjlD1JIOoHkLP1Fg==",
"cpu": [
"arm64"
],
@@ -2626,9 +2618,9 @@
}
},
"node_modules/@rolldown/binding-darwin-x64": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-darwin-x64/-/binding-darwin-x64-1.0.0-rc.9.tgz",
"integrity": "sha512-iwtmmghy8nhfRGeNAIltcNXzD0QMNaaA5U/NyZc1Ia4bxrzFByNMDoppoC+hl7cDiUq5/1CnFthpT9n+UtfFyg==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-darwin-x64/-/binding-darwin-x64-1.0.0-rc.15.tgz",
"integrity": "sha512-YzeVqOqjPYvUbJSWJ4EDL8ahbmsIXQpgL3JVipmN+MX0XnXMeWomLN3Fb+nwCmP/jfyqte5I3XRSm7OfQrbyxw==",
"cpu": [
"x64"
],
@@ -2643,9 +2635,9 @@
}
},
"node_modules/@rolldown/binding-freebsd-x64": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-freebsd-x64/-/binding-freebsd-x64-1.0.0-rc.9.tgz",
"integrity": "sha512-DLFYI78SCiZr5VvdEplsVC2Vx53lnA4/Ga5C65iyldMVaErr86aiqCoNBLl92PXPfDtUYjUh+xFFor40ueNs4Q==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-freebsd-x64/-/binding-freebsd-x64-1.0.0-rc.15.tgz",
"integrity": "sha512-9Erhx956jeQ0nNTyif1+QWAXDRD38ZNjr//bSHrt6wDwB+QkAfl2q6Mn1k6OBPerznjRmbM10lgRb1Pli4xZPw==",
"cpu": [
"x64"
],
@@ -2660,9 +2652,9 @@
}
},
"node_modules/@rolldown/binding-linux-arm-gnueabihf": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm-gnueabihf/-/binding-linux-arm-gnueabihf-1.0.0-rc.9.tgz",
"integrity": "sha512-CsjTmTwd0Hri6iTw/DRMK7kOZ7FwAkrO4h8YWKoX/kcj833e4coqo2wzIFywtch/8Eb5enQ/lwLM7w6JX1W5RQ==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm-gnueabihf/-/binding-linux-arm-gnueabihf-1.0.0-rc.15.tgz",
"integrity": "sha512-cVwk0w8QbZJGTnP/AHQBs5yNwmpgGYStL88t4UIaqcvYJWBfS0s3oqVLZPwsPU6M0zlW4GqjP0Zq5MnAGwFeGA==",
"cpu": [
"arm"
],
@@ -2677,9 +2669,9 @@
}
},
"node_modules/@rolldown/binding-linux-arm64-gnu": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm64-gnu/-/binding-linux-arm64-gnu-1.0.0-rc.9.tgz",
"integrity": "sha512-2x9O2JbSPxpxMDhP9Z74mahAStibTlrBMW0520+epJH5sac7/LwZW5Bmg/E6CXuEF53JJFW509uP+lSedaUNxg==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm64-gnu/-/binding-linux-arm64-gnu-1.0.0-rc.15.tgz",
"integrity": "sha512-eBZ/u8iAK9SoHGanqe/jrPnY0JvBN6iXbVOsbO38mbz+ZJsaobExAm1Iu+rxa4S1l2FjG0qEZn4Rc6X8n+9M+w==",
"cpu": [
"arm64"
],
@@ -2694,9 +2686,9 @@
}
},
"node_modules/@rolldown/binding-linux-arm64-musl": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm64-musl/-/binding-linux-arm64-musl-1.0.0-rc.9.tgz",
"integrity": "sha512-JA1QRW31ogheAIRhIg9tjMfsYbglXXYGNPLdPEYrwFxdbkQCAzvpSCSHCDWNl4hTtrol8WeboCSEpjdZK8qrCg==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-arm64-musl/-/binding-linux-arm64-musl-1.0.0-rc.15.tgz",
"integrity": "sha512-ZvRYMGrAklV9PEkgt4LQM6MjQX2P58HPAuecwYObY2DhS2t35R0I810bKi0wmaYORt6m/2Sm+Z+nFgb0WhXNcQ==",
"cpu": [
"arm64"
],
@@ -2711,9 +2703,9 @@
}
},
"node_modules/@rolldown/binding-linux-ppc64-gnu": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-ppc64-gnu/-/binding-linux-ppc64-gnu-1.0.0-rc.9.tgz",
"integrity": "sha512-aOKU9dJheda8Kj8Y3w9gnt9QFOO+qKPAl8SWd7JPHP+Cu0EuDAE5wokQubLzIDQWg2myXq2XhTpOVS07qqvT+w==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-ppc64-gnu/-/binding-linux-ppc64-gnu-1.0.0-rc.15.tgz",
"integrity": "sha512-VDpgGBzgfg5hLg+uBpCLoFG5kVvEyafmfxGUV0UHLcL5irxAK7PKNeC2MwClgk6ZAiNhmo9FLhRYgvMmedLtnQ==",
"cpu": [
"ppc64"
],
@@ -2728,9 +2720,9 @@
}
},
"node_modules/@rolldown/binding-linux-s390x-gnu": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-s390x-gnu/-/binding-linux-s390x-gnu-1.0.0-rc.9.tgz",
"integrity": "sha512-OalO94fqj7IWRn3VdXWty75jC5dk4C197AWEuMhIpvVv2lw9fiPhud0+bW2ctCxb3YoBZor71QHbY+9/WToadA==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-s390x-gnu/-/binding-linux-s390x-gnu-1.0.0-rc.15.tgz",
"integrity": "sha512-y1uXY3qQWCzcPgRJATPSOUP4tCemh4uBdY7e3EZbVwCJTY3gLJWnQABgeUetvED+bt1FQ01OeZwvhLS2bpNrAQ==",
"cpu": [
"s390x"
],
@@ -2745,9 +2737,9 @@
}
},
"node_modules/@rolldown/binding-linux-x64-gnu": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-x64-gnu/-/binding-linux-x64-gnu-1.0.0-rc.9.tgz",
"integrity": "sha512-cVEl1vZtBsBZna3YMjGXNvnYYrOJ7RzuWvZU0ffvJUexWkukMaDuGhUXn0rjnV0ptzGVkvc+vW9Yqy6h8YX4pg==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-x64-gnu/-/binding-linux-x64-gnu-1.0.0-rc.15.tgz",
"integrity": "sha512-023bTPBod7J3Y/4fzAN6QtpkSABR0rigtrwaP+qSEabUh5zf6ELr9Nc7GujaROuPY3uwdSIXWrvhn1KxOvurWA==",
"cpu": [
"x64"
],
@@ -2762,9 +2754,9 @@
}
},
"node_modules/@rolldown/binding-linux-x64-musl": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-x64-musl/-/binding-linux-x64-musl-1.0.0-rc.9.tgz",
"integrity": "sha512-UzYnKCIIc4heAKgI4PZ3dfBGUZefGCJ1TPDuLHoCzgrMYPb5Rv6TLFuYtyM4rWyHM7hymNdsg5ik2C+UD9VDbA==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-linux-x64-musl/-/binding-linux-x64-musl-1.0.0-rc.15.tgz",
"integrity": "sha512-witB2O0/hU4CgfOOKUoeFgQ4GktPi1eEbAhaLAIpgD6+ZnhcPkUtPsoKKHRzmOoWPZue46IThdSgdo4XneOLYw==",
"cpu": [
"x64"
],
@@ -2779,9 +2771,9 @@
}
},
"node_modules/@rolldown/binding-openharmony-arm64": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-openharmony-arm64/-/binding-openharmony-arm64-1.0.0-rc.9.tgz",
"integrity": "sha512-+6zoiF+RRyf5cdlFQP7nm58mq7+/2PFaY2DNQeD4B87N36JzfF/l9mdBkkmTvSYcYPE8tMh/o3cRlsx1ldLfog==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-openharmony-arm64/-/binding-openharmony-arm64-1.0.0-rc.15.tgz",
"integrity": "sha512-UCL68NJ0Ud5zRipXZE9dF5PmirzJE4E4BCIOOssEnM7wLDsxjc6Qb0sGDxTNRTP53I6MZpygyCpY8Aa8sPfKPg==",
"cpu": [
"arm64"
],
@@ -2796,9 +2788,9 @@
}
},
"node_modules/@rolldown/binding-wasm32-wasi": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-wasm32-wasi/-/binding-wasm32-wasi-1.0.0-rc.9.tgz",
"integrity": "sha512-rgFN6sA/dyebil3YTlL2evvi/M+ivhfnyxec7AccTpRPccno/rPoNlqybEZQBkcbZu8Hy+eqNJCqfBR8P7Pg8g==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-wasm32-wasi/-/binding-wasm32-wasi-1.0.0-rc.15.tgz",
"integrity": "sha512-ApLruZq/ig+nhaE7OJm4lDjayUnOHVUa77zGeqnqZ9pn0ovdVbbNPerVibLXDmWeUZXjIYIT8V3xkT58Rm9u5Q==",
"cpu": [
"wasm32"
],
@@ -2806,16 +2798,18 @@
"license": "MIT",
"optional": true,
"dependencies": {
"@napi-rs/wasm-runtime": "^1.1.1"
"@emnapi/core": "1.9.2",
"@emnapi/runtime": "1.9.2",
"@napi-rs/wasm-runtime": "^1.1.3"
},
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/@rolldown/binding-win32-arm64-msvc": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-win32-arm64-msvc/-/binding-win32-arm64-msvc-1.0.0-rc.9.tgz",
"integrity": "sha512-lHVNUG/8nlF1IQk1C0Ci574qKYyty2goMiPlRqkC5R+3LkXDkL5Dhx8ytbxq35m+pkHVIvIxviD+TWLdfeuadA==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-win32-arm64-msvc/-/binding-win32-arm64-msvc-1.0.0-rc.15.tgz",
"integrity": "sha512-KmoUoU7HnN+Si5YWJigfTws1jz1bKBYDQKdbLspz0UaqjjFkddHsqorgiW1mxcAj88lYUE6NC/zJNwT+SloqtA==",
"cpu": [
"arm64"
],
@@ -2830,9 +2824,9 @@
}
},
"node_modules/@rolldown/binding-win32-x64-msvc": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/binding-win32-x64-msvc/-/binding-win32-x64-msvc-1.0.0-rc.9.tgz",
"integrity": "sha512-G0oA4+w1iY5AGi5HcDTxWsoxF509hrFIPB2rduV5aDqS9FtDg1CAfa7V34qImbjfhIcA8C+RekocJZA96EarwQ==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/binding-win32-x64-msvc/-/binding-win32-x64-msvc-1.0.0-rc.15.tgz",
"integrity": "sha512-3P2A8L+x75qavWLe/Dll3EYBJLQmtkJN8rfh+U/eR3MqMgL/h98PhYI+JFfXuDPgPeCB7iZAKiqii5vqOvnA0g==",
"cpu": [
"x64"
],
@@ -2876,19 +2870,6 @@
}
}
},
"node_modules/@rollup/pluginutils/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/@rtsao/scc": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@rtsao/scc/-/scc-1.1.0.tgz",
@@ -3833,12 +3814,12 @@
"license": "MIT"
},
"node_modules/@types/react": {
"version": "19.2.2",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.2.tgz",
"integrity": "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA==",
"version": "19.2.14",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz",
"integrity": "sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w==",
"license": "MIT",
"dependencies": {
"csstype": "^3.0.2"
"csstype": "^3.2.2"
}
},
"node_modules/@types/react-dom": {
@@ -4981,9 +4962,9 @@
}
},
"node_modules/csstype": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
"integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz",
"integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==",
"license": "MIT"
},
"node_modules/cytoscape": {
@@ -5675,9 +5656,9 @@
}
},
"node_modules/dompurify": {
"version": "3.3.3",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.3.3.tgz",
"integrity": "sha512-Oj6pzI2+RqBfFG+qOaOLbFXLQ90ARpcGG6UePL82bJLtdsa6CYJD7nmiU8MW9nQNOtCHV3lZ/Bzq1X0QYbBZCA==",
"version": "3.4.0",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.4.0.tgz",
"integrity": "sha512-nolgK9JcaUXMSmW+j1yaSvaEaoXYHwWyGJlkoCTghc97KgGDDSnpoU/PlEnw63Ah+TGKFOyY+X5LnxaWbCSfXg==",
"license": "(MPL-2.0 OR Apache-2.0)",
"optionalDependencies": {
"@types/trusted-types": "^2.0.7"
@@ -8317,19 +8298,6 @@
"url": "https://opencollective.com/lint-staged"
}
},
"node_modules/lint-staged/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/listr2": {
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/listr2/-/listr2-9.0.5.tgz",
@@ -8382,9 +8350,9 @@
}
},
"node_modules/lodash": {
"version": "4.17.23",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.23.tgz",
"integrity": "sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w==",
"version": "4.18.1",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.18.1.tgz",
"integrity": "sha512-dMInicTPVE8d1e5otfwmmjlxkZoUpiVLwyeTdUsi/Caj/gfzzblBcCE5sRHV/AsjuCmxWrte2TNGSYuCeCq+0Q==",
"license": "MIT"
},
"node_modules/lodash-es": {
@@ -9932,6 +9900,19 @@
"dev": true,
"license": "ISC"
},
"node_modules/picomatch": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.4.tgz",
"integrity": "sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/pkg-types": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/pkg-types/-/pkg-types-2.3.0.tgz",
@@ -10456,9 +10437,9 @@
}
},
"node_modules/react-router": {
"version": "7.13.1",
"resolved": "https://registry.npmjs.org/react-router/-/react-router-7.13.1.tgz",
"integrity": "sha512-td+xP4X2/6BJvZoX6xw++A2DdEi++YypA69bJUV5oVvqf6/9/9nNlD70YO1e9d3MyamJEBQFEzk6mbfDYbqrSA==",
"version": "7.14.1",
"resolved": "https://registry.npmjs.org/react-router/-/react-router-7.14.1.tgz",
"integrity": "sha512-5BCvFskyAAVumqhEKh/iPhLOIkfxcEUz8WqFIARCkMg8hZZzDYX9CtwxXA0e+qT8zAxmMC0x3Ckb9iMONwc5jg==",
"license": "MIT",
"dependencies": {
"cookie": "^1.0.1",
@@ -10478,12 +10459,12 @@
}
},
"node_modules/react-router-dom": {
"version": "7.13.1",
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-7.13.1.tgz",
"integrity": "sha512-UJnV3Rxc5TgUPJt2KJpo1Jpy0OKQr0AjgbZzBFjaPJcFOb2Y8jA5H3LT8HUJAiRLlWrEXWHbF1Z4SCZaQjWDHw==",
"version": "7.14.1",
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-7.14.1.tgz",
"integrity": "sha512-ZkrQuwwhGibjQLqH1eCdyiZyLWglPxzxdl5tgwgKEyCSGC76vmAjleGocRe3J/MLfzMUIKwaFJWpFVJhK3d2xA==",
"license": "MIT",
"dependencies": {
"react-router": "7.13.1"
"react-router": "7.14.1"
},
"engines": {
"node": ">=20.0.0"
@@ -10807,14 +10788,14 @@
"license": "Unlicense"
},
"node_modules/rolldown": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/rolldown/-/rolldown-1.0.0-rc.9.tgz",
"integrity": "sha512-9EbgWge7ZH+yqb4d2EnELAntgPTWbfL8ajiTW+SyhJEC4qhBbkCKbqFV4Ge4zmu5ziQuVbWxb/XwLZ+RIO7E8Q==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/rolldown/-/rolldown-1.0.0-rc.15.tgz",
"integrity": "sha512-Ff31guA5zT6WjnGp0SXw76X6hzGRk/OQq2hE+1lcDe+lJdHSgnSX6nK3erbONHyCbpSj9a9E+uX/OvytZoWp2g==",
"dev": true,
"license": "MIT",
"dependencies": {
"@oxc-project/types": "=0.115.0",
"@rolldown/pluginutils": "1.0.0-rc.9"
"@oxc-project/types": "=0.124.0",
"@rolldown/pluginutils": "1.0.0-rc.15"
},
"bin": {
"rolldown": "bin/cli.mjs"
@@ -10823,27 +10804,27 @@
"node": "^20.19.0 || >=22.12.0"
},
"optionalDependencies": {
"@rolldown/binding-android-arm64": "1.0.0-rc.9",
"@rolldown/binding-darwin-arm64": "1.0.0-rc.9",
"@rolldown/binding-darwin-x64": "1.0.0-rc.9",
"@rolldown/binding-freebsd-x64": "1.0.0-rc.9",
"@rolldown/binding-linux-arm-gnueabihf": "1.0.0-rc.9",
"@rolldown/binding-linux-arm64-gnu": "1.0.0-rc.9",
"@rolldown/binding-linux-arm64-musl": "1.0.0-rc.9",
"@rolldown/binding-linux-ppc64-gnu": "1.0.0-rc.9",
"@rolldown/binding-linux-s390x-gnu": "1.0.0-rc.9",
"@rolldown/binding-linux-x64-gnu": "1.0.0-rc.9",
"@rolldown/binding-linux-x64-musl": "1.0.0-rc.9",
"@rolldown/binding-openharmony-arm64": "1.0.0-rc.9",
"@rolldown/binding-wasm32-wasi": "1.0.0-rc.9",
"@rolldown/binding-win32-arm64-msvc": "1.0.0-rc.9",
"@rolldown/binding-win32-x64-msvc": "1.0.0-rc.9"
"@rolldown/binding-android-arm64": "1.0.0-rc.15",
"@rolldown/binding-darwin-arm64": "1.0.0-rc.15",
"@rolldown/binding-darwin-x64": "1.0.0-rc.15",
"@rolldown/binding-freebsd-x64": "1.0.0-rc.15",
"@rolldown/binding-linux-arm-gnueabihf": "1.0.0-rc.15",
"@rolldown/binding-linux-arm64-gnu": "1.0.0-rc.15",
"@rolldown/binding-linux-arm64-musl": "1.0.0-rc.15",
"@rolldown/binding-linux-ppc64-gnu": "1.0.0-rc.15",
"@rolldown/binding-linux-s390x-gnu": "1.0.0-rc.15",
"@rolldown/binding-linux-x64-gnu": "1.0.0-rc.15",
"@rolldown/binding-linux-x64-musl": "1.0.0-rc.15",
"@rolldown/binding-openharmony-arm64": "1.0.0-rc.15",
"@rolldown/binding-wasm32-wasi": "1.0.0-rc.15",
"@rolldown/binding-win32-arm64-msvc": "1.0.0-rc.15",
"@rolldown/binding-win32-x64-msvc": "1.0.0-rc.15"
}
},
"node_modules/rolldown/node_modules/@rolldown/pluginutils": {
"version": "1.0.0-rc.9",
"resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-rc.9.tgz",
"integrity": "sha512-w6oiRWgEBl04QkFZgmW+jnU1EC9b57Oihi2ot3HNWIQRqgHp5PnYDia5iZ5FF7rpa4EQdiqMDXjlqKGXBhsoXw==",
"version": "1.0.0-rc.15",
"resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-rc.15.tgz",
"integrity": "sha512-UromN0peaE53IaBRe9W7CjrZgXl90fqGpK+mIZbA3qSTeYqg3pqpROBdIPvOG3F5ereDHNwoHBI2e50n1BDr1g==",
"dev": true,
"license": "MIT"
},
@@ -11513,19 +11494,6 @@
}
}
},
"node_modules/tinyglobby/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/toggle-selection": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/toggle-selection/-/toggle-selection-1.0.6.tgz",
@@ -11588,19 +11556,6 @@
"typescript": ">=4.0.0"
}
},
"node_modules/ts-declaration-location/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/ts-dedent": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/ts-dedent/-/ts-dedent-2.2.0.tgz",
@@ -12046,17 +12001,16 @@
}
},
"node_modules/vite": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/vite/-/vite-8.0.0.tgz",
"integrity": "sha512-fPGaRNj9Zytaf8LEiBhY7Z6ijnFKdzU/+mL8EFBaKr7Vw1/FWcTBAMW0wLPJAGMPX38ZPVCVgLceWiEqeoqL2Q==",
"version": "8.0.8",
"resolved": "https://registry.npmjs.org/vite/-/vite-8.0.8.tgz",
"integrity": "sha512-dbU7/iLVa8KZALJyLOBOQ88nOXtNG8vxKuOT4I2mD+Ya70KPceF4IAmDsmU0h1Qsn5bPrvsY9HJstCRh3hG6Uw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@oxc-project/runtime": "0.115.0",
"lightningcss": "^1.32.0",
"picomatch": "^4.0.3",
"picomatch": "^4.0.4",
"postcss": "^8.5.8",
"rolldown": "1.0.0-rc.9",
"rolldown": "1.0.0-rc.15",
"tinyglobby": "^0.2.15"
},
"bin": {
@@ -12073,8 +12027,8 @@
},
"peerDependencies": {
"@types/node": "^20.19.0 || >=22.12.0",
"@vitejs/devtools": "^0.0.0-alpha.31",
"esbuild": "^0.27.0",
"@vitejs/devtools": "^0.1.0",
"esbuild": "^0.27.0 || ^0.28.0",
"jiti": ">=1.21.0",
"less": "^4.0.0",
"sass": "^1.70.0",
@@ -12139,19 +12093,6 @@
"vite": ">=2.6.0"
}
},
"node_modules/vite/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/void-elements": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/void-elements/-/void-elements-3.1.0.tgz",

View File

@@ -31,7 +31,7 @@
"copy-to-clipboard": "^3.3.3",
"i18next": "^26.0.4",
"i18next-browser-languagedetector": "^8.2.1",
"lodash": "^4.17.21",
"lodash": "^4.18.1",
"lucide-react": "^1.8.0",
"mermaid": "^11.14.0",
"prop-types": "^15.8.1",
@@ -44,7 +44,7 @@
"react-i18next": "^17.0.2",
"react-markdown": "^9.0.1",
"react-redux": "^9.2.0",
"react-router-dom": "^7.6.1",
"react-router-dom": "^7.14.1",
"react-syntax-highlighter": "^16.1.1",
"reactflow": "^11.11.4",
"rehype-katex": "^7.0.1",
@@ -55,7 +55,7 @@
"devDependencies": {
"@tailwindcss/postcss": "^4.2.2",
"@types/lodash": "^4.17.20",
"@types/react": "^19.1.8",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"@types/react-syntax-highlighter": "^15.5.13",
"@typescript-eslint/eslint-plugin": "^8.58.2",
@@ -77,7 +77,7 @@
"tailwindcss": "^4.2.2",
"tw-animate-css": "^1.4.0",
"typescript": "^5.8.3",
"vite": "^8.0.0",
"vite": "^8.0.8",
"vite-plugin-svgr": "^4.3.0"
}
}

View File

@@ -16,6 +16,7 @@ markers =
unit: Unit tests
integration: Integration tests
slow: Slow running tests
asyncio_mode = strict
filterwarnings =
ignore::DeprecationWarning
ignore::PendingDeprecationWarning

137
scripts/mock_llm.py Normal file
View File

@@ -0,0 +1,137 @@
"""Mock OpenAI-compatible LLM server for benchmarking.
Fixed 5-second generation (100 tokens × 50 ms/token). No auth. Emits SSE
chunks in OpenAI's chat.completions streaming format, or a single response
when stream=false. Run on 127.0.0.1:8090 — point DocsGPT at it via
OPENAI_BASE_URL=http://127.0.0.1:8090/v1.
"""
import asyncio
import json
import logging
import time
import uuid
from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse, StreamingResponse
TOKEN_COUNT = 100
TOKEN_DELAY_S = 0.05 # 100 * 0.05 = 5.0 s
logger = logging.getLogger("mock_llm")
logging.basicConfig(level=logging.INFO, format="%(asctime)s mock: %(message)s")
FILLER_TOKENS = [
"Lorem", " ipsum", " dolor", " sit", " amet", ",", " consectetur",
" adipiscing", " elit", ".", " Sed", " do", " eiusmod", " tempor",
" incididunt", " ut", " labore", " et", " dolore", " magna", " aliqua",
".", " Ut", " enim", " ad", " minim", " veniam", ",", " quis", " nostrud",
" exercitation", " ullamco", " laboris", " nisi", " ut", " aliquip",
" ex", " ea", " commodo", " consequat", ".", " Duis", " aute", " irure",
" dolor", " in", " reprehenderit", " in", " voluptate", " velit",
" esse", " cillum", " dolore", " eu", " fugiat", " nulla", " pariatur",
".", " Excepteur", " sint", " occaecat", " cupidatat", " non", " proident",
",", " sunt", " in", " culpa", " qui", " officia", " deserunt",
" mollit", " anim", " id", " est", " laborum", ".", " Curabitur",
" pretium", " tincidunt", " lacus", ".", " Nulla", " gravida", " orci",
" a", " odio", ".", " Nullam", " varius", ",", " turpis", " et",
" commodo", " pharetra", ",", " est", " eros", " bibendum", " elit",
".",
]
app = FastAPI()
def _token_stream_id() -> str:
return f"chatcmpl-mock-{uuid.uuid4().hex[:12]}"
def _sse_chunk(completion_id: str, model: str, delta: dict, finish_reason=None) -> str:
payload = {
"id": completion_id,
"object": "chat.completion.chunk",
"created": int(time.time()),
"model": model,
"choices": [
{
"index": 0,
"delta": delta,
"finish_reason": finish_reason,
}
],
}
return f"data: {json.dumps(payload)}\n\n"
async def _stream_response(model: str, req_id: str):
completion_id = _token_stream_id()
yield _sse_chunk(completion_id, model, {"role": "assistant", "content": ""})
for i, tok in enumerate(FILLER_TOKENS[:TOKEN_COUNT]):
await asyncio.sleep(TOKEN_DELAY_S)
yield _sse_chunk(completion_id, model, {"content": tok})
yield _sse_chunk(completion_id, model, {}, finish_reason="stop")
yield "data: [DONE]\n\n"
logger.info("[%s] stream done", req_id)
@app.post("/v1/chat/completions")
async def chat_completions(request: Request):
body = await request.json()
model = body.get("model", "mock")
stream = bool(body.get("stream", False))
req_id = uuid.uuid4().hex[:8]
logger.info("[%s] /chat/completions stream=%s model=%s max_tokens=%s", req_id, stream, model, body.get("max_tokens"))
if stream:
return StreamingResponse(
_stream_response(model, req_id),
media_type="text/event-stream",
headers={
"Cache-Control": "no-cache, no-transform",
"X-Accel-Buffering": "no",
},
)
await asyncio.sleep(TOKEN_COUNT * TOKEN_DELAY_S)
logger.info("[%s] non-stream done", req_id)
text = "".join(FILLER_TOKENS[:TOKEN_COUNT])
completion_id = _token_stream_id()
return JSONResponse(
{
"id": completion_id,
"object": "chat.completion",
"created": int(time.time()),
"model": model,
"choices": [
{
"index": 0,
"message": {"role": "assistant", "content": text},
"finish_reason": "stop",
}
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": TOKEN_COUNT,
"total_tokens": 10 + TOKEN_COUNT,
},
}
)
@app.get("/v1/models")
async def list_models():
return {
"object": "list",
"data": [{"id": "mock", "object": "model", "owned_by": "mock"}],
}
@app.get("/health")
async def health():
return {"status": "ok"}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="127.0.0.1", port=8090, log_level="info")

View File

@@ -1,3 +1,17 @@
"""Tests for /api/search route (application/api/answer/routes/search.py).
Retrieval logic lives in ``application/services/search_service.py`` and
has its own unit tests in ``tests/services/test_search_service.py``. The
tests below focus on what the route specifically owns:
* Request validation (400 for missing fields).
* Translation of the service's ``InvalidAPIKey`` / ``SearchFailed``
exceptions to HTTP status codes (401 / 500).
* End-to-end happy path against a real ephemeral Postgres via
``pg_conn``, to catch regressions in the route's wiring to the
service and repositories.
"""
from contextlib import contextmanager
from unittest.mock import MagicMock, patch
@@ -6,254 +20,97 @@ import pytest
@pytest.mark.unit
class TestSearchResourceValidation:
pass
def test_returns_error_when_question_missing(self, mock_mongo_db, flask_app):
def test_returns_400_when_question_missing(self, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
with flask_app.test_request_context(
json={"api_key": "test_key"}
):
resource = SearchResource()
result = resource.post()
with flask_app.test_request_context(json={"api_key": "test_key"}):
result = SearchResource().post()
assert result.status_code == 400
assert "question" in result.json["error"]
def test_returns_error_when_api_key_missing(self, mock_mongo_db, flask_app):
def test_returns_400_when_api_key_missing(self, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
with flask_app.test_request_context(
json={"question": "test query"}
):
resource = SearchResource()
result = resource.post()
with flask_app.test_request_context(json={"question": "test query"}):
result = SearchResource().post()
assert result.status_code == 400
assert "api_key" in result.json["error"]
@pytest.mark.unit
class TestGetSourcesFromApiKey:
pass
class TestSearchResourceExceptionMapping:
"""Verify the route maps service exceptions to HTTP status codes.
def test_returns_source_id_via_patched_method(self, mock_mongo_db, flask_app):
"""Test that _get_sources_from_api_key can return multiple sources via patch."""
The service function itself is patched; these tests do not care about
the search logic — only that 401/500/200 are produced correctly from
the three possible service outcomes.
"""
def test_invalid_api_key_returns_401(self, flask_app):
from application.api.answer.routes.search import SearchResource
from application.services.search_service import InvalidAPIKey
with flask_app.app_context(), flask_app.test_request_context(
json={"question": "q", "api_key": "bad"}
), patch(
"application.api.answer.routes.search.search",
side_effect=InvalidAPIKey(),
):
result = SearchResource().post()
assert result.status_code == 401
assert result.json == {"error": "Invalid API key"}
def test_search_failed_returns_500(self, flask_app):
from application.api.answer.routes.search import SearchResource
from application.services.search_service import SearchFailed
with flask_app.app_context(), flask_app.test_request_context(
json={"question": "q", "api_key": "k"}
), patch(
"application.api.answer.routes.search.search",
side_effect=SearchFailed("boom"),
):
result = SearchResource().post()
assert result.status_code == 500
assert result.json == {"error": "Search failed"}
def test_happy_path_passes_service_result_through(self, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
hits = [{"text": "t", "title": "T", "source": "s"}]
with flask_app.app_context(), flask_app.test_request_context(
json={"question": "q", "api_key": "k", "chunks": 7}
), patch(
"application.api.answer.routes.search.search",
return_value=hits,
) as mock_search:
result = SearchResource().post()
assert result.status_code == 200
assert result.json == hits
mock_search.assert_called_once_with("k", "q", 7)
with patch.object(resource, "_get_sources_from_api_key", return_value=["src1", "src2"]):
result = resource._get_sources_from_api_key("any_key")
assert len(result) == 2
assert "src1" in result
assert "src2" in result
@pytest.mark.unit
class TestSearchVectorstores:
pass
def test_returns_empty_when_no_source_ids(self, mock_mongo_db, flask_app):
def test_default_chunks_is_5(self, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
result = resource._search_vectorstores("test query", [], 5)
assert result == []
def test_skips_empty_source_ids(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = []
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["", " "], 5)
mock_create.assert_not_called()
assert result == []
def test_returns_search_results(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
mock_doc = {
"text": "Test content",
"page_content": "Test content",
"metadata": {
"title": "Test Title",
"source": "/path/to/doc",
},
}
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = [mock_doc]
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert len(result) == 1
assert result[0]["text"] == "Test content"
assert result[0]["title"] == "Test Title"
assert result[0]["source"] == "/path/to/doc"
def test_handles_langchain_document_format(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
mock_doc = MagicMock()
mock_doc.page_content = "Langchain content"
mock_doc.metadata = {"title": "LC Title", "source": "/lc/path"}
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = [mock_doc]
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert len(result) == 1
assert result[0]["text"] == "Langchain content"
assert result[0]["title"] == "LC Title"
def test_respects_chunks_limit(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
mock_docs = [
{"text": f"Content {i}", "metadata": {"title": f"Title {i}"}}
for i in range(10)
]
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = mock_docs
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 3)
assert len(result) == 3
def test_deduplicates_results(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
duplicate_text = "Duplicate content " * 20
mock_docs = [
{"text": duplicate_text, "metadata": {"title": "Title 1"}},
{"text": duplicate_text, "metadata": {"title": "Title 2"}},
{"text": "Unique content", "metadata": {"title": "Title 3"}},
]
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = mock_docs
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert len(result) == 2
def test_handles_vectorstore_error_gracefully(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_create.side_effect = Exception("Vectorstore error")
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert result == []
def test_uses_filename_as_title_fallback(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
mock_doc = {
"text": "Content without title",
"metadata": {"filename": "document.pdf"},
}
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = [mock_doc]
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert result[0]["title"] == "document.pdf"
def test_uses_content_snippet_as_title_last_resort(self, mock_mongo_db, flask_app):
from application.api.answer.routes.search import SearchResource
with flask_app.app_context():
resource = SearchResource()
mock_doc = {
"text": "Content without any title metadata at all",
"metadata": {},
}
with patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore"
) as mock_create:
mock_vectorstore = MagicMock()
mock_vectorstore.search.return_value = [mock_doc]
mock_create.return_value = mock_vectorstore
result = resource._search_vectorstores("test query", ["source_id"], 5)
assert "Content without any title" in result[0]["title"]
assert result[0]["title"].endswith("...")
@pytest.mark.unit
class TestSearchEndpoint:
pass
with flask_app.app_context(), flask_app.test_request_context(
json={"question": "q", "api_key": "k"} # no chunks field
), patch(
"application.api.answer.routes.search.search",
return_value=[],
) as mock_search:
SearchResource().post()
mock_search.assert_called_once_with("k", "q", 5)
# ---------------------------------------------------------------------------
# Real-PG tests for SearchResource.
# End-to-end against a real ephemeral Postgres.
#
# These exercise the full route → service → repository → DB path, patching
# only ``VectorCreator.create_vectorstore`` (so we don't need real embeddings
# or a vector index). ``db_readonly`` is redirected at the *service* module
# since that's where the import now lives.
# ---------------------------------------------------------------------------
@@ -264,7 +121,7 @@ def _patch_search_db(conn):
yield conn
with patch(
"application.api.answer.routes.search.db_readonly", _yield
"application.services.search_service.db_readonly", _yield
):
yield
@@ -298,9 +155,7 @@ class TestSearchResourcePgConn:
def test_search_returns_results(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.repositories.sources import (
SourcesRepository,
)
from application.storage.db.repositories.sources import SourcesRepository
src = SourcesRepository(pg_conn).create("src", user_id="u")
AgentsRepository(pg_conn).create(
@@ -315,7 +170,7 @@ class TestSearchResourcePgConn:
]
with _patch_search_db(pg_conn), patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore",
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
), flask_app.app_context():
with flask_app.test_request_context(
@@ -328,9 +183,7 @@ class TestSearchResourcePgConn:
def test_search_uses_extra_source_ids(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.repositories.sources import (
SourcesRepository,
)
from application.storage.db.repositories.sources import SourcesRepository
src1 = SourcesRepository(pg_conn).create("s1", user_id="u")
src2 = SourcesRepository(pg_conn).create("s2", user_id="u")
@@ -345,7 +198,7 @@ class TestSearchResourcePgConn:
{"text": "one", "metadata": {"title": "A"}},
]
with _patch_search_db(pg_conn), patch(
"application.api.answer.routes.search.VectorCreator.create_vectorstore",
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
), flask_app.app_context():
with flask_app.test_request_context(
@@ -353,71 +206,3 @@ class TestSearchResourcePgConn:
):
result = SearchResource().post()
assert result.status_code == 200
def test_search_exception_returns_500(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.repositories.sources import (
SourcesRepository,
)
src = SourcesRepository(pg_conn).create("src", user_id="u")
AgentsRepository(pg_conn).create(
"u", "a", "published",
key="err-key",
source_id=str(src["id"]),
)
with _patch_search_db(pg_conn), patch(
"application.api.answer.routes.search.SearchResource._get_sources_from_api_key",
side_effect=RuntimeError("boom"),
), flask_app.app_context():
with flask_app.test_request_context(
json={"question": "q", "api_key": "err-key"},
):
result = SearchResource().post()
assert result.status_code == 500
class TestGetSourcesFromApiKeyPg:
def test_empty_for_unknown_key(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
with _patch_search_db(pg_conn), flask_app.app_context():
got = SearchResource()._get_sources_from_api_key("nope")
assert got == []
def test_returns_extra_source_ids(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.repositories.sources import (
SourcesRepository,
)
src = SourcesRepository(pg_conn).create("s", user_id="u")
AgentsRepository(pg_conn).create(
"u", "a", "published",
key="sources-key",
extra_source_ids=[str(src["id"])],
)
with _patch_search_db(pg_conn), flask_app.app_context():
got = SearchResource()._get_sources_from_api_key("sources-key")
assert got == [str(src["id"])]
def test_falls_back_to_single_source(self, pg_conn, flask_app):
from application.api.answer.routes.search import SearchResource
from application.storage.db.repositories.agents import AgentsRepository
from application.storage.db.repositories.sources import (
SourcesRepository,
)
src = SourcesRepository(pg_conn).create("s", user_id="u")
AgentsRepository(pg_conn).create(
"u", "a", "published",
key="single-key",
source_id=str(src["id"]),
)
with _patch_search_db(pg_conn), flask_app.app_context():
got = SearchResource()._get_sources_from_api_key("single-key")
assert got == [str(src["id"])]

View File

@@ -1,4 +1,5 @@
pytest>=8.0.0
pytest-asyncio>=0.23
pytest-cov>=4.1.0
coverage>=7.4.0
pytest-postgresql>=6.0.0

View File

View File

@@ -0,0 +1,134 @@
"""Tests for application/mcp_server.py.
The server module exposes one FastMCP tool, ``search_docs``, that reads
the caller's ``Authorization: Bearer <key>`` header via
``get_http_headers()`` and delegates to
``application.services.search_service.search``. These tests exercise
the tool directly by patching ``get_http_headers`` and ``search``; the
full HTTP-layer plumbing (mount, lifespan, session handshake) is
covered by ``tests/test_asgi.py``.
"""
from unittest.mock import patch
import pytest
@pytest.mark.unit
class TestSearchDocsTool:
@pytest.mark.asyncio
async def test_missing_bearer_raises_permission_error(self):
from application.mcp_server import search_docs
with patch(
"application.mcp_server.get_http_headers", return_value={}
):
with pytest.raises(PermissionError):
await search_docs(query="hi")
@pytest.mark.asyncio
async def test_non_bearer_header_raises_permission_error(self):
from application.mcp_server import search_docs
with patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Basic dXNlcjpwYXNz"},
):
with pytest.raises(PermissionError):
await search_docs(query="hi")
@pytest.mark.asyncio
async def test_blank_bearer_token_raises_permission_error(self):
from application.mcp_server import search_docs
with patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Bearer "},
):
with pytest.raises(PermissionError):
await search_docs(query="hi")
@pytest.mark.asyncio
async def test_invalid_api_key_raises_permission_error(self):
from application.mcp_server import search_docs
from application.services.search_service import InvalidAPIKey
with (
patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Bearer bogus"},
),
patch(
"application.mcp_server.search", side_effect=InvalidAPIKey()
),
):
with pytest.raises(PermissionError):
await search_docs(query="hi")
@pytest.mark.asyncio
async def test_search_failed_bubbles_up(self):
from application.mcp_server import search_docs
from application.services.search_service import SearchFailed
with (
patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Bearer k"},
),
patch(
"application.mcp_server.search",
side_effect=SearchFailed("boom"),
),
):
with pytest.raises(SearchFailed):
await search_docs(query="hi")
@pytest.mark.asyncio
async def test_happy_path_passes_args_and_returns_hits(self):
from application.mcp_server import search_docs
hits = [{"text": "t", "title": "T", "source": "s"}]
with (
patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Bearer the-key"},
),
patch(
"application.mcp_server.search", return_value=hits
) as mock_search,
):
out = await search_docs(query="q", chunks=7)
assert out == hits
mock_search.assert_called_once_with("the-key", "q", 7)
@pytest.mark.asyncio
async def test_default_chunks_is_5(self):
from application.mcp_server import search_docs
with (
patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "Bearer k"},
),
patch(
"application.mcp_server.search", return_value=[]
) as mock_search,
):
await search_docs(query="q")
mock_search.assert_called_once_with("k", "q", 5)
@pytest.mark.asyncio
async def test_bearer_scheme_case_insensitive(self):
from application.mcp_server import search_docs
with (
patch(
"application.mcp_server.get_http_headers",
return_value={"authorization": "bearer lowercase-scheme"},
),
patch(
"application.mcp_server.search", return_value=[]
) as mock_search,
):
await search_docs(query="q")
mock_search.assert_called_once_with("lowercase-scheme", "q", 5)

View File

@@ -0,0 +1,230 @@
"""Unit tests for application/services/search_service.py.
Tests exercise the service function in isolation — AgentsRepository is
stubbed via a patched ``db_readonly`` context manager, and
``VectorCreator.create_vectorstore`` is patched to return a fake
vectorstore. No Flask app context, no real DB, no real embeddings.
"""
from contextlib import contextmanager
from unittest.mock import MagicMock, patch
import pytest
from application.services.search_service import (
InvalidAPIKey,
SearchFailed,
_collect_source_ids,
search,
)
@contextmanager
def _fake_db_readonly(agent_data):
"""Patch ``db_readonly`` so ``AgentsRepository.find_by_key`` returns ``agent_data``."""
agents_repo = MagicMock()
agents_repo.find_by_key.return_value = agent_data
@contextmanager
def _yield_conn():
yield MagicMock()
with patch(
"application.services.search_service.db_readonly", _yield_conn
), patch(
"application.services.search_service.AgentsRepository",
return_value=agents_repo,
):
yield
@pytest.mark.unit
class TestCollectSourceIds:
def test_empty_when_no_sources(self):
assert _collect_source_ids({}) == []
def test_returns_extra_source_ids(self):
agent = {"extra_source_ids": ["s1", "s2"], "source_id": "legacy"}
assert _collect_source_ids(agent) == ["s1", "s2"]
def test_falls_back_to_single_source_id(self):
agent = {"extra_source_ids": [], "source_id": "s1"}
assert _collect_source_ids(agent) == ["s1"]
def test_skips_empty_entries_in_extra(self):
agent = {"extra_source_ids": ["", None, "s1"], "source_id": "fallback"}
assert _collect_source_ids(agent) == ["s1"]
@pytest.mark.unit
class TestSearchInvalidAPIKey:
def test_raises_when_key_unknown(self):
with _fake_db_readonly(None):
with pytest.raises(InvalidAPIKey):
search("does-not-exist", "hello", 5)
def test_raises_search_failed_on_db_error(self):
@contextmanager
def _yield_conn():
yield MagicMock()
agents_repo = MagicMock()
agents_repo.find_by_key.side_effect = RuntimeError("db down")
with patch(
"application.services.search_service.db_readonly", _yield_conn
), patch(
"application.services.search_service.AgentsRepository",
return_value=agents_repo,
):
with pytest.raises(SearchFailed):
search("any-key", "hello", 5)
@pytest.mark.unit
class TestSearchEmptyWhenNoSources:
def test_returns_empty_when_agent_has_no_sources(self):
with _fake_db_readonly({"extra_source_ids": [], "source_id": None}):
assert search("k", "q", 5) == []
def test_returns_empty_for_zero_chunks_without_db_lookup(self):
with patch("application.services.search_service.db_readonly") as mock_db:
assert search("k", "q", 0) == []
mock_db.assert_not_called()
def test_returns_empty_for_negative_chunks_without_db_lookup(self):
with patch("application.services.search_service.db_readonly") as mock_db:
assert search("k", "q", -1) == []
mock_db.assert_not_called()
@pytest.mark.unit
class TestSearchResults:
def test_returns_hit_shape(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
fake_vs = MagicMock()
fake_vs.search.return_value = [
{
"text": "Test content",
"metadata": {"title": "Test Title", "source": "/path/to/doc"},
}
]
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 5)
assert results == [
{"text": "Test content", "title": "Test Title", "source": "/path/to/doc"}
]
def test_handles_langchain_document_format(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
lc_doc = MagicMock()
lc_doc.page_content = "Langchain content"
lc_doc.metadata = {"title": "LC Title", "source": "/lc/path"}
fake_vs = MagicMock()
fake_vs.search.return_value = [lc_doc]
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 5)
assert len(results) == 1
assert results[0]["text"] == "Langchain content"
assert results[0]["title"] == "LC Title"
def test_respects_chunks_cap(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
docs = [
{"text": f"Content {i}", "metadata": {"title": f"T{i}"}}
for i in range(10)
]
fake_vs = MagicMock()
fake_vs.search.return_value = docs
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 3)
assert len(results) == 3
def test_deduplicates_results_by_content_prefix(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
dup_text = "Duplicate content " * 20
docs = [
{"text": dup_text, "metadata": {"title": "T1"}},
{"text": dup_text, "metadata": {"title": "T2"}},
{"text": "Unique content", "metadata": {"title": "T3"}},
]
fake_vs = MagicMock()
fake_vs.search.return_value = docs
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 5)
assert len(results) == 2
def test_skips_broken_source_and_returns_from_healthy_ones(self):
# Two sources — the first raises, the second returns a doc. The
# caller should still get the healthy source's result.
agent = {"extra_source_ids": ["broken", "ok"], "source_id": None}
healthy_vs = MagicMock()
healthy_vs.search.return_value = [
{"text": "ok content", "metadata": {"title": "Ok"}}
]
def create_vs(store, source_id, key):
if source_id == "broken":
raise RuntimeError("vector index missing")
return healthy_vs
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
side_effect=create_vs,
):
results = search("k", "q", 5)
assert len(results) == 1
assert results[0]["text"] == "ok content"
def test_uses_filename_when_title_missing(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
fake_vs = MagicMock()
fake_vs.search.return_value = [
{"text": "body", "metadata": {"filename": "document.pdf"}}
]
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 5)
assert results[0]["title"] == "document.pdf"
def test_uses_content_snippet_as_title_last_resort(self):
agent = {"source_id": "src-1", "extra_source_ids": []}
fake_vs = MagicMock()
fake_vs.search.return_value = [
{"text": "Content without any title metadata at all", "metadata": {}}
]
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore",
return_value=fake_vs,
):
results = search("k", "q", 5)
assert results[0]["title"].endswith("...")
assert "Content without any title" in results[0]["title"]
def test_skips_empty_source_ids(self):
# ``source_id=" "`` only — after strip() this leaves no real source.
agent = {"extra_source_ids": [" ", ""], "source_id": None}
with _fake_db_readonly(agent), patch(
"application.services.search_service.VectorCreator.create_vectorstore"
) as mock_create:
results = search("k", "q", 5)
mock_create.assert_not_called()
assert results == []

View File

@@ -1,164 +0,0 @@
"""Integration tests for ``_backfill_connector_sessions``.
Mongo routinely contains multiple rows for the same
``(user_id, server_url, provider)`` triple — each OAuth-button click
inserts a pending row and only the last one is authorized. The Postgres
schema has a unique index on that triple, so the backfill does a Python-
side dedup *before* the insert, preferring authorized rows over pending
ones and newer ``created_at`` over older.
This test seeds exactly that shape and asserts the dedup keeps the
authorized row.
"""
from __future__ import annotations
import sys
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
import mongomock
import pytest
from sqlalchemy import text
sys.path.insert(0, str(Path(__file__).resolve().parents[3]))
from scripts.db.backfill import _backfill_connector_sessions # noqa: E402
@pytest.fixture
def mongo_db() -> Any:
client = mongomock.MongoClient()
return client["docsgpt_test"]
class TestBackfillConnectorSessions:
def test_dedups_authorized_over_pending(self, pg_conn, mongo_db):
# Two rows, same (user, server_url, provider) triple. The pending
# row was inserted first; the authorized row was inserted later
# when the OAuth redirect completed. Dedup should keep the
# authorized one (has ``token_info``).
mongo_db["connector_sessions"].insert_many(
[
{
"_id": "777777777777777777777771",
"user_id": "alice",
"provider": "google_drive",
"server_url": "https://drive.google.com",
"status": "pending",
"created_at": datetime(2026, 1, 1, tzinfo=timezone.utc),
},
{
"_id": "777777777777777777777772",
"user_id": "alice",
"provider": "google_drive",
"server_url": "https://drive.google.com",
"status": "authorized",
"token_info": {"access_token": "xyz"},
"created_at": datetime(2026, 1, 2, tzinfo=timezone.utc),
},
]
)
stats = _backfill_connector_sessions(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 1
assert stats["skipped"] == 1
rows = pg_conn.execute(
text(
"SELECT status, legacy_mongo_id, token_info FROM "
"connector_sessions WHERE user_id = 'alice'"
)
).fetchall()
assert len(rows) == 1
assert rows[0]._mapping["status"] == "authorized"
assert rows[0]._mapping["legacy_mongo_id"] == "777777777777777777777772"
assert rows[0]._mapping["token_info"] == {"access_token": "xyz"}
def test_dedups_newer_created_at_when_both_pending(self, pg_conn, mongo_db):
# Both rows are pending. Newer created_at wins.
mongo_db["connector_sessions"].insert_many(
[
{
"_id": "777777777777777777777771",
"user_id": "alice",
"provider": "github",
"server_url": "",
"status": "pending",
"created_at": datetime(2026, 1, 1, tzinfo=timezone.utc),
},
{
"_id": "777777777777777777777772",
"user_id": "alice",
"provider": "github",
"server_url": "",
"status": "pending",
"created_at": datetime(2026, 2, 1, tzinfo=timezone.utc),
},
]
)
_backfill_connector_sessions(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
rows = pg_conn.execute(
text(
"SELECT legacy_mongo_id FROM connector_sessions "
"WHERE user_id = 'alice' AND provider = 'github'"
)
).fetchall()
assert len(rows) == 1
assert rows[0]._mapping["legacy_mongo_id"] == "777777777777777777777772"
def test_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["connector_sessions"].insert_one(
{
"_id": "777777777777777777777771",
"user_id": "alice",
"provider": "google_drive",
"server_url": "https://drive.google.com",
"status": "authorized",
"token_info": {"access_token": "xyz"},
"created_at": datetime(2026, 1, 1, tzinfo=timezone.utc),
}
)
_backfill_connector_sessions(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_connector_sessions(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM connector_sessions "
"WHERE legacy_mongo_id = '777777777777777777777771'"
)
).scalar()
assert count == 1
def test_skips_rows_without_user_or_provider(self, pg_conn, mongo_db):
mongo_db["connector_sessions"].insert_many(
[
{
"_id": "777777777777777777777771",
"provider": "github", # no user
},
{
"_id": "777777777777777777777772",
"user_id": "alice", # no provider
},
]
)
stats = _backfill_connector_sessions(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 0
assert stats["skipped"] == 2

View File

@@ -1,292 +0,0 @@
"""Integration tests for ``_backfill_conversations`` +
``_backfill_conversation_messages`` flattening and attachment FK
resolution.
The conversations backfill does two things the other backfillers don't:
1. It flattens Mongo's nested ``queries[]`` array into rows in a child
table (``conversation_messages``), with ``position`` = array index.
2. It resolves attachment refs (strings pointing to Mongo ObjectIds) to
Postgres UUIDs via a ``legacy_mongo_id`` lookup. Unresolvable refs are
dropped rather than crashing the whole batch.
These are the two bits we assert below, alongside the standard
happy-shape / idempotency / convergence checks.
"""
from __future__ import annotations
import sys
from pathlib import Path
from typing import Any
import mongomock
import pytest
from sqlalchemy import text
sys.path.insert(0, str(Path(__file__).resolve().parents[3]))
from scripts.db.backfill import ( # noqa: E402
_backfill_attachments,
_backfill_conversations,
)
@pytest.fixture
def mongo_db() -> Any:
client = mongomock.MongoClient()
return client["docsgpt_test"]
# ---------------------------------------------------------------------------
# attachments — prerequisite for conversations (DBRef→UUID via legacy map)
# ---------------------------------------------------------------------------
class TestBackfillAttachments:
def test_attachments_happy_shape_preserves_mime_and_size(
self, pg_conn, mongo_db
):
mongo_db["attachments"].insert_one(
{
"_id": "aaaaaaaaaaaaaaaaaaaaaaaa",
"user": "alice",
"filename": "report.pdf",
# Worker writes the blob path as ``path``; the PG column is
# ``upload_path``. If the backfill ever reads the wrong key,
# this test will fail with an empty string.
"path": "uploads/alice/report.pdf",
"mime_type": "application/pdf",
"size": 12345,
"content": "extracted text",
"token_count": 42,
}
)
_backfill_attachments(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text(
"SELECT filename, upload_path, mime_type, size, "
"content, token_count, legacy_mongo_id "
"FROM attachments WHERE user_id = 'alice'"
)
).one()._mapping
assert row["filename"] == "report.pdf"
assert row["upload_path"] == "uploads/alice/report.pdf"
assert row["mime_type"] == "application/pdf"
assert row["size"] == 12345
assert row["content"] == "extracted text"
assert row["token_count"] == 42
assert row["legacy_mongo_id"] == "aaaaaaaaaaaaaaaaaaaaaaaa"
def test_attachments_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["attachments"].insert_one(
{
"_id": "aaaaaaaaaaaaaaaaaaaaaaaa",
"user": "alice",
"filename": "r.pdf",
"path": "u/alice/r.pdf",
}
)
_backfill_attachments(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_attachments(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM attachments "
"WHERE legacy_mongo_id = 'aaaaaaaaaaaaaaaaaaaaaaaa'"
)
).scalar()
assert count == 1
# ---------------------------------------------------------------------------
# conversations + conversation_messages
# ---------------------------------------------------------------------------
def _seed_attachment(mongo_db: Any, _id: str, user: str = "alice") -> None:
mongo_db["attachments"].insert_one(
{
"_id": _id,
"user": user,
"filename": f"{_id}.txt",
"path": f"uploads/{user}/{_id}.txt",
}
)
class TestBackfillConversations:
def test_conversations_flattens_queries_into_messages(
self, pg_conn, mongo_db
):
_seed_attachment(mongo_db, "aaaaaaaaaaaaaaaaaaaaaaa1")
_backfill_attachments(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["conversations"].insert_one(
{
"_id": "cccccccccccccccccccccccc",
"user": "alice",
"name": "chat-1",
"queries": [
{
"prompt": "hello",
"response": "hi",
"attachments": ["aaaaaaaaaaaaaaaaaaaaaaa1"],
},
{
"prompt": "how are you",
"response": "fine",
},
],
}
)
stats = _backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 1
assert stats["written"] == 1
assert stats["messages_written"] == 2
conv_id = pg_conn.execute(
text(
"SELECT id FROM conversations "
"WHERE legacy_mongo_id = 'cccccccccccccccccccccccc'"
)
).scalar()
rows = pg_conn.execute(
text(
"SELECT position, prompt, response, attachments "
"FROM conversation_messages WHERE conversation_id = :cid "
"ORDER BY position"
),
{"cid": str(conv_id)},
).fetchall()
assert [r._mapping["position"] for r in rows] == [0, 1]
assert rows[0]._mapping["prompt"] == "hello"
assert rows[1]._mapping["response"] == "fine"
# Attachment ObjectId string was mapped to the PG UUID.
resolved = rows[0]._mapping["attachments"]
assert len(resolved) == 1
pg_att_id = pg_conn.execute(
text(
"SELECT id FROM attachments "
"WHERE legacy_mongo_id = 'aaaaaaaaaaaaaaaaaaaaaaa1'"
)
).scalar()
assert str(resolved[0]) == str(pg_att_id)
def test_conversations_drops_unresolved_attachments(
self, pg_conn, mongo_db
):
mongo_db["conversations"].insert_one(
{
"_id": "cccccccccccccccccccccccc",
"user": "alice",
"queries": [
{
"prompt": "hi",
# Unknown attachment objectid — not present in PG.
"attachments": ["bbbbbbbbbbbbbbbbbbbbbbb0"],
}
],
}
)
stats = _backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["unresolved_attachment_refs"] == 1
row = pg_conn.execute(
text(
"SELECT attachments FROM conversation_messages "
"WHERE position = 0"
)
).one()._mapping
assert list(row["attachments"]) == []
def test_conversations_rerun_does_not_double_messages(
self, pg_conn, mongo_db
):
mongo_db["conversations"].insert_one(
{
"_id": "cccccccccccccccccccccccc",
"user": "alice",
"queries": [
{"prompt": "a", "response": "1"},
{"prompt": "b", "response": "2"},
],
}
)
_backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
conv_count = pg_conn.execute(
text(
"SELECT count(*) FROM conversations "
"WHERE legacy_mongo_id = 'cccccccccccccccccccccccc'"
)
).scalar()
msg_count = pg_conn.execute(
text(
"SELECT count(*) FROM conversation_messages "
"WHERE conversation_id = ("
" SELECT id FROM conversations "
" WHERE legacy_mongo_id = 'cccccccccccccccccccccccc'"
")"
)
).scalar()
assert conv_count == 1
assert msg_count == 2
def test_conversations_rerun_truncates_removed_tail_messages(
self, pg_conn, mongo_db
):
# First run: 3 messages. Then Mongo truncates to 1. Second run
# should drop positions 1 and 2.
mongo_db["conversations"].insert_one(
{
"_id": "cccccccccccccccccccccccc",
"user": "alice",
"queries": [
{"prompt": "a", "response": "1"},
{"prompt": "b", "response": "2"},
{"prompt": "c", "response": "3"},
],
}
)
_backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["conversations"].update_one(
{"_id": "cccccccccccccccccccccccc"},
{"$set": {"queries": [{"prompt": "a", "response": "1-updated"}]}},
)
_backfill_conversations(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
rows = pg_conn.execute(
text(
"SELECT position, response FROM conversation_messages "
"WHERE conversation_id = ("
" SELECT id FROM conversations "
" WHERE legacy_mongo_id = 'cccccccccccccccccccccccc'"
") ORDER BY position"
)
).fetchall()
assert len(rows) == 1
assert rows[0]._mapping["response"] == "1-updated"

View File

@@ -1,144 +0,0 @@
"""Integration tests for ``_backfill_sources``.
The notable translation here is the optional Mongo ``user`` field: system
seed rows arrive with no ``user`` (or ``user="system"``) and must land
with ``user_id = '__system__'`` in Postgres so the NOT NULL constraint
accepts them.
"""
from __future__ import annotations
import sys
from pathlib import Path
from typing import Any
import mongomock
import pytest
from sqlalchemy import text
sys.path.insert(0, str(Path(__file__).resolve().parents[3]))
from scripts.db.backfill import ( # noqa: E402
SYSTEM_USER_ID,
_backfill_sources,
)
@pytest.fixture
def mongo_db() -> Any:
client = mongomock.MongoClient()
return client["docsgpt_test"]
class TestBackfillSources:
def test_sources_system_rows_get_sentinel_user(self, pg_conn, mongo_db):
mongo_db["sources"].insert_many(
[
{
"_id": "555555555555555555555555",
"name": "Seed Source",
"type": "url",
# Intentionally no ``user`` field.
},
{
"_id": "666666666666666666666666",
"user": "alice",
"name": "Alice's Upload",
"type": "file",
},
]
)
stats = _backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 2
rows = pg_conn.execute(
text(
"SELECT user_id, name FROM sources "
"WHERE legacy_mongo_id IN "
"('555555555555555555555555', '666666666666666666666666') "
"ORDER BY name"
)
).fetchall()
assert len(rows) == 2
# Alphabetical by name: "Alice's Upload" comes before "Seed Source".
assert rows[0]._mapping["user_id"] == "alice"
assert rows[1]._mapping["user_id"] == SYSTEM_USER_ID
def test_sources_preserves_legacy_fields_under_metadata(
self, pg_conn, mongo_db
):
mongo_db["sources"].insert_one(
{
"_id": "555555555555555555555555",
"user": "alice",
"name": "s1",
"type": "url",
"status": "ingested", # legacy ingestion field
"reason": None,
}
)
_backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
meta = pg_conn.execute(
text(
"SELECT metadata FROM sources "
"WHERE legacy_mongo_id = '555555555555555555555555'"
)
).scalar()
assert meta["legacy_fields"]["status"] == "ingested"
def test_sources_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["sources"].insert_one(
{
"_id": "555555555555555555555555",
"user": "alice",
"name": "s1",
"type": "url",
}
)
_backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM sources "
"WHERE legacy_mongo_id = '555555555555555555555555'"
)
).scalar()
assert count == 1
def test_sources_rerun_converges_on_name(self, pg_conn, mongo_db):
mongo_db["sources"].insert_one(
{
"_id": "555555555555555555555555",
"user": "alice",
"name": "old-name",
"type": "url",
}
)
_backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["sources"].update_one(
{"_id": "555555555555555555555555"},
{"$set": {"name": "new-name"}},
)
_backfill_sources(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
name = pg_conn.execute(
text(
"SELECT name FROM sources "
"WHERE legacy_mongo_id = '555555555555555555555555'"
)
).scalar()
assert name == "new-name"

View File

@@ -1,380 +0,0 @@
"""Integration tests for the per-tool-child collections: todos, notes,
and memories.
These all depend on ``user_tools`` having been backfilled first, because
:func:`scripts.db.backfill._build_tool_id_map` joins Mongo
``user_tools._id`` to Postgres ``user_tools.id`` via ``(user_id, name)``.
Each test seeds a single tool row to keep fixtures minimal.
Memories are the odd one out: the SQL uses ``ON CONFLICT DO NOTHING`` and
the table has no ``legacy_mongo_id`` column, so a re-run can't converge
on mutated content. The negative test locks in the current behavior so a
future fix (adding ``legacy_mongo_id`` + DO UPDATE, tracked in
``migration-postgres.md``) will flip the assertion.
"""
from __future__ import annotations
import sys
from pathlib import Path
from typing import Any
import mongomock
import pytest
from sqlalchemy import text
sys.path.insert(0, str(Path(__file__).resolve().parents[3]))
from scripts.db.backfill import ( # noqa: E402
_backfill_memories,
_backfill_notes,
_backfill_todos,
_backfill_user_tools,
)
_TOOL_MONGO_ID = "507f1f77bcf86cd799439011"
@pytest.fixture
def mongo_db() -> Any:
client = mongomock.MongoClient()
return client["docsgpt_test"]
@pytest.fixture
def seeded_tool(pg_conn, mongo_db) -> str:
"""Seed one tool in both Mongo and Postgres and return its PG UUID.
Having a user_tools row on both sides is a prerequisite for every
backfill function in this file — ``_build_tool_id_map`` needs it to
resolve ``tool_id`` FKs.
"""
mongo_db["user_tools"].insert_one(
{
"_id": _TOOL_MONGO_ID,
"user": "alice",
"name": "my-tool",
}
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
return str(
pg_conn.execute(
text("SELECT id FROM user_tools WHERE legacy_mongo_id = :lid"),
{"lid": _TOOL_MONGO_ID},
).scalar()
)
# ---------------------------------------------------------------------------
# todos
# ---------------------------------------------------------------------------
class TestBackfillTodos:
def test_todos_status_completed_translation(self, pg_conn, mongo_db, seeded_tool):
# Two todos: one "open", one "completed". Asserts the Mongo
# ``status`` string is translated to the PG ``completed`` bool.
mongo_db["todos"].insert_many(
[
{
"_id": "111111111111111111111111",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"todo_id": 1,
"title": "Write tests",
"status": "open",
},
{
"_id": "222222222222222222222222",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"todo_id": 2,
"title": "Ship it",
"status": "completed",
},
]
)
stats = _backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 2
rows = pg_conn.execute(
text(
"SELECT todo_id, title, completed, legacy_mongo_id FROM todos "
"WHERE user_id = 'alice' ORDER BY todo_id"
)
).fetchall()
assert len(rows) == 2
assert rows[0]._mapping["todo_id"] == 1
assert rows[0]._mapping["completed"] is False
assert rows[1]._mapping["todo_id"] == 2
assert rows[1]._mapping["completed"] is True
# legacy_mongo_id preserved for idempotency on re-run.
assert rows[0]._mapping["legacy_mongo_id"] == "111111111111111111111111"
def test_todos_legacy_fields_stashed_in_metadata(
self, pg_conn, mongo_db, seeded_tool
):
# Legacy top-level ``conversation_id`` field should survive under
# metadata.legacy_fields.
mongo_db["todos"].insert_one(
{
"_id": "111111111111111111111111",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"title": "t",
"status": "open",
"conversation_id": "abc-legacy",
}
)
_backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
meta = pg_conn.execute(
text("SELECT metadata FROM todos WHERE legacy_mongo_id = :lid"),
{"lid": "111111111111111111111111"},
).scalar()
assert meta["legacy_fields"]["conversation_id"] == "abc-legacy"
def test_todos_rerun_does_not_duplicate(self, pg_conn, mongo_db, seeded_tool):
mongo_db["todos"].insert_one(
{
"_id": "111111111111111111111111",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"todo_id": 1,
"title": "t",
"status": "open",
}
)
_backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM todos "
"WHERE legacy_mongo_id = '111111111111111111111111'"
)
).scalar()
assert count == 1
def test_todos_rerun_converges_on_status_and_title(
self, pg_conn, mongo_db, seeded_tool
):
mongo_db["todos"].insert_one(
{
"_id": "111111111111111111111111",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"todo_id": 1,
"title": "Old title",
"status": "open",
}
)
_backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["todos"].update_one(
{"_id": "111111111111111111111111"},
{"$set": {"status": "completed", "title": "Done title"}},
)
_backfill_todos(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text(
"SELECT title, completed FROM todos "
"WHERE legacy_mongo_id = '111111111111111111111111'"
)
).one()._mapping
assert row["title"] == "Done title"
assert row["completed"] is True
# ---------------------------------------------------------------------------
# notes
# ---------------------------------------------------------------------------
class TestBackfillNotes:
def test_notes_translates_note_to_content(
self, pg_conn, mongo_db, seeded_tool
):
mongo_db["notes"].insert_one(
{
"_id": "333333333333333333333333",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"note": "hello world",
# No explicit title — backfill should fall back to "note".
}
)
_backfill_notes(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text(
"SELECT title, content, legacy_mongo_id FROM notes "
"WHERE user_id = 'alice'"
)
).one()._mapping
assert row["content"] == "hello world"
assert row["title"] == "note"
# legacy_mongo_id column landed recently — lock in its presence.
assert row["legacy_mongo_id"] == "333333333333333333333333"
def test_notes_uses_title_when_present(self, pg_conn, mongo_db, seeded_tool):
mongo_db["notes"].insert_one(
{
"_id": "333333333333333333333333",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"title": "My title",
"content": "direct content",
}
)
_backfill_notes(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text("SELECT title, content FROM notes WHERE user_id = 'alice'")
).one()._mapping
assert row["title"] == "My title"
assert row["content"] == "direct content"
def test_notes_rerun_converges_on_content(
self, pg_conn, mongo_db, seeded_tool
):
mongo_db["notes"].insert_one(
{
"_id": "333333333333333333333333",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"note": "v1",
}
)
_backfill_notes(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["notes"].update_one(
{"_id": "333333333333333333333333"},
{"$set": {"note": "v2"}},
)
_backfill_notes(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
rows = pg_conn.execute(
text(
"SELECT content FROM notes "
"WHERE legacy_mongo_id = '333333333333333333333333'"
)
).fetchall()
assert len(rows) == 1
assert rows[0]._mapping["content"] == "v2"
# ---------------------------------------------------------------------------
# memories
# ---------------------------------------------------------------------------
class TestBackfillMemories:
def test_memories_happy_shape(self, pg_conn, mongo_db, seeded_tool):
mongo_db["memories"].insert_one(
{
"_id": "444444444444444444444444",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"path": "/foo",
"content": "memory body",
}
)
_backfill_memories(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text("SELECT path, content FROM memories WHERE user_id = 'alice'")
).one()._mapping
assert row["path"] == "/foo"
assert row["content"] == "memory body"
def test_memories_rerun_does_not_duplicate(
self, pg_conn, mongo_db, seeded_tool
):
# The unique index is (user_id, tool_id, path). Same-row re-insert
# is expected to be a no-op because the SQL uses DO NOTHING.
mongo_db["memories"].insert_one(
{
"_id": "444444444444444444444444",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"path": "/foo",
"content": "v1",
}
)
_backfill_memories(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_memories(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM memories "
"WHERE user_id = 'alice' AND path = '/foo'"
)
).scalar()
assert count == 1
def test_memories_rerun_does_not_converge_content(
self, pg_conn, mongo_db, seeded_tool
):
# KNOWN non-idempotent behavior: memories have no legacy_mongo_id
# column and the SQL uses ``ON CONFLICT DO NOTHING`` rather than
# DO UPDATE. A content change in Mongo is NOT reflected in PG on
# re-run. See migration-postgres.md for the tracked fix. When the
# backfill gains a legacy_mongo_id column + DO UPDATE branch,
# flip this assertion to assert the new content wins.
mongo_db["memories"].insert_one(
{
"_id": "444444444444444444444444",
"user_id": "alice",
"tool_id": _TOOL_MONGO_ID,
"path": "/foo",
"content": "original",
}
)
_backfill_memories(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["memories"].update_one(
{"_id": "444444444444444444444444"},
{"$set": {"content": "updated"}},
)
_backfill_memories(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
content = pg_conn.execute(
text(
"SELECT content FROM memories "
"WHERE user_id = 'alice' AND path = '/foo'"
)
).scalar()
# Lock in the non-convergent behavior.
assert content == "original"

View File

@@ -1,348 +0,0 @@
"""Integration tests for ``_backfill_users`` / ``_backfill_prompts`` /
``_backfill_user_tools`` against an ephemeral Postgres.
These exercise the Mongo→PG shape translations end-to-end: a fake Mongo
(mongomock) is populated with representative docs, the ``_backfill_*``
function under test runs against ``pg_conn``, and the resulting rows are
asserted to carry the translated fields. Each function is also run twice
to validate idempotency (no row duplication) and — where the SQL uses
``DO UPDATE`` — convergence on mutated source data.
"""
from __future__ import annotations
import sys
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
import mongomock
import pytest
from sqlalchemy import text
# Make ``scripts.db.backfill`` importable (scripts/ isn't on sys.path by default).
sys.path.insert(0, str(Path(__file__).resolve().parents[3]))
from scripts.db.backfill import ( # noqa: E402
SYSTEM_USER_ID,
_backfill_prompts,
_backfill_user_tools,
_backfill_users,
)
@pytest.fixture
def mongo_db() -> Any:
"""Fresh in-memory Mongo database per test."""
client = mongomock.MongoClient()
return client["docsgpt_test"]
# ---------------------------------------------------------------------------
# users
# ---------------------------------------------------------------------------
class TestBackfillUsers:
def test_users_happy_shape_merges_agent_preferences(self, pg_conn, mongo_db):
mongo_db["users"].insert_many(
[
{
"user_id": "alice",
"agent_preferences": {
"pinned": ["agent-1"],
"theme": "dark",
},
},
{
"user_id": "bob",
"agent_preferences": {},
},
]
)
stats = _backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 2
assert stats["skipped_no_user_id"] == 0
rows = pg_conn.execute(
text(
"SELECT user_id, agent_preferences FROM users "
"WHERE user_id IN ('alice', 'bob') ORDER BY user_id"
)
).fetchall()
assert len(rows) == 2
alice = rows[0]._mapping
bob = rows[1]._mapping
# Unknown top-level prefs (``theme``) survive untouched.
assert alice["agent_preferences"]["theme"] == "dark"
assert alice["agent_preferences"]["pinned"] == ["agent-1"]
# Missing ``shared_with_me`` gets filled to [].
assert alice["agent_preferences"]["shared_with_me"] == []
assert bob["agent_preferences"]["pinned"] == []
assert bob["agent_preferences"]["shared_with_me"] == []
def test_users_skips_rows_without_user_id(self, pg_conn, mongo_db):
mongo_db["users"].insert_many(
[{"user_id": "alice"}, {"agent_preferences": {"pinned": []}}]
)
stats = _backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["skipped_no_user_id"] == 1
assert stats["written"] == 1
def test_users_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["users"].insert_one(
{"user_id": "alice", "agent_preferences": {"pinned": ["a-1"]}}
)
_backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text("SELECT count(*) FROM users WHERE user_id = 'alice'")
).scalar()
assert count == 1
def test_users_rerun_converges_on_mutated_pinned(self, pg_conn, mongo_db):
mongo_db["users"].insert_one(
{"user_id": "alice", "agent_preferences": {"pinned": ["a-1"]}}
)
_backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
# Mutate Mongo.
mongo_db["users"].update_one(
{"user_id": "alice"},
{"$set": {"agent_preferences": {"pinned": ["a-1", "a-2"]}}},
)
_backfill_users(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
prefs = pg_conn.execute(
text("SELECT agent_preferences FROM users WHERE user_id = 'alice'")
).scalar()
assert prefs["pinned"] == ["a-1", "a-2"]
# ---------------------------------------------------------------------------
# prompts
# ---------------------------------------------------------------------------
class TestBackfillPrompts:
def test_prompts_happy_shape(self, pg_conn, mongo_db):
mongo_db["prompts"].insert_many(
[
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "Greet",
"content": "Say hi",
},
{
"_id": "507f1f77bcf86cd799439012",
"user": "system",
"name": "Template",
"content": "Seed",
},
]
)
stats = _backfill_prompts(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
assert stats["seen"] == 2
assert stats["written"] == 2
rows = pg_conn.execute(
text(
"SELECT user_id, name, content, legacy_mongo_id "
"FROM prompts ORDER BY name"
)
).fetchall()
assert len(rows) == 2
greet = rows[0]._mapping
template = rows[1]._mapping
assert greet["user_id"] == "alice"
assert greet["name"] == "Greet"
assert greet["content"] == "Say hi"
assert greet["legacy_mongo_id"] == "507f1f77bcf86cd799439011"
# Legacy ``user="system"`` collapses to the sentinel.
assert template["user_id"] == SYSTEM_USER_ID
def test_prompts_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["prompts"].insert_one(
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "Greet",
"content": "Say hi",
}
)
_backfill_prompts(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_prompts(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM prompts WHERE "
"legacy_mongo_id = '507f1f77bcf86cd799439011'"
)
).scalar()
assert count == 1
def test_prompts_rerun_converges_on_content(self, pg_conn, mongo_db):
mongo_db["prompts"].insert_one(
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "Greet",
"content": "v1",
}
)
_backfill_prompts(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["prompts"].update_one(
{"_id": "507f1f77bcf86cd799439011"},
{"$set": {"content": "v2"}},
)
_backfill_prompts(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text(
"SELECT content FROM prompts WHERE "
"legacy_mongo_id = '507f1f77bcf86cd799439011'"
)
).scalar()
assert row == "v2"
# ---------------------------------------------------------------------------
# user_tools
# ---------------------------------------------------------------------------
class TestBackfillUserTools:
def test_user_tools_nested_values_stringified(self, pg_conn, mongo_db):
# Use a dict with a datetime-like / non-JSON-native type nested
# inside config to verify ``default=str`` gets applied.
mongo_db["user_tools"].insert_one(
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "calendar",
"displayName": "Calendar",
"customName": "My Calendar",
"description": "cal desc",
"config": {
"api_key": "secret",
"expires_at": datetime(2026, 1, 1, tzinfo=timezone.utc),
},
"configRequirements": {"api_key": {"type": "string"}},
"actions": [{"name": "create_event"}],
"status": True,
}
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
row = pg_conn.execute(
text(
"SELECT name, custom_name, display_name, description, "
"config, config_requirements, actions, status "
"FROM user_tools WHERE legacy_mongo_id = :lid"
),
{"lid": "507f1f77bcf86cd799439011"},
).one()._mapping
assert row["name"] == "calendar"
assert row["custom_name"] == "My Calendar"
assert row["display_name"] == "Calendar"
assert row["description"] == "cal desc"
assert row["status"] is True
# Nested config datetime is serialized to a string (default=str).
assert row["config"]["api_key"] == "secret"
assert isinstance(row["config"]["expires_at"], str)
assert row["config_requirements"] == {"api_key": {"type": "string"}}
assert row["actions"] == [{"name": "create_event"}]
def test_user_tools_rerun_converges(self, pg_conn, mongo_db):
mongo_db["user_tools"].insert_one(
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "cal",
"config": {"k": "v1"},
}
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
mongo_db["user_tools"].update_one(
{"_id": "507f1f77bcf86cd799439011"},
{"$set": {"config": {"k": "v2"}, "description": "new"}},
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
rows = pg_conn.execute(
text(
"SELECT config, description FROM user_tools "
"WHERE legacy_mongo_id = '507f1f77bcf86cd799439011'"
)
).fetchall()
assert len(rows) == 1
assert rows[0]._mapping["config"] == {"k": "v2"}
assert rows[0]._mapping["description"] == "new"
def test_user_tools_rerun_does_not_duplicate(self, pg_conn, mongo_db):
mongo_db["user_tools"].insert_one(
{
"_id": "507f1f77bcf86cd799439011",
"user": "alice",
"name": "cal",
}
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
_backfill_user_tools(
conn=pg_conn, mongo_db=mongo_db, batch_size=100, dry_run=False
)
count = pg_conn.execute(
text(
"SELECT count(*) FROM user_tools "
"WHERE legacy_mongo_id = '507f1f77bcf86cd799439011'"
)
).scalar()
assert count == 1

View File

@@ -10,7 +10,7 @@ Two invariants are covered here:
These tests run against a real ephemeral Postgres (via ``pg_engine``)
rather than mocks. They rebuild the module-level engine cache so the
``statement_timeout`` ``connect_args`` is actually exercised.
engine factory's ``statement_timeout`` setup is actually exercised.
"""
from __future__ import annotations
@@ -35,9 +35,9 @@ def wired_engine(pg_engine, monkeypatch):
"""Rebuild the module-level engine against the ephemeral DB.
``pg_engine`` already creates its own SQLAlchemy engine, but that
engine does not carry our ``statement_timeout`` ``connect_args``. We
reconstruct one via :func:`get_engine` so the real production factory
code path is exercised.
engine does not install our ``statement_timeout`` connect-event
hook. We reconstruct one via :func:`get_engine` so the real
production factory code path is exercised.
"""
# Reset the module-level cache so get_engine() re-reads the URL and
# applies the production connect_args.
@@ -129,7 +129,8 @@ class TestDbReadonlyEnforcement:
class TestStatementTimeout:
"""The engine factory installs ``statement_timeout`` via connect_args.
"""The engine factory installs ``statement_timeout`` on every new
connection.
We verify two things:

View File

@@ -105,11 +105,26 @@ class TestAuthenticateRequest:
assert response.status_code == 200
class TestAfterRequest:
class TestFlaskCors:
@pytest.mark.unit
def test_cors_headers(self, client):
response = client.get("/api/health")
assert response.headers.get("Access-Control-Allow-Origin") == "*"
assert "Content-Type" in response.headers.get("Access-Control-Allow-Headers", "")
assert "GET" in response.headers.get("Access-Control-Allow-Methods", "")
def test_cors_headers_on_flask_route(self, client):
response = client.get("/api/health", headers={"Origin": "http://localhost:5173"})
assert response.headers["Access-Control-Allow-Origin"] == "*"
assert response.headers["Access-Control-Allow-Headers"] == "Content-Type, Authorization"
assert response.headers["Access-Control-Allow-Methods"] == "GET, POST, PUT, DELETE, OPTIONS"
@pytest.mark.unit
def test_cors_headers_on_flask_preflight(self, client):
response = client.options(
"/api/health",
headers={
"Origin": "http://localhost:5173",
"Access-Control-Request-Method": "GET",
"Access-Control-Request-Headers": "Content-Type",
},
)
assert response.status_code == 200
assert response.headers["Access-Control-Allow-Origin"] == "*"
assert response.headers["Access-Control-Allow-Headers"] == "Content-Type, Authorization"
assert response.headers["Access-Control-Allow-Methods"] == "GET, POST, PUT, DELETE, OPTIONS"

136
tests/test_asgi.py Normal file
View File

@@ -0,0 +1,136 @@
"""Smoke tests for application/asgi.py.
The goal isn't to re-test Flask or FastMCP internals — it's to catch
regressions in the wiring: mounts resolve, CORS headers emit, lifespan
runs (without it, the /mcp session manager raises "Task group is not
initialized"), routing to ``/`` vs ``/mcp`` doesn't cross paths.
Uses ``starlette.testclient.TestClient`` because it boots the ASGI app
end-to-end and handles the lifespan protocol automatically — ``httpx``
alone does not run lifespan events, which would mask the exact kind of
misconfiguration this test suite exists to catch.
"""
import pytest
@pytest.mark.unit
def test_asgi_app_imports():
from application.asgi import asgi_app
assert asgi_app is not None
@pytest.mark.unit
def test_flask_route_served_through_starlette_mount():
"""GET /api/health should reach the Flask app via a2wsgi and return 200."""
from starlette.testclient import TestClient
from application.asgi import asgi_app
with TestClient(asgi_app) as client:
r = client.get("/api/health")
assert r.status_code == 200
assert r.json() == {"status": "ok"}
@pytest.mark.unit
def test_mcp_endpoint_mounted_and_lifespan_runs():
"""/mcp must be reachable AND the FastMCP session manager must start.
Without ``lifespan=mcp_app.lifespan`` on the outer Starlette app,
every /mcp request raises ``RuntimeError: Task group is not
initialized``. Hitting the endpoint under a real lifespan-aware
client catches that.
"""
from starlette.testclient import TestClient
from application.asgi import asgi_app
with TestClient(asgi_app) as client:
# Minimal MCP initialize request. Doesn't need to succeed — we
# just need a non-404, non-500-with-RuntimeError response to
# confirm the mount + lifespan are both wired.
r = client.post(
"/mcp/",
headers={
"Origin": "http://example.com",
"Content-Type": "application/json",
"Accept": "application/json, text/event-stream",
},
json={
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2025-03-26",
"capabilities": {},
"clientInfo": {"name": "pytest", "version": "0"},
},
},
)
assert r.status_code != 404, f"/mcp mount unreachable: {r.status_code}"
# A successful initialize returns 200 with a Mcp-Session-Id header.
assert r.status_code == 200
assert "mcp-session-id" in {k.lower() for k in r.headers.keys()}
assert r.headers.get("access-control-expose-headers") == "Mcp-Session-Id"
@pytest.mark.unit
def test_cors_headers_on_flask_route():
"""CORS middleware should emit allow-origin on actual (non-preflight) requests.
``allow_origins=["*"]`` → header value is literal ``*`` (not an echo).
"""
from starlette.testclient import TestClient
from application.asgi import asgi_app
with TestClient(asgi_app) as client:
r = client.get("/api/health", headers={"Origin": "http://example.com"})
assert r.status_code == 200
assert r.headers.get("access-control-allow-origin") == "*"
@pytest.mark.unit
def test_cors_preflight_on_flask_route():
"""OPTIONS preflight on a Flask route should be handled by Starlette CORSMiddleware."""
from starlette.testclient import TestClient
from application.asgi import asgi_app
with TestClient(asgi_app) as client:
r = client.options(
"/api/health",
headers={
"Origin": "http://example.com",
"Access-Control-Request-Method": "GET",
"Access-Control-Request-Headers": "Content-Type",
},
)
assert r.status_code in (200, 204)
assert r.headers.get("access-control-allow-origin") == "*"
assert "GET" in r.headers.get("access-control-allow-methods", "")
@pytest.mark.unit
def test_cors_preflight_on_mcp_route():
"""Browser clients hitting /mcp should be allowed to send session headers."""
from starlette.testclient import TestClient
from application.asgi import asgi_app
with TestClient(asgi_app) as client:
r = client.options(
"/mcp/",
headers={
"Origin": "http://example.com",
"Access-Control-Request-Method": "POST",
"Access-Control-Request-Headers": (
"Authorization, Content-Type, Mcp-Session-Id"
),
},
)
assert r.status_code in (200, 204)
assert r.headers.get("access-control-allow-origin") == "*"
assert "Mcp-Session-Id" in r.headers.get("access-control-allow-headers", "")

402
tests/test_version_check.py Normal file
View File

@@ -0,0 +1,402 @@
"""Unit tests for the anonymous startup version-check client.
All external dependencies (Postgres, Redis, HTTP) are mocked so the
suite runs in pure-Python isolation. The focus is on the branching
behavior described in the spec: opt-out, cache-hit, cache-miss,
lock-denied, and the various failure paths that must never propagate.
"""
from __future__ import annotations
import json
from contextlib import contextmanager
from unittest.mock import MagicMock, patch
import pytest
import requests
from application.updates import version_check as vc_module
class _FakeRepo:
"""Stand-in for AppMetadataRepository backed by a plain dict."""
def __init__(self, store: dict | None = None, *, raise_on_get_instance: bool = False):
self._store: dict[str, str] = dict(store) if store else {}
self._raise = raise_on_get_instance
def get(self, key: str):
return self._store.get(key)
def set(self, key: str, value: str) -> None:
self._store[key] = value
def get_or_create_instance_id(self) -> str:
if self._raise:
raise RuntimeError("simulated Postgres outage")
existing = self._store.get("instance_id")
if existing:
return existing
self._store["instance_id"] = "11111111-2222-3333-4444-555555555555"
return self._store["instance_id"]
@contextmanager
def _fake_db_session():
"""Stand-in for ``db_session()`` — yields ``None`` because the fake
repository ignores its connection argument."""
yield None
def _install_repo(monkeypatch, repo: _FakeRepo):
"""Patch the repo constructor so ``AppMetadataRepository(conn)`` → ``repo``."""
monkeypatch.setattr(
vc_module, "AppMetadataRepository", lambda conn: repo
)
def _install_db_session(monkeypatch, *, raise_exc: Exception | None = None):
if raise_exc is not None:
@contextmanager
def boom():
raise raise_exc
yield # pragma: no cover - unreachable
monkeypatch.setattr(vc_module, "db_session", boom)
else:
monkeypatch.setattr(vc_module, "db_session", _fake_db_session)
def _make_redis_mock(*, get_return=None, set_return=True):
client = MagicMock()
client.get.return_value = get_return
client.set.return_value = set_return
client.setex.return_value = True
client.delete.return_value = 1
return client
@pytest.fixture
def enable_check(monkeypatch):
monkeypatch.setattr(vc_module.settings, "VERSION_CHECK", True)
@pytest.mark.unit
def test_opt_out_short_circuits(monkeypatch):
"""VERSION_CHECK=0 → no Postgres, no Redis, no network."""
monkeypatch.setattr(vc_module.settings, "VERSION_CHECK", False)
db_spy = MagicMock()
redis_spy = MagicMock()
post_spy = MagicMock()
monkeypatch.setattr(vc_module, "db_session", db_spy)
monkeypatch.setattr(vc_module, "get_redis_instance", redis_spy)
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
db_spy.assert_not_called()
redis_spy.assert_not_called()
post_spy.assert_not_called()
@pytest.mark.unit
def test_cache_hit_renders_without_lock_or_network(monkeypatch, enable_check, capsys):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
cached = {
"advisories": [
{
"id": "DOCSGPT-TEST-1",
"title": "Example",
"severity": "high",
"fixed_in": "0.17.0",
"url": "https://example.test/a",
"summary": "Upgrade required.",
}
]
}
redis_client = _make_redis_mock(get_return=json.dumps(cached).encode("utf-8"))
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
post_spy = MagicMock()
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
redis_client.get.assert_called_once_with(vc_module.CACHE_KEY)
redis_client.set.assert_not_called()
redis_client.setex.assert_not_called()
post_spy.assert_not_called()
assert "SECURITY ADVISORY: DOCSGPT-TEST-1" in capsys.readouterr().out
@pytest.mark.unit
def test_cache_miss_lock_acquired_fetches_and_caches(monkeypatch, enable_check):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=True)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
response_body = {
"advisories": [
{
"id": "DOCSGPT-LOW-1",
"title": "Minor",
"severity": "low",
"fixed_in": "0.17.0",
"url": "https://example.test/low",
}
],
"next_check_after": 1800,
}
post_response = MagicMock()
post_response.status_code = 200
post_response.json.return_value = response_body
post_spy = MagicMock(return_value=post_response)
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
post_spy.assert_called_once()
call_kwargs = post_spy.call_args
assert call_kwargs.args[0] == vc_module.ENDPOINT_URL
payload = call_kwargs.kwargs["json"]
assert payload["client"] == "docsgpt-backend"
assert payload["instance_id"] == "11111111-2222-3333-4444-555555555555"
assert "version" in payload and "python_version" in payload
# Lock acquired with NX EX, cache written with server-specified TTL,
# lock released.
redis_client.set.assert_called_once()
set_kwargs = redis_client.set.call_args.kwargs
assert set_kwargs == {"nx": True, "ex": vc_module.LOCK_TTL_SECONDS}
redis_client.setex.assert_called_once()
setex_args = redis_client.setex.call_args.args
assert setex_args[0] == vc_module.CACHE_KEY
assert setex_args[1] == 1800 # server override under 6h
redis_client.delete.assert_called_once_with(vc_module.LOCK_KEY)
@pytest.mark.unit
def test_cache_miss_lock_denied_skips_silently(monkeypatch, enable_check):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=False) # lock not acquired
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
post_spy = MagicMock()
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
post_spy.assert_not_called()
redis_client.setex.assert_not_called()
redis_client.delete.assert_not_called()
@pytest.mark.unit
def test_instance_id_persisted_across_runs(monkeypatch, enable_check):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=True)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
post_response = MagicMock()
post_response.status_code = 200
post_response.json.return_value = {}
monkeypatch.setattr(
vc_module.requests, "post", MagicMock(return_value=post_response)
)
vc_module.run_check()
first_id = repo.get("instance_id")
vc_module.run_check()
second_id = repo.get("instance_id")
assert first_id is not None
assert first_id == second_id
@pytest.mark.unit
def test_first_run_notice_emitted_once(monkeypatch, enable_check, capsys):
repo = _FakeRepo() # empty — notice not shown yet
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
# Cache hit so we don't need to mock HTTP. Notice logic runs before cache.
redis_client = _make_redis_mock(get_return=json.dumps({}).encode("utf-8"))
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
vc_module.run_check()
first_out = capsys.readouterr().out
assert "Anonymous version check enabled" in first_out
assert repo.get("version_check_notice_shown") == "1"
vc_module.run_check()
second_out = capsys.readouterr().out
assert "Anonymous version check enabled" not in second_out
@pytest.mark.unit
def test_postgres_unavailable_skips_silently(monkeypatch, enable_check):
_install_db_session(monkeypatch, raise_exc=RuntimeError("db down"))
redis_spy = MagicMock()
post_spy = MagicMock()
monkeypatch.setattr(vc_module, "get_redis_instance", redis_spy)
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
redis_spy.assert_not_called()
post_spy.assert_not_called()
@pytest.mark.unit
def test_postgres_repo_raises_skips_silently(monkeypatch, enable_check):
repo = _FakeRepo(raise_on_get_instance=True)
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_spy = MagicMock()
post_spy = MagicMock()
monkeypatch.setattr(vc_module, "get_redis_instance", redis_spy)
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
redis_spy.assert_not_called()
post_spy.assert_not_called()
@pytest.mark.unit
def test_redis_unavailable_proceeds_uncached(monkeypatch, enable_check):
"""``get_redis_instance()`` → None should not abort the check."""
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: None)
post_response = MagicMock()
post_response.status_code = 200
post_response.json.return_value = {"advisories": []}
post_spy = MagicMock(return_value=post_response)
monkeypatch.setattr(vc_module.requests, "post", post_spy)
vc_module.run_check()
post_spy.assert_called_once()
@pytest.mark.unit
def test_unknown_version_warns_and_skips(monkeypatch, enable_check):
"""get_version() → "unknown" must not hit the endpoint silently."""
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=True)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
monkeypatch.setattr(vc_module, "get_version", lambda: "unknown")
post_spy = MagicMock()
monkeypatch.setattr(vc_module.requests, "post", post_spy)
with patch.object(vc_module, "logger") as mock_logger:
vc_module.run_check()
post_spy.assert_not_called()
redis_client.setex.assert_not_called()
# Lock released so the next cycle can retry.
redis_client.delete.assert_called_once_with(vc_module.LOCK_KEY)
assert mock_logger.warning.called
assert "unknown" in mock_logger.warning.call_args.args[0].lower() \
or mock_logger.warning.call_args.args[1:] == ("unknown",)
@pytest.mark.unit
def test_http_5xx_swallowed(monkeypatch, enable_check):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=True)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
post_response = MagicMock()
post_response.status_code = 503
post_response.json.return_value = {}
monkeypatch.setattr(
vc_module.requests, "post", MagicMock(return_value=post_response)
)
vc_module.run_check()
redis_client.setex.assert_not_called()
# Lock still released so the next cycle can retry.
redis_client.delete.assert_called_once_with(vc_module.LOCK_KEY)
@pytest.mark.unit
def test_http_timeout_swallowed(monkeypatch, enable_check):
repo = _FakeRepo({"version_check_notice_shown": "1"})
_install_repo(monkeypatch, repo)
_install_db_session(monkeypatch)
redis_client = _make_redis_mock(get_return=None, set_return=True)
monkeypatch.setattr(vc_module, "get_redis_instance", lambda: redis_client)
monkeypatch.setattr(
vc_module.requests,
"post",
MagicMock(side_effect=requests.Timeout("boom")),
)
# Must not raise.
vc_module.run_check()
redis_client.setex.assert_not_called()
redis_client.delete.assert_called_once_with(vc_module.LOCK_KEY)
@pytest.mark.unit
def test_compute_ttl_honors_server_override():
assert vc_module._compute_ttl({"next_check_after": 300}) == 300
assert vc_module._compute_ttl({"next_check_after": 60000}) == vc_module.CACHE_TTL_SECONDS
assert vc_module._compute_ttl({}) == vc_module.CACHE_TTL_SECONDS
assert vc_module._compute_ttl({"next_check_after": "bad"}) == vc_module.CACHE_TTL_SECONDS
# Zero/negative overrides fall back to the 6h default.
assert vc_module._compute_ttl({"next_check_after": 0}) == vc_module.CACHE_TTL_SECONDS
@pytest.mark.unit
def test_render_advisories_logs_warning_and_prints_banner(monkeypatch, capsys):
with patch.object(vc_module, "logger") as mock_logger:
vc_module._render_advisories(
{
"advisories": [
{
"id": "DOCSGPT-2025-001",
"title": "SSRF",
"severity": "critical",
"fixed_in": "0.17.0",
"url": "https://example.test/a",
"summary": "Your DocsGPT is vulnerable.",
},
{
"id": "DOCSGPT-2025-002",
"title": "Low-sev",
"severity": "low",
},
]
}
)
# Both advisories logged as warnings.
assert mock_logger.warning.call_count == 2
out = capsys.readouterr().out
# Only the high/critical one gets the console banner.
assert "DOCSGPT-2025-001" in out
assert "DOCSGPT-2025-002" not in out