Compare commits
6 Commits
main
...
feature/de
| Author | SHA1 | Date | |
|---|---|---|---|
| 20d7a66d57 | |||
| d6b61ae8fb | |||
| 5a8b773e40 | |||
| 73fd4d26a5 | |||
| 1423a4e4b2 | |||
| 8ce60fb5e7 |
193
.gitignore
vendored
Normal file
193
.gitignore
vendored
Normal file
@@ -0,0 +1,193 @@
|
|||||||
|
# ==============================================================================
|
||||||
|
# PYTHON + NEXT.JS UNIFIED .GITIGNORE
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Byte-compiled & Cache
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.cython_debug/
|
||||||
|
cython_debug/
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Packaging & Distribution
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
build/
|
||||||
|
dist/
|
||||||
|
develop-eggs/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
wheels/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
*.manifest
|
||||||
|
*.spec
|
||||||
|
|
||||||
|
# Installer logs
|
||||||
|
pip-log.txt
|
||||||
|
pip-delete-this-directory.txt
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Virtual Environments & Dependency Managers
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Virtual environments
|
||||||
|
.venv
|
||||||
|
venv/
|
||||||
|
env/
|
||||||
|
ENV/
|
||||||
|
env.bak/
|
||||||
|
venv.bak/
|
||||||
|
.pixi/
|
||||||
|
__pypackages__/
|
||||||
|
|
||||||
|
# pyenv
|
||||||
|
.python-version
|
||||||
|
|
||||||
|
# Dependency lock files (usually committed, uncomment if you prefer to ignore)
|
||||||
|
# Pipfile.lock
|
||||||
|
# poetry.lock
|
||||||
|
# uv.lock
|
||||||
|
# pdm.lock
|
||||||
|
# pixi.lock
|
||||||
|
|
||||||
|
# Tool-specific
|
||||||
|
.tox/
|
||||||
|
.nox/
|
||||||
|
.pdm-python
|
||||||
|
.pdm-build/
|
||||||
|
.poetry.toml
|
||||||
|
.pdm.toml
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Testing & Coverage
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
.cache
|
||||||
|
nosetests.xml
|
||||||
|
coverage.xml
|
||||||
|
*.cover
|
||||||
|
*.py.cover
|
||||||
|
.hypothesis/
|
||||||
|
.pytest_cache/
|
||||||
|
cover/
|
||||||
|
.ruff_cache/
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Development & IDE
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Jupyter / IPython
|
||||||
|
.ipynb_checkpoints
|
||||||
|
profile_default/
|
||||||
|
ipython_config.py
|
||||||
|
|
||||||
|
# Type checkers & linters
|
||||||
|
.mypy_cache/
|
||||||
|
.dmypy.json
|
||||||
|
dmypy.json
|
||||||
|
.pyre/
|
||||||
|
.pytype/
|
||||||
|
|
||||||
|
# Project / IDE settings
|
||||||
|
.spyderproject
|
||||||
|
.spyproject
|
||||||
|
.ropeproject
|
||||||
|
|
||||||
|
# PyCharm / JetBrains (uncomment to ignore entire folder)
|
||||||
|
# .idea/
|
||||||
|
|
||||||
|
# VS Code (uncomment to ignore entire folder)
|
||||||
|
# .vscode/
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python: Frameworks & Tools
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Django
|
||||||
|
*.log
|
||||||
|
local_settings.py
|
||||||
|
db.sqlite3
|
||||||
|
db.sqlite3-journal
|
||||||
|
|
||||||
|
# Flask
|
||||||
|
instance/
|
||||||
|
.webassets-cache
|
||||||
|
|
||||||
|
# Scrapy
|
||||||
|
.scrapy
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
celerybeat-schedule
|
||||||
|
celerybeat-schedule.*
|
||||||
|
celerybeat.pid
|
||||||
|
|
||||||
|
# Sphinx / MkDocs / Marimo
|
||||||
|
docs/_build/
|
||||||
|
/site
|
||||||
|
marimo/_static/
|
||||||
|
marimo/_lsp/
|
||||||
|
__marimo__/
|
||||||
|
|
||||||
|
# Streamlit secrets
|
||||||
|
.streamlit/secrets.toml
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Next.js / Node.js
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Dependencies
|
||||||
|
node_modules
|
||||||
|
.pnp
|
||||||
|
.pnp.js
|
||||||
|
.pnp.loader.mjs
|
||||||
|
|
||||||
|
# Build outputs
|
||||||
|
.next/
|
||||||
|
out/
|
||||||
|
build/
|
||||||
|
|
||||||
|
# TypeScript
|
||||||
|
*.tsbuildinfo
|
||||||
|
next-env.d.ts
|
||||||
|
|
||||||
|
# Testing (Jest, etc.)
|
||||||
|
coverage
|
||||||
|
|
||||||
|
# Vercel
|
||||||
|
.vercel
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# General / OS / Security
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
.env*.local
|
||||||
|
.envrc
|
||||||
|
|
||||||
|
# OS generated files
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
*.pem
|
||||||
|
|
||||||
|
# Logs & debug
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# PyPI config
|
||||||
|
.pypirc
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# End of file
|
||||||
|
# ==============================================================================
|
||||||
10
.vscode/extensions.json
vendored
Normal file
10
.vscode/extensions.json
vendored
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
{
|
||||||
|
"recommendations": [
|
||||||
|
"ms-python.python",
|
||||||
|
"ms-python.vscode-pylance",
|
||||||
|
"charliermarsh.ruff",
|
||||||
|
"tamasfe.even-better-toml",
|
||||||
|
"aaron-bond.better-comments",
|
||||||
|
"bierner.markdown-mermaid",
|
||||||
|
]
|
||||||
|
}
|
||||||
13
.vscode/settings.json
vendored
Normal file
13
.vscode/settings.json
vendored
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
// Workspace settings: Apply to a specific project or workspace. Overrides User Settings, but only for that workspace.
|
||||||
|
// Python settings
|
||||||
|
"python.envFile": "${workspaceFolder}/.env",
|
||||||
|
"python.terminal.activateEnvironment": true,
|
||||||
|
"python.defaultInterpreterPath": "${workspaceFolder}/backend/.venv/bin/python",
|
||||||
|
// Test settings
|
||||||
|
"python.testing.pytestEnabled": true,
|
||||||
|
"python.testing.unittestEnabled": false,
|
||||||
|
"python.testing.cwd": "${workspaceFolder}/",
|
||||||
|
"python.testing.pytestPath": "${workspaceFolder}/.venv/bin/pytest",
|
||||||
|
"python.testing.autoTestDiscoverOnSaveEnabled": true,
|
||||||
|
}
|
||||||
168
.vscode/user-settings.json
vendored
Normal file
168
.vscode/user-settings.json
vendored
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
{
|
||||||
|
// User Settings: Personal preferences that apply globally across all VS Code workspaces for that user.
|
||||||
|
// General settings
|
||||||
|
"security.workspace.trust.untrustedFiles": "newWindow",
|
||||||
|
"window.zoomLevel": 2,
|
||||||
|
"files.exclude": {
|
||||||
|
"**/.git": true
|
||||||
|
},
|
||||||
|
"extensions.autoUpdate": "onlyEnabledExtensions",
|
||||||
|
"chat.disableAIFeatures": true,
|
||||||
|
// ChatGPT Codex
|
||||||
|
"chatgpt.openOnStartup": true,
|
||||||
|
// Git settings
|
||||||
|
"git.autofetch": true,
|
||||||
|
"git.confirmSync": false,
|
||||||
|
"git.enableSmartCommit": true,
|
||||||
|
"git.showActionButton": {
|
||||||
|
"commit": false,
|
||||||
|
"publish": false,
|
||||||
|
"sync": false
|
||||||
|
},
|
||||||
|
// Explorer settings
|
||||||
|
"explorer.excludeGitIgnore": true,
|
||||||
|
"explorer.autoReveal": true,
|
||||||
|
"explorer.confirmDelete": false,
|
||||||
|
"explorer.confirmDragAndDrop": false,
|
||||||
|
"explorer.sortOrder": "filesFirst",
|
||||||
|
// Workbench settings
|
||||||
|
"workbench.colorTheme": "Default Dark+",
|
||||||
|
"workbench.editor.enablePreview": false,
|
||||||
|
"workbench.editor.tabSizing": "shrink",
|
||||||
|
"workbench.settings.editor": "json",
|
||||||
|
// Editor settings
|
||||||
|
"ruff.importStrategy": "useBundled",
|
||||||
|
"editor.defaultFormatter": "charliermarsh.ruff",
|
||||||
|
"editor.formatOnPaste": true,
|
||||||
|
"editor.formatOnSave": true,
|
||||||
|
"editor.formatOnSaveMode": "file",
|
||||||
|
"editor.codeActionsOnSave": {
|
||||||
|
"source.organizeImports": "always",
|
||||||
|
"source.fixAll": "always"
|
||||||
|
},
|
||||||
|
"files.autoSave": "onFocusChange",
|
||||||
|
"[json]": {
|
||||||
|
"editor.defaultFormatter": "vscode.json-language-features"
|
||||||
|
},
|
||||||
|
"[jsonc]": {
|
||||||
|
"editor.defaultFormatter": "vscode.json-language-features"
|
||||||
|
},
|
||||||
|
// Debug settings
|
||||||
|
"debug.toolBarLocation": "docked",
|
||||||
|
// Terminal settings
|
||||||
|
"terminal.integrated.tabs.enabled": true,
|
||||||
|
"terminal.integrated.tabs.hideCondition": "never",
|
||||||
|
"terminal.integrated.tabs.location": "right",
|
||||||
|
// Markdown settings
|
||||||
|
"markdown.preview.scrollEditorWithPreview": true,
|
||||||
|
"markdown.preview.scrollPreviewWithEditor": true,
|
||||||
|
// Color customization settings
|
||||||
|
"workbench.colorCustomizations": {
|
||||||
|
// Status bar
|
||||||
|
"statusBar.background": "#00D396",
|
||||||
|
"statusBar.foreground": "#0c1b29",
|
||||||
|
"statusBar.noFolderBackground": "#2A5677",
|
||||||
|
"statusBar.debuggingBackground": "#511f1f",
|
||||||
|
"statusBarItem.remoteBackground": "#00D396",
|
||||||
|
"statusBarItem.remoteForeground": "#0c1b29",
|
||||||
|
// Activity Bar (right bar)
|
||||||
|
"activityBar.background": "#0c1b29",
|
||||||
|
"activityBar.foreground": "#A1F7DB",
|
||||||
|
"activityBarBadge.background": "#00D396",
|
||||||
|
"activityBarBadge.foreground": "#0c1b29",
|
||||||
|
// Side bar (left panel)
|
||||||
|
"sideBar.background": "#0c1b29",
|
||||||
|
"sideBar.foreground": "#EDEDF0",
|
||||||
|
"sideBarTitle.foreground": "#A1F7DB",
|
||||||
|
"sideBarSectionHeader.background": "#2A5677",
|
||||||
|
// Editor
|
||||||
|
"editor.background": "#0c1b29",
|
||||||
|
"editor.foreground": "#EDEDF0",
|
||||||
|
"editor.lineHighlightBackground": "#005bd330",
|
||||||
|
"editor.selectionBackground": "#2A567780",
|
||||||
|
"editorCursor.foreground": "#A1F7DB",
|
||||||
|
// Tab colors
|
||||||
|
"tab.activeBackground": "#0c1b29",
|
||||||
|
"tab.activeBorderTop": "#00D396",
|
||||||
|
"tab.activeForeground": "#A1F7DB",
|
||||||
|
"tab.unfocusedActiveBorder": "#ffffff",
|
||||||
|
"tab.inactiveBackground": "#0c1b29",
|
||||||
|
"tab.inactiveForeground": "#ffffff",
|
||||||
|
// Editor group header
|
||||||
|
"editorGroupHeader.tabsBackground": "#0c1b29",
|
||||||
|
"editorGroupHeader.tabsBorder": "#00D396",
|
||||||
|
"editorGroupHeader.noTabsBackground": "#2A5677",
|
||||||
|
// Scrollbar
|
||||||
|
"scrollbarSlider.background": "#A1F7DB90",
|
||||||
|
"scrollbarSlider.hoverBackground": "#00D39690",
|
||||||
|
// Terminal
|
||||||
|
"terminal.background": "#0c1b29",
|
||||||
|
"terminal.tab.activeBorder": "#00D396",
|
||||||
|
"terminal.tab.background": "#2A5677",
|
||||||
|
"terminal.tab.activeForeground": "#00D396",
|
||||||
|
"terminal.tab.inactiveForeground": "#A1F7DB",
|
||||||
|
// Panel
|
||||||
|
"panelTitle.activeBorder": "#00D396",
|
||||||
|
"panel.background": "#0c1b29",
|
||||||
|
// Notifications
|
||||||
|
"notification.background": "#2A5677",
|
||||||
|
"notification.foreground": "#EDEDF0",
|
||||||
|
"notification.infoBackground": "#00D396",
|
||||||
|
"notification.warningBackground": "#A1F7DB",
|
||||||
|
"notification.errorBackground": "#511f1f",
|
||||||
|
// Window
|
||||||
|
"window.activeBorder": "#0c1b29",
|
||||||
|
"window.inactiveBorder": "#00D396",
|
||||||
|
"titleBar.activeBackground": "#0c1b29",
|
||||||
|
"titleBar.activeForeground": "#A1F7DB",
|
||||||
|
"titleBar.inactiveBackground": "#2A5677",
|
||||||
|
"titleBar.inactiveForeground": "#A1F7DB",
|
||||||
|
// Button styles
|
||||||
|
"button.background": "#00D396",
|
||||||
|
"button.foreground": "#0c1b29",
|
||||||
|
"button.hoverBackground": "#00B386",
|
||||||
|
// Input styles
|
||||||
|
"input.background": "#0c1b29",
|
||||||
|
"input.foreground": "#ffffff",
|
||||||
|
"input.placeholderForeground": "#A1F7DB80",
|
||||||
|
"inputValidation.errorBackground": "#511f1f",
|
||||||
|
"inputValidation.errorForeground": "#EDEDF0",
|
||||||
|
"inputValidation.errorBorder": "#FF5555",
|
||||||
|
// Quick Open / Command Palette input box
|
||||||
|
"quickInput.background": "#0c1b29",
|
||||||
|
"quickInput.foreground": "#ffffff",
|
||||||
|
"quickInputTitle.background": "#0F2436",
|
||||||
|
"pickerGroup.foreground": "#ffffff",
|
||||||
|
"pickerGroup.border": "#00D396",
|
||||||
|
"pickerGroup.background": "#00D396",
|
||||||
|
// Icons and decorations for quick
|
||||||
|
"keybindingLabel.foreground": "#1E3A57",
|
||||||
|
"keybindingLabel.background": "#00D396",
|
||||||
|
"keybindingLabel.border": "#00D396",
|
||||||
|
"keybindingLabel.bottomBorder": "#00D396",
|
||||||
|
// Quick Open/Command Palette selected item
|
||||||
|
"list.activeSelectionBackground": "#2a567775",
|
||||||
|
"list.activeSelectionForeground": "#ffffff",
|
||||||
|
"list.activeSelectionIconForeground": "#A1F7DB",
|
||||||
|
"list.hoverBackground": "#1E3A57",
|
||||||
|
"list.inactiveSelectionBackground": "#2A5677",
|
||||||
|
"list.inactiveSelectionForeground": "#A1F7DB",
|
||||||
|
// Editor widget (Quick Open, Search, Replace)
|
||||||
|
"editorWidget.background": "#0c1b29",
|
||||||
|
"editorWidget.border": "#00D396",
|
||||||
|
"editorWidget.foreground": "#EDEDF0",
|
||||||
|
"editor.findMatchBackground": "#00D39630",
|
||||||
|
"editor.findMatchHighlightBackground": "#2A567780",
|
||||||
|
"editor.findRangeHighlightBackground": "#2A567780",
|
||||||
|
"editor.findMatchBorder": "#00D396",
|
||||||
|
"editor.findMatchHighlightBorder": "#00D396"
|
||||||
|
},
|
||||||
|
"workbench.startupEditor": "none",
|
||||||
|
"python.analysis.typeCheckingMode": "strict",
|
||||||
|
"markdown-pdf.displayHeaderFooter": false,
|
||||||
|
"markdown-pdf.highlightStyle": "github.css",
|
||||||
|
"markdown-mermaid.darkModeTheme": "forest",
|
||||||
|
"markdown-mermaid.lightModeTheme": "forest",
|
||||||
|
"markdown-pdf.mermaidServer": "https://unpkg.com/mermaid@11.12.1/dist/mermaid.js",
|
||||||
|
"markdown-pdf.executablePath": "/opt/google/chrome/google-chrome"
|
||||||
|
}
|
||||||
739
README.md
739
README.md
@@ -71,749 +71,12 @@
|
|||||||
| **⭐ Shine** *(Recommended)* | 50 | **kr 5 999** | The sweet spot for building natural fluency and confidence. |
|
| **⭐ Shine** *(Recommended)* | 50 | **kr 5 999** | The sweet spot for building natural fluency and confidence. |
|
||||||
| **Radiance** | 200 | **kr 17 999** | Designed for dedicated learners seeking transformation. |
|
| **Radiance** | 200 | **kr 17 999** | Designed for dedicated learners seeking transformation. |
|
||||||
|
|
||||||
## 4. Configuration
|
## 4. Project Structure
|
||||||
|
|
||||||
### 4.1 Configure the VPS
|
|
||||||
|
|
||||||
#### 4.1.1 Configure the firewal at the VPS host
|
|
||||||
|
|
||||||
| Public IP |
|
|
||||||
| :------------: |
|
|
||||||
| 217.154.51.242 |
|
|
||||||
|
|
||||||
| Action | Allowed IP | Protocol | Port(s) | Description |
|
|
||||||
| :-----: | :--------: | :------: | ----------: | :------------ |
|
|
||||||
| Allow | Any | TCP | 80 | HTTP |
|
|
||||||
| Allow | Any | TCP | 443 | HTTPS |
|
|
||||||
| Allow | Any | TCP | 2222 | Git SSH |
|
|
||||||
| Allow | Any | TCP | 2885 | VPS SSH |
|
|
||||||
| Allow | Any | UDP | 3478 | STUN/TURN |
|
|
||||||
| Allow | Any | TCP | 5349 | TURN/TLS |
|
|
||||||
| Allow | Any | TCP | 7881 | LiveKit TCP |
|
|
||||||
| Allow | Any | UDP | 50000-60000 | LiveKit Media |
|
|
||||||
|
|
||||||
#### 4.1.2 Configure the DNS settings at domain registrar
|
|
||||||
|
|
||||||
| Host (avaaz.ai) | Type | Value |
|
|
||||||
| :-------------: | :---: | :------------: |
|
|
||||||
| @ | A | 217.154.51.242 |
|
|
||||||
| www | CNAME | avaaz.ai |
|
|
||||||
| app | A | 217.154.51.242 |
|
|
||||||
| api | A | 217.154.51.242 |
|
|
||||||
| rtc | A | 217.154.51.242 |
|
|
||||||
| git | A | 217.154.51.242 |
|
|
||||||
|
|
||||||
#### 4.1.3 Change the SSH port from 22 to 2885
|
|
||||||
|
|
||||||
1. Connect to the server.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh username@avaaz.ai
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Edit the SSH configuration file.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo nano /etc/ssh/sshd_config
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Add port 2885 to the file and comment out port 22.
|
|
||||||
|
|
||||||
```text
|
|
||||||
#Port 22
|
|
||||||
Port 2885
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Save the file and exit the editor.
|
|
||||||
|
|
||||||
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
|
||||||
|
|
||||||
5. Restart the SSH service.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo systemctl daemon-reload && sudo systemctl restart ssh.socket && sudo systemctl restart ssh.service
|
|
||||||
```
|
|
||||||
|
|
||||||
6. **Before closing the current session**, open a new terminal window and connect to the server to verify the changes work correctly.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh username@avaaz.ai # ssh: connect to host avaaz.ai port 22: Connection timed out
|
|
||||||
ssh username@avaaz.ai -p 2885
|
|
||||||
```
|
|
||||||
|
|
||||||
7. Once the connection is successful, close the original session safely.
|
|
||||||
|
|
||||||
#### 4.1.4 Build and deploy the infrastructure
|
|
||||||
|
|
||||||
1. Check with `dig git.avaaz.ai +short` wether the DNS settings have been propagated.
|
|
||||||
|
|
||||||
2. SSH into the VPS to install Docker & docker compose.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh username@avaaz.ai -p 2885
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Update system packages.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo apt update && sudo apt upgrade -y
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Install dependencies for Docker’s official repo
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo apt install -y \
|
|
||||||
ca-certificates \
|
|
||||||
curl \
|
|
||||||
gnupg \
|
|
||||||
lsb-release
|
|
||||||
```
|
|
||||||
|
|
||||||
5. Add Docker’s official APT repo.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo install -m 0755 -d /etc/apt/keyrings
|
|
||||||
|
|
||||||
curl -fsSL https://download.docker.com/linux/ubuntu/gpg \
|
|
||||||
sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
|
|
||||||
|
|
||||||
echo \
|
|
||||||
"deb [arch=$(dpkg --print-architecture) \
|
|
||||||
signed-by=/etc/apt/keyrings/docker.gpg] \
|
|
||||||
https://download.docker.com/linux/ubuntu \
|
|
||||||
$(lsb_release -cs) stable" \
|
|
||||||
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
|
|
||||||
|
|
||||||
sudo apt update
|
|
||||||
```
|
|
||||||
|
|
||||||
6. Install Docker Engine + compose plugin.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo apt install -y \
|
|
||||||
docker-ce \
|
|
||||||
docker-ce-cli \
|
|
||||||
containerd.io \
|
|
||||||
docker-buildx-plugin \
|
|
||||||
docker-compose-plugin
|
|
||||||
```
|
|
||||||
|
|
||||||
7. Verify the installation.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker --version
|
|
||||||
sudo docker compose version
|
|
||||||
```
|
|
||||||
|
|
||||||
8. Create the `/etc/docker/daemon.json` file to avoid issues with overusing disk for log data.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo nano /etc/docker/daemon.json
|
|
||||||
```
|
|
||||||
|
|
||||||
9. Paste the following.
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"log-driver": "local",
|
|
||||||
"log-opts": {
|
|
||||||
"max-size": "10m",
|
|
||||||
"max-file": "3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
10. Save the file and exit the editor.
|
|
||||||
|
|
||||||
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
|
||||||
|
|
||||||
11. Restart the docker service to apply changes.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo systemctl daemon-reload
|
|
||||||
sudo systemctl restart docker
|
|
||||||
```
|
|
||||||
|
|
||||||
12. Create directory for infra stack in `/srv/infra`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo mkdir -p /srv/infra
|
|
||||||
sudo chown -R $USER:$USER /srv/infra
|
|
||||||
cd /srv/infra
|
|
||||||
```
|
|
||||||
|
|
||||||
13. Create directories for Gitea (repos, config, etc.) and Runner persistent data. Gitea runs as UID/GID 1000 by default.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
mkdir -p gitea-data gitea-runner-data
|
|
||||||
```
|
|
||||||
|
|
||||||
14. Create the `/srv/infra/docker-compose.yml` (Caddy + Gitea + Runner) file.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
nano docker-compose.yml
|
|
||||||
```
|
|
||||||
|
|
||||||
15. Paste the following.
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
services:
|
|
||||||
caddy:
|
|
||||||
# Use the latest official Caddy image
|
|
||||||
image: caddy:latest
|
|
||||||
# Docker Compose automatically generates container names: <folder>_<service>_<index>
|
|
||||||
container_name: caddy # Fixed name used by Docker engine
|
|
||||||
# Automatically restart unless manually stopped
|
|
||||||
restart: unless-stopped
|
|
||||||
ports:
|
|
||||||
# Expose HTTP (ACME + redirect)
|
|
||||||
- "80:80"
|
|
||||||
# Expose HTTPS/WSS (frontend, backend, LiveKit)
|
|
||||||
- "443:443"
|
|
||||||
volumes:
|
|
||||||
# Mount the Caddy config file read-only
|
|
||||||
- ./Caddyfile:/etc/caddy/Caddyfile:ro
|
|
||||||
# Caddy TLS certs (persistent Docker volume)
|
|
||||||
- caddy_data:/data
|
|
||||||
# Internal Caddy state/config
|
|
||||||
- caddy_config:/config
|
|
||||||
networks:
|
|
||||||
# Attach to the shared "proxy" network
|
|
||||||
- proxy
|
|
||||||
|
|
||||||
gitea:
|
|
||||||
# Official Gitea image with built-in Actions
|
|
||||||
image: gitea/gitea:latest
|
|
||||||
container_name: gitea # Fixed name used by Docker engine
|
|
||||||
# Auto-restart service
|
|
||||||
restart: unless-stopped
|
|
||||||
environment:
|
|
||||||
# Run Gitea as host user 1000 (prevents permission issues)
|
|
||||||
- USER_UID=1000
|
|
||||||
# Same for group
|
|
||||||
- USER_GID=1000
|
|
||||||
# Use SQLite (stored inside /data)
|
|
||||||
- GITEA__database__DB_TYPE=sqlite3
|
|
||||||
# Location of the SQLite DB
|
|
||||||
- GITEA__database__PATH=/data/gitea/gitea.db
|
|
||||||
# Custom config directory
|
|
||||||
- GITEA_CUSTOM=/data/gitea
|
|
||||||
volumes:
|
|
||||||
# Bind mount instead of Docker volume because:
|
|
||||||
# - We want repos, configs, SSH keys, and SQLite DB **visible and editable** on host
|
|
||||||
# - Easy backups (just copy `./gitea-data`)
|
|
||||||
# - Easy migration
|
|
||||||
# - Avoids losing data if Docker volumes are pruned
|
|
||||||
- ./gitea-data:/data
|
|
||||||
networks:
|
|
||||||
- proxy
|
|
||||||
ports:
|
|
||||||
# SSH for Git operations mapped to host 2222
|
|
||||||
- "2222:22"
|
|
||||||
|
|
||||||
gitea-runner:
|
|
||||||
# Official Gitea Actions Runner
|
|
||||||
image: gitea/act_runner:latest
|
|
||||||
container_name: gitea-runner # Fixed name used by Docker engine
|
|
||||||
restart: unless-stopped
|
|
||||||
depends_on:
|
|
||||||
# Runner requires Gitea to be available
|
|
||||||
- gitea
|
|
||||||
volumes:
|
|
||||||
# Runner uses host Docker daemon to spin up job containers (Docker-out-of-Docker)
|
|
||||||
- /var/run/docker.sock:/var/run/docker.sock
|
|
||||||
# Bind mount instead of volume because:
|
|
||||||
# - Runner identity is stored in /data/.runner
|
|
||||||
# - Must persist across container recreations
|
|
||||||
# - Prevents duplicated runner registrations in Gitea
|
|
||||||
# - Easy to inspect/reset via `./gitea-runner-data/.runner`
|
|
||||||
- ./gitea-runner-data:/data
|
|
||||||
environment:
|
|
||||||
# Base URL of your Gitea instance
|
|
||||||
- GITEA_INSTANCE_URL=${GITEA_INSTANCE_URL}
|
|
||||||
# One-time registration token
|
|
||||||
- GITEA_RUNNER_REGISTRATION_TOKEN=${GITEA_RUNNER_REGISTRATION_TOKEN}
|
|
||||||
# Human-readable name for the runner
|
|
||||||
- GITEA_RUNNER_NAME=${GITEA_RUNNER_NAME}
|
|
||||||
# Runner labels (e.g., ubuntu-latest)
|
|
||||||
- GITEA_RUNNER_LABELS=${GITEA_RUNNER_LABELS}
|
|
||||||
# Set container timezone to UTC for consistent logs
|
|
||||||
- TZ=Etc/UTC
|
|
||||||
networks:
|
|
||||||
- proxy
|
|
||||||
# Start runner using persisted config
|
|
||||||
command: ["act_runner", "daemon", "--config", "/data/.runner"]
|
|
||||||
|
|
||||||
networks:
|
|
||||||
proxy:
|
|
||||||
# Shared network for Caddy + Gitea (+ later app stack)
|
|
||||||
name: proxy
|
|
||||||
# Default Docker bridge network
|
|
||||||
driver: bridge
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
# Docker volume for Caddy TLS data (safe to keep inside Docker)
|
|
||||||
caddy_data:
|
|
||||||
name: caddy_data
|
|
||||||
# Docker volume for internal Caddy configs/state
|
|
||||||
caddy_config:
|
|
||||||
name: caddy_config
|
|
||||||
```
|
|
||||||
|
|
||||||
16. Save the file and exit the editor.
|
|
||||||
|
|
||||||
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
|
||||||
|
|
||||||
17. Create the `/srv/infra/.env` file with environment variables.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
nano .env
|
|
||||||
```
|
|
||||||
|
|
||||||
18. Paste the following:
|
|
||||||
|
|
||||||
```env
|
|
||||||
# Base URL of your Gitea instance (used by the runner to register itself
|
|
||||||
# and to send/receive workflow job information).
|
|
||||||
GITEA_INSTANCE_URL=https://git.avaaz.ai
|
|
||||||
|
|
||||||
# One-time registration token generated in:
|
|
||||||
# Gitea → Site Administration → Actions → Runners → "Generate Token"
|
|
||||||
# This MUST be filled in once, so the runner can register.
|
|
||||||
# After registration, the runner stores its identity inside ./gitea-runner-data/.runner
|
|
||||||
# and this value is no longer needed (can be left blank).
|
|
||||||
GITEA_RUNNER_REGISTRATION_TOKEN=
|
|
||||||
|
|
||||||
# Human-readable name for this runner.
|
|
||||||
# This is shown in the Gitea UI so you can distinguish multiple runners:
|
|
||||||
# Example: "vps-runner", "staging-runner", "gpu-runner"
|
|
||||||
GITEA_RUNNER_NAME=gitea-runner
|
|
||||||
|
|
||||||
# Runner labels allow workflows to choose specific runners.
|
|
||||||
# The label format is: label[:schema[:args]]
|
|
||||||
# - "ubuntu-latest" is the <label> name that workflows request using runs-on: [ "ubuntu-latest" ].
|
|
||||||
# - "docker://" is the <schema> indicating the job runs inside a separate Docker container.
|
|
||||||
# - "catthehacker/ubuntu:act-latest" is the <args>, specifying the Docker image to use for the container.
|
|
||||||
# Workflows can target this using:
|
|
||||||
# runs-on: [ "ubuntu-latest" ]
|
|
||||||
GITEA_RUNNER_LABELS=ubuntu-latest:docker://catthehacker/ubuntu:act-latest
|
|
||||||
```
|
|
||||||
|
|
||||||
19. Save the file and exit the editor.
|
|
||||||
|
|
||||||
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
|
||||||
|
|
||||||
20. Create `/srv/infra/Caddyfile` to configure Caddy.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
nano Caddyfile
|
|
||||||
```
|
|
||||||
|
|
||||||
21. Paste the following:
|
|
||||||
|
|
||||||
```caddy
|
|
||||||
{
|
|
||||||
# Global Caddy options.
|
|
||||||
#
|
|
||||||
# auto_https on
|
|
||||||
# - Caddy listens on port 80 for every host (ACME + redirect).
|
|
||||||
# - Automatically issues HTTPS certificates.
|
|
||||||
# - Automatically redirects HTTP → HTTPS unless disabled.
|
|
||||||
#
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Redirect www → root domain
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
www.avaaz.ai {
|
|
||||||
# Permanent redirect to naked domain
|
|
||||||
redir https://avaaz.ai{uri} permanent
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Marketing site (optional — if frontend handles it, remove this)
|
|
||||||
# Redirect root → app
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
avaaz.ai {
|
|
||||||
# If you have a static marketing page, serve it here.
|
|
||||||
# If not, redirect visitors to the app.
|
|
||||||
redir https://app.avaaz.ai{uri}
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Frontend (Next.js)
|
|
||||||
# Public URL: https://app.avaaz.ai
|
|
||||||
# Internal target: frontend:3000
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
app.avaaz.ai {
|
|
||||||
# Reverse-proxy HTTPS traffic to the frontend container
|
|
||||||
reverse_proxy frontend:3000
|
|
||||||
|
|
||||||
# Access log for debugging frontend activity
|
|
||||||
log {
|
|
||||||
output file /data/app-access.log
|
|
||||||
}
|
|
||||||
|
|
||||||
# Compression for faster delivery of JS, HTML, etc.
|
|
||||||
encode gzip zstd
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Backend (FastAPI)
|
|
||||||
# Public URL: https://api.avaaz.ai
|
|
||||||
# Internal target: backend:8000
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
api.avaaz.ai {
|
|
||||||
# Reverse-proxy all API traffic to FastAPI
|
|
||||||
reverse_proxy backend:8000
|
|
||||||
|
|
||||||
# Access log — useful for monitoring API traffic and debugging issues
|
|
||||||
log {
|
|
||||||
output file /data/api-access.log
|
|
||||||
}
|
|
||||||
|
|
||||||
# Enable response compression (JSON, text, etc.)
|
|
||||||
encode gzip zstd
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# LiveKit (signaling only — media uses direct UDP)
|
|
||||||
# Public URL: wss://rtc.avaaz.ai
|
|
||||||
# Internal target: livekit:7880
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
rtc.avaaz.ai {
|
|
||||||
# LiveKit uses WebSocket signaling, so we reverse-proxy WS → WS
|
|
||||||
reverse_proxy livekit:7880
|
|
||||||
|
|
||||||
# Access log — helps diagnose WebRTC connection failures
|
|
||||||
log {
|
|
||||||
output file /data/rtc-access.log
|
|
||||||
}
|
|
||||||
|
|
||||||
# Compression not needed for WS traffic, but harmless
|
|
||||||
encode gzip zstd
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Gitea (Git server UI + HTTPS + SSH clone)
|
|
||||||
# Public URL: https://git.avaaz.ai
|
|
||||||
# Internal target: gitea:3000
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
git.avaaz.ai {
|
|
||||||
# Route all HTTPS traffic to Gitea’s web UI
|
|
||||||
reverse_proxy gitea:3000
|
|
||||||
|
|
||||||
# Log all Git UI requests and API access
|
|
||||||
log {
|
|
||||||
output file /data/git-access.log
|
|
||||||
}
|
|
||||||
|
|
||||||
# Compress UI responses
|
|
||||||
encode gzip zstd
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
22. Save the file and exit the editor.
|
|
||||||
|
|
||||||
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
|
||||||
|
|
||||||
23. Start the stack from `/srv/infra`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose pull # fetch images: caddy, gitea, act_runner
|
|
||||||
sudo docker compose up -d # start all containers in the background
|
|
||||||
```
|
|
||||||
|
|
||||||
24. Verify that the status of all the containers are `Up`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose ps -a
|
|
||||||
```
|
|
||||||
|
|
||||||
25. Open `https://git.avaaz.ai` in your browser. Caddy should have already obtained a cert and you should see the Gitea installer.
|
|
||||||
|
|
||||||
26. Configure database settings.
|
|
||||||
|
|
||||||
- **Database Type:** `SQLite3`
|
|
||||||
- **Path:** `/data/gitea/gitea.db` *(matches `GITEA__database__PATH`)*
|
|
||||||
|
|
||||||
27. Configure general settings.
|
|
||||||
|
|
||||||
- **Site Title:** default *(`Gitea: Git with a cup of tea`)*
|
|
||||||
- **Repository Root Path:** default *(`/data/git/repositories`)*
|
|
||||||
- **LFS Root Path:** default *(`/data/git/lfs`)*
|
|
||||||
|
|
||||||
28. Configure server settings.
|
|
||||||
|
|
||||||
- **Domain:** `git.avaaz.ai` *(external HTTPS via Caddy)*
|
|
||||||
- **SSH Port:** `2222` *(external SSH port)*
|
|
||||||
- **HTTP Port:** `3000` *(internal HTTP port)*
|
|
||||||
- **Gitea Base URL / ROOT_URL:** `https://git.avaaz.ai/`
|
|
||||||
|
|
||||||
29. Create the admin account (username + password + email) and finish installation.
|
|
||||||
|
|
||||||
30. Edit Gitea `/data/gitea/conf/app.ini` at the host bind mount `/srv/infra/gitea-data/gitea/conf/app.ini`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
nano gitea-data/gitea/conf/app.ini
|
|
||||||
```
|
|
||||||
|
|
||||||
31. Add/verify the following sections.
|
|
||||||
|
|
||||||
```ini
|
|
||||||
[server]
|
|
||||||
; Gitea serves HTTP internally (Caddy handles HTTPS externally)
|
|
||||||
PROTOCOL = http
|
|
||||||
; External hostname used for links and redirects
|
|
||||||
DOMAIN = git.avaaz.ai
|
|
||||||
; Hostname embedded in SSH clone URLs
|
|
||||||
SSH_DOMAIN = git.avaaz.ai
|
|
||||||
; Internal container port Gitea listens on (Caddy reverse-proxies to this)
|
|
||||||
HTTP_PORT = 3000
|
|
||||||
; Public-facing base URL (MUST be HTTPS when behind Caddy)
|
|
||||||
ROOT_URL = https://git.avaaz.ai/
|
|
||||||
; Enable Gitea's built-in SSH server inside the container
|
|
||||||
DISABLE_SSH = false
|
|
||||||
; Host-side SSH port exposed by Docker (mapped to container:22)
|
|
||||||
SSH_PORT = 2222
|
|
||||||
; Container-side SSH port (always 22 inside the container)
|
|
||||||
SSH_LISTEN_PORT = 22
|
|
||||||
|
|
||||||
[database]
|
|
||||||
; SQLite database file stored in bind-mounted volume
|
|
||||||
PATH = /data/gitea/gitea.db
|
|
||||||
; Using SQLite (sufficient for single-node small/medium setups)
|
|
||||||
DB_TYPE = sqlite3
|
|
||||||
|
|
||||||
[security]
|
|
||||||
; Prevent web-based reinstallation (crucial for a secured instance)
|
|
||||||
INSTALL_LOCK = true
|
|
||||||
; Auto-generated on first startup; DO NOT change or delete
|
|
||||||
SECRET_KEY =
|
|
||||||
|
|
||||||
[actions]
|
|
||||||
; Enable Gitea Actions (CI/CD)
|
|
||||||
ENABLED = true
|
|
||||||
; Default platform to get action plugins, github for https://github.com, self for the current Gitea instance.
|
|
||||||
DEFAULT_ACTIONS_URL = github
|
|
||||||
```
|
|
||||||
|
|
||||||
32. Restart Gitea to apply changes.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose restart gitea
|
|
||||||
```
|
|
||||||
|
|
||||||
33. Check if Actions is enabled.
|
|
||||||
|
|
||||||
1. Log in as admin at `https://git.avaaz.ai`.
|
|
||||||
2. Go to **Site Administration**.
|
|
||||||
3. Look for a menu item **Actions**. If `[actions] ENABLED = true` in `app.ini`, there will be options related to **Runners**, allowing management of instance-level action runners. Otherwise, the Actions menu item in the Site Administration panel will not appear, indicating the feature is globally disabled.
|
|
||||||
|
|
||||||
34. Get registration token to register the Gitea Actions runner and create a *user* account.
|
|
||||||
|
|
||||||
1. Log in as admin at `https://git.avaaz.ai`.
|
|
||||||
2. Go to **Site Administration → Actions → Runners**.
|
|
||||||
3. Choose **Create new Runner**.
|
|
||||||
4. Copy the **Registration Token**.
|
|
||||||
5. Create a *user* account.
|
|
||||||
|
|
||||||
35. Edit `.env` to add the token.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
nano .env
|
|
||||||
```
|
|
||||||
|
|
||||||
36. Paste the Registration Token after `=` without spaces.
|
|
||||||
|
|
||||||
```env
|
|
||||||
# One-time registration token generated in:
|
|
||||||
# Gitea → Site Administration → Actions → Runners → "Generate Token"
|
|
||||||
# This MUST be filled in once, so the runner can register.
|
|
||||||
# After registration, the runner stores its identity inside ./gitea-runner-data/.runner
|
|
||||||
# and this value is no longer needed (can be left blank).
|
|
||||||
GITEA_RUNNER_REGISTRATION_TOKEN=
|
|
||||||
```
|
|
||||||
|
|
||||||
37. Check for configuration changes and restart the container `gitea-runner`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose up -d gitea-runner
|
|
||||||
```
|
|
||||||
|
|
||||||
38. Confirm that the Gitea instance URL, Runner name, and Runner labels in `gitea-runner-data/.runner` file are the same as the values in the `.env` file. Fix it using `nano gitea-runner-data/.runner` if different.
|
|
||||||
|
|
||||||
39. Verify that the Runner is connected to `https://git.avaaz.ai` and is polling for jobs.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker logs -f gitea-runner
|
|
||||||
```
|
|
||||||
|
|
||||||
40. Generate an SSH key on laptop. Accept the defaults and optionally set a passphrase. The public key is placed in `~/.ssh/id_ed25519.pub`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh-keygen -t ed25519 -C "user@avaaz.ai"
|
|
||||||
```
|
|
||||||
|
|
||||||
41. Add the public key to Gitea.
|
|
||||||
|
|
||||||
1. Log into `https://git.avaaz.ai` as *user*.
|
|
||||||
2. Go to **Profile → Settings → SSH / GPG Keys → Add Key**.
|
|
||||||
3. Paste the contents starting with `ssh-ed25519` in `~/.ssh/id_ed25519.pub`.
|
|
||||||
4. Save.
|
|
||||||
|
|
||||||
42. Test SSH remote on laptop.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh -T -p 2222 git@git.avaaz.ai
|
|
||||||
```
|
|
||||||
|
|
||||||
43. Type `yes` to tell SSH client to trust the fingerprint and press `Enter`. Enter the passphrase and verify the response *You've successfully authenticated..., but Gitea does not provide shell access.*
|
|
||||||
|
|
||||||
44. Confirm that Gitea’s **clone URLs** of a repo show `ssh://git@git.avaaz.ai:2222/<user>/<repo>.git`.
|
|
||||||
|
|
||||||
45. Upgrade Docker images safely.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose pull # pull newer images
|
|
||||||
sudo docker compose up -d # recreate containers with new images
|
|
||||||
```
|
|
||||||
|
|
||||||
46. Restart the whole infra stack.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose restart # restart all containers
|
|
||||||
```
|
|
||||||
|
|
||||||
47. Check logs for troubleshooting.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker logs -f caddy # shows “obtaining certificate” or ACME errors if HTTPS fails.
|
|
||||||
sudo docker logs -f gitea # shows DB/permissions problems, config issues, etc.
|
|
||||||
sudo docker logs -f gitea-runner # shows registration/connection/job-execution issues.
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 4.1.5 Validate the infrastructure
|
|
||||||
|
|
||||||
1. Confirm that all containers `caddy`, `gitea`, and `gitea-runner` are `Up`.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker compose ps -a
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Confirm that `https://git.avaaz.ai` shows Gitea login page with a valid TLS cert (padlock icon) when opened in a browser.
|
|
||||||
|
|
||||||
3. Confirm the response *You've successfully authenticated..., but Gitea does not provide shell access.* when connecting to Gitea over SSH.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh -T -p 2222 git@git.avaaz.ai
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Create a `test` repo in Gitea and confirm cloning it.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git clone ssh://git@git.avaaz.ai:2222/<your-user>/test.git
|
|
||||||
```
|
|
||||||
|
|
||||||
5. Confirm that the Actions runner `gitea-runner` is registered and online with status **Idle**.
|
|
||||||
|
|
||||||
1. Log in as admin at `https://git.avaaz.ai`.
|
|
||||||
2. Go to **Site Administration → Actions → Runners**.
|
|
||||||
|
|
||||||
6. Add `.gitea/workflows/test.yml` to the `test` repo root, commit and push.
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# Workflow Name
|
|
||||||
name: Test Workflow
|
|
||||||
|
|
||||||
# Trigger on a push event to any branch
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
branches:
|
|
||||||
# This means 'any branch'
|
|
||||||
- '**'
|
|
||||||
|
|
||||||
# Define the jobs to run
|
|
||||||
jobs:
|
|
||||||
hello:
|
|
||||||
# Specify the runner image to use
|
|
||||||
runs-on: [ "ubuntu-latest" ]
|
|
||||||
|
|
||||||
# Define the steps for this job
|
|
||||||
steps:
|
|
||||||
- name: Run a Test Script
|
|
||||||
run: echo "Hello from Gitea Actions!"
|
|
||||||
```
|
|
||||||
|
|
||||||
7. Confirm a workflow run appears in Gitea → test repo → **Actions** tab and progresses from queued → in progress → success.
|
|
||||||
|
|
||||||
8. Confirm the logs show the job picked up, container created, and the “Hello from Gitea Actions!” output.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo docker logs -f gitea-runner
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4.2 Configure the Development Laptop
|
|
||||||
|
|
||||||
#### 4.2.1 Run Applicaiton
|
|
||||||
|
|
||||||
1. Removes all cached Python packages stored by pip, removes local Python cache files, clears the cache used by uv, and forcibly clear the cache for Node.js.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
uv tool install cleanpy
|
|
||||||
pip cache purge && cleanpy . && uv cache clean && npm cache clean --force
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Resolve dependencies from your *pyproject.toml* and upgrade all packages. Synchronize the virtual environment with the dependencies specified in the *uv.lock* including packages needed for development.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd backend
|
|
||||||
uv lock --upgrade
|
|
||||||
uv sync --dev
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Lint and check code for errors, style issues, and potential bugs, and try to fix them. Discover and run tests in *tests/*.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd backend
|
|
||||||
uv run ruff check --fix && uv run pytest
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Starts a local development API server, visible at port 8000, and automatically reloads the server as you make code changes.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd backend
|
|
||||||
uv run uvicorn src.main:app --reload --port 8000
|
|
||||||
```
|
|
||||||
|
|
||||||
5. Scans dependencies for security vulnerabilities and attempts to automatically fix them by force-updating to the latest secure versions.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd frontend
|
|
||||||
npm audit fix --force
|
|
||||||
```
|
|
||||||
|
|
||||||
6. Install dependencies from *package.json*, then update those dependencies to the latest allowed versions based on version ranges. Next, check the source code for stylistic and syntax errors according to configured rules. Finally, compile or bundle the application for deployment or production use.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd frontend
|
|
||||||
npm install && npm update && npm run lint && npm run build
|
|
||||||
```
|
|
||||||
|
|
||||||
7. Execute start script in *package.json*, launch your Node.js application in production mode.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd frontend
|
|
||||||
npm run start
|
|
||||||
```
|
|
||||||
|
|
||||||
## 5. Example Project Structure
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
avaaz.ai/
|
avaaz.ai/
|
||||||
├── .dockerignore # Specifies files and directories to exclude from Docker builds, such as .git, node_modules, and build artifacts, to optimize image sizes.
|
├── .dockerignore # Specifies files and directories to exclude from Docker builds, such as .git, node_modules, and build artifacts, to optimize image sizes.
|
||||||
├── .gitignore # Lists files and patterns to ignore in Git, including .env, __pycache__, node_modules, and logs, preventing sensitive or temporary files from being committed.
|
├── .gitignore # Lists files and patterns to ignore in Git, including .env, __pycache__, node_modules, and logs, preventing sensitive or temporary files from being committed.
|
||||||
├── .gitattributes # Controls Git’s handling of files across platforms (e.g. normalizing line endings with * text=auto), and can force certain files to be treated as binary or configure diff/merge drivers.
|
|
||||||
│
|
│
|
||||||
├── .env.example # Template for environment variables, showing required keys like DATABASE_URL, GEMINI_API_KEY, LIVEKIT_API_KEY without actual values.
|
├── .env.example # Template for environment variables, showing required keys like DATABASE_URL, GEMINI_API_KEY, LIVEKIT_API_KEY without actual values.
|
||||||
├── docker-compose.dev.yml # Docker Compose file for development environment: defines services for local frontend, backend, postgres, livekit with volume mounts for hot-reloading.
|
├── docker-compose.dev.yml # Docker Compose file for development environment: defines services for local frontend, backend, postgres, livekit with volume mounts for hot-reloading.
|
||||||
|
|||||||
126
app/.dockerignore
Normal file
126
app/.dockerignore
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
# ==============================================================================
|
||||||
|
# .dockerignore – Python + Next.js (Docker Compose)
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Git & Version Control
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
.gitattributes
|
||||||
|
.github
|
||||||
|
.gitpod.yml
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Python-specific (already in .gitignore, but repeat for safety)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
*.manifest
|
||||||
|
*.spec
|
||||||
|
|
||||||
|
# Virtual environments & caches
|
||||||
|
.venv
|
||||||
|
venv/
|
||||||
|
env/
|
||||||
|
ENV/
|
||||||
|
.pixi/
|
||||||
|
__pypackages__/
|
||||||
|
.tox/
|
||||||
|
.nox/
|
||||||
|
.pdm-python
|
||||||
|
.pdm-build/
|
||||||
|
|
||||||
|
# Testing & coverage
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
.pytest_cache/
|
||||||
|
.coverage/
|
||||||
|
.ruff_cache/
|
||||||
|
.mypy_cache/
|
||||||
|
.pyre/
|
||||||
|
.pytype/
|
||||||
|
|
||||||
|
# Jupyter / notebooks
|
||||||
|
.ipynb_checkpoints
|
||||||
|
|
||||||
|
# IDEs & editors
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Next.js / Node.js
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
node_modules/
|
||||||
|
.next/
|
||||||
|
out/
|
||||||
|
build/
|
||||||
|
dist/
|
||||||
|
.npm
|
||||||
|
.pnp.*
|
||||||
|
.yarn/
|
||||||
|
.yarn-cache/
|
||||||
|
.yarn-unplugged/
|
||||||
|
|
||||||
|
# TypeScript build info
|
||||||
|
*.tsbuildinfo
|
||||||
|
next-env.d.ts
|
||||||
|
|
||||||
|
# Logs & debug
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
.pnpm-debug.log*
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Environment & Secrets (never send to Docker daemon)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env*.local
|
||||||
|
.env.production
|
||||||
|
.env.development
|
||||||
|
.envrc
|
||||||
|
*.pem
|
||||||
|
*.key
|
||||||
|
*.crt
|
||||||
|
*.secrets
|
||||||
|
.streamlit/secrets.toml
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Docker & Compose (avoid recursive inclusion)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
Dockerfile*
|
||||||
|
docker-compose*.yml
|
||||||
|
docker-compose*.yaml
|
||||||
|
.dockerignore
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Misc / OS
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
desktop.ini
|
||||||
|
|
||||||
|
# Local documentation builds
|
||||||
|
/site
|
||||||
|
docs/_build/
|
||||||
|
|
||||||
|
# Temporary files
|
||||||
|
tmp/
|
||||||
|
temp/
|
||||||
|
*.tmp
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# End of file
|
||||||
|
# ==============================================================================
|
||||||
49
app/.env.example
Normal file
49
app/.env.example
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
#
|
||||||
|
# Sample environment for docker compose. Copy to .env and adjust.
|
||||||
|
#
|
||||||
|
# Profiles:
|
||||||
|
# dev - laptop development (hot reload + localhost ports)
|
||||||
|
# prod - VPS behind Caddy (no public container ports; secrets provided by CI/CD)
|
||||||
|
#
|
||||||
|
|
||||||
|
COMPOSE_PROFILES=dev
|
||||||
|
DOCKER_RESTART_POLICY=unless-stopped
|
||||||
|
|
||||||
|
# PostgreSQL
|
||||||
|
POSTGRES_USER=postgres
|
||||||
|
POSTGRES_PASSWORD=postgres
|
||||||
|
POSTGRES_DB=avaaz
|
||||||
|
POSTGRES_HOST=postgres
|
||||||
|
POSTGRES_PORT=5432
|
||||||
|
DATABASE_URL=postgresql+psycopg://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
|
||||||
|
|
||||||
|
# Backend
|
||||||
|
ENVIRONMENT=development
|
||||||
|
SECRET_KEY=dev-secret-change-me
|
||||||
|
GUNICORN_WORKERS=4
|
||||||
|
GUNICORN_TIMEOUT=120
|
||||||
|
OPENAI_API_KEY=
|
||||||
|
GOOGLE_API_KEY=
|
||||||
|
|
||||||
|
# LiveKit
|
||||||
|
LIVEKIT_API_KEY=devkey
|
||||||
|
LIVEKIT_API_SECRET=devsecret
|
||||||
|
LIVEKIT_LOG_LEVEL=info
|
||||||
|
LIVEKIT_WS_URL=ws://livekit:7880
|
||||||
|
LIVEKIT_URL=http://livekit:7880
|
||||||
|
|
||||||
|
# Frontend
|
||||||
|
NEXT_PUBLIC_API_URL=http://localhost:8000
|
||||||
|
NEXT_PUBLIC_LIVEKIT_WS_URL=ws://localhost:7880
|
||||||
|
|
||||||
|
# Production overrides (supply via secrets/CI, not committed):
|
||||||
|
# COMPOSE_PROFILES=prod
|
||||||
|
# ENVIRONMENT=production
|
||||||
|
# SECRET_KEY=<strong-random-secret>
|
||||||
|
# NEXT_PUBLIC_API_URL=https://api.avaaz.ai
|
||||||
|
# NEXT_PUBLIC_LIVEKIT_WS_URL=wss://rtc.avaaz.ai
|
||||||
|
# LIVEKIT_WS_URL=ws://livekit:7880
|
||||||
|
# LIVEKIT_API_KEY=<lk-key>
|
||||||
|
# LIVEKIT_API_SECRET=<lk-secret>
|
||||||
|
# OPENAI_API_KEY=<openai-key>
|
||||||
|
# GOOGLE_API_KEY=<gemini-key>
|
||||||
47
app/README.md
Normal file
47
app/README.md
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
# 1. Run Applicaiton
|
||||||
|
|
||||||
|
1. Remove all cached Python packages stored by pip, remove local Python cache files, clear the cache used by uv, and forcibly clear the cache for Node.js.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv tool install cleanpy
|
||||||
|
pip cache purge && cleanpy . && uv cache clean && npm cache clean --force
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Resolve dependencies from *pyproject.toml* and upgrade all packages. Synchronize the virtual environment with the dependencies specified in the *uv.lock* including packages needed for **development**.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
uv lock --upgrade
|
||||||
|
uv sync --dev
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Lint and check code for errors, style issues, and potential bugs, and try to fix them. Discover and run tests in *tests/*.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv run ruff check --fix && uv run pytest
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Start a local **development** API server, visible at port 8000, and automatically reloads the server when code changes are made.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv run uvicorn src.main:app --reload --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Open a new terminal. Scan dependencies for security vulnerabilities and attempt to automatically fix them by force-updating to the latest secure versions.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
npm audit fix --force
|
||||||
|
```
|
||||||
|
|
||||||
|
6. Install dependencies from *package.json*, then update those dependencies to the latest allowed versions based on version ranges. Next, check the source code for stylistic and syntax errors according to configured rules. Finally, compile or bundle the application for deployment or production use.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm install && npm update && npm run lint && npm run build
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Execute start script in *package.json*, launch Node.js application in **development** mode.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
84
app/backend/Dockerfile
Normal file
84
app/backend/Dockerfile
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
#
|
||||||
|
# BACKEND DOCKERFILE
|
||||||
|
#
|
||||||
|
# Multi-stage image for the FastAPI + LiveKit Agent backend using uv.
|
||||||
|
# - production: smallest runtime image with gunicorn/uvicorn worker
|
||||||
|
# - development: hot-reload friendly image with full toolchain
|
||||||
|
# - builder: installs dependencies once for reuse across stages
|
||||||
|
#
|
||||||
|
# Keep dependency definitions aligned with docs/architecture.md.
|
||||||
|
|
||||||
|
FROM python:3.12-slim AS base
|
||||||
|
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||||
|
PYTHONUNBUFFERED=1 \
|
||||||
|
PIP_NO_CACHE_DIR=1 \
|
||||||
|
PIP_DISABLE_PIP_VERSION_CHECK=1 \
|
||||||
|
UV_PROJECT_ENVIRONMENT=/app/.venv \
|
||||||
|
UV_LINK_MODE=copy
|
||||||
|
|
||||||
|
RUN apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
build-essential \
|
||||||
|
curl \
|
||||||
|
libpq-dev \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN groupadd --system app && useradd --system --home /app --gid app app
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install uv globally so subsequent stages share the toolchain.
|
||||||
|
RUN pip install --upgrade pip uv
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Builder: install prod dependencies into an in-project virtualenv
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM base AS builder
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN test -f pyproject.toml || (echo "pyproject.toml is required for uv sync"; exit 1)
|
||||||
|
RUN if [ -f uv.lock ]; then \
|
||||||
|
uv sync --frozen --no-dev --compile-bytecode; \
|
||||||
|
else \
|
||||||
|
uv sync --no-dev --compile-bytecode; \
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Production: minimal runtime image with gunicorn as the entrypoint
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM python:3.12-slim AS production
|
||||||
|
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||||
|
PYTHONUNBUFFERED=1 \
|
||||||
|
PIP_NO_CACHE_DIR=1
|
||||||
|
|
||||||
|
RUN apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends libpq5 \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN groupadd --system app && useradd --system --home /app --gid app app
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY --from=builder --chown=app:app /app /app
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
|
||||||
|
USER app
|
||||||
|
EXPOSE 8000
|
||||||
|
CMD ["gunicorn", "-w", "4", "-k", "uvicorn.workers.UvicornWorker", "main:app", "--bind", "0.0.0.0:8000"]
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Development: includes dev dependencies and keeps uvicorn reload-friendly
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM base AS development
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN test -f pyproject.toml || (echo "pyproject.toml is required for uv sync"; exit 1)
|
||||||
|
RUN if [ -f uv.lock ]; then \
|
||||||
|
uv sync --frozen --dev --compile-bytecode; \
|
||||||
|
else \
|
||||||
|
uv sync --dev --compile-bytecode; \
|
||||||
|
fi
|
||||||
|
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
USER app
|
||||||
|
EXPOSE 8000
|
||||||
1
app/backend/__init__.py
Normal file
1
app/backend/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Backend package for avaaz.ai."""
|
||||||
1
app/backend/api/__init__.py
Normal file
1
app/backend/api/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""API package with versioned routers."""
|
||||||
1
app/backend/api/v1/__init__.py
Normal file
1
app/backend/api/v1/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""API v1 package."""
|
||||||
8
app/backend/api/v1/router.py
Normal file
8
app/backend/api/v1/router.py
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
"""Version 1 API router aggregator for features."""
|
||||||
|
|
||||||
|
from fastapi import APIRouter
|
||||||
|
|
||||||
|
from features.auth.router import router as auth_router
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
router.include_router(auth_router)
|
||||||
1
app/backend/core/__init__.py
Normal file
1
app/backend/core/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Core infrastructure (config, database, etc.)."""
|
||||||
49
app/backend/core/config.py
Normal file
49
app/backend/core/config.py
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
"""Environment configuration derived from environment variables."""
|
||||||
|
|
||||||
|
from functools import lru_cache
|
||||||
|
from pydantic import SecretStr # Import SecretStr for sensitive data
|
||||||
|
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
"""
|
||||||
|
Application settings class using Pydantic BaseSettings.
|
||||||
|
Settings are loaded from environment variables and have default values defined here.
|
||||||
|
"""
|
||||||
|
|
||||||
|
service_name: str = "avaaz-backend" # A unique functional identifier for the microservice
|
||||||
|
environment: str = "development" # Defines the current deployment stage (e.g., 'development', 'staging', 'production')
|
||||||
|
|
||||||
|
title: str = "Avaaz Language Tutoring API"
|
||||||
|
description: str = """
|
||||||
|
# Avaaz Language Tutoring API
|
||||||
|
|
||||||
|
This API powers the **avaaz.ai** mobile and web applications, providing the robust backend services for our AI-driven oral language skills tutor. The platform is specifically engineered to help students achieve oral proficiency using adaptive, conversational AI agents.
|
||||||
|
|
||||||
|
## Key Services Provided:
|
||||||
|
* **Conversational AI Engine:** Facilitates ultra-low-latency speech-to-speech interaction and provides instant corrective feedback (grammar, pronunciation, fluency).
|
||||||
|
* **Curriculum Management:** Delivers structured, CEFR aligned lessons and scenarios focused on real-life immigrant contexts (healthcare, workplace, school).
|
||||||
|
* **Assessment & Gamification:** Manages progress tracking, mock oral exam simulations, performance summaries, and motivational mechanics (streaks, badges).
|
||||||
|
* **Cross-Platform Sync:** Ensures seamless learning continuity and progress synchronization across all user devices.
|
||||||
|
"""
|
||||||
|
|
||||||
|
version: str = "0.1.0" # The current semantic version of the API application
|
||||||
|
|
||||||
|
# Use SecretStr to prevent accidental logging of credentials.
|
||||||
|
# Access the actual value using settings.database_url.get_secret_value()
|
||||||
|
database_url: SecretStr = SecretStr("postgresql+psycopg://postgres:postgres@postgres:5432/avaaz")
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(
|
||||||
|
env_prefix="", # Load variables without a specific prefix (e.g., `DATABASE_URL` instead of `APP_DATABASE_URL`)
|
||||||
|
case_sensitive=False # Environment variable names are treated as case-insensitive during loading
|
||||||
|
)
|
||||||
|
|
||||||
|
@lru_cache(maxsize=1)
|
||||||
|
def get_settings() -> Settings:
|
||||||
|
"""
|
||||||
|
Return a cached singleton instance of the application settings.
|
||||||
|
|
||||||
|
This function leverages functools.lru_cache to ensure that environment variables
|
||||||
|
are read only once during the application's lifecycle, improving performance
|
||||||
|
and ensuring consistency across requests.
|
||||||
|
"""
|
||||||
|
return Settings()
|
||||||
14
app/backend/core/database.py
Normal file
14
app/backend/core/database.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
"""Database connection placeholders."""
|
||||||
|
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from typing import Iterator
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def get_db() -> Iterator[None]:
|
||||||
|
"""
|
||||||
|
Yield a database session placeholder.
|
||||||
|
|
||||||
|
Replace with a real session (e.g., SQLAlchemy) when persistence is added.
|
||||||
|
"""
|
||||||
|
yield None
|
||||||
1
app/backend/features/__init__.py
Normal file
1
app/backend/features/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Domain feature modules."""
|
||||||
1
app/backend/features/auth/__init__.py
Normal file
1
app/backend/features/auth/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Auth feature placeholder."""
|
||||||
1
app/backend/features/auth/adapters/__init__.py
Normal file
1
app/backend/features/auth/adapters/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Adapters for auth integrations."""
|
||||||
3
app/backend/features/auth/adapters/fastapi_users.py
Normal file
3
app/backend/features/auth/adapters/fastapi_users.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Placeholder for FastAPI Users integration."""
|
||||||
|
|
||||||
|
# Add glue code for FastAPI Users when adopting that library.
|
||||||
3
app/backend/features/auth/dependencies.py
Normal file
3
app/backend/features/auth/dependencies.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Authentication dependencies placeholder."""
|
||||||
|
|
||||||
|
# Add FastAPI dependencies (e.g., current_user) when auth is implemented.
|
||||||
3
app/backend/features/auth/models.py
Normal file
3
app/backend/features/auth/models.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Authentication models placeholder."""
|
||||||
|
|
||||||
|
# Add ORM models (e.g., SQLAlchemy) when auth is implemented.
|
||||||
3
app/backend/features/auth/permissions.py
Normal file
3
app/backend/features/auth/permissions.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Authentication permission placeholder."""
|
||||||
|
|
||||||
|
# Define scopes/roles when auth is implemented.
|
||||||
11
app/backend/features/auth/router.py
Normal file
11
app/backend/features/auth/router.py
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
"""Authentication router placeholder."""
|
||||||
|
|
||||||
|
from fastapi import APIRouter
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/auth", tags=["auth"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/noop", include_in_schema=False)
|
||||||
|
def auth_not_implemented() -> dict:
|
||||||
|
"""Placeholder endpoint to keep router wired."""
|
||||||
|
return {"status": "not_implemented"}
|
||||||
3
app/backend/features/auth/schemas.py
Normal file
3
app/backend/features/auth/schemas.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Authentication schemas placeholder."""
|
||||||
|
|
||||||
|
# Add Pydantic models for auth requests/responses when implemented.
|
||||||
3
app/backend/features/auth/service.py
Normal file
3
app/backend/features/auth/service.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""Authentication service placeholder."""
|
||||||
|
|
||||||
|
# Add token generation/verification logic here.
|
||||||
56
app/backend/main.py
Normal file
56
app/backend/main.py
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
"""
|
||||||
|
API Application Entrypoint and Router Wiring.
|
||||||
|
|
||||||
|
This module initializes the core FastAPI application instance, loads
|
||||||
|
configuration, applies middleware, and wires up all defined API routers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
from api.v1.router import router as api_v1_router
|
||||||
|
from operations.health.router import router as health_router
|
||||||
|
from core.config import get_settings, Settings # Import Settings type for clarity
|
||||||
|
|
||||||
|
def create_app(settings: Settings = get_settings()) -> FastAPI:
|
||||||
|
"""
|
||||||
|
Create and configure the FastAPI application instance.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
settings: Configuration object from core.config. Defaults to current settings.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A configured FastAPI application instance.
|
||||||
|
"""
|
||||||
|
app = FastAPI(
|
||||||
|
title=settings.title,
|
||||||
|
description=settings.description,
|
||||||
|
version=settings.version,
|
||||||
|
docs_url="/docs",
|
||||||
|
redoc_url=None,
|
||||||
|
openapi_url="/openapi.json",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Define allowed origins dynamically based on local dev environment needs
|
||||||
|
# In a production setting, this list would typically be sourced from environment variables.
|
||||||
|
allowed_origins: List[str] = ["http://localhost:3000", "http://127.0.0.1:3000"]
|
||||||
|
|
||||||
|
# Configure CORS middleware
|
||||||
|
# TODO: Tightly restrict origins in production deployment.
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=allowed_origins,
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Include API routers
|
||||||
|
app.include_router(health_router)
|
||||||
|
app.include_router(api_v1_router, prefix="/api/v1")
|
||||||
|
|
||||||
|
return app
|
||||||
|
|
||||||
|
# Main entry point for services like Uvicorn (e.g., `uvicorn main:app --reload`)
|
||||||
|
app = create_app()
|
||||||
1
app/backend/operations/__init__.py
Normal file
1
app/backend/operations/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Domain operation modules."""
|
||||||
1
app/backend/operations/health/__init__.py
Normal file
1
app/backend/operations/health/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Health feature package."""
|
||||||
84
app/backend/operations/health/router.py
Normal file
84
app/backend/operations/health/router.py
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
"""Health endpoints for FastAPI application health checks (Liveness, Readiness, Detailed Status)."""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from fastapi.responses import PlainTextResponse
|
||||||
|
|
||||||
|
# Dependencies for application configuration and schema definitions
|
||||||
|
from core.config import Settings, get_settings
|
||||||
|
from operations.health.schemas import HealthStatus, HealthStatusEnum
|
||||||
|
from operations.health.service import get_detailed_health, readiness_check
|
||||||
|
|
||||||
|
# Initialize the API router for health endpoints, grouping them under the "/health" prefix
|
||||||
|
router = APIRouter(prefix="/health", tags=["health"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get(
|
||||||
|
"/live",
|
||||||
|
summary="Liveness Probe",
|
||||||
|
response_class=PlainTextResponse,
|
||||||
|
status_code=200,
|
||||||
|
)
|
||||||
|
def liveness() -> str:
|
||||||
|
"""
|
||||||
|
**Liveness Probe:** Confirms the application process is running and responsive.
|
||||||
|
|
||||||
|
This endpoint is used by automated systems (like Kubernetes) to determine if
|
||||||
|
the instance should be kept running or restarted. It must be extremely lightweight,
|
||||||
|
performing no deep checks on external dependencies.
|
||||||
|
|
||||||
|
**Success Response:** HTTP 200 OK with "live" body.
|
||||||
|
**Failure Response:** The orchestrator will interpret a *TCP connection timeout* as a failure.
|
||||||
|
"""
|
||||||
|
# Simply returning a string confirms the Python process and FastAPI are functional.
|
||||||
|
return "live"
|
||||||
|
|
||||||
|
|
||||||
|
@router.get(
|
||||||
|
"/ready",
|
||||||
|
summary="Readiness Probe",
|
||||||
|
response_class=PlainTextResponse,
|
||||||
|
status_code=200,
|
||||||
|
)
|
||||||
|
async def readiness() -> str:
|
||||||
|
"""
|
||||||
|
**Readiness Probe:** Determines if the application can accept user traffic.
|
||||||
|
|
||||||
|
This endpoint is used by load balancers or service meshes to decide whether
|
||||||
|
to route traffic to this specific instance. It performs deep checks
|
||||||
|
on all critical dependencies (e.g., database connection, external services).
|
||||||
|
|
||||||
|
**Success Response:** HTTP 200 OK with "ready" body.
|
||||||
|
**Failure Response:** HTTP 503 Service Unavailable if any critical dependency fails.
|
||||||
|
"""
|
||||||
|
# Call the service layer function that runs all critical checks concurrently
|
||||||
|
ok = await readiness_check()
|
||||||
|
|
||||||
|
if not ok:
|
||||||
|
# If any check fails, signal 'Service Unavailable' so traffic is diverted
|
||||||
|
raise HTTPException(status_code=503, detail="not ready")
|
||||||
|
|
||||||
|
return "ready"
|
||||||
|
|
||||||
|
|
||||||
|
@router.get(
|
||||||
|
"",
|
||||||
|
summary="Detailed Health Status Page",
|
||||||
|
response_model=HealthStatus,
|
||||||
|
)
|
||||||
|
async def detailed_health(settings: Settings = Depends(get_settings)) -> HealthStatus:
|
||||||
|
"""
|
||||||
|
**Detailed Status Page:** Provides granular health information for human operators/monitoring tools.
|
||||||
|
|
||||||
|
This endpoint runs all readiness checks and returns a structured JSON object
|
||||||
|
containing the status of each individual component.
|
||||||
|
The top-level HTTP status code reflects the overall application health (200 OK for 'pass', 503 for 'fail').
|
||||||
|
"""
|
||||||
|
# Retrieve the comprehensive health status model
|
||||||
|
detailed_health = await get_detailed_health(settings)
|
||||||
|
|
||||||
|
if detailed_health.status != HealthStatusEnum.passed:
|
||||||
|
# Align the HTTP status code with the overall health status for easy monitoring
|
||||||
|
raise HTTPException(status_code=503, detail="not ready")
|
||||||
|
|
||||||
|
# Status code is 200
|
||||||
|
return detailed_health
|
||||||
65
app/backend/operations/health/schemas.py
Normal file
65
app/backend/operations/health/schemas.py
Normal file
@@ -0,0 +1,65 @@
|
|||||||
|
"""
|
||||||
|
Pydantic schemas for defining health check responses, following IETF standards.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
# Import ConfigDict from pydantic to resolve the deprecation warning
|
||||||
|
from pydantic import BaseModel, Field, ConfigDict
|
||||||
|
|
||||||
|
# Define acceptable statuses as an Enum for robust validation
|
||||||
|
class HealthStatusEnum(str, Enum):
|
||||||
|
"""Enumeration for standard health check statuses."""
|
||||||
|
passed = "pass"
|
||||||
|
warned = "warn"
|
||||||
|
failed = "fail"
|
||||||
|
|
||||||
|
class ComponentCheck(BaseModel):
|
||||||
|
"""
|
||||||
|
Represents the status and metrics for a single internal component or dependency.
|
||||||
|
"""
|
||||||
|
# Use ConfigDict instead of the class Config approach
|
||||||
|
model_config = ConfigDict(
|
||||||
|
populate_by_name=True # Allows instantiation using either 'observed_value' or 'observedValue'
|
||||||
|
)
|
||||||
|
|
||||||
|
name: str = Field(description="The unique name of the component being checked (e.g., 'postgres', 'redis').")
|
||||||
|
status: HealthStatusEnum = Field(description="The status of the check: 'pass', 'warn', or 'fail'.")
|
||||||
|
time: datetime | None = Field(default=None, description="The time at which the check was performed in ISO 8601 format.")
|
||||||
|
output: str | None = Field(default=None, description="Additional details, error messages, or logs if the status is 'fail' or 'warn'.")
|
||||||
|
|
||||||
|
# Python uses snake_case internally, JSON uses camelCase for the alias
|
||||||
|
observed_value: float | int | None = Field(
|
||||||
|
default=None,
|
||||||
|
alias="observedValue",
|
||||||
|
description="The value observed during the check (e.g., latency in ms)."
|
||||||
|
)
|
||||||
|
# Python uses snake_case internally, JSON uses camelCase for the alias
|
||||||
|
observed_unit: str | None = Field(
|
||||||
|
default=None,
|
||||||
|
alias="observedUnit",
|
||||||
|
description="The unit of the observed value (e.g., 'ms', 'count', 'bytes')."
|
||||||
|
)
|
||||||
|
|
||||||
|
class HealthStatus(BaseModel):
|
||||||
|
"""
|
||||||
|
The overall system health response model, aggregating all individual component checks.
|
||||||
|
"""
|
||||||
|
# Use ConfigDict instead of the class Config approach
|
||||||
|
model_config = ConfigDict(
|
||||||
|
populate_by_name=True
|
||||||
|
)
|
||||||
|
|
||||||
|
status: HealthStatusEnum = Field(description="The aggregate status of the entire service: 'pass', 'warn', or 'fail'.")
|
||||||
|
version: str | None = Field(default=None, description="The application version (e.g., Git SHA or semantic version number).")
|
||||||
|
environment: str | None = Field(default=None, description="The deployment environment (e.g., 'production', 'staging').")
|
||||||
|
|
||||||
|
# Python uses snake_case internally, JSON uses camelCase for the alias
|
||||||
|
service_name: str | None = Field(
|
||||||
|
default=None,
|
||||||
|
alias="serviceName",
|
||||||
|
description="The name of the service."
|
||||||
|
)
|
||||||
|
description: str | None = Field(default=None, description="A brief description of the service.")
|
||||||
|
checks: dict[str, ComponentCheck] = Field(description="A dictionary mapping check keys (e.g., 'Database') to their detailed ComponentCheck results.")
|
||||||
129
app/backend/operations/health/service.py
Normal file
129
app/backend/operations/health/service.py
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
import asyncio
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import Callable, Coroutine, Any, List, Dict
|
||||||
|
|
||||||
|
from core.config import Settings
|
||||||
|
# Assuming schemas now contains the Enum definition and uses it in the models
|
||||||
|
from operations.health.schemas import ComponentCheck, HealthStatus, HealthStatusEnum
|
||||||
|
|
||||||
|
# Type alias for a function that returns an awaitable ComponentCheck.
|
||||||
|
HealthCheckFunc = Callable[[], Coroutine[Any, Any, ComponentCheck]]
|
||||||
|
|
||||||
|
async def _run_check_with_timeout(
|
||||||
|
check_coroutine: Coroutine[Any, Any, None],
|
||||||
|
name: str,
|
||||||
|
timeout_ms: int
|
||||||
|
) -> ComponentCheck:
|
||||||
|
"""
|
||||||
|
A utility wrapper that executes a given async coroutine with a strict timeout constraint.
|
||||||
|
It standardizes the exception handling and timing calculation for health checks.
|
||||||
|
"""
|
||||||
|
start_time = datetime.now(timezone.utc)
|
||||||
|
# Convert milliseconds timeout to a float in seconds for asyncio
|
||||||
|
timeout_seconds = timeout_ms / 1000.0
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Enforce the timeout using the modern asyncio.timeout context manager (Python 3.11+)
|
||||||
|
async with asyncio.timeout(timeout_seconds):
|
||||||
|
await check_coroutine
|
||||||
|
|
||||||
|
# If execution reaches here, the check passed within the time limit.
|
||||||
|
duration = datetime.now(timezone.utc) - start_time
|
||||||
|
observed_value = int(duration.total_seconds() * 1000) # value stored in ms
|
||||||
|
|
||||||
|
return ComponentCheck(
|
||||||
|
name=name,
|
||||||
|
# Use the Enum value for status
|
||||||
|
status=HealthStatusEnum.passed,
|
||||||
|
time=datetime.now(timezone.utc),
|
||||||
|
observedValue=observed_value,
|
||||||
|
observedUnit="ms",
|
||||||
|
)
|
||||||
|
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
# The operation specifically took too long and the timeout context manager raised an exception.
|
||||||
|
return ComponentCheck(
|
||||||
|
name=name,
|
||||||
|
# Use the Enum value for status
|
||||||
|
status=HealthStatusEnum.failed,
|
||||||
|
time=datetime.now(timezone.utc),
|
||||||
|
output=f"Check timed out after {timeout_seconds:.2f}s",
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
# Catch any other general exceptions (e.g., connection refused, network down)
|
||||||
|
return ComponentCheck(
|
||||||
|
name=name,
|
||||||
|
# Use the Enum value for status
|
||||||
|
status=HealthStatusEnum.failed,
|
||||||
|
time=datetime.now(timezone.utc),
|
||||||
|
output=f"An error occurred: {str(e)}",
|
||||||
|
)
|
||||||
|
|
||||||
|
async def check_database_status() -> ComponentCheck:
|
||||||
|
"""
|
||||||
|
Initiates the check for the primary database connection.
|
||||||
|
Calls the generic wrapper with specific logic and timeout for Postgres.
|
||||||
|
"""
|
||||||
|
async def db_logic():
|
||||||
|
# IMPORTANT: Replace this sleep simulation with the actual async DB client call (e.g., await database.ping())
|
||||||
|
await asyncio.sleep(0.045)
|
||||||
|
|
||||||
|
return await _run_check_with_timeout(db_logic(), name="postgres", timeout_ms=50)
|
||||||
|
|
||||||
|
async def check_media_server_status() -> ComponentCheck:
|
||||||
|
"""
|
||||||
|
Initiates the check for the media server connection.
|
||||||
|
Calls the generic wrapper with specific logic and timeout for LiveKit/Media Server.
|
||||||
|
"""
|
||||||
|
async def media_logic():
|
||||||
|
# IMPORTANT: Replace this sleep simulation with the actual network I/O call (e.g., await http_client.get('...'))
|
||||||
|
await asyncio.sleep(0.02)
|
||||||
|
|
||||||
|
return await _run_check_with_timeout(media_logic(), name="livekit", timeout_ms=50)
|
||||||
|
|
||||||
|
# This dictionary serves as the single source of truth for all critical health checks.
|
||||||
|
CRITICAL_CHECKS: Dict[str, HealthCheckFunc] = {
|
||||||
|
"Database": check_database_status,
|
||||||
|
"Media Server": check_media_server_status
|
||||||
|
}
|
||||||
|
|
||||||
|
async def readiness_check() -> bool:
|
||||||
|
"""
|
||||||
|
Performs a readiness probe. The service is considered "ready" only if *all* critical checks pass.
|
||||||
|
"""
|
||||||
|
tasks = [check_func() for check_func in CRITICAL_CHECKS.values()]
|
||||||
|
results: List[ComponentCheck] = await asyncio.gather(*tasks)
|
||||||
|
|
||||||
|
# Check if every result status is equal to the HealthStatusEnum.passed value ('pass')
|
||||||
|
return all(result.status == HealthStatusEnum.passed for result in results)
|
||||||
|
|
||||||
|
async def get_detailed_health(settings: Settings) -> HealthStatus:
|
||||||
|
"""
|
||||||
|
Builds a detailed health payload that conforms to the health+json specification.
|
||||||
|
Aggregates results from all CRITICAL_CHECKS and includes system metadata.
|
||||||
|
"""
|
||||||
|
|
||||||
|
tasks = [check_func() for check_func in CRITICAL_CHECKS.values()]
|
||||||
|
results: List[ComponentCheck] = await asyncio.gather(*tasks)
|
||||||
|
|
||||||
|
# Initialize overall status using the Enum value
|
||||||
|
overall = HealthStatusEnum.passed
|
||||||
|
checks = {}
|
||||||
|
|
||||||
|
# Iterate through results, mapping them back to their original dictionary keys
|
||||||
|
for key, result in zip(CRITICAL_CHECKS.keys(), results):
|
||||||
|
checks[key] = result
|
||||||
|
# Compare status against the Enum value
|
||||||
|
if result.status != HealthStatusEnum.passed:
|
||||||
|
# If any individual check fails, the overall system status must be 'fail'
|
||||||
|
overall = HealthStatusEnum.failed
|
||||||
|
|
||||||
|
# Assemble the final, comprehensive health report object using provided settings
|
||||||
|
return HealthStatus(
|
||||||
|
status=overall,
|
||||||
|
version=settings.version,
|
||||||
|
environment=settings.environment,
|
||||||
|
serviceName=settings.service_name,
|
||||||
|
description=settings.title,
|
||||||
|
checks=checks,
|
||||||
|
)
|
||||||
42
app/backend/pyproject.toml
Normal file
42
app/backend/pyproject.toml
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
[project]
|
||||||
|
name = "avaaz-backend"
|
||||||
|
version = "0.1.0"
|
||||||
|
description = "FastAPI backend for avaaz.ai with health check."
|
||||||
|
authors = [{ name = "avaaz.ai" }]
|
||||||
|
requires-python = ">=3.12"
|
||||||
|
dependencies = [
|
||||||
|
"fastapi>=0.115.4,<0.116",
|
||||||
|
"uvicorn[standard]>=0.30.6,<0.31",
|
||||||
|
"pydantic-settings>=2.6.1,<3",
|
||||||
|
"gunicorn>=22.0,<23",
|
||||||
|
]
|
||||||
|
|
||||||
|
[project.optional-dependencies]
|
||||||
|
dev = [
|
||||||
|
"pytest>=8.3,<9",
|
||||||
|
"pytest-cov>=5.0,<6",
|
||||||
|
"hypothesis>=6.112,<7",
|
||||||
|
"httpx>=0.27,<0.28",
|
||||||
|
]
|
||||||
|
|
||||||
|
[dependency-groups]
|
||||||
|
dev = [
|
||||||
|
"pytest>=8.3,<9",
|
||||||
|
"pytest-cov>=5.0,<6",
|
||||||
|
"hypothesis>=6.112,<7",
|
||||||
|
"httpx>=0.27,<0.28",
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.pytest.ini_options]
|
||||||
|
addopts = "-ra"
|
||||||
|
testpaths = ["tests"]
|
||||||
|
|
||||||
|
[tool.setuptools]
|
||||||
|
py-modules = ["main"]
|
||||||
|
|
||||||
|
[tool.setuptools.packages.find]
|
||||||
|
include = ["api*", "core*", "features*", "operations*"]
|
||||||
|
|
||||||
|
[build-system]
|
||||||
|
requires = ["setuptools>=68", "wheel"]
|
||||||
|
build-backend = "setuptools.build_meta"
|
||||||
1
app/backend/tests/__init__.py
Normal file
1
app/backend/tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Test suite for backend."""
|
||||||
86
app/backend/tests/test_health.py
Normal file
86
app/backend/tests/test_health.py
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from hypothesis import given, settings
|
||||||
|
from hypothesis import strategies as st
|
||||||
|
|
||||||
|
# Import the main FastAPI application instance from your source code
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
# Initialize the TestClient to make requests against your FastAPI app instance
|
||||||
|
client = TestClient(app)
|
||||||
|
|
||||||
|
def test_liveness_ok():
|
||||||
|
"""
|
||||||
|
Test the basic liveness endpoint.
|
||||||
|
A liveness probe checks if the container is running and responsive.
|
||||||
|
It should always return a 200 OK status and the text 'live'.
|
||||||
|
"""
|
||||||
|
response = client.get("/health/live")
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.text == "live"
|
||||||
|
|
||||||
|
@given(st.text(min_size=0, max_size=16))
|
||||||
|
@settings(max_examples=10)
|
||||||
|
def test_liveness_resilience_to_query_noise(noise: str):
|
||||||
|
"""
|
||||||
|
Use Hypothesis for property-based testing.
|
||||||
|
This test ensures that the liveness endpoint is robust and remains functional
|
||||||
|
even when unexpected or garbage query parameters ("noise") are provided in the URL.
|
||||||
|
The `given` decorator generates various string inputs for the 'noise' parameter.
|
||||||
|
"""
|
||||||
|
# Pass arbitrary query parameters to the endpoint
|
||||||
|
response = client.get("/health/live", params={"noise": noise})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.text == "live"
|
||||||
|
|
||||||
|
def test_readiness_ok():
|
||||||
|
"""
|
||||||
|
Test the basic readiness endpoint.
|
||||||
|
A readiness probe checks if the container is ready to accept traffic (e.g., database connection established).
|
||||||
|
It should return a 200 OK status and the text 'ready' when healthy.
|
||||||
|
"""
|
||||||
|
response = client.get("/health/ready")
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.text == "ready"
|
||||||
|
|
||||||
|
@given(st.text(min_size=0, max_size=16))
|
||||||
|
@settings(max_examples=10)
|
||||||
|
def test_readiness_resilience_to_query_noise(noise: str):
|
||||||
|
"""
|
||||||
|
Use Hypothesis for property-based testing.
|
||||||
|
This test ensures that the readiness endpoint is robust and remains functional
|
||||||
|
even when unexpected or garbage query parameters ("noise") are provided in the URL.
|
||||||
|
The `given` decorator generates various string inputs for the 'noise' parameter.
|
||||||
|
"""
|
||||||
|
# Pass arbitrary query parameters to the endpoint
|
||||||
|
response = client.get("/health/ready", params={"noise": noise})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.text == "ready"
|
||||||
|
|
||||||
|
def test_detailed_health_pass():
|
||||||
|
"""
|
||||||
|
Test the detailed health check endpoint, often conforming to the
|
||||||
|
[IETF health check standard](datatracker.ietf.org).
|
||||||
|
It should return a 200 OK status, and the JSON body should have a
|
||||||
|
"status" of "pass" and a dictionary of individual "checks".
|
||||||
|
"""
|
||||||
|
response = client.get("/health")
|
||||||
|
assert response.status_code == 200
|
||||||
|
body = response.json()
|
||||||
|
assert body["status"] == "pass"
|
||||||
|
assert isinstance(body["checks"], dict)
|
||||||
|
|
||||||
|
@given(st.text(min_size=0, max_size=16))
|
||||||
|
@settings(max_examples=10)
|
||||||
|
def test_health_resilience_to_query_noise(noise: str):
|
||||||
|
"""
|
||||||
|
Use Hypothesis for property-based testing.
|
||||||
|
This test ensures that the health endpoint is robust and remains functional
|
||||||
|
even when unexpected or garbage query parameters ("noise") are provided in the URL.
|
||||||
|
The `given` decorator generates various string inputs for the 'noise' parameter.
|
||||||
|
"""
|
||||||
|
# Pass arbitrary query parameters to the endpoint
|
||||||
|
response = client.get("/health", params={"noise": noise})
|
||||||
|
assert response.status_code == 200
|
||||||
|
body = response.json()
|
||||||
|
assert body["status"] == "pass"
|
||||||
|
assert isinstance(body["checks"], dict)
|
||||||
720
app/backend/uv.lock
generated
Normal file
720
app/backend/uv.lock
generated
Normal file
@@ -0,0 +1,720 @@
|
|||||||
|
version = 1
|
||||||
|
revision = 3
|
||||||
|
requires-python = ">=3.12"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "annotated-types"
|
||||||
|
version = "0.7.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "anyio"
|
||||||
|
version = "4.11.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "sniffio" },
|
||||||
|
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c6/78/7d432127c41b50bccba979505f272c16cbcadcc33645d5fa3a738110ae75/anyio-4.11.0.tar.gz", hash = "sha256:82a8d0b81e318cc5ce71a5f1f8b5c4e63619620b63141ef8c995fa0db95a57c4", size = 219094, upload-time = "2025-09-23T09:19:12.58Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "avaaz-backend"
|
||||||
|
version = "0.1.0"
|
||||||
|
source = { editable = "." }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "fastapi" },
|
||||||
|
{ name = "gunicorn" },
|
||||||
|
{ name = "pydantic-settings" },
|
||||||
|
{ name = "uvicorn", extra = ["standard"] },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.optional-dependencies]
|
||||||
|
dev = [
|
||||||
|
{ name = "httpx" },
|
||||||
|
{ name = "hypothesis" },
|
||||||
|
{ name = "pytest" },
|
||||||
|
{ name = "pytest-cov" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.dev-dependencies]
|
||||||
|
dev = [
|
||||||
|
{ name = "httpx" },
|
||||||
|
{ name = "hypothesis" },
|
||||||
|
{ name = "pytest" },
|
||||||
|
{ name = "pytest-cov" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.metadata]
|
||||||
|
requires-dist = [
|
||||||
|
{ name = "fastapi", specifier = ">=0.115.4,<0.116" },
|
||||||
|
{ name = "gunicorn", specifier = ">=22.0,<23" },
|
||||||
|
{ name = "httpx", marker = "extra == 'dev'", specifier = ">=0.27,<0.28" },
|
||||||
|
{ name = "hypothesis", marker = "extra == 'dev'", specifier = ">=6.112,<7" },
|
||||||
|
{ name = "pydantic-settings", specifier = ">=2.6.1,<3" },
|
||||||
|
{ name = "pytest", marker = "extra == 'dev'", specifier = ">=8.3,<9" },
|
||||||
|
{ name = "pytest-cov", marker = "extra == 'dev'", specifier = ">=5.0,<6" },
|
||||||
|
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.30.6,<0.31" },
|
||||||
|
]
|
||||||
|
provides-extras = ["dev"]
|
||||||
|
|
||||||
|
[package.metadata.requires-dev]
|
||||||
|
dev = [
|
||||||
|
{ name = "httpx", specifier = ">=0.27,<0.28" },
|
||||||
|
{ name = "hypothesis", specifier = ">=6.112,<7" },
|
||||||
|
{ name = "pytest", specifier = ">=8.3,<9" },
|
||||||
|
{ name = "pytest-cov", specifier = ">=5.0,<6" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "certifi"
|
||||||
|
version = "2025.11.12"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a2/8c/58f469717fa48465e4a50c014a0400602d3c437d7c0c468e17ada824da3a/certifi-2025.11.12.tar.gz", hash = "sha256:d8ab5478f2ecd78af242878415affce761ca6bc54a22a27e026d7c25357c3316", size = 160538, upload-time = "2025-11-12T02:54:51.517Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/70/7d/9bc192684cea499815ff478dfcdc13835ddf401365057044fb721ec6bddb/certifi-2025.11.12-py3-none-any.whl", hash = "sha256:97de8790030bbd5c2d96b7ec782fc2f7820ef8dba6db909ccf95449f2d062d4b", size = 159438, upload-time = "2025-11-12T02:54:49.735Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "click"
|
||||||
|
version = "8.3.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "colorama"
|
||||||
|
version = "0.4.6"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "coverage"
|
||||||
|
version = "7.12.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/89/26/4a96807b193b011588099c3b5c89fbb05294e5b90e71018e065465f34eb6/coverage-7.12.0.tar.gz", hash = "sha256:fc11e0a4e372cb5f282f16ef90d4a585034050ccda536451901abfb19a57f40c", size = 819341, upload-time = "2025-11-18T13:34:20.766Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/bf/638c0427c0f0d47638242e2438127f3c8ee3cfc06c7fdeb16778ed47f836/coverage-7.12.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:29644c928772c78512b48e14156b81255000dcfd4817574ff69def189bcb3647", size = 217704, upload-time = "2025-11-18T13:32:28.906Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/08/e1/706fae6692a66c2d6b871a608bbde0da6281903fa0e9f53a39ed441da36a/coverage-7.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8638cbb002eaa5d7c8d04da667813ce1067080b9a91099801a0053086e52b736", size = 218064, upload-time = "2025-11-18T13:32:30.161Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a9/8b/eb0231d0540f8af3ffda39720ff43cb91926489d01524e68f60e961366e4/coverage-7.12.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:083631eeff5eb9992c923e14b810a179798bb598e6a0dd60586819fc23be6e60", size = 249560, upload-time = "2025-11-18T13:32:31.835Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/a1/67fb52af642e974d159b5b379e4d4c59d0ebe1288677fbd04bbffe665a82/coverage-7.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:99d5415c73ca12d558e07776bd957c4222c687b9f1d26fa0e1b57e3598bdcde8", size = 252318, upload-time = "2025-11-18T13:32:33.178Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/e5/38228f31b2c7665ebf9bdfdddd7a184d56450755c7e43ac721c11a4b8dab/coverage-7.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e949ebf60c717c3df63adb4a1a366c096c8d7fd8472608cd09359e1bd48ef59f", size = 253403, upload-time = "2025-11-18T13:32:34.45Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/4b/df78e4c8188f9960684267c5a4897836f3f0f20a20c51606ee778a1d9749/coverage-7.12.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6d907ddccbca819afa2cd014bc69983b146cca2735a0b1e6259b2a6c10be1e70", size = 249984, upload-time = "2025-11-18T13:32:35.747Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ba/51/bb163933d195a345c6f63eab9e55743413d064c291b6220df754075c2769/coverage-7.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b1518ecbad4e6173f4c6e6c4a46e49555ea5679bf3feda5edb1b935c7c44e8a0", size = 251339, upload-time = "2025-11-18T13:32:37.352Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/40/c9b29cdb8412c837cdcbc2cfa054547dd83affe6cbbd4ce4fdb92b6ba7d1/coverage-7.12.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:51777647a749abdf6f6fd8c7cffab12de68ab93aab15efc72fbbb83036c2a068", size = 249489, upload-time = "2025-11-18T13:32:39.212Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/da/b3131e20ba07a0de4437a50ef3b47840dfabf9293675b0cd5c2c7f66dd61/coverage-7.12.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:42435d46d6461a3b305cdfcad7cdd3248787771f53fe18305548cba474e6523b", size = 249070, upload-time = "2025-11-18T13:32:40.598Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/70/81/b653329b5f6302c08d683ceff6785bc60a34be9ae92a5c7b63ee7ee7acec/coverage-7.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5bcead88c8423e1855e64b8057d0544e33e4080b95b240c2a355334bb7ced937", size = 250929, upload-time = "2025-11-18T13:32:42.915Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a3/00/250ac3bca9f252a5fb1338b5ad01331ebb7b40223f72bef5b1b2cb03aa64/coverage-7.12.0-cp312-cp312-win32.whl", hash = "sha256:dcbb630ab034e86d2a0f79aefd2be07e583202f41e037602d438c80044957baa", size = 220241, upload-time = "2025-11-18T13:32:44.665Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/1c/77e79e76d37ce83302f6c21980b45e09f8aa4551965213a10e62d71ce0ab/coverage-7.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:2fd8354ed5d69775ac42986a691fbf68b4084278710cee9d7c3eaa0c28fa982a", size = 221051, upload-time = "2025-11-18T13:32:46.008Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/31/f5/641b8a25baae564f9e52cac0e2667b123de961985709a004e287ee7663cc/coverage-7.12.0-cp312-cp312-win_arm64.whl", hash = "sha256:737c3814903be30695b2de20d22bcc5428fdae305c61ba44cdc8b3252984c49c", size = 219692, upload-time = "2025-11-18T13:32:47.372Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b8/14/771700b4048774e48d2c54ed0c674273702713c9ee7acdfede40c2666747/coverage-7.12.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:47324fffca8d8eae7e185b5bb20c14645f23350f870c1649003618ea91a78941", size = 217725, upload-time = "2025-11-18T13:32:49.22Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/17/a7/3aa4144d3bcb719bf67b22d2d51c2d577bf801498c13cb08f64173e80497/coverage-7.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ccf3b2ede91decd2fb53ec73c1f949c3e034129d1e0b07798ff1d02ea0c8fa4a", size = 218098, upload-time = "2025-11-18T13:32:50.78Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fc/9c/b846bbc774ff81091a12a10203e70562c91ae71badda00c5ae5b613527b1/coverage-7.12.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b365adc70a6936c6b0582dc38746b33b2454148c02349345412c6e743efb646d", size = 249093, upload-time = "2025-11-18T13:32:52.554Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/b6/67d7c0e1f400b32c883e9342de4a8c2ae7c1a0b57c5de87622b7262e2309/coverage-7.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:bc13baf85cd8a4cfcf4a35c7bc9d795837ad809775f782f697bf630b7e200211", size = 251686, upload-time = "2025-11-18T13:32:54.862Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cc/75/b095bd4b39d49c3be4bffbb3135fea18a99a431c52dd7513637c0762fecb/coverage-7.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:099d11698385d572ceafb3288a5b80fe1fc58bf665b3f9d362389de488361d3d", size = 252930, upload-time = "2025-11-18T13:32:56.417Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6e/f3/466f63015c7c80550bead3093aacabf5380c1220a2a93c35d374cae8f762/coverage-7.12.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:473dc45d69694069adb7680c405fb1e81f60b2aff42c81e2f2c3feaf544d878c", size = 249296, upload-time = "2025-11-18T13:32:58.074Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/86/eba2209bf2b7e28c68698fc13437519a295b2d228ba9e0ec91673e09fa92/coverage-7.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:583f9adbefd278e9de33c33d6846aa8f5d164fa49b47144180a0e037f0688bb9", size = 251068, upload-time = "2025-11-18T13:32:59.646Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/55/ca8ae7dbba962a3351f18940b359b94c6bafdd7757945fdc79ec9e452dc7/coverage-7.12.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b2089cc445f2dc0af6f801f0d1355c025b76c24481935303cf1af28f636688f0", size = 249034, upload-time = "2025-11-18T13:33:01.481Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/d7/39136149325cad92d420b023b5fd900dabdd1c3a0d1d5f148ef4a8cedef5/coverage-7.12.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:950411f1eb5d579999c5f66c62a40961f126fc71e5e14419f004471957b51508", size = 248853, upload-time = "2025-11-18T13:33:02.935Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/b6/76e1add8b87ef60e00643b0b7f8f7bb73d4bf5249a3be19ebefc5793dd25/coverage-7.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b1aab7302a87bafebfe76b12af681b56ff446dc6f32ed178ff9c092ca776e6bc", size = 250619, upload-time = "2025-11-18T13:33:04.336Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/87/924c6dc64f9203f7a3c1832a6a0eee5a8335dbe5f1bdadcc278d6f1b4d74/coverage-7.12.0-cp313-cp313-win32.whl", hash = "sha256:d7e0d0303c13b54db495eb636bc2465b2fb8475d4c8bcec8fe4b5ca454dfbae8", size = 220261, upload-time = "2025-11-18T13:33:06.493Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/91/77/dd4aff9af16ff776bf355a24d87eeb48fc6acde54c907cc1ea89b14a8804/coverage-7.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:ce61969812d6a98a981d147d9ac583a36ac7db7766f2e64a9d4d059c2fe29d07", size = 221072, upload-time = "2025-11-18T13:33:07.926Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/70/49/5c9dc46205fef31b1b226a6e16513193715290584317fd4df91cdaf28b22/coverage-7.12.0-cp313-cp313-win_arm64.whl", hash = "sha256:bcec6f47e4cb8a4c2dc91ce507f6eefc6a1b10f58df32cdc61dff65455031dfc", size = 219702, upload-time = "2025-11-18T13:33:09.631Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9b/62/f87922641c7198667994dd472a91e1d9b829c95d6c29529ceb52132436ad/coverage-7.12.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:459443346509476170d553035e4a3eed7b860f4fe5242f02de1010501956ce87", size = 218420, upload-time = "2025-11-18T13:33:11.153Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/85/dd/1cc13b2395ef15dbb27d7370a2509b4aee77890a464fb35d72d428f84871/coverage-7.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:04a79245ab2b7a61688958f7a855275997134bc84f4a03bc240cf64ff132abf6", size = 218773, upload-time = "2025-11-18T13:33:12.569Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/40/35773cc4bb1e9d4658d4fb669eb4195b3151bef3bbd6f866aba5cd5dac82/coverage-7.12.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:09a86acaaa8455f13d6a99221d9654df249b33937b4e212b4e5a822065f12aa7", size = 260078, upload-time = "2025-11-18T13:33:14.037Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/ee/231bb1a6ffc2905e396557585ebc6bdc559e7c66708376d245a1f1d330fc/coverage-7.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:907e0df1b71ba77463687a74149c6122c3f6aac56c2510a5d906b2f368208560", size = 262144, upload-time = "2025-11-18T13:33:15.601Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/28/be/32f4aa9f3bf0b56f3971001b56508352c7753915345d45fab4296a986f01/coverage-7.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9b57e2d0ddd5f0582bae5437c04ee71c46cd908e7bc5d4d0391f9a41e812dd12", size = 264574, upload-time = "2025-11-18T13:33:17.354Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/68/7c/00489fcbc2245d13ab12189b977e0cf06ff3351cb98bc6beba8bd68c5902/coverage-7.12.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:58c1c6aa677f3a1411fe6fb28ec3a942e4f665df036a3608816e0847fad23296", size = 259298, upload-time = "2025-11-18T13:33:18.958Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/96/b4/f0760d65d56c3bea95b449e02570d4abd2549dc784bf39a2d4721a2d8ceb/coverage-7.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4c589361263ab2953e3c4cd2a94db94c4ad4a8e572776ecfbad2389c626e4507", size = 262150, upload-time = "2025-11-18T13:33:20.644Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c5/71/9a9314df00f9326d78c1e5a910f520d599205907432d90d1c1b7a97aa4b1/coverage-7.12.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:91b810a163ccad2e43b1faa11d70d3cf4b6f3d83f9fd5f2df82a32d47b648e0d", size = 259763, upload-time = "2025-11-18T13:33:22.189Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/34/01a0aceed13fbdf925876b9a15d50862eb8845454301fe3cdd1df08b2182/coverage-7.12.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:40c867af715f22592e0d0fb533a33a71ec9e0f73a6945f722a0c85c8c1cbe3a2", size = 258653, upload-time = "2025-11-18T13:33:24.239Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8d/04/81d8fd64928acf1574bbb0181f66901c6c1c6279c8ccf5f84259d2c68ae9/coverage-7.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:68b0d0a2d84f333de875666259dadf28cc67858bc8fd8b3f1eae84d3c2bec455", size = 260856, upload-time = "2025-11-18T13:33:26.365Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f2/76/fa2a37bfaeaf1f766a2d2360a25a5297d4fb567098112f6517475eee120b/coverage-7.12.0-cp313-cp313t-win32.whl", hash = "sha256:73f9e7fbd51a221818fd11b7090eaa835a353ddd59c236c57b2199486b116c6d", size = 220936, upload-time = "2025-11-18T13:33:28.165Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/52/60f64d932d555102611c366afb0eb434b34266b1d9266fc2fe18ab641c47/coverage-7.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:24cff9d1f5743f67db7ba46ff284018a6e9aeb649b67aa1e70c396aa1b7cb23c", size = 222001, upload-time = "2025-11-18T13:33:29.656Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/77/df/c303164154a5a3aea7472bf323b7c857fed93b26618ed9fc5c2955566bb0/coverage-7.12.0-cp313-cp313t-win_arm64.whl", hash = "sha256:c87395744f5c77c866d0f5a43d97cc39e17c7f1cb0115e54a2fe67ca75c5d14d", size = 220273, upload-time = "2025-11-18T13:33:31.415Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bf/2e/fc12db0883478d6e12bbd62d481210f0c8daf036102aa11434a0c5755825/coverage-7.12.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:a1c59b7dc169809a88b21a936eccf71c3895a78f5592051b1af8f4d59c2b4f92", size = 217777, upload-time = "2025-11-18T13:33:32.86Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/c1/ce3e525d223350c6ec16b9be8a057623f54226ef7f4c2fee361ebb6a02b8/coverage-7.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:8787b0f982e020adb732b9f051f3e49dd5054cebbc3f3432061278512a2b1360", size = 218100, upload-time = "2025-11-18T13:33:34.532Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/87/113757441504aee3808cb422990ed7c8bcc2d53a6779c66c5adef0942939/coverage-7.12.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5ea5a9f7dc8877455b13dd1effd3202e0bca72f6f3ab09f9036b1bcf728f69ac", size = 249151, upload-time = "2025-11-18T13:33:36.135Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d9/1d/9529d9bd44049b6b05bb319c03a3a7e4b0a8a802d28fa348ad407e10706d/coverage-7.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fdba9f15849534594f60b47c9a30bc70409b54947319a7c4fd0e8e3d8d2f355d", size = 251667, upload-time = "2025-11-18T13:33:37.996Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/11/bb/567e751c41e9c03dc29d3ce74b8c89a1e3396313e34f255a2a2e8b9ebb56/coverage-7.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a00594770eb715854fb1c57e0dea08cce6720cfbc531accdb9850d7c7770396c", size = 253003, upload-time = "2025-11-18T13:33:39.553Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/b3/c2cce2d8526a02fb9e9ca14a263ca6fc074449b33a6afa4892838c903528/coverage-7.12.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:5560c7e0d82b42eb1951e4f68f071f8017c824ebfd5a6ebe42c60ac16c6c2434", size = 249185, upload-time = "2025-11-18T13:33:42.086Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/a7/967f93bb66e82c9113c66a8d0b65ecf72fc865adfba5a145f50c7af7e58d/coverage-7.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2e26b481c9159c2773a37947a9718cfdc58893029cdfb177531793e375cfc", size = 251025, upload-time = "2025-11-18T13:33:43.634Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b9/b2/f2f6f56337bc1af465d5b2dc1ee7ee2141b8b9272f3bf6213fcbc309a836/coverage-7.12.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:6e1a8c066dabcde56d5d9fed6a66bc19a2883a3fe051f0c397a41fc42aedd4cc", size = 248979, upload-time = "2025-11-18T13:33:46.04Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/7a/bf4209f45a4aec09d10a01a57313a46c0e0e8f4c55ff2965467d41a92036/coverage-7.12.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:f7ba9da4726e446d8dd8aae5a6cd872511184a5d861de80a86ef970b5dacce3e", size = 248800, upload-time = "2025-11-18T13:33:47.546Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b8/b7/1e01b8696fb0521810f60c5bbebf699100d6754183e6cc0679bf2ed76531/coverage-7.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e0f483ab4f749039894abaf80c2f9e7ed77bbf3c737517fb88c8e8e305896a17", size = 250460, upload-time = "2025-11-18T13:33:49.537Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/71/ae/84324fb9cb46c024760e706353d9b771a81b398d117d8c1fe010391c186f/coverage-7.12.0-cp314-cp314-win32.whl", hash = "sha256:76336c19a9ef4a94b2f8dc79f8ac2da3f193f625bb5d6f51a328cd19bfc19933", size = 220533, upload-time = "2025-11-18T13:33:51.16Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/71/1033629deb8460a8f97f83e6ac4ca3b93952e2b6f826056684df8275e015/coverage-7.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7c1059b600aec6ef090721f8f633f60ed70afaffe8ecab85b59df748f24b31fe", size = 221348, upload-time = "2025-11-18T13:33:52.776Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/5f/ac8107a902f623b0c251abdb749be282dc2ab61854a8a4fcf49e276fce2f/coverage-7.12.0-cp314-cp314-win_arm64.whl", hash = "sha256:172cf3a34bfef42611963e2b661302a8931f44df31629e5b1050567d6b90287d", size = 219922, upload-time = "2025-11-18T13:33:54.316Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/6e/f27af2d4da367f16077d21ef6fe796c874408219fa6dd3f3efe7751bd910/coverage-7.12.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:aa7d48520a32cb21c7a9b31f81799e8eaec7239db36c3b670be0fa2403828d1d", size = 218511, upload-time = "2025-11-18T13:33:56.343Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/dd/65fd874aa460c30da78f9d259400d8e6a4ef457d61ab052fd248f0050558/coverage-7.12.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:90d58ac63bc85e0fb919f14d09d6caa63f35a5512a2205284b7816cafd21bb03", size = 218771, upload-time = "2025-11-18T13:33:57.966Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/55/e0/7c6b71d327d8068cb79c05f8f45bf1b6145f7a0de23bbebe63578fe5240a/coverage-7.12.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:ca8ecfa283764fdda3eae1bdb6afe58bf78c2c3ec2b2edcb05a671f0bba7b3f9", size = 260151, upload-time = "2025-11-18T13:33:59.597Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/ce/4697457d58285b7200de6b46d606ea71066c6e674571a946a6ea908fb588/coverage-7.12.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:874fe69a0785d96bd066059cd4368022cebbec1a8958f224f0016979183916e6", size = 262257, upload-time = "2025-11-18T13:34:01.166Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2f/33/acbc6e447aee4ceba88c15528dbe04a35fb4d67b59d393d2e0d6f1e242c1/coverage-7.12.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5b3c889c0b8b283a24d721a9eabc8ccafcfc3aebf167e4cd0d0e23bf8ec4e339", size = 264671, upload-time = "2025-11-18T13:34:02.795Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/87/ec/e2822a795c1ed44d569980097be839c5e734d4c0c1119ef8e0a073496a30/coverage-7.12.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8bb5b894b3ec09dcd6d3743229dc7f2c42ef7787dc40596ae04c0edda487371e", size = 259231, upload-time = "2025-11-18T13:34:04.397Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/72/c5/a7ec5395bb4a49c9b7ad97e63f0c92f6bf4a9e006b1393555a02dae75f16/coverage-7.12.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:79a44421cd5fba96aa57b5e3b5a4d3274c449d4c622e8f76882d76635501fd13", size = 262137, upload-time = "2025-11-18T13:34:06.068Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/0c/02c08858b764129f4ecb8e316684272972e60777ae986f3865b10940bdd6/coverage-7.12.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:33baadc0efd5c7294f436a632566ccc1f72c867f82833eb59820ee37dc811c6f", size = 259745, upload-time = "2025-11-18T13:34:08.04Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/04/4fd32b7084505f3829a8fe45c1a74a7a728cb251aaadbe3bec04abcef06d/coverage-7.12.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:c406a71f544800ef7e9e0000af706b88465f3573ae8b8de37e5f96c59f689ad1", size = 258570, upload-time = "2025-11-18T13:34:09.676Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/35/2365e37c90df4f5342c4fa202223744119fe31264ee2924f09f074ea9b6d/coverage-7.12.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e71bba6a40883b00c6d571599b4627f50c360b3d0d02bfc658168936be74027b", size = 260899, upload-time = "2025-11-18T13:34:11.259Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/05/56/26ab0464ca733fa325e8e71455c58c1c374ce30f7c04cebb88eabb037b18/coverage-7.12.0-cp314-cp314t-win32.whl", hash = "sha256:9157a5e233c40ce6613dead4c131a006adfda70e557b6856b97aceed01b0e27a", size = 221313, upload-time = "2025-11-18T13:34:12.863Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/da/1c/017a3e1113ed34d998b27d2c6dba08a9e7cb97d362f0ec988fcd873dcf81/coverage-7.12.0-cp314-cp314t-win_amd64.whl", hash = "sha256:e84da3a0fd233aeec797b981c51af1cabac74f9bd67be42458365b30d11b5291", size = 222423, upload-time = "2025-11-18T13:34:15.14Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4c/36/bcc504fdd5169301b52568802bb1b9cdde2e27a01d39fbb3b4b508ab7c2c/coverage-7.12.0-cp314-cp314t-win_arm64.whl", hash = "sha256:01d24af36fedda51c2b1aca56e4330a3710f83b02a5ff3743a6b015ffa7c9384", size = 220459, upload-time = "2025-11-18T13:34:17.222Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/a3/43b749004e3c09452e39bb56347a008f0a0668aad37324a99b5c8ca91d9e/coverage-7.12.0-py3-none-any.whl", hash = "sha256:159d50c0b12e060b15ed3d39f87ed43d4f7f7ad40b8a534f4dd331adbb51104a", size = 209503, upload-time = "2025-11-18T13:34:18.892Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "fastapi"
|
||||||
|
version = "0.115.14"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pydantic" },
|
||||||
|
{ name = "starlette" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ca/53/8c38a874844a8b0fa10dd8adf3836ac154082cf88d3f22b544e9ceea0a15/fastapi-0.115.14.tar.gz", hash = "sha256:b1de15cdc1c499a4da47914db35d0e4ef8f1ce62b624e94e0e5824421df99739", size = 296263, upload-time = "2025-06-26T15:29:08.21Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/50/b1222562c6d270fea83e9c9075b8e8600b8479150a18e4516a6138b980d1/fastapi-0.115.14-py3-none-any.whl", hash = "sha256:6c0c8bf9420bd58f565e585036d971872472b4f7d3f6c73b698e10cffdefb3ca", size = 95514, upload-time = "2025-06-26T15:29:06.49Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "gunicorn"
|
||||||
|
version = "22.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "packaging" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/1e/88/e2f93c5738a4c1f56a458fc7a5b1676fc31dcdbb182bef6b40a141c17d66/gunicorn-22.0.0.tar.gz", hash = "sha256:4a0b436239ff76fb33f11c07a16482c521a7e09c1ce3cc293c2330afe01bec63", size = 3639760, upload-time = "2024-04-16T22:58:19.218Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/29/97/6d610ae77b5633d24b69c2ff1ac3044e0e565ecbd1ec188f02c45073054c/gunicorn-22.0.0-py3-none-any.whl", hash = "sha256:350679f91b24062c86e386e198a15438d53a7a8207235a78ba1b53df4c4378d9", size = 84443, upload-time = "2024-04-16T22:58:15.233Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "h11"
|
||||||
|
version = "0.16.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpcore"
|
||||||
|
version = "1.0.9"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httptools"
|
||||||
|
version = "0.7.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/b5/46/120a669232c7bdedb9d52d4aeae7e6c7dfe151e99dc70802e2fc7a5e1993/httptools-0.7.1.tar.gz", hash = "sha256:abd72556974f8e7c74a259655924a717a2365b236c882c3f6f8a45fe94703ac9", size = 258961, upload-time = "2025-10-10T03:55:08.559Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/7f/403e5d787dc4942316e515e949b0c8a013d84078a915910e9f391ba9b3ed/httptools-0.7.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:38e0c83a2ea9746ebbd643bdfb521b9aa4a91703e2cd705c20443405d2fd16a5", size = 206280, upload-time = "2025-10-10T03:54:39.274Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/0d/7f3fd28e2ce311ccc998c388dd1c53b18120fda3b70ebb022b135dc9839b/httptools-0.7.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f25bbaf1235e27704f1a7b86cd3304eabc04f569c828101d94a0e605ef7205a5", size = 110004, upload-time = "2025-10-10T03:54:40.403Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/a6/b3965e1e146ef5762870bbe76117876ceba51a201e18cc31f5703e454596/httptools-0.7.1-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2c15f37ef679ab9ecc06bfc4e6e8628c32a8e4b305459de7cf6785acd57e4d03", size = 517655, upload-time = "2025-10-10T03:54:41.347Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/11/7d/71fee6f1844e6fa378f2eddde6c3e41ce3a1fb4b2d81118dd544e3441ec0/httptools-0.7.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7fe6e96090df46b36ccfaf746f03034e5ab723162bc51b0a4cf58305324036f2", size = 511440, upload-time = "2025-10-10T03:54:42.452Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/a5/079d216712a4f3ffa24af4a0381b108aa9c45b7a5cc6eb141f81726b1823/httptools-0.7.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f72fdbae2dbc6e68b8239defb48e6a5937b12218e6ffc2c7846cc37befa84362", size = 495186, upload-time = "2025-10-10T03:54:43.937Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/9e/025ad7b65278745dee3bd0ebf9314934c4592560878308a6121f7f812084/httptools-0.7.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e99c7b90a29fd82fea9ef57943d501a16f3404d7b9ee81799d41639bdaae412c", size = 499192, upload-time = "2025-10-10T03:54:45.003Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6d/de/40a8f202b987d43afc4d54689600ff03ce65680ede2f31df348d7f368b8f/httptools-0.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:3e14f530fefa7499334a79b0cf7e7cd2992870eb893526fb097d51b4f2d0f321", size = 86694, upload-time = "2025-10-10T03:54:45.923Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/09/8f/c77b1fcbfd262d422f12da02feb0d218fa228d52485b77b953832105bb90/httptools-0.7.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6babce6cfa2a99545c60bfef8bee0cc0545413cb0018f617c8059a30ad985de3", size = 202889, upload-time = "2025-10-10T03:54:47.089Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/1a/22887f53602feaa066354867bc49a68fc295c2293433177ee90870a7d517/httptools-0.7.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:601b7628de7504077dd3dcb3791c6b8694bbd967148a6d1f01806509254fb1ca", size = 108180, upload-time = "2025-10-10T03:54:48.052Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/32/6a/6aaa91937f0010d288d3d124ca2946d48d60c3a5ee7ca62afe870e3ea011/httptools-0.7.1-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:04c6c0e6c5fb0739c5b8a9eb046d298650a0ff38cf42537fc372b28dc7e4472c", size = 478596, upload-time = "2025-10-10T03:54:48.919Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6d/70/023d7ce117993107be88d2cbca566a7c1323ccbaf0af7eabf2064fe356f6/httptools-0.7.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:69d4f9705c405ae3ee83d6a12283dc9feba8cc6aaec671b412917e644ab4fa66", size = 473268, upload-time = "2025-10-10T03:54:49.993Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/32/4d/9dd616c38da088e3f436e9a616e1d0cc66544b8cdac405cc4e81c8679fc7/httptools-0.7.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:44c8f4347d4b31269c8a9205d8a5ee2df5322b09bbbd30f8f862185bb6b05346", size = 455517, upload-time = "2025-10-10T03:54:51.066Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1d/3a/a6c595c310b7df958e739aae88724e24f9246a514d909547778d776799be/httptools-0.7.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:465275d76db4d554918aba40bf1cbebe324670f3dfc979eaffaa5d108e2ed650", size = 458337, upload-time = "2025-10-10T03:54:52.196Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/82/88e8d6d2c51edc1cc391b6e044c6c435b6aebe97b1abc33db1b0b24cd582/httptools-0.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:322d00c2068d125bd570f7bf78b2d367dad02b919d8581d7476d8b75b294e3e6", size = 85743, upload-time = "2025-10-10T03:54:53.448Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/34/50/9d095fcbb6de2d523e027a2f304d4551855c2f46e0b82befd718b8b20056/httptools-0.7.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:c08fe65728b8d70b6923ce31e3956f859d5e1e8548e6f22ec520a962c6757270", size = 203619, upload-time = "2025-10-10T03:54:54.321Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/07/f0/89720dc5139ae54b03f861b5e2c55a37dba9a5da7d51e1e824a1f343627f/httptools-0.7.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:7aea2e3c3953521c3c51106ee11487a910d45586e351202474d45472db7d72d3", size = 108714, upload-time = "2025-10-10T03:54:55.163Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b3/cb/eea88506f191fb552c11787c23f9a405f4c7b0c5799bf73f2249cd4f5228/httptools-0.7.1-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0e68b8582f4ea9166be62926077a3334064d422cf08ab87d8b74664f8e9058e1", size = 472909, upload-time = "2025-10-10T03:54:56.056Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/4a/a548bdfae6369c0d078bab5769f7b66f17f1bfaa6fa28f81d6be6959066b/httptools-0.7.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:df091cf961a3be783d6aebae963cc9b71e00d57fa6f149025075217bc6a55a7b", size = 470831, upload-time = "2025-10-10T03:54:57.219Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4d/31/14df99e1c43bd132eec921c2e7e11cda7852f65619bc0fc5bdc2d0cb126c/httptools-0.7.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f084813239e1eb403ddacd06a30de3d3e09a9b76e7894dcda2b22f8a726e9c60", size = 452631, upload-time = "2025-10-10T03:54:58.219Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/d2/b7e131f7be8d854d48cb6d048113c30f9a46dca0c9a8b08fcb3fcd588cdc/httptools-0.7.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7347714368fb2b335e9063bc2b96f2f87a9ceffcd9758ac295f8bbcd3ffbc0ca", size = 452910, upload-time = "2025-10-10T03:54:59.366Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/cf/878f3b91e4e6e011eff6d1fa9ca39f7eb17d19c9d7971b04873734112f30/httptools-0.7.1-cp314-cp314-win_amd64.whl", hash = "sha256:cfabda2a5bb85aa2a904ce06d974a3f30fb36cc63d7feaddec05d2050acede96", size = 88205, upload-time = "2025-10-10T03:55:00.389Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpx"
|
||||||
|
version = "0.27.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "httpcore" },
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "sniffio" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/78/82/08f8c936781f67d9e6b9eeb8a0c8b4e406136ea4c3d1f89a5db71d42e0e6/httpx-0.27.2.tar.gz", hash = "sha256:f7c2be1d2f3c3c3160d441802406b206c2b76f5947b11115e6df10c6c65e66c2", size = 144189, upload-time = "2024-08-27T12:54:01.334Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/56/95/9377bcb415797e44274b51d46e3249eba641711cf3348050f76ee7b15ffc/httpx-0.27.2-py3-none-any.whl", hash = "sha256:7bb2708e112d8fdd7829cd4243970f0c223274051cb35ee80c03301ee29a3df0", size = 76395, upload-time = "2024-08-27T12:53:59.653Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "hypothesis"
|
||||||
|
version = "6.148.3"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "sortedcontainers" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e1/3d/41da3727e5f3e6b0c79b9657946c742e2f61d24edcde3e1660e337509586/hypothesis-6.148.3.tar.gz", hash = "sha256:bd81221740d8658473060ad900dc831f889f156fdb41210ba2f47cfad10a66ed", size = 469896, upload-time = "2025-11-27T06:34:09.419Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/61/8c9fd9397eb46ac54d974be8b9e619c386d6b47a462d8df962ebb79980f9/hypothesis-6.148.3-py3-none-any.whl", hash = "sha256:e7dd193da9800234ec5e1541c1eddde4bddff49b53faf690ba68a0af55a7abb3", size = 536925, upload-time = "2025-11-27T06:34:06.978Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "idna"
|
||||||
|
version = "3.11"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "iniconfig"
|
||||||
|
version = "2.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "packaging"
|
||||||
|
version = "25.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pluggy"
|
||||||
|
version = "1.6.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic"
|
||||||
|
version = "2.12.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "annotated-types" },
|
||||||
|
{ name = "pydantic-core" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/69/44/36f1a6e523abc58ae5f928898e4aca2e0ea509b5aa6f6f392a5d882be928/pydantic-2.12.5.tar.gz", hash = "sha256:4d351024c75c0f085a9febbb665ce8c0c6ec5d30e903bdb6394b7ede26aebb49", size = 821591, upload-time = "2025-11-26T15:11:46.471Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/87/b70ad306ebb6f9b585f114d0ac2137d792b48be34d732d60e597c2f8465a/pydantic-2.12.5-py3-none-any.whl", hash = "sha256:e561593fccf61e8a20fc46dfc2dfe075b8be7d0188df33f221ad1f0139180f9d", size = 463580, upload-time = "2025-11-26T15:11:44.605Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic-core"
|
||||||
|
version = "2.41.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/71/70/23b021c950c2addd24ec408e9ab05d59b035b39d97cdc1130e1bce647bb6/pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e", size = 460952, upload-time = "2025-11-04T13:43:49.098Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/5d/5f6c63eebb5afee93bcaae4ce9a898f3373ca23df3ccaef086d0233a35a7/pydantic_core-2.41.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f41a7489d32336dbf2199c8c0a215390a751c5b014c2c1c5366e817202e9cdf7", size = 2110990, upload-time = "2025-11-04T13:39:58.079Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/32/9c2e8ccb57c01111e0fd091f236c7b371c1bccea0fa85247ac55b1e2b6b6/pydantic_core-2.41.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:070259a8818988b9a84a449a2a7337c7f430a22acc0859c6b110aa7212a6d9c0", size = 1896003, upload-time = "2025-11-04T13:39:59.956Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/68/b8/a01b53cb0e59139fbc9e4fda3e9724ede8de279097179be4ff31f1abb65a/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e96cea19e34778f8d59fe40775a7a574d95816eb150850a85a7a4c8f4b94ac69", size = 1919200, upload-time = "2025-11-04T13:40:02.241Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/38/de/8c36b5198a29bdaade07b5985e80a233a5ac27137846f3bc2d3b40a47360/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed2e99c456e3fadd05c991f8f437ef902e00eedf34320ba2b0842bd1c3ca3a75", size = 2052578, upload-time = "2025-11-04T13:40:04.401Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/00/b5/0e8e4b5b081eac6cb3dbb7e60a65907549a1ce035a724368c330112adfdd/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:65840751b72fbfd82c3c640cff9284545342a4f1eb1586ad0636955b261b0b05", size = 2208504, upload-time = "2025-11-04T13:40:06.072Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/77/56/87a61aad59c7c5b9dc8caad5a41a5545cba3810c3e828708b3d7404f6cef/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e536c98a7626a98feb2d3eaf75944ef6f3dbee447e1f841eae16f2f0a72d8ddc", size = 2335816, upload-time = "2025-11-04T13:40:07.835Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/76/941cc9f73529988688a665a5c0ecff1112b3d95ab48f81db5f7606f522d3/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eceb81a8d74f9267ef4081e246ffd6d129da5d87e37a77c9bde550cb04870c1c", size = 2075366, upload-time = "2025-11-04T13:40:09.804Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/43/ebef01f69baa07a482844faaa0a591bad1ef129253ffd0cdaa9d8a7f72d3/pydantic_core-2.41.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d38548150c39b74aeeb0ce8ee1d8e82696f4a4e16ddc6de7b1d8823f7de4b9b5", size = 2171698, upload-time = "2025-11-04T13:40:12.004Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/87/41f3202e4193e3bacfc2c065fab7706ebe81af46a83d3e27605029c1f5a6/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c23e27686783f60290e36827f9c626e63154b82b116d7fe9adba1fda36da706c", size = 2132603, upload-time = "2025-11-04T13:40:13.868Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/7d/4c00df99cb12070b6bccdef4a195255e6020a550d572768d92cc54dba91a/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:482c982f814460eabe1d3bb0adfdc583387bd4691ef00b90575ca0d2b6fe2294", size = 2329591, upload-time = "2025-11-04T13:40:15.672Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cc/6a/ebf4b1d65d458f3cda6a7335d141305dfa19bdc61140a884d165a8a1bbc7/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:bfea2a5f0b4d8d43adf9d7b8bf019fb46fdd10a2e5cde477fbcb9d1fa08c68e1", size = 2319068, upload-time = "2025-11-04T13:40:17.532Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/3b/774f2b5cd4192d5ab75870ce4381fd89cf218af999515baf07e7206753f0/pydantic_core-2.41.5-cp312-cp312-win32.whl", hash = "sha256:b74557b16e390ec12dca509bce9264c3bbd128f8a2c376eaa68003d7f327276d", size = 1985908, upload-time = "2025-11-04T13:40:19.309Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/45/00173a033c801cacf67c190fef088789394feaf88a98a7035b0e40d53dc9/pydantic_core-2.41.5-cp312-cp312-win_amd64.whl", hash = "sha256:1962293292865bca8e54702b08a4f26da73adc83dd1fcf26fbc875b35d81c815", size = 2020145, upload-time = "2025-11-04T13:40:21.548Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/22/91fbc821fa6d261b376a3f73809f907cec5ca6025642c463d3488aad22fb/pydantic_core-2.41.5-cp312-cp312-win_arm64.whl", hash = "sha256:1746d4a3d9a794cacae06a5eaaccb4b8643a131d45fbc9af23e353dc0a5ba5c3", size = 1976179, upload-time = "2025-11-04T13:40:23.393Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/87/06/8806241ff1f70d9939f9af039c6c35f2360cf16e93c2ca76f184e76b1564/pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9", size = 2120403, upload-time = "2025-11-04T13:40:25.248Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/94/02/abfa0e0bda67faa65fef1c84971c7e45928e108fe24333c81f3bfe35d5f5/pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34", size = 1896206, upload-time = "2025-11-04T13:40:27.099Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/df/a4c740c0943e93e6500f9eb23f4ca7ec9bf71b19e608ae5b579678c8d02f/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0", size = 1919307, upload-time = "2025-11-04T13:40:29.806Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/e3/6324802931ae1d123528988e0e86587c2072ac2e5394b4bc2bc34b61ff6e/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33", size = 2063258, upload-time = "2025-11-04T13:40:33.544Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/d4/2230d7151d4957dd79c3044ea26346c148c98fbf0ee6ebd41056f2d62ab5/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e", size = 2214917, upload-time = "2025-11-04T13:40:35.479Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e6/9f/eaac5df17a3672fef0081b6c1bb0b82b33ee89aa5cec0d7b05f52fd4a1fa/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2", size = 2332186, upload-time = "2025-11-04T13:40:37.436Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/4e/35a80cae583a37cf15604b44240e45c05e04e86f9cfd766623149297e971/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586", size = 2073164, upload-time = "2025-11-04T13:40:40.289Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bf/e3/f6e262673c6140dd3305d144d032f7bd5f7497d3871c1428521f19f9efa2/pydantic_core-2.41.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b93590ae81f7010dbe380cdeab6f515902ebcbefe0b9327cc4804d74e93ae69d", size = 2179146, upload-time = "2025-11-04T13:40:42.809Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/75/c7/20bd7fc05f0c6ea2056a4565c6f36f8968c0924f19b7d97bbfea55780e73/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740", size = 2137788, upload-time = "2025-11-04T13:40:44.752Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/8d/34318ef985c45196e004bc46c6eab2eda437e744c124ef0dbe1ff2c9d06b/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e", size = 2340133, upload-time = "2025-11-04T13:40:46.66Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/59/013626bf8c78a5a5d9350d12e7697d3d4de951a75565496abd40ccd46bee/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858", size = 2324852, upload-time = "2025-11-04T13:40:48.575Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1a/d9/c248c103856f807ef70c18a4f986693a46a8ffe1602e5d361485da502d20/pydantic_core-2.41.5-cp313-cp313-win32.whl", hash = "sha256:650ae77860b45cfa6e2cdafc42618ceafab3a2d9a3811fcfbd3bbf8ac3c40d36", size = 1994679, upload-time = "2025-11-04T13:40:50.619Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/8b/341991b158ddab181cff136acd2552c9f35bd30380422a639c0671e99a91/pydantic_core-2.41.5-cp313-cp313-win_amd64.whl", hash = "sha256:79ec52ec461e99e13791ec6508c722742ad745571f234ea6255bed38c6480f11", size = 2019766, upload-time = "2025-11-04T13:40:52.631Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/73/7d/f2f9db34af103bea3e09735bb40b021788a5e834c81eedb541991badf8f5/pydantic_core-2.41.5-cp313-cp313-win_arm64.whl", hash = "sha256:3f84d5c1b4ab906093bdc1ff10484838aca54ef08de4afa9de0f5f14d69639cd", size = 1981005, upload-time = "2025-11-04T13:40:54.734Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ea/28/46b7c5c9635ae96ea0fbb779e271a38129df2550f763937659ee6c5dbc65/pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a", size = 2119622, upload-time = "2025-11-04T13:40:56.68Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/1a/145646e5687e8d9a1e8d09acb278c8535ebe9e972e1f162ed338a622f193/pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14", size = 1891725, upload-time = "2025-11-04T13:40:58.807Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/23/04/e89c29e267b8060b40dca97bfc64a19b2a3cf99018167ea1677d96368273/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1", size = 1915040, upload-time = "2025-11-04T13:41:00.853Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/a3/15a82ac7bd97992a82257f777b3583d3e84bdb06ba6858f745daa2ec8a85/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66", size = 2063691, upload-time = "2025-11-04T13:41:03.504Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/9b/0046701313c6ef08c0c1cf0e028c67c770a4e1275ca73131563c5f2a310a/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869", size = 2213897, upload-time = "2025-11-04T13:41:05.804Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/cd/6bac76ecd1b27e75a95ca3a9a559c643b3afcd2dd62086d4b7a32a18b169/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2", size = 2333302, upload-time = "2025-11-04T13:41:07.809Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4c/d2/ef2074dc020dd6e109611a8be4449b98cd25e1b9b8a303c2f0fca2f2bcf7/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375", size = 2064877, upload-time = "2025-11-04T13:41:09.827Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/66/e9db17a9a763d72f03de903883c057b2592c09509ccfe468187f2a2eef29/pydantic_core-2.41.5-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2782c870e99878c634505236d81e5443092fba820f0373997ff75f90f68cd553", size = 2180680, upload-time = "2025-11-04T13:41:12.379Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/9e/3ce66cebb929f3ced22be85d4c2399b8e85b622db77dad36b73c5387f8f8/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90", size = 2138960, upload-time = "2025-11-04T13:41:14.627Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/62/205a998f4327d2079326b01abee48e502ea739d174f0a89295c481a2272e/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07", size = 2339102, upload-time = "2025-11-04T13:41:16.868Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3c/0d/f05e79471e889d74d3d88f5bd20d0ed189ad94c2423d81ff8d0000aab4ff/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb", size = 2326039, upload-time = "2025-11-04T13:41:18.934Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/e1/e08a6208bb100da7e0c4b288eed624a703f4d129bde2da475721a80cab32/pydantic_core-2.41.5-cp314-cp314-win32.whl", hash = "sha256:aec5cf2fd867b4ff45b9959f8b20ea3993fc93e63c7363fe6851424c8a7e7c23", size = 1995126, upload-time = "2025-11-04T13:41:21.418Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/5d/56ba7b24e9557f99c9237e29f5c09913c81eeb2f3217e40e922353668092/pydantic_core-2.41.5-cp314-cp314-win_amd64.whl", hash = "sha256:8e7c86f27c585ef37c35e56a96363ab8de4e549a95512445b85c96d3e2f7c1bf", size = 2015489, upload-time = "2025-11-04T13:41:24.076Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4e/bb/f7a190991ec9e3e0ba22e4993d8755bbc4a32925c0b5b42775c03e8148f9/pydantic_core-2.41.5-cp314-cp314-win_arm64.whl", hash = "sha256:e672ba74fbc2dc8eea59fb6d4aed6845e6905fc2a8afe93175d94a83ba2a01a0", size = 1977288, upload-time = "2025-11-04T13:41:26.33Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/ed/77542d0c51538e32e15afe7899d79efce4b81eee631d99850edc2f5e9349/pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a", size = 2120255, upload-time = "2025-11-04T13:41:28.569Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bb/3d/6913dde84d5be21e284439676168b28d8bbba5600d838b9dca99de0fad71/pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3", size = 1863760, upload-time = "2025-11-04T13:41:31.055Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/f0/e5e6b99d4191da102f2b0eb9687aaa7f5bea5d9964071a84effc3e40f997/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c", size = 1878092, upload-time = "2025-11-04T13:41:33.21Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/71/48/36fb760642d568925953bcc8116455513d6e34c4beaa37544118c36aba6d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612", size = 2053385, upload-time = "2025-11-04T13:41:35.508Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/25/92dc684dd8eb75a234bc1c764b4210cf2646479d54b47bf46061657292a8/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d", size = 2218832, upload-time = "2025-11-04T13:41:37.732Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/09/f53e0b05023d3e30357d82eb35835d0f6340ca344720a4599cd663dca599/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9", size = 2327585, upload-time = "2025-11-04T13:41:40Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/4e/2ae1aa85d6af35a39b236b1b1641de73f5a6ac4d5a7509f77b814885760c/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660", size = 2041078, upload-time = "2025-11-04T13:41:42.323Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cd/13/2e215f17f0ef326fc72afe94776edb77525142c693767fc347ed6288728d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0d2568a8c11bf8225044aa94409e21da0cb09dcdafe9ecd10250b2baad531a9", size = 2173914, upload-time = "2025-11-04T13:41:45.221Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/7a/f999a6dcbcd0e5660bc348a3991c8915ce6599f4f2c6ac22f01d7a10816c/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3", size = 2129560, upload-time = "2025-11-04T13:41:47.474Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/b1/6c990ac65e3b4c079a4fb9f5b05f5b013afa0f4ed6780a3dd236d2cbdc64/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf", size = 2329244, upload-time = "2025-11-04T13:41:49.992Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d9/02/3c562f3a51afd4d88fff8dffb1771b30cfdfd79befd9883ee094f5b6c0d8/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470", size = 2331955, upload-time = "2025-11-04T13:41:54.079Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5c/96/5fb7d8c3c17bc8c62fdb031c47d77a1af698f1d7a406b0f79aaa1338f9ad/pydantic_core-2.41.5-cp314-cp314t-win32.whl", hash = "sha256:b4ececa40ac28afa90871c2cc2b9ffd2ff0bf749380fbdf57d165fd23da353aa", size = 1988906, upload-time = "2025-11-04T13:41:56.606Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/ed/182129d83032702912c2e2d8bbe33c036f342cc735737064668585dac28f/pydantic_core-2.41.5-cp314-cp314t-win_amd64.whl", hash = "sha256:80aa89cad80b32a912a65332f64a4450ed00966111b6615ca6816153d3585a8c", size = 1981607, upload-time = "2025-11-04T13:41:58.889Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/ed/068e41660b832bb0b1aa5b58011dea2a3fe0ba7861ff38c4d4904c1c1a99/pydantic_core-2.41.5-cp314-cp314t-win_arm64.whl", hash = "sha256:35b44f37a3199f771c3eaa53051bc8a70cd7b54f333531c59e29fd4db5d15008", size = 1974769, upload-time = "2025-11-04T13:42:01.186Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/09/32/59b0c7e63e277fa7911c2fc70ccfb45ce4b98991e7ef37110663437005af/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:7da7087d756b19037bc2c06edc6c170eeef3c3bafcb8f532ff17d64dc427adfd", size = 2110495, upload-time = "2025-11-04T13:42:49.689Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/81/05e400037eaf55ad400bcd318c05bb345b57e708887f07ddb2d20e3f0e98/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:aabf5777b5c8ca26f7824cb4a120a740c9588ed58df9b2d196ce92fba42ff8dc", size = 1915388, upload-time = "2025-11-04T13:42:52.215Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6e/0d/e3549b2399f71d56476b77dbf3cf8937cec5cd70536bdc0e374a421d0599/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c007fe8a43d43b3969e8469004e9845944f1a80e6acd47c150856bb87f230c56", size = 1942879, upload-time = "2025-11-04T13:42:56.483Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f7/07/34573da085946b6a313d7c42f82f16e8920bfd730665de2d11c0c37a74b5/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76d0819de158cd855d1cbb8fcafdf6f5cf1eb8e470abe056d5d161106e38062b", size = 2139017, upload-time = "2025-11-04T13:42:59.471Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic-settings"
|
||||||
|
version = "2.12.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pydantic" },
|
||||||
|
{ name = "python-dotenv" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/43/4b/ac7e0aae12027748076d72a8764ff1c9d82ca75a7a52622e67ed3f765c54/pydantic_settings-2.12.0.tar.gz", hash = "sha256:005538ef951e3c2a68e1c08b292b5f2e71490def8589d4221b95dab00dafcfd0", size = 194184, upload-time = "2025-11-10T14:25:47.013Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/60/5d4751ba3f4a40a6891f24eec885f51afd78d208498268c734e256fb13c4/pydantic_settings-2.12.0-py3-none-any.whl", hash = "sha256:fddb9fd99a5b18da837b29710391e945b1e30c135477f484084ee513adb93809", size = 51880, upload-time = "2025-11-10T14:25:45.546Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pygments"
|
||||||
|
version = "2.19.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest"
|
||||||
|
version = "8.4.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
{ name = "iniconfig" },
|
||||||
|
{ name = "packaging" },
|
||||||
|
{ name = "pluggy" },
|
||||||
|
{ name = "pygments" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest-cov"
|
||||||
|
version = "5.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "coverage" },
|
||||||
|
{ name = "pytest" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/74/67/00efc8d11b630c56f15f4ad9c7f9223f1e5ec275aaae3fa9118c6a223ad2/pytest-cov-5.0.0.tar.gz", hash = "sha256:5837b58e9f6ebd335b0f8060eecce69b662415b16dc503883a02f45dfeb14857", size = 63042, upload-time = "2024-03-24T20:16:34.856Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/3a/af5b4fa5961d9a1e6237b530eb87dd04aea6eb83da09d2a4073d81b54ccf/pytest_cov-5.0.0-py3-none-any.whl", hash = "sha256:4f0764a1219df53214206bf1feea4633c3b558a2925c8b59f144f682861ce652", size = 21990, upload-time = "2024-03-24T20:16:32.444Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "python-dotenv"
|
||||||
|
version = "1.2.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f0/26/19cadc79a718c5edbec86fd4919a6b6d3f681039a2f6d66d14be94e75fb9/python_dotenv-1.2.1.tar.gz", hash = "sha256:42667e897e16ab0d66954af0e60a9caa94f0fd4ecf3aaf6d2d260eec1aa36ad6", size = 44221, upload-time = "2025-10-26T15:12:10.434Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230, upload-time = "2025-10-26T15:12:09.109Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pyyaml"
|
||||||
|
version = "6.0.3"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/93/7baea19427dcfbe1e5a372d81473250b379f04b1bd3c4c5ff825e2327202/pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", size = 137658, upload-time = "2025-09-25T21:32:20.209Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/bf/899e81e4cce32febab4fb42bb97dcdf66bc135272882d1987881a4b519e9/pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", size = 154003, upload-time = "2025-09-25T21:32:21.167Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1a/08/67bd04656199bbb51dbed1439b7f27601dfb576fb864099c7ef0c3e55531/pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", size = 140344, upload-time = "2025-09-25T21:32:22.617Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/23/20/bb6982b26a40bb43951265ba29d4c246ef0ff59c9fdcdf0ed04e0687de4d/pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", size = 156429, upload-time = "2025-09-25T21:32:57.844Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/f4/a4541072bb9422c8a883ab55255f918fa378ecf083f5b85e87fc2b4eda1b/pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", size = 143912, upload-time = "2025-09-25T21:32:59.247Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f0/7a/1c7270340330e575b92f397352af856a8c06f230aa3e76f86b39d01b416a/pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", size = 174062, upload-time = "2025-09-25T21:32:55.767Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/12/de94a39c2ef588c7e6455cfbe7343d3b2dc9d6b6b2f40c4c6565744c873d/pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", size = 149341, upload-time = "2025-09-25T21:32:56.828Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "sniffio"
|
||||||
|
version = "1.3.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "sortedcontainers"
|
||||||
|
version = "2.4.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594, upload-time = "2021-05-16T22:03:42.897Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575, upload-time = "2021-05-16T22:03:41.177Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "starlette"
|
||||||
|
version = "0.46.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846, upload-time = "2025-04-13T13:56:17.942Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037, upload-time = "2025-04-13T13:56:16.21Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-extensions"
|
||||||
|
version = "4.15.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-inspection"
|
||||||
|
version = "0.4.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload-time = "2025-10-01T02:14:41.687Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "uvicorn"
|
||||||
|
version = "0.30.6"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "click" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/5a/01/5e637e7aa9dd031be5376b9fb749ec20b86f5a5b6a49b87fabd374d5fa9f/uvicorn-0.30.6.tar.gz", hash = "sha256:4b15decdda1e72be08209e860a1e10e92439ad5b97cf44cc945fcbee66fc5788", size = 42825, upload-time = "2024-08-13T09:27:35.098Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/8e/cdc7d6263db313030e4c257dd5ba3909ebc4e4fb53ad62d5f09b1a2f5458/uvicorn-0.30.6-py3-none-any.whl", hash = "sha256:65fd46fe3fda5bdc1b03b94eb634923ff18cd35b2f084813ea79d1f103f711b5", size = 62835, upload-time = "2024-08-13T09:27:33.536Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.optional-dependencies]
|
||||||
|
standard = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
{ name = "httptools" },
|
||||||
|
{ name = "python-dotenv" },
|
||||||
|
{ name = "pyyaml" },
|
||||||
|
{ name = "uvloop", marker = "platform_python_implementation != 'PyPy' and sys_platform != 'cygwin' and sys_platform != 'win32'" },
|
||||||
|
{ name = "watchfiles" },
|
||||||
|
{ name = "websockets" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "uvloop"
|
||||||
|
version = "0.22.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/06/f0/18d39dbd1971d6d62c4629cc7fa67f74821b0dc1f5a77af43719de7936a7/uvloop-0.22.1.tar.gz", hash = "sha256:6c84bae345b9147082b17371e3dd5d42775bddce91f885499017f4607fdaf39f", size = 2443250, upload-time = "2025-10-16T22:17:19.342Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/ff/7f72e8170be527b4977b033239a83a68d5c881cc4775fca255c677f7ac5d/uvloop-0.22.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:fe94b4564e865d968414598eea1a6de60adba0c040ba4ed05ac1300de402cd42", size = 1359936, upload-time = "2025-10-16T22:16:29.436Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/c6/e5d433f88fd54d81ef4be58b2b7b0cea13c442454a1db703a1eea0db1a59/uvloop-0.22.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:51eb9bd88391483410daad430813d982010f9c9c89512321f5b60e2cddbdddd6", size = 752769, upload-time = "2025-10-16T22:16:30.493Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/68/a6ac446820273e71aa762fa21cdcc09861edd3536ff47c5cd3b7afb10eeb/uvloop-0.22.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:700e674a166ca5778255e0e1dc4e9d79ab2acc57b9171b79e65feba7184b3370", size = 4317413, upload-time = "2025-10-16T22:16:31.644Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/6f/e62b4dfc7ad6518e7eff2516f680d02a0f6eb62c0c212e152ca708a0085e/uvloop-0.22.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7b5b1ac819a3f946d3b2ee07f09149578ae76066d70b44df3fa990add49a82e4", size = 4426307, upload-time = "2025-10-16T22:16:32.917Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/60/97362554ac21e20e81bcef1150cb2a7e4ffdaf8ea1e5b2e8bf7a053caa18/uvloop-0.22.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e047cc068570bac9866237739607d1313b9253c3051ad84738cbb095be0537b2", size = 4131970, upload-time = "2025-10-16T22:16:34.015Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/99/39/6b3f7d234ba3964c428a6e40006340f53ba37993f46ed6e111c6e9141d18/uvloop-0.22.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:512fec6815e2dd45161054592441ef76c830eddaad55c8aa30952e6fe1ed07c0", size = 4296343, upload-time = "2025-10-16T22:16:35.149Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/8c/182a2a593195bfd39842ea68ebc084e20c850806117213f5a299dfc513d9/uvloop-0.22.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:561577354eb94200d75aca23fbde86ee11be36b00e52a4eaf8f50fb0c86b7705", size = 1358611, upload-time = "2025-10-16T22:16:36.833Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/14/e301ee96a6dc95224b6f1162cd3312f6d1217be3907b79173b06785f2fe7/uvloop-0.22.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1cdf5192ab3e674ca26da2eada35b288d2fa49fdd0f357a19f0e7c4e7d5077c8", size = 751811, upload-time = "2025-10-16T22:16:38.275Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/02/654426ce265ac19e2980bfd9ea6590ca96a56f10c76e63801a2df01c0486/uvloop-0.22.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e2ea3d6190a2968f4a14a23019d3b16870dd2190cd69c8180f7c632d21de68d", size = 4288562, upload-time = "2025-10-16T22:16:39.375Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/c0/0be24758891ef825f2065cd5db8741aaddabe3e248ee6acc5e8a80f04005/uvloop-0.22.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0530a5fbad9c9e4ee3f2b33b148c6a64d47bbad8000ea63704fa8260f4cf728e", size = 4366890, upload-time = "2025-10-16T22:16:40.547Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/53/8369e5219a5855869bcee5f4d317f6da0e2c669aecf0ef7d371e3d084449/uvloop-0.22.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bc5ef13bbc10b5335792360623cc378d52d7e62c2de64660616478c32cd0598e", size = 4119472, upload-time = "2025-10-16T22:16:41.694Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f8/ba/d69adbe699b768f6b29a5eec7b47dd610bd17a69de51b251126a801369ea/uvloop-0.22.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1f38ec5e3f18c8a10ded09742f7fb8de0108796eb673f30ce7762ce1b8550cad", size = 4239051, upload-time = "2025-10-16T22:16:43.224Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/cd/b62bdeaa429758aee8de8b00ac0dd26593a9de93d302bff3d21439e9791d/uvloop-0.22.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3879b88423ec7e97cd4eba2a443aa26ed4e59b45e6b76aabf13fe2f27023a142", size = 1362067, upload-time = "2025-10-16T22:16:44.503Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/f8/a132124dfda0777e489ca86732e85e69afcd1ff7686647000050ba670689/uvloop-0.22.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:4baa86acedf1d62115c1dc6ad1e17134476688f08c6efd8a2ab076e815665c74", size = 752423, upload-time = "2025-10-16T22:16:45.968Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a3/94/94af78c156f88da4b3a733773ad5ba0b164393e357cc4bd0ab2e2677a7d6/uvloop-0.22.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:297c27d8003520596236bdb2335e6b3f649480bd09e00d1e3a99144b691d2a35", size = 4272437, upload-time = "2025-10-16T22:16:47.451Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b5/35/60249e9fd07b32c665192cec7af29e06c7cd96fa1d08b84f012a56a0b38e/uvloop-0.22.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c1955d5a1dd43198244d47664a5858082a3239766a839b2102a269aaff7a4e25", size = 4292101, upload-time = "2025-10-16T22:16:49.318Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/62/67d382dfcb25d0a98ce73c11ed1a6fba5037a1a1d533dcbb7cab033a2636/uvloop-0.22.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b31dc2fccbd42adc73bc4e7cdbae4fc5086cf378979e53ca5d0301838c5682c6", size = 4114158, upload-time = "2025-10-16T22:16:50.517Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f0/7a/f1171b4a882a5d13c8b7576f348acfe6074d72eaf52cccef752f748d4a9f/uvloop-0.22.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:93f617675b2d03af4e72a5333ef89450dfaa5321303ede6e67ba9c9d26878079", size = 4177360, upload-time = "2025-10-16T22:16:52.646Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/7b/b01414f31546caf0919da80ad57cbfe24c56b151d12af68cee1b04922ca8/uvloop-0.22.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:37554f70528f60cad66945b885eb01f1bb514f132d92b6eeed1c90fd54ed6289", size = 1454790, upload-time = "2025-10-16T22:16:54.355Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/31/0bb232318dd838cad3fa8fb0c68c8b40e1145b32025581975e18b11fab40/uvloop-0.22.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:b76324e2dc033a0b2f435f33eb88ff9913c156ef78e153fb210e03c13da746b3", size = 796783, upload-time = "2025-10-16T22:16:55.906Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/42/38/c9b09f3271a7a723a5de69f8e237ab8e7803183131bc57c890db0b6bb872/uvloop-0.22.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:badb4d8e58ee08dad957002027830d5c3b06aea446a6a3744483c2b3b745345c", size = 4647548, upload-time = "2025-10-16T22:16:57.008Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/37/945b4ca0ac27e3dc4952642d4c900edd030b3da6c9634875af6e13ae80e5/uvloop-0.22.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b91328c72635f6f9e0282e4a57da7470c7350ab1c9f48546c0f2866205349d21", size = 4467065, upload-time = "2025-10-16T22:16:58.206Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/97/cc/48d232f33d60e2e2e0b42f4e73455b146b76ebe216487e862700457fbf3c/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:daf620c2995d193449393d6c62131b3fbd40a63bf7b307a1527856ace637fe88", size = 4328384, upload-time = "2025-10-16T22:16:59.36Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/16/c1fd27e9549f3c4baf1dc9c20c456cd2f822dbf8de9f463824b0c0357e06/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6cde23eeda1a25c75b2e07d39970f3374105d5eafbaab2a4482be82f272d5a5e", size = 4296730, upload-time = "2025-10-16T22:17:00.744Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "watchfiles"
|
||||||
|
version = "1.1.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c2/c9/8869df9b2a2d6c59d79220a4db37679e74f807c559ffe5265e08b227a210/watchfiles-1.1.1.tar.gz", hash = "sha256:a173cb5c16c4f40ab19cecf48a534c409f7ea983ab8fed0741304a1c0a31b3f2", size = 94440, upload-time = "2025-10-14T15:06:21.08Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/d5/f039e7e3c639d9b1d09b07ea412a6806d38123f0508e5f9b48a87b0a76cc/watchfiles-1.1.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:8c89f9f2f740a6b7dcc753140dd5e1ab9215966f7a3530d0c0705c83b401bd7d", size = 404745, upload-time = "2025-10-14T15:04:46.731Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a5/96/a881a13aa1349827490dab2d363c8039527060cfcc2c92cc6d13d1b1049e/watchfiles-1.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:bd404be08018c37350f0d6e34676bd1e2889990117a2b90070b3007f172d0610", size = 391769, upload-time = "2025-10-14T15:04:48.003Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4b/5b/d3b460364aeb8da471c1989238ea0e56bec24b6042a68046adf3d9ddb01c/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8526e8f916bb5b9a0a777c8317c23ce65de259422bba5b31325a6fa6029d33af", size = 449374, upload-time = "2025-10-14T15:04:49.179Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b9/44/5769cb62d4ed055cb17417c0a109a92f007114a4e07f30812a73a4efdb11/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2edc3553362b1c38d9f06242416a5d8e9fe235c204a4072e988ce2e5bb1f69f6", size = 459485, upload-time = "2025-10-14T15:04:50.155Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/19/0c/286b6301ded2eccd4ffd0041a1b726afda999926cf720aab63adb68a1e36/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:30f7da3fb3f2844259cba4720c3fc7138eb0f7b659c38f3bfa65084c7fc7abce", size = 488813, upload-time = "2025-10-14T15:04:51.059Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/2b/8530ed41112dd4a22f4dcfdb5ccf6a1baad1ff6eed8dc5a5f09e7e8c41c7/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f8979280bdafff686ba5e4d8f97840f929a87ed9cdf133cbbd42f7766774d2aa", size = 594816, upload-time = "2025-10-14T15:04:52.031Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/d2/f5f9fb49489f184f18470d4f99f4e862a4b3e9ac2865688eb2099e3d837a/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dcc5c24523771db3a294c77d94771abcfcb82a0e0ee8efd910c37c59ec1b31bb", size = 475186, upload-time = "2025-10-14T15:04:53.064Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/68/5707da262a119fb06fbe214d82dd1fe4a6f4af32d2d14de368d0349eb52a/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1db5d7ae38ff20153d542460752ff397fcf5c96090c1230803713cf3147a6803", size = 456812, upload-time = "2025-10-14T15:04:55.174Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/ab/3cbb8756323e8f9b6f9acb9ef4ec26d42b2109bce830cc1f3468df20511d/watchfiles-1.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:28475ddbde92df1874b6c5c8aaeb24ad5be47a11f87cde5a28ef3835932e3e94", size = 630196, upload-time = "2025-10-14T15:04:56.22Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/46/7152ec29b8335f80167928944a94955015a345440f524d2dfe63fc2f437b/watchfiles-1.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:36193ed342f5b9842edd3532729a2ad55c4160ffcfa3700e0d54be496b70dd43", size = 622657, upload-time = "2025-10-14T15:04:57.521Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/bf/95895e78dd75efe9a7f31733607f384b42eb5feb54bd2eb6ed57cc2e94f4/watchfiles-1.1.1-cp312-cp312-win32.whl", hash = "sha256:859e43a1951717cc8de7f4c77674a6d389b106361585951d9e69572823f311d9", size = 272042, upload-time = "2025-10-14T15:04:59.046Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/87/0a/90eb755f568de2688cb220171c4191df932232c20946966c27a59c400850/watchfiles-1.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:91d4c9a823a8c987cce8fa2690923b069966dabb196dd8d137ea2cede885fde9", size = 288410, upload-time = "2025-10-14T15:05:00.081Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/76/f322701530586922fbd6723c4f91ace21364924822a8772c549483abed13/watchfiles-1.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:a625815d4a2bdca61953dbba5a39d60164451ef34c88d751f6c368c3ea73d404", size = 278209, upload-time = "2025-10-14T15:05:01.168Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bb/f4/f750b29225fe77139f7ae5de89d4949f5a99f934c65a1f1c0b248f26f747/watchfiles-1.1.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:130e4876309e8686a5e37dba7d5e9bc77e6ed908266996ca26572437a5271e18", size = 404321, upload-time = "2025-10-14T15:05:02.063Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/f9/f07a295cde762644aa4c4bb0f88921d2d141af45e735b965fb2e87858328/watchfiles-1.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5f3bde70f157f84ece3765b42b4a52c6ac1a50334903c6eaf765362f6ccca88a", size = 391783, upload-time = "2025-10-14T15:05:03.052Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bc/11/fc2502457e0bea39a5c958d86d2cb69e407a4d00b85735ca724bfa6e0d1a/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14e0b1fe858430fc0251737ef3824c54027bedb8c37c38114488b8e131cf8219", size = 449279, upload-time = "2025-10-14T15:05:04.004Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e3/1f/d66bc15ea0b728df3ed96a539c777acfcad0eb78555ad9efcaa1274688f0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f27db948078f3823a6bb3b465180db8ebecf26dd5dae6f6180bd87383b6b4428", size = 459405, upload-time = "2025-10-14T15:05:04.942Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/be/90/9f4a65c0aec3ccf032703e6db02d89a157462fbb2cf20dd415128251cac0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:059098c3a429f62fc98e8ec62b982230ef2c8df68c79e826e37b895bc359a9c0", size = 488976, upload-time = "2025-10-14T15:05:05.905Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/37/57/ee347af605d867f712be7029bb94c8c071732a4b44792e3176fa3c612d39/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfb5862016acc9b869bb57284e6cb35fdf8e22fe59f7548858e2f971d045f150", size = 595506, upload-time = "2025-10-14T15:05:06.906Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a8/78/cc5ab0b86c122047f75e8fc471c67a04dee395daf847d3e59381996c8707/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:319b27255aacd9923b8a276bb14d21a5f7ff82564c744235fc5eae58d95422ae", size = 474936, upload-time = "2025-10-14T15:05:07.906Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/da/def65b170a3815af7bd40a3e7010bf6ab53089ef1b75d05dd5385b87cf08/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c755367e51db90e75b19454b680903631d41f9e3607fbd941d296a020c2d752d", size = 456147, upload-time = "2025-10-14T15:05:09.138Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/57/99/da6573ba71166e82d288d4df0839128004c67d2778d3b566c138695f5c0b/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c22c776292a23bfc7237a98f791b9ad3144b02116ff10d820829ce62dff46d0b", size = 630007, upload-time = "2025-10-14T15:05:10.117Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a8/51/7439c4dd39511368849eb1e53279cd3454b4a4dbace80bab88feeb83c6b5/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:3a476189be23c3686bc2f4321dd501cb329c0a0469e77b7b534ee10129ae6374", size = 622280, upload-time = "2025-10-14T15:05:11.146Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/9c/8ed97d4bba5db6fdcdb2b298d3898f2dd5c20f6b73aee04eabe56c59677e/watchfiles-1.1.1-cp313-cp313-win32.whl", hash = "sha256:bf0a91bfb5574a2f7fc223cf95eeea79abfefa404bf1ea5e339c0c1560ae99a0", size = 272056, upload-time = "2025-10-14T15:05:12.156Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/f3/c14e28429f744a260d8ceae18bf58c1d5fa56b50d006a7a9f80e1882cb0d/watchfiles-1.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:52e06553899e11e8074503c8e716d574adeeb7e68913115c4b3653c53f9bae42", size = 288162, upload-time = "2025-10-14T15:05:13.208Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/61/fe0e56c40d5cd29523e398d31153218718c5786b5e636d9ae8ae79453d27/watchfiles-1.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:ac3cc5759570cd02662b15fbcd9d917f7ecd47efe0d6b40474eafd246f91ea18", size = 277909, upload-time = "2025-10-14T15:05:14.49Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/42/e0a7d749626f1e28c7108a99fb9bf524b501bbbeb9b261ceecde644d5a07/watchfiles-1.1.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:563b116874a9a7ce6f96f87cd0b94f7faf92d08d0021e837796f0a14318ef8da", size = 403389, upload-time = "2025-10-14T15:05:15.777Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/49/08732f90ce0fbbc13913f9f215c689cfc9ced345fb1bcd8829a50007cc8d/watchfiles-1.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3ad9fe1dae4ab4212d8c91e80b832425e24f421703b5a42ef2e4a1e215aff051", size = 389964, upload-time = "2025-10-14T15:05:16.85Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/0d/7c315d4bd5f2538910491a0393c56bf70d333d51bc5b34bee8e68e8cea19/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce70f96a46b894b36eba678f153f052967a0d06d5b5a19b336ab0dbbd029f73e", size = 448114, upload-time = "2025-10-14T15:05:17.876Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/24/9e096de47a4d11bc4df41e9d1e61776393eac4cb6eb11b3e23315b78b2cc/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:cb467c999c2eff23a6417e58d75e5828716f42ed8289fe6b77a7e5a91036ca70", size = 460264, upload-time = "2025-10-14T15:05:18.962Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cc/0f/e8dea6375f1d3ba5fcb0b3583e2b493e77379834c74fd5a22d66d85d6540/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:836398932192dae4146c8f6f737d74baeac8b70ce14831a239bdb1ca882fc261", size = 487877, upload-time = "2025-10-14T15:05:20.094Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/5b/df24cfc6424a12deb41503b64d42fbea6b8cb357ec62ca84a5a3476f654a/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:743185e7372b7bc7c389e1badcc606931a827112fbbd37f14c537320fca08620", size = 595176, upload-time = "2025-10-14T15:05:21.134Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/b5/853b6757f7347de4e9b37e8cc3289283fb983cba1ab4d2d7144694871d9c/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:afaeff7696e0ad9f02cbb8f56365ff4686ab205fcf9c4c5b6fdfaaa16549dd04", size = 473577, upload-time = "2025-10-14T15:05:22.306Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e1/f7/0a4467be0a56e80447c8529c9fce5b38eab4f513cb3d9bf82e7392a5696b/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f7eb7da0eb23aa2ba036d4f616d46906013a68caf61b7fdbe42fc8b25132e77", size = 455425, upload-time = "2025-10-14T15:05:23.348Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8e/e0/82583485ea00137ddf69bc84a2db88bd92ab4a6e3c405e5fb878ead8d0e7/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:831a62658609f0e5c64178211c942ace999517f5770fe9436be4c2faeba0c0ef", size = 628826, upload-time = "2025-10-14T15:05:24.398Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/28/9a/a785356fccf9fae84c0cc90570f11702ae9571036fb25932f1242c82191c/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf", size = 622208, upload-time = "2025-10-14T15:05:25.45Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/f4/0872229324ef69b2c3edec35e84bd57a1289e7d3fe74588048ed8947a323/watchfiles-1.1.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:d1715143123baeeaeadec0528bb7441103979a1d5f6fd0e1f915383fea7ea6d5", size = 404315, upload-time = "2025-10-14T15:05:26.501Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7b/22/16d5331eaed1cb107b873f6ae1b69e9ced582fcf0c59a50cd84f403b1c32/watchfiles-1.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:39574d6370c4579d7f5d0ad940ce5b20db0e4117444e39b6d8f99db5676c52fd", size = 390869, upload-time = "2025-10-14T15:05:27.649Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/7e/5643bfff5acb6539b18483128fdc0ef2cccc94a5b8fbda130c823e8ed636/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7365b92c2e69ee952902e8f70f3ba6360d0d596d9299d55d7d386df84b6941fb", size = 449919, upload-time = "2025-10-14T15:05:28.701Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/2e/c410993ba5025a9f9357c376f48976ef0e1b1aefb73b97a5ae01a5972755/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bfff9740c69c0e4ed32416f013f3c45e2ae42ccedd1167ef2d805c000b6c71a5", size = 460845, upload-time = "2025-10-14T15:05:30.064Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8e/a4/2df3b404469122e8680f0fcd06079317e48db58a2da2950fb45020947734/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b27cf2eb1dda37b2089e3907d8ea92922b673c0c427886d4edc6b94d8dfe5db3", size = 489027, upload-time = "2025-10-14T15:05:31.064Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ea/84/4587ba5b1f267167ee715b7f66e6382cca6938e0a4b870adad93e44747e6/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:526e86aced14a65a5b0ec50827c745597c782ff46b571dbfe46192ab9e0b3c33", size = 595615, upload-time = "2025-10-14T15:05:32.074Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6a/0f/c6988c91d06e93cd0bb3d4a808bcf32375ca1904609835c3031799e3ecae/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04e78dd0b6352db95507fd8cb46f39d185cf8c74e4cf1e4fbad1d3df96faf510", size = 474836, upload-time = "2025-10-14T15:05:33.209Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b4/36/ded8aebea91919485b7bbabbd14f5f359326cb5ec218cd67074d1e426d74/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c85794a4cfa094714fb9c08d4a218375b2b95b8ed1666e8677c349906246c05", size = 455099, upload-time = "2025-10-14T15:05:34.189Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/e0/8c9bdba88af756a2fce230dd365fab2baf927ba42cd47521ee7498fd5211/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:74d5012b7630714b66be7b7b7a78855ef7ad58e8650c73afc4c076a1f480a8d6", size = 630626, upload-time = "2025-10-14T15:05:35.216Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/84/a95db05354bf2d19e438520d92a8ca475e578c647f78f53197f5a2f17aaf/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:8fbe85cb3201c7d380d3d0b90e63d520f15d6afe217165d7f98c9c649654db81", size = 622519, upload-time = "2025-10-14T15:05:36.259Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1d/ce/d8acdc8de545de995c339be67711e474c77d643555a9bb74a9334252bd55/watchfiles-1.1.1-cp314-cp314-win32.whl", hash = "sha256:3fa0b59c92278b5a7800d3ee7733da9d096d4aabcfabb9a928918bd276ef9b9b", size = 272078, upload-time = "2025-10-14T15:05:37.63Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/c9/a74487f72d0451524be827e8edec251da0cc1fcf111646a511ae752e1a3d/watchfiles-1.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:c2047d0b6cea13b3316bdbafbfa0c4228ae593d995030fda39089d36e64fc03a", size = 287664, upload-time = "2025-10-14T15:05:38.95Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/b8/8ac000702cdd496cdce998c6f4ee0ca1f15977bba51bdf07d872ebdfc34c/watchfiles-1.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:842178b126593addc05acf6fce960d28bc5fae7afbaa2c6c1b3a7b9460e5be02", size = 277154, upload-time = "2025-10-14T15:05:39.954Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/47/a8/e3af2184707c29f0f14b1963c0aace6529f9d1b8582d5b99f31bbf42f59e/watchfiles-1.1.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:88863fbbc1a7312972f1c511f202eb30866370ebb8493aef2812b9ff28156a21", size = 403820, upload-time = "2025-10-14T15:05:40.932Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/ec/e47e307c2f4bd75f9f9e8afbe3876679b18e1bcec449beca132a1c5ffb2d/watchfiles-1.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:55c7475190662e202c08c6c0f4d9e345a29367438cf8e8037f3155e10a88d5a5", size = 390510, upload-time = "2025-10-14T15:05:41.945Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/a0/ad235642118090f66e7b2f18fd5c42082418404a79205cdfca50b6309c13/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f53fa183d53a1d7a8852277c92b967ae99c2d4dcee2bfacff8868e6e30b15f7", size = 448408, upload-time = "2025-10-14T15:05:43.385Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/85/97fa10fd5ff3332ae17e7e40e20784e419e28521549780869f1413742e9d/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6aae418a8b323732fa89721d86f39ec8f092fc2af67f4217a2b07fd3e93c6101", size = 458968, upload-time = "2025-10-14T15:05:44.404Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/47/c2/9059c2e8966ea5ce678166617a7f75ecba6164375f3b288e50a40dc6d489/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f096076119da54a6080e8920cbdaac3dbee667eb91dcc5e5b78840b87415bd44", size = 488096, upload-time = "2025-10-14T15:05:45.398Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/94/44/d90a9ec8ac309bc26db808a13e7bfc0e4e78b6fc051078a554e132e80160/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:00485f441d183717038ed2e887a7c868154f216877653121068107b227a2f64c", size = 596040, upload-time = "2025-10-14T15:05:46.502Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/68/4e3479b20ca305cfc561db3ed207a8a1c745ee32bf24f2026a129d0ddb6e/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a55f3e9e493158d7bfdb60a1165035f1cf7d320914e7b7ea83fe22c6023b58fc", size = 473847, upload-time = "2025-10-14T15:05:47.484Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4f/55/2af26693fd15165c4ff7857e38330e1b61ab8c37d15dc79118cdba115b7a/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c91ed27800188c2ae96d16e3149f199d62f86c7af5f5f4d2c61a3ed8cd3666c", size = 455072, upload-time = "2025-10-14T15:05:48.928Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/1d/d0d200b10c9311ec25d2273f8aad8c3ef7cc7ea11808022501811208a750/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:311ff15a0bae3714ffb603e6ba6dbfba4065ab60865d15a6ec544133bdb21099", size = 629104, upload-time = "2025-10-14T15:05:49.908Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e3/bd/fa9bb053192491b3867ba07d2343d9f2252e00811567d30ae8d0f78136fe/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:a916a2932da8f8ab582f242c065f5c81bed3462849ca79ee357dd9551b0e9b01", size = 622112, upload-time = "2025-10-14T15:05:50.941Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "websockets"
|
||||||
|
version = "15.0.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" },
|
||||||
|
]
|
||||||
231
app/docker-compose.yml
Normal file
231
app/docker-compose.yml
Normal file
@@ -0,0 +1,231 @@
|
|||||||
|
# COMPOSE_PROFILES controls which services start:
|
||||||
|
# - dev : laptop-friendly; mounts source, exposes localhost ports, enables reloaders
|
||||||
|
# - prod : VPS-friendly; no host ports for app containers, joins proxy network for Caddy
|
||||||
|
|
||||||
|
x-service-defaults: &service-defaults
|
||||||
|
env_file: .env
|
||||||
|
restart: ${DOCKER_RESTART_POLICY:-unless-stopped}
|
||||||
|
networks:
|
||||||
|
- app
|
||||||
|
logging:
|
||||||
|
driver: "json-file"
|
||||||
|
options:
|
||||||
|
max-size: "10m"
|
||||||
|
max-file: "3"
|
||||||
|
|
||||||
|
x-backend-common: &backend-common
|
||||||
|
<<: *service-defaults
|
||||||
|
depends_on:
|
||||||
|
postgres:
|
||||||
|
condition: service_healthy
|
||||||
|
|
||||||
|
services:
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
# Next.js Frontend
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
frontend:
|
||||||
|
<<: *service-defaults
|
||||||
|
profiles: [prod]
|
||||||
|
container_name: frontend
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: runner
|
||||||
|
environment:
|
||||||
|
NODE_ENV: production
|
||||||
|
NEXT_TELEMETRY_DISABLED: 1
|
||||||
|
PORT: 3000
|
||||||
|
expose:
|
||||||
|
- "3000"
|
||||||
|
depends_on:
|
||||||
|
backend:
|
||||||
|
condition: service_healthy
|
||||||
|
healthcheck:
|
||||||
|
test:
|
||||||
|
[
|
||||||
|
"CMD-SHELL",
|
||||||
|
"node -e \"require('http').get('http://127.0.0.1:3000', (res) => { process.exit(res.statusCode < 500 ? 0 : 1); }).on('error', () => process.exit(1));\"",
|
||||||
|
]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
start_period: 10s
|
||||||
|
networks:
|
||||||
|
- app
|
||||||
|
- proxy
|
||||||
|
# Caddy from the infra stack reverse-proxies to this container on the proxy network.
|
||||||
|
|
||||||
|
frontend-dev:
|
||||||
|
<<: *service-defaults
|
||||||
|
profiles: [dev]
|
||||||
|
container_name: frontend-dev
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: dev
|
||||||
|
command: ["npm", "run", "dev", "--", "--hostname", "0.0.0.0", "--port", "3000"]
|
||||||
|
environment:
|
||||||
|
NODE_ENV: development
|
||||||
|
NEXT_TELEMETRY_DISABLED: 1
|
||||||
|
ports:
|
||||||
|
- "3000:3000" # Localhost access during development
|
||||||
|
volumes:
|
||||||
|
- ./frontend:/app
|
||||||
|
- frontend-node_modules:/app/node_modules
|
||||||
|
depends_on:
|
||||||
|
backend-dev:
|
||||||
|
condition: service_healthy
|
||||||
|
networks:
|
||||||
|
app:
|
||||||
|
aliases:
|
||||||
|
- frontend
|
||||||
|
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
# FastAPI Backend
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
backend:
|
||||||
|
<<: *backend-common
|
||||||
|
profiles: [prod]
|
||||||
|
container_name: backend
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: production
|
||||||
|
environment:
|
||||||
|
ENVIRONMENT: production
|
||||||
|
command:
|
||||||
|
[
|
||||||
|
"gunicorn",
|
||||||
|
"main:app",
|
||||||
|
"-k",
|
||||||
|
"uvicorn.workers.UvicornWorker",
|
||||||
|
"-w",
|
||||||
|
"${GUNICORN_WORKERS:-4}",
|
||||||
|
"--bind",
|
||||||
|
"0.0.0.0:8000",
|
||||||
|
"--access-logfile",
|
||||||
|
"-",
|
||||||
|
"--error-logfile",
|
||||||
|
"-",
|
||||||
|
"--timeout",
|
||||||
|
"${GUNICORN_TIMEOUT:-120}",
|
||||||
|
]
|
||||||
|
expose:
|
||||||
|
- "8000"
|
||||||
|
healthcheck:
|
||||||
|
test:
|
||||||
|
[
|
||||||
|
"CMD-SHELL",
|
||||||
|
"python -c \"import urllib.request; urllib.request.urlopen('http://127.0.0.1:8000/health/live').close()\"",
|
||||||
|
]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
start_period: 10s
|
||||||
|
networks:
|
||||||
|
- app
|
||||||
|
- proxy
|
||||||
|
|
||||||
|
backend-dev:
|
||||||
|
<<: *backend-common
|
||||||
|
profiles: [dev]
|
||||||
|
container_name: backend-dev
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: development
|
||||||
|
environment:
|
||||||
|
ENVIRONMENT: development
|
||||||
|
PYTHONPATH: /app
|
||||||
|
command: ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
|
||||||
|
ports:
|
||||||
|
- "8000:8000" # Localhost access during development
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
- backend-venv:/app/.venv
|
||||||
|
healthcheck:
|
||||||
|
test:
|
||||||
|
[
|
||||||
|
"CMD-SHELL",
|
||||||
|
"python -c \"import urllib.request; urllib.request.urlopen('http://127.0.0.1:8000/health/live').close()\"",
|
||||||
|
]
|
||||||
|
interval: 15s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
start_period: 10s
|
||||||
|
networks:
|
||||||
|
app:
|
||||||
|
aliases:
|
||||||
|
- backend
|
||||||
|
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
# PostgreSQL + pgvector
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
postgres:
|
||||||
|
<<: *service-defaults
|
||||||
|
container_name: postgres
|
||||||
|
image: pgvector/pgvector:pg16
|
||||||
|
profiles:
|
||||||
|
- dev
|
||||||
|
- prod
|
||||||
|
volumes:
|
||||||
|
- postgres-data:/var/lib/postgresql/data
|
||||||
|
ports:
|
||||||
|
- "127.0.0.1:${POSTGRES_PORT:-5432}:5432" # Local-only binding keeps DB off the public interface
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-postgres} -d ${POSTGRES_DB:-postgres}"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
# LiveKit Real-Time Server
|
||||||
|
# --------------------------------------------------------------------------
|
||||||
|
livekit:
|
||||||
|
<<: *service-defaults
|
||||||
|
container_name: livekit
|
||||||
|
image: livekit/livekit-server:latest
|
||||||
|
profiles:
|
||||||
|
- dev
|
||||||
|
- prod
|
||||||
|
# UDP/TCP ports remain published in prod so external clients can complete WebRTC/TURN;
|
||||||
|
# Caddy still proxies signaling over the shared proxy network.
|
||||||
|
environment:
|
||||||
|
LIVEKIT_KEYS: "${LIVEKIT_API_KEY}:${LIVEKIT_API_SECRET}"
|
||||||
|
LIVEKIT_PORT: 7880
|
||||||
|
LIVEKIT_LOG_LEVEL: ${LIVEKIT_LOG_LEVEL:-info}
|
||||||
|
command:
|
||||||
|
[
|
||||||
|
"livekit-server",
|
||||||
|
"--dev",
|
||||||
|
"--port",
|
||||||
|
"7880",
|
||||||
|
"--rtc.port-range-start",
|
||||||
|
"${LIVEKIT_RTC_PORT_RANGE_START:-60000}",
|
||||||
|
"--rtc.port-range-end",
|
||||||
|
"${LIVEKIT_RTC_PORT_RANGE_END:-60100}",
|
||||||
|
]
|
||||||
|
ports:
|
||||||
|
- "7880:7880" # HTTP/WS signaling (Caddy terminates TLS)
|
||||||
|
- "7881:7881" # TCP fallback for WebRTC
|
||||||
|
- "3478:3478/udp" # TURN
|
||||||
|
- "5349:5349/tcp" # TURN over TLS
|
||||||
|
- "${LIVEKIT_RTC_PORT_RANGE_START:-60000}-${LIVEKIT_RTC_PORT_RANGE_END:-60100}:${LIVEKIT_RTC_PORT_RANGE_START:-60000}-${LIVEKIT_RTC_PORT_RANGE_END:-60100}/udp" # WebRTC media plane
|
||||||
|
networks:
|
||||||
|
- app
|
||||||
|
- proxy
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres-data:
|
||||||
|
backend-venv:
|
||||||
|
frontend-node_modules:
|
||||||
|
|
||||||
|
networks:
|
||||||
|
app:
|
||||||
|
name: app_network
|
||||||
|
proxy:
|
||||||
|
name: proxy
|
||||||
|
# In prod, set PROXY_NETWORK_EXTERNAL=true so this attaches to the shared
|
||||||
|
# Caddy network created by infra. In dev, leave false to let Compose create
|
||||||
|
# a local network automatically.
|
||||||
|
external: ${PROXY_NETWORK_EXTERNAL:-false}
|
||||||
75
app/frontend/Dockerfile
Normal file
75
app/frontend/Dockerfile
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
#
|
||||||
|
# FRONTEND DOCKERFILE
|
||||||
|
#
|
||||||
|
# Multi-stage image for the Next.js SPA/SSR frontend.
|
||||||
|
# - runner: production server with minimal footprint
|
||||||
|
# - builder: compiles the Next.js app
|
||||||
|
# - dev: hot-reload friendly image
|
||||||
|
#
|
||||||
|
# COMPOSE_PROFILES decides which stage is used by docker-compose.yml.
|
||||||
|
#
|
||||||
|
|
||||||
|
FROM node:22-slim AS base
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
ENV NPM_CONFIG_LOGLEVEL=warn \
|
||||||
|
NODE_OPTIONS="--enable-source-maps" \
|
||||||
|
NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Dependencies cache
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM base AS deps
|
||||||
|
COPY package*.json ./
|
||||||
|
RUN if [ -f package-lock.json ]; then npm ci; else npm install; fi \
|
||||||
|
&& chown -R node:node /app
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Production dependencies only (pruned to omit dev tooling)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM base AS prod-deps
|
||||||
|
COPY package*.json ./
|
||||||
|
RUN if [ -f package-lock.json ]; then npm ci --omit=dev; else npm install --omit=dev; fi
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Builder: compile the application for production
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM base AS builder
|
||||||
|
|
||||||
|
COPY --from=deps /app/node_modules ./node_modules
|
||||||
|
COPY . .
|
||||||
|
ENV NODE_ENV=production
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED=1
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Production runner: serve the built Next.js app
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM node:22-slim AS runner
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
ENV NODE_ENV=production \
|
||||||
|
NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
|
USER node
|
||||||
|
COPY --from=prod-deps --chown=node:node /app/node_modules ./node_modules
|
||||||
|
COPY --from=builder --chown=node:node /app/.next ./.next
|
||||||
|
COPY --from=builder --chown=node:node /app/public ./public
|
||||||
|
COPY --from=builder --chown=node:node /app/package.json ./package.json
|
||||||
|
COPY --from=builder --chown=node:node /app/package-lock.json ./package-lock.json
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
CMD ["npm", "run", "start"]
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Development: keeps node_modules and sources mounted for hot reload
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
FROM deps AS dev
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
ENV NODE_ENV=development \
|
||||||
|
NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
|
USER node
|
||||||
|
EXPOSE 3000
|
||||||
|
CMD ["npm", "run", "dev", "--", "--hostname", "0.0.0.0", "--port", "3000"]
|
||||||
35
app/frontend/app/globals.css
Normal file
35
app/frontend/app/globals.css
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
@import "tailwindcss";
|
||||||
|
|
||||||
|
@source "./app/**/*.{js,ts,jsx,tsx,mdx}";
|
||||||
|
@source "./components/**/*.{js,ts,jsx,tsx,mdx}";
|
||||||
|
|
||||||
|
@theme {
|
||||||
|
--color-ink: #0f172a;
|
||||||
|
--color-inkSoft: #1e293b;
|
||||||
|
--color-inkMuted: #64748b;
|
||||||
|
--color-sand: #f7f7fb;
|
||||||
|
--color-card: #ffffff;
|
||||||
|
--color-success: #22c55e;
|
||||||
|
--color-danger: #ef4444;
|
||||||
|
--color-pulse: #f59e0b;
|
||||||
|
--color-accent-blue: #60a5fa;
|
||||||
|
--color-accent-mint: #34d399;
|
||||||
|
--color-accent-coral: #fb7185;
|
||||||
|
--font-sans: "Plus Jakarta Sans", "Inter", system-ui, -apple-system, "Segoe UI",
|
||||||
|
sans-serif;
|
||||||
|
--font-display: "Plus Jakarta Sans", "Inter", system-ui, -apple-system, "Segoe UI",
|
||||||
|
sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
:root {
|
||||||
|
color-scheme: light;
|
||||||
|
background-color: #f7f7fb;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
@apply min-h-screen bg-gradient-to-br from-[#f9fafb] via-[#f2f4f6] to-[#e7eaee] text-ink antialiased;
|
||||||
|
}
|
||||||
|
|
||||||
|
* {
|
||||||
|
@apply selection:bg-blue-200 selection:text-ink;
|
||||||
|
}
|
||||||
20
app/frontend/app/layout.tsx
Normal file
20
app/frontend/app/layout.tsx
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
import "./globals.css";
|
||||||
|
import type { Metadata } from "next";
|
||||||
|
import type { ReactNode } from "react";
|
||||||
|
|
||||||
|
export const metadata: Metadata = {
|
||||||
|
title: "avaaz.ai | Live Health Console",
|
||||||
|
description: "Live ECG-style monitoring for avaaz.ai health endpoints.",
|
||||||
|
icons: {
|
||||||
|
icon: [{ url: "/favicon.png", type: "image/png" }],
|
||||||
|
shortcut: ["/favicon.png"],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
export default function RootLayout({ children }: { children: ReactNode }) {
|
||||||
|
return (
|
||||||
|
<html lang="en">
|
||||||
|
<body className="bg-sand font-sans text-ink">{children}</body>
|
||||||
|
</html>
|
||||||
|
);
|
||||||
|
}
|
||||||
596
app/frontend/app/page.tsx
Normal file
596
app/frontend/app/page.tsx
Normal file
@@ -0,0 +1,596 @@
|
|||||||
|
"use client";
|
||||||
|
|
||||||
|
import Image from "next/image";
|
||||||
|
import {
|
||||||
|
useCallback,
|
||||||
|
useEffect,
|
||||||
|
useMemo,
|
||||||
|
useRef,
|
||||||
|
useState,
|
||||||
|
} from "react";
|
||||||
|
import { ECGMonitorCard } from "@/components/ECGMonitorCard";
|
||||||
|
import type { PollState } from "@/types/monitor";
|
||||||
|
|
||||||
|
interface HealthSummary {
|
||||||
|
status: string;
|
||||||
|
version?: string;
|
||||||
|
serviceId?: string;
|
||||||
|
description?: string;
|
||||||
|
checks?:
|
||||||
|
| Record<string, { status: string; output?: string; details?: string }>
|
||||||
|
| {
|
||||||
|
name?: string;
|
||||||
|
key?: string;
|
||||||
|
check?: string;
|
||||||
|
status: string;
|
||||||
|
output?: string;
|
||||||
|
details?: string;
|
||||||
|
}[];
|
||||||
|
}
|
||||||
|
|
||||||
|
const POLL_INTERVALS = {
|
||||||
|
live: 10_000,
|
||||||
|
ready: 30_000,
|
||||||
|
health: 60_000,
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
const WAVE_SAMPLES = 140;
|
||||||
|
const WAVE_WIDTH = 360;
|
||||||
|
const BASELINE = 68;
|
||||||
|
const MIN_Y = 20;
|
||||||
|
const MAX_Y = 110;
|
||||||
|
const NOISE = 1.4;
|
||||||
|
|
||||||
|
function formatTime(timestamp: string | null) {
|
||||||
|
if (!timestamp) return "Pending";
|
||||||
|
return new Date(timestamp).toLocaleTimeString([], {
|
||||||
|
hour: "2-digit",
|
||||||
|
minute: "2-digit",
|
||||||
|
second: "2-digit",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function useNowString() {
|
||||||
|
const [now, setNow] = useState<string>("");
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const tick = () =>
|
||||||
|
setNow(
|
||||||
|
new Date().toLocaleTimeString([], {
|
||||||
|
hour: "2-digit",
|
||||||
|
minute: "2-digit",
|
||||||
|
second: "2-digit",
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
tick();
|
||||||
|
const id = setInterval(tick, 1_000);
|
||||||
|
return () => clearInterval(id);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return now;
|
||||||
|
}
|
||||||
|
|
||||||
|
function hasStatusField(value: unknown): value is { status?: unknown } {
|
||||||
|
return typeof value === "object" && value !== null && "status" in value;
|
||||||
|
}
|
||||||
|
|
||||||
|
function isStatusHealthy(value: unknown): boolean {
|
||||||
|
if (!value) return false;
|
||||||
|
if (typeof value === "string") {
|
||||||
|
return ["ok", "pass", "healthy", "up", "ready", "live"].includes(
|
||||||
|
value.toLowerCase()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
if (hasStatusField(value) && typeof value.status === "string") {
|
||||||
|
return ["ok", "pass", "healthy", "up", "ready", "live"].includes(
|
||||||
|
value.status.toLowerCase()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Drives the shared ECG waveform: keeps the line scrolling and injects pulse spikes
|
||||||
|
* whenever a poll occurs so the right edge always shows the freshest activity.
|
||||||
|
*/
|
||||||
|
function useWaveform() {
|
||||||
|
const [wave, setWave] = useState<number[]>(() =>
|
||||||
|
Array.from({ length: WAVE_SAMPLES }, () => BASELINE)
|
||||||
|
);
|
||||||
|
const queueRef = useRef<number[]>([]);
|
||||||
|
const tickRef = useRef(0);
|
||||||
|
|
||||||
|
// Queue sharp spikes that get blended into the scrolling ECG line.
|
||||||
|
const triggerPulse = useCallback((strength = 18) => {
|
||||||
|
queueRef.current.push(
|
||||||
|
BASELINE - 6,
|
||||||
|
BASELINE + strength,
|
||||||
|
BASELINE - 10,
|
||||||
|
BASELINE + strength * 0.6
|
||||||
|
);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const id = setInterval(() => {
|
||||||
|
setWave((prev) => {
|
||||||
|
const queue = queueRef.current;
|
||||||
|
const drift =
|
||||||
|
Math.sin(tickRef.current / 6) * NOISE +
|
||||||
|
Math.cos(tickRef.current / 9) * (NOISE / 1.6);
|
||||||
|
tickRef.current += 1;
|
||||||
|
const nextVal = queue.length
|
||||||
|
? queue.shift() ?? BASELINE
|
||||||
|
: BASELINE + drift;
|
||||||
|
const clamped = Math.max(MIN_Y, Math.min(MAX_Y, nextVal));
|
||||||
|
const next = prev.slice(1);
|
||||||
|
next.push(clamped);
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
}, 110);
|
||||||
|
return () => clearInterval(id);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const wavePoints = useMemo(
|
||||||
|
() =>
|
||||||
|
wave
|
||||||
|
.map((value, idx) => {
|
||||||
|
const x = (idx / (wave.length - 1)) * WAVE_WIDTH;
|
||||||
|
return `${x.toFixed(1)},${value.toFixed(1)}`;
|
||||||
|
})
|
||||||
|
.join(" "),
|
||||||
|
[wave]
|
||||||
|
);
|
||||||
|
|
||||||
|
return { wavePoints, waveHeight: wave[wave.length - 1], triggerPulse };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Polls an endpoint on a cadence, parsing the response and surfacing status metadata.
|
||||||
|
*/
|
||||||
|
function usePoller<T>(opts: {
|
||||||
|
baseUrl: string;
|
||||||
|
path: string;
|
||||||
|
intervalMs: number;
|
||||||
|
parser: (res: Response) => Promise<T>;
|
||||||
|
onPoll?: () => void;
|
||||||
|
}): { state: PollState<T>; pollNow: () => Promise<void> } {
|
||||||
|
const { baseUrl, path, intervalMs, parser, onPoll } = opts;
|
||||||
|
const [state, setState] = useState<PollState<T>>({
|
||||||
|
data: null,
|
||||||
|
error: null,
|
||||||
|
loading: false,
|
||||||
|
lastUpdated: null,
|
||||||
|
attemptedUrl: `${baseUrl}${path}`,
|
||||||
|
});
|
||||||
|
|
||||||
|
const poll = useCallback(async () => {
|
||||||
|
const url = `${baseUrl}${path}`;
|
||||||
|
onPoll?.();
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
loading: true,
|
||||||
|
attemptedUrl: url,
|
||||||
|
}));
|
||||||
|
|
||||||
|
try {
|
||||||
|
const res = await fetch(url, { cache: "no-store" });
|
||||||
|
if (!res.ok) throw new Error(`${path} responded with ${res.status}`);
|
||||||
|
const parsed = await parser(res);
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
data: parsed,
|
||||||
|
error: null,
|
||||||
|
lastUpdated: new Date().toISOString(),
|
||||||
|
loading: false,
|
||||||
|
}));
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : "Unknown error";
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
data: null,
|
||||||
|
error: message,
|
||||||
|
lastUpdated: new Date().toISOString(),
|
||||||
|
loading: false,
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
}, [baseUrl, path, parser, onPoll]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
void poll();
|
||||||
|
const id = setInterval(() => void poll(), intervalMs);
|
||||||
|
return () => clearInterval(id);
|
||||||
|
}, [poll, intervalMs]);
|
||||||
|
|
||||||
|
return { state, pollNow: poll };
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function Home() {
|
||||||
|
const apiBase = useMemo(
|
||||||
|
() => process.env.NEXT_PUBLIC_API_URL ?? "http://localhost:8000",
|
||||||
|
[]
|
||||||
|
);
|
||||||
|
const now = useNowString();
|
||||||
|
const unifiedWave = useWaveform();
|
||||||
|
const liveWave = useWaveform();
|
||||||
|
const readyWave = useWaveform();
|
||||||
|
const healthWave = useWaveform();
|
||||||
|
|
||||||
|
const parseHealth = useCallback(async (res: Response) => {
|
||||||
|
return (await res.json()) as HealthSummary;
|
||||||
|
}, []);
|
||||||
|
const parseText = useCallback(async (res: Response) => res.text(), []);
|
||||||
|
|
||||||
|
const pulseHealth = useCallback(() => {
|
||||||
|
unifiedWave.triggerPulse(22);
|
||||||
|
healthWave.triggerPulse(22);
|
||||||
|
}, [unifiedWave.triggerPulse, healthWave.triggerPulse]);
|
||||||
|
const pulseReady = useCallback(() => {
|
||||||
|
unifiedWave.triggerPulse(16);
|
||||||
|
readyWave.triggerPulse(16);
|
||||||
|
}, [unifiedWave.triggerPulse, readyWave.triggerPulse]);
|
||||||
|
const pulseLive = useCallback(() => {
|
||||||
|
unifiedWave.triggerPulse(12);
|
||||||
|
liveWave.triggerPulse(12);
|
||||||
|
}, [unifiedWave.triggerPulse, liveWave.triggerPulse]);
|
||||||
|
|
||||||
|
const { state: healthState, pollNow: pollHealth } = usePoller<HealthSummary>({
|
||||||
|
baseUrl: apiBase,
|
||||||
|
path: "/health",
|
||||||
|
intervalMs: POLL_INTERVALS.health,
|
||||||
|
parser: parseHealth,
|
||||||
|
onPoll: pulseHealth,
|
||||||
|
});
|
||||||
|
|
||||||
|
const { state: readyState, pollNow: pollReady } = usePoller<string>({
|
||||||
|
baseUrl: apiBase,
|
||||||
|
path: "/health/ready",
|
||||||
|
intervalMs: POLL_INTERVALS.ready,
|
||||||
|
parser: parseText,
|
||||||
|
onPoll: pulseReady,
|
||||||
|
});
|
||||||
|
|
||||||
|
const { state: liveState, pollNow: pollLive } = usePoller<string>({
|
||||||
|
baseUrl: apiBase,
|
||||||
|
path: "/health/live",
|
||||||
|
intervalMs: POLL_INTERVALS.live,
|
||||||
|
parser: parseText,
|
||||||
|
onPoll: pulseLive,
|
||||||
|
});
|
||||||
|
|
||||||
|
const healthOk = isStatusHealthy(healthState.data?.status) && !healthState.error;
|
||||||
|
const readyOk = isStatusHealthy(readyState.data) && !readyState.error;
|
||||||
|
const liveOk = isStatusHealthy(liveState.data) && !liveState.error;
|
||||||
|
const overallOk = healthOk && readyOk && liveOk;
|
||||||
|
|
||||||
|
const checks = useMemo(() => {
|
||||||
|
const source = healthState.data?.checks;
|
||||||
|
if (!source) return [];
|
||||||
|
if (Array.isArray(source)) {
|
||||||
|
return source.map((item, idx) => ({
|
||||||
|
label: item.name ?? item.key ?? item.check ?? `Check ${idx + 1}`,
|
||||||
|
status: item.status ?? "unknown",
|
||||||
|
output: item.output ?? item.details ?? "",
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
return Object.entries(source).map(([label, val]) => ({
|
||||||
|
label,
|
||||||
|
status: (val as { status?: string }).status ?? "unknown",
|
||||||
|
output:
|
||||||
|
(val as { output?: string; details?: string }).output ??
|
||||||
|
(val as { details?: string }).details ??
|
||||||
|
"",
|
||||||
|
}));
|
||||||
|
}, [healthState.data?.checks]);
|
||||||
|
|
||||||
|
const overallLabel = overallOk
|
||||||
|
? "All probes healthy"
|
||||||
|
: "Attention needed";
|
||||||
|
const overallLoading =
|
||||||
|
healthState.loading || readyState.loading || liveState.loading;
|
||||||
|
const overallStrokeColor = overallLoading
|
||||||
|
? "var(--color-pulse)"
|
||||||
|
: overallOk
|
||||||
|
? "var(--color-success)"
|
||||||
|
: "var(--color-danger)";
|
||||||
|
|
||||||
|
return (
|
||||||
|
<main className="relative min-h-screen overflow-hidden">
|
||||||
|
<div className="pointer-events-none absolute inset-0 bg-[radial-gradient(circle_at_20%_10%,rgba(148,163,184,0.18),transparent_45%),radial-gradient(circle_at_80%_0%,rgba(203,213,225,0.18),transparent_35%),radial-gradient(circle_at_40%_90%,rgba(226,232,240,0.22),transparent_40%)]" />
|
||||||
|
<div className="absolute inset-0 bg-gradient-to-b from-white/80 via-[#f3f4f6cc] to-[#e5e7eb80] backdrop-blur-[1px]" />
|
||||||
|
|
||||||
|
<div className="relative mx-auto flex w-full max-w-6xl flex-col gap-6 px-4 pb-16 pt-10 sm:px-6 lg:px-8">
|
||||||
|
<header className="flex flex-col gap-4 rounded-3xl border border-black/5 bg-white/80 px-5 py-4 shadow-card backdrop-blur md:flex-row md:items-center md:justify-between">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
<div className="flex h-12 w-12 items-center justify-center rounded-2xl border border-black/5 bg-gradient-to-br from-white via-white to-slate-50 shadow-sm">
|
||||||
|
<Image
|
||||||
|
src="/logo.png"
|
||||||
|
alt="avaaz.ai logo"
|
||||||
|
width={40}
|
||||||
|
height={40}
|
||||||
|
priority
|
||||||
|
className="h-9 w-9 rounded-xl object-contain"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<p className="text-lg font-semibold text-ink">avaaz.ai</p>
|
||||||
|
<p className="text-sm text-inkMuted">
|
||||||
|
Unified ECG monitoring for live, ready, and health probes.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-wrap items-center gap-2">
|
||||||
|
<span
|
||||||
|
className={`inline-flex items-center gap-2 rounded-full border px-3 py-2 text-sm font-semibold shadow-sm ${
|
||||||
|
overallOk
|
||||||
|
? "border-emerald-200 bg-emerald-50 text-emerald-800"
|
||||||
|
: "border-rose-200 bg-rose-50 text-rose-800"
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
<span
|
||||||
|
className={`h-2.5 w-2.5 rounded-full ${
|
||||||
|
overallOk ? "bg-success" : "bg-danger"
|
||||||
|
}`}
|
||||||
|
/>
|
||||||
|
{overallLabel}
|
||||||
|
</span>
|
||||||
|
<span className="inline-flex items-center gap-2 rounded-full border border-black/5 bg-black/5 px-3 py-2 text-xs font-semibold uppercase tracking-[0.1em] text-inkMuted">
|
||||||
|
Live clock
|
||||||
|
<span className="font-mono text-sm">{now || "— — : — —"}</span>
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<section className="grid gap-5 md:grid-cols-[1.1fr,1.4fr]">
|
||||||
|
<div className="rounded-3xl border border-black/5 bg-white/80 p-6 shadow-card backdrop-blur">
|
||||||
|
<p className="text-xs uppercase tracking-[0.2em] text-inkMuted">
|
||||||
|
Live dashboard
|
||||||
|
</p>
|
||||||
|
<h1 className="mt-2 text-3xl font-semibold leading-tight text-ink md:text-4xl">
|
||||||
|
ECG-style observability for avaaz.ai
|
||||||
|
</h1>
|
||||||
|
<p className="mt-3 text-base text-inkMuted">
|
||||||
|
A continuous, scrolling signal shows every poll with sharp blips
|
||||||
|
whenever a probe fires. Color shifts between healthy (green),
|
||||||
|
unhealthy (red), and active polling (amber).
|
||||||
|
</p>
|
||||||
|
<div className="mt-4 grid grid-cols-1 gap-3 text-sm text-ink">
|
||||||
|
<div className="flex items-center justify-between rounded-2xl border border-black/5 bg-white/70 px-4 py-3">
|
||||||
|
<span className="text-inkMuted">/health/live</span>
|
||||||
|
<span className="font-semibold">
|
||||||
|
{liveOk ? "Live" : "Down"} · 10s cadence
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center justify-between rounded-2xl border border-black/5 bg-white/70 px-4 py-3">
|
||||||
|
<span className="text-inkMuted">/health/ready</span>
|
||||||
|
<span className="font-semibold">
|
||||||
|
{readyOk ? "Ready" : "Not ready"} · 30s cadence
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center justify-between rounded-2xl border border-black/5 bg-white/70 px-4 py-3">
|
||||||
|
<span className="text-inkMuted">/health</span>
|
||||||
|
<span className="font-semibold">
|
||||||
|
{healthOk ? "Healthy" : "Degraded"} · 60s cadence
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="relative overflow-hidden rounded-3xl border border-black/5 bg-gradient-to-br from-white via-[#f3f4f6] to-white p-6 shadow-card">
|
||||||
|
<div className="pointer-events-none absolute inset-0 bg-[radial-gradient(circle_at_20%_10%,rgba(148,163,184,0.18),transparent_35%),radial-gradient(circle_at_80%_10%,rgba(203,213,225,0.14),transparent_30%),radial-gradient(circle_at_40%_90%,rgba(226,232,240,0.14),transparent_28%)]" />
|
||||||
|
<div className="flex items-center justify-between text-sm text-inkMuted">
|
||||||
|
<span className="font-semibold uppercase tracking-[0.12em]">
|
||||||
|
Unified ECG strip
|
||||||
|
</span>
|
||||||
|
<span className="inline-flex items-center gap-2 rounded-full border border-black/5 bg-white/70 px-3 py-1 font-semibold text-ink">
|
||||||
|
<span className="h-2 w-2 rounded-full bg-success" />
|
||||||
|
Healthy
|
||||||
|
<span className="h-2 w-2 rounded-full bg-danger" />
|
||||||
|
Unhealthy
|
||||||
|
<span className="h-2 w-2 rounded-full bg-pulse" />
|
||||||
|
Polling
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div className="mt-3 rounded-2xl border border-black/5 bg-white/80 p-4 shadow-inner">
|
||||||
|
<svg
|
||||||
|
viewBox="0 0 360 120"
|
||||||
|
preserveAspectRatio="none"
|
||||||
|
className="h-32 w-full"
|
||||||
|
>
|
||||||
|
<polyline
|
||||||
|
points={unifiedWave.wavePoints}
|
||||||
|
fill="none"
|
||||||
|
stroke={overallStrokeColor}
|
||||||
|
strokeWidth="4"
|
||||||
|
strokeLinecap="round"
|
||||||
|
strokeLinejoin="round"
|
||||||
|
/>
|
||||||
|
<circle
|
||||||
|
cx="360"
|
||||||
|
cy={unifiedWave.waveHeight}
|
||||||
|
r="7"
|
||||||
|
className="stroke-white/50"
|
||||||
|
style={{ fill: overallStrokeColor }}
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
<div className="mt-3 grid grid-cols-2 gap-3 text-sm text-ink">
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/70 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Last health update
|
||||||
|
</p>
|
||||||
|
<p className="text-base font-semibold text-ink">
|
||||||
|
{formatTime(healthState.lastUpdated)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/70 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Next pulse cadence
|
||||||
|
</p>
|
||||||
|
<p className="text-base font-semibold text-ink">
|
||||||
|
10s / 30s / 60s
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section className="grid grid-cols-1 gap-5 md:grid-cols-2 xl:grid-cols-3">
|
||||||
|
<ECGMonitorCard
|
||||||
|
title="Live probe"
|
||||||
|
endpoint="/health/live"
|
||||||
|
intervalLabel="10 seconds"
|
||||||
|
tone="blue"
|
||||||
|
now={now}
|
||||||
|
wavePoints={liveWave.wavePoints}
|
||||||
|
waveHeight={liveWave.waveHeight}
|
||||||
|
healthy={liveOk}
|
||||||
|
loading={liveState.loading}
|
||||||
|
statusLabel={
|
||||||
|
liveState.loading
|
||||||
|
? "Polling"
|
||||||
|
: liveOk
|
||||||
|
? "Operational"
|
||||||
|
: "Attention"
|
||||||
|
}
|
||||||
|
statusDetail={liveState.error ?? liveState.data ?? "Awaiting data"}
|
||||||
|
lastUpdatedLabel={formatTime(liveState.lastUpdated)}
|
||||||
|
state={liveState}
|
||||||
|
onManualTrigger={pollLive}
|
||||||
|
>
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2 text-sm">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Signal
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold text-ink">
|
||||||
|
{liveState.data ?? liveState.error ?? "No response yet"}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</ECGMonitorCard>
|
||||||
|
|
||||||
|
<ECGMonitorCard
|
||||||
|
title="Readiness"
|
||||||
|
endpoint="/health/ready"
|
||||||
|
intervalLabel="30 seconds"
|
||||||
|
tone="mint"
|
||||||
|
now={now}
|
||||||
|
wavePoints={readyWave.wavePoints}
|
||||||
|
waveHeight={readyWave.waveHeight}
|
||||||
|
healthy={readyOk}
|
||||||
|
loading={readyState.loading}
|
||||||
|
statusLabel={
|
||||||
|
readyState.loading
|
||||||
|
? "Polling"
|
||||||
|
: readyOk
|
||||||
|
? "Operational"
|
||||||
|
: "Attention"
|
||||||
|
}
|
||||||
|
statusDetail={
|
||||||
|
readyState.error ??
|
||||||
|
readyState.data ??
|
||||||
|
"Waiting for first readiness signal"
|
||||||
|
}
|
||||||
|
lastUpdatedLabel={formatTime(readyState.lastUpdated)}
|
||||||
|
state={readyState}
|
||||||
|
onManualTrigger={pollReady}
|
||||||
|
>
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2 text-sm">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Response
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold text-ink">
|
||||||
|
{readyState.data ?? readyState.error ?? "No response yet"}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</ECGMonitorCard>
|
||||||
|
|
||||||
|
<ECGMonitorCard
|
||||||
|
title="Deep health"
|
||||||
|
endpoint="/health"
|
||||||
|
intervalLabel="1 minute"
|
||||||
|
tone="coral"
|
||||||
|
now={now}
|
||||||
|
wavePoints={healthWave.wavePoints}
|
||||||
|
waveHeight={healthWave.waveHeight}
|
||||||
|
healthy={healthOk}
|
||||||
|
loading={healthState.loading}
|
||||||
|
statusLabel={
|
||||||
|
healthState.loading
|
||||||
|
? "Polling"
|
||||||
|
: healthOk
|
||||||
|
? "Operational"
|
||||||
|
: "Attention"
|
||||||
|
}
|
||||||
|
statusDetail={
|
||||||
|
healthState.error ??
|
||||||
|
healthState.data?.status ??
|
||||||
|
"Waiting for first health payload"
|
||||||
|
}
|
||||||
|
lastUpdatedLabel={formatTime(healthState.lastUpdated)}
|
||||||
|
state={healthState}
|
||||||
|
onManualTrigger={pollHealth}
|
||||||
|
>
|
||||||
|
<div className="grid grid-cols-2 gap-3 text-sm">
|
||||||
|
{healthState.data?.version && (
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Version
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold text-ink">{healthState.data.version}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{healthState.data?.serviceId && (
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Service
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold text-ink">
|
||||||
|
{healthState.data.serviceId}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{healthState.data?.description && (
|
||||||
|
<div className="col-span-2 rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Notes
|
||||||
|
</p>
|
||||||
|
<p className="text-ink">{healthState.data.description}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
<div className="mt-3 space-y-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Checks
|
||||||
|
</p>
|
||||||
|
{checks.length > 0 ? (
|
||||||
|
<div className="grid grid-cols-1 gap-2 sm:grid-cols-2">
|
||||||
|
{checks.map((check) => (
|
||||||
|
<div
|
||||||
|
key={check.label}
|
||||||
|
className="rounded-2xl border border-black/5 bg-white/70 px-3 py-2"
|
||||||
|
>
|
||||||
|
<div className="flex items-center justify-between text-xs text-inkMuted">
|
||||||
|
<span>{check.label}</span>
|
||||||
|
<span
|
||||||
|
className={`h-2 w-2 rounded-full ${
|
||||||
|
isStatusHealthy(check.status) ? "bg-success" : "bg-danger"
|
||||||
|
}`}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<p className="text-sm font-semibold text-ink">{check.status}</p>
|
||||||
|
{check.output && (
|
||||||
|
<p className="text-xs text-inkMuted">{check.output}</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<p className="text-sm text-inkMuted">Awaiting check details.</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</ECGMonitorCard>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
</main>
|
||||||
|
);
|
||||||
|
}
|
||||||
162
app/frontend/components/ECGMonitorCard.tsx
Normal file
162
app/frontend/components/ECGMonitorCard.tsx
Normal file
@@ -0,0 +1,162 @@
|
|||||||
|
import type { ReactNode } from "react";
|
||||||
|
import type { PollState } from "@/types/monitor";
|
||||||
|
|
||||||
|
type Tone = "mint" | "blue" | "coral";
|
||||||
|
|
||||||
|
const toneStyles: Record<Tone, string> = {
|
||||||
|
mint: "from-white via-white to-emerald-50 ring-emerald-100",
|
||||||
|
blue: "from-white via-white to-sky-50 ring-sky-100",
|
||||||
|
coral: "from-white via-white to-rose-50 ring-rose-100",
|
||||||
|
};
|
||||||
|
|
||||||
|
interface ECGMonitorCardProps<T> {
|
||||||
|
title: string;
|
||||||
|
endpoint: string;
|
||||||
|
intervalLabel: string;
|
||||||
|
tone?: Tone;
|
||||||
|
now: string;
|
||||||
|
wavePoints: string;
|
||||||
|
waveHeight: number;
|
||||||
|
healthy: boolean;
|
||||||
|
loading: boolean;
|
||||||
|
statusLabel: string;
|
||||||
|
statusDetail: string;
|
||||||
|
lastUpdatedLabel: string;
|
||||||
|
state: PollState<T>;
|
||||||
|
children?: ReactNode;
|
||||||
|
onManualTrigger: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Presentational card that renders endpoint status and the shared ECG waveform with
|
||||||
|
* consistent tone, status badges, and contextual metadata.
|
||||||
|
*/
|
||||||
|
export function ECGMonitorCard<T>({
|
||||||
|
title,
|
||||||
|
endpoint,
|
||||||
|
intervalLabel,
|
||||||
|
tone = "mint",
|
||||||
|
now,
|
||||||
|
wavePoints,
|
||||||
|
waveHeight,
|
||||||
|
healthy,
|
||||||
|
loading,
|
||||||
|
statusLabel,
|
||||||
|
statusDetail,
|
||||||
|
lastUpdatedLabel,
|
||||||
|
state,
|
||||||
|
children,
|
||||||
|
onManualTrigger,
|
||||||
|
}: ECGMonitorCardProps<T>) {
|
||||||
|
const signalId = `${endpoint.replace(/[^a-z0-9]/gi, "-")}-stroke`;
|
||||||
|
|
||||||
|
const badgeColor = loading
|
||||||
|
? "border-amber-200 bg-amber-50 text-amber-800"
|
||||||
|
: healthy
|
||||||
|
? "border-emerald-200 bg-emerald-50 text-emerald-800"
|
||||||
|
: "border-rose-200 bg-rose-50 text-rose-800";
|
||||||
|
|
||||||
|
const dotColor = loading ? "bg-pulse" : healthy ? "bg-success" : "bg-danger";
|
||||||
|
const strokeColor = loading ? "#f59e0b" : healthy ? "#22c55e" : "#ef4444";
|
||||||
|
|
||||||
|
return (
|
||||||
|
<article
|
||||||
|
className={`relative overflow-hidden rounded-3xl border border-black/5 bg-gradient-to-br ${toneStyles[tone]} p-5 shadow-card ring-4 ring-transparent transition hover:-translate-y-1 hover:shadow-[0_24px_60px_rgba(15,23,42,0.16)]`}
|
||||||
|
>
|
||||||
|
<div className="absolute inset-0 -z-10 bg-[radial-gradient(circle_at_20%_10%,rgba(96,165,250,0.08),transparent_30%),radial-gradient(circle_at_80%_0%,rgba(52,211,153,0.1),transparent_30%)]" />
|
||||||
|
<div className="flex items-start justify-between gap-3">
|
||||||
|
<div className="space-y-2 min-w-0">
|
||||||
|
<div className="inline-flex max-w-full items-center gap-2 rounded-full border border-black/5 bg-black/5 px-3 py-1 text-xs font-semibold uppercase tracking-[0.12em] text-inkMuted">
|
||||||
|
<span className="break-words">{endpoint}</span>
|
||||||
|
<span className="h-1.5 w-1.5 shrink-0 rounded-full bg-gradient-to-r from-accent-blue to-accent-mint" />
|
||||||
|
</div>
|
||||||
|
<h3 className="text-lg font-semibold text-ink">{title}</h3>
|
||||||
|
<p className="text-sm text-inkMuted">every {intervalLabel}</p>
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-col items-end gap-2 text-right">
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<span
|
||||||
|
className={`inline-flex items-center gap-2 rounded-full border px-3 py-2 text-sm font-semibold shadow-sm ${badgeColor}`}
|
||||||
|
>
|
||||||
|
<span className={`h-2.5 w-2.5 rounded-full ${dotColor}`} />
|
||||||
|
{statusLabel}
|
||||||
|
</span>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={onManualTrigger}
|
||||||
|
className="inline-flex items-center gap-2 rounded-full bg-ink text-white px-3 py-2 text-xs font-semibold shadow-card transition hover:bg-inkSoft focus-visible:outline focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:outline-ink"
|
||||||
|
>
|
||||||
|
<span className="h-2 w-2 rounded-full bg-white" aria-hidden />
|
||||||
|
Poll now
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<span className="text-xs font-semibold uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
{now}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="mt-4 flex flex-col gap-3">
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">
|
||||||
|
Last updated
|
||||||
|
</p>
|
||||||
|
<p className="text-base font-semibold text-ink">{lastUpdatedLabel}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="relative overflow-hidden rounded-2xl border border-black/5 bg-gradient-to-br from-white via-white to-slate-50 p-4">
|
||||||
|
<div className="pointer-events-none absolute inset-0 opacity-60 mix-blend-multiply">
|
||||||
|
<div className="absolute inset-0 bg-[radial-gradient(circle_at_10%_20%,rgba(96,165,250,0.12),transparent_35%),radial-gradient(circle_at_80%_0%,rgba(52,211,153,0.12),transparent_30%)]" />
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center justify-between text-xs text-inkMuted">
|
||||||
|
<span>ECG signal</span>
|
||||||
|
<span className="inline-flex items-center gap-2 rounded-full bg-black/5 px-2 py-1 font-semibold">
|
||||||
|
<span className={`h-2 w-2 rounded-full ${dotColor}`} />
|
||||||
|
{loading ? "Polling" : healthy ? "Healthy" : "Unhealthy"}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<svg
|
||||||
|
viewBox="0 0 360 120"
|
||||||
|
preserveAspectRatio="none"
|
||||||
|
className="mt-2 h-28 w-full text-success drop-shadow-sm"
|
||||||
|
>
|
||||||
|
<defs>
|
||||||
|
<linearGradient id={signalId} x1="0%" y1="0%" x2="100%" y2="0%">
|
||||||
|
<stop offset="0%" stopColor={strokeColor} stopOpacity="0.9" />
|
||||||
|
<stop offset="50%" stopColor={strokeColor} stopOpacity="0.75" />
|
||||||
|
<stop offset="100%" stopColor={strokeColor} stopOpacity="0.95" />
|
||||||
|
</linearGradient>
|
||||||
|
</defs>
|
||||||
|
<polyline
|
||||||
|
points={wavePoints}
|
||||||
|
fill="none"
|
||||||
|
stroke={`url(#${signalId})`}
|
||||||
|
strokeWidth="3"
|
||||||
|
strokeLinecap="round"
|
||||||
|
strokeLinejoin="round"
|
||||||
|
/>
|
||||||
|
<circle
|
||||||
|
cx="360"
|
||||||
|
cy={waveHeight}
|
||||||
|
r="6"
|
||||||
|
className={`stroke-white/40 ${dotColor}`}
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="grid grid-cols-2 gap-3 text-sm">
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">Status</p>
|
||||||
|
<p className="text-base font-semibold text-ink">{statusDetail}</p>
|
||||||
|
</div>
|
||||||
|
<div className="rounded-2xl border border-black/5 bg-white/80 px-3 py-2">
|
||||||
|
<p className="text-xs uppercase tracking-[0.08em] text-inkMuted">Endpoint</p>
|
||||||
|
<p className="text-xs font-mono text-ink">{state.attemptedUrl}</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{children}
|
||||||
|
</div>
|
||||||
|
</article>
|
||||||
|
);
|
||||||
|
}
|
||||||
44
app/frontend/eslint.config.mjs
Normal file
44
app/frontend/eslint.config.mjs
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
import nextPlugin from "@next/eslint-plugin-next";
|
||||||
|
import globals from "globals";
|
||||||
|
import tseslint from "typescript-eslint";
|
||||||
|
|
||||||
|
export default tseslint.config(
|
||||||
|
{
|
||||||
|
ignores: [
|
||||||
|
"**/.next/**",
|
||||||
|
"node_modules/**",
|
||||||
|
"dist/**",
|
||||||
|
"tailwind.config.js",
|
||||||
|
"postcss.config.mjs",
|
||||||
|
],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
files: ["**/*.{js,jsx,ts,tsx}"],
|
||||||
|
extends: [
|
||||||
|
...tseslint.configs.recommendedTypeChecked,
|
||||||
|
...tseslint.configs.stylisticTypeChecked,
|
||||||
|
nextPlugin.configs.recommended,
|
||||||
|
nextPlugin.configs["core-web-vitals"],
|
||||||
|
],
|
||||||
|
languageOptions: {
|
||||||
|
parserOptions: {
|
||||||
|
project: "./tsconfig.json",
|
||||||
|
},
|
||||||
|
globals: {
|
||||||
|
...globals.browser,
|
||||||
|
...globals.node,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
rules: {
|
||||||
|
"@typescript-eslint/consistent-type-imports": [
|
||||||
|
"error",
|
||||||
|
{ prefer: "type-imports", fixStyle: "separate-type-imports" },
|
||||||
|
],
|
||||||
|
"@typescript-eslint/no-misused-promises": [
|
||||||
|
"error",
|
||||||
|
{ checksVoidReturn: false },
|
||||||
|
],
|
||||||
|
"@typescript-eslint/no-floating-promises": "error",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
);
|
||||||
6
app/frontend/next.config.mjs
Normal file
6
app/frontend/next.config.mjs
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
/** @type {import('next').NextConfig} */
|
||||||
|
const nextConfig = {
|
||||||
|
reactStrictMode: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
export default nextConfig;
|
||||||
6600
app/frontend/package-lock.json
generated
Normal file
6600
app/frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
31
app/frontend/package.json
Normal file
31
app/frontend/package.json
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
{
|
||||||
|
"name": "avaaz-frontend",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"private": true,
|
||||||
|
"scripts": {
|
||||||
|
"dev": "next dev",
|
||||||
|
"build": "next build --webpack",
|
||||||
|
"start": "next start",
|
||||||
|
"lint": "eslint ."
|
||||||
|
},
|
||||||
|
"type": "module",
|
||||||
|
"dependencies": {
|
||||||
|
"@tailwindcss/postcss": "^4.1.17",
|
||||||
|
"next": "^16.0.5",
|
||||||
|
"react": "^19.2.0",
|
||||||
|
"react-dom": "^19.2.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/node": "^24.10.1",
|
||||||
|
"@types/react": "^19.2.7",
|
||||||
|
"@types/react-dom": "^19.2.3",
|
||||||
|
"autoprefixer": "^10.4.20",
|
||||||
|
"eslint": "^9.39.1",
|
||||||
|
"eslint-config-next": "^16.0.5",
|
||||||
|
"globals": "^16.5.0",
|
||||||
|
"postcss": "^8.4.49",
|
||||||
|
"tailwindcss": "^4.1.17",
|
||||||
|
"typescript": "^5.9.3",
|
||||||
|
"typescript-eslint": "^8.48.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
6
app/frontend/postcss.config.mjs
Normal file
6
app/frontend/postcss.config.mjs
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
export default {
|
||||||
|
plugins: {
|
||||||
|
"@tailwindcss/postcss": {},
|
||||||
|
autoprefixer: {},
|
||||||
|
},
|
||||||
|
};
|
||||||
BIN
app/frontend/public/favicon.png
Normal file
BIN
app/frontend/public/favicon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 800 KiB |
BIN
app/frontend/public/logo.png
Normal file
BIN
app/frontend/public/logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 984 KiB |
44
app/frontend/tsconfig.json
Normal file
44
app/frontend/tsconfig.json
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2022",
|
||||||
|
"lib": [
|
||||||
|
"dom",
|
||||||
|
"dom.iterable",
|
||||||
|
"ES2022"
|
||||||
|
],
|
||||||
|
"allowJs": false,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"strict": true,
|
||||||
|
"noEmit": true,
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"module": "ESNext",
|
||||||
|
"moduleResolution": "Bundler",
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"jsx": "react-jsx",
|
||||||
|
"incremental": true,
|
||||||
|
"types": [
|
||||||
|
"node"
|
||||||
|
],
|
||||||
|
"plugins": [
|
||||||
|
{
|
||||||
|
"name": "next"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"paths": {
|
||||||
|
"@/*": [
|
||||||
|
"./*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"include": [
|
||||||
|
"next-env.d.ts",
|
||||||
|
"**/*.ts",
|
||||||
|
"**/*.tsx",
|
||||||
|
".next/types/**/*.ts",
|
||||||
|
".next/dev/types/**/*.ts"
|
||||||
|
],
|
||||||
|
"exclude": [
|
||||||
|
"node_modules"
|
||||||
|
]
|
||||||
|
}
|
||||||
7
app/frontend/types/monitor.ts
Normal file
7
app/frontend/types/monitor.ts
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
export interface PollState<T> {
|
||||||
|
data: T | null;
|
||||||
|
error: string | null;
|
||||||
|
loading: boolean;
|
||||||
|
lastUpdated: string | null;
|
||||||
|
attemptedUrl: string;
|
||||||
|
}
|
||||||
568
docs/PRD.md
Normal file
568
docs/PRD.md
Normal file
@@ -0,0 +1,568 @@
|
|||||||
|
# Product Requirements Document
|
||||||
|
|
||||||
|
## Product Information
|
||||||
|
|
||||||
|
**Title:** Avaaz
|
||||||
|
**Change History:**
|
||||||
|
|
||||||
|
| Date | Version | Author | Description |
|
||||||
|
| ---------- | ------- | ---------------- | -------------------------------------------------------------------------------------------- |
|
||||||
|
| 2025-12-04 | 0.6.0 | Internal (Codex) | Tightened health-check requirements and ensured all endpoints and criteria are testable. |
|
||||||
|
| 2025-12-04 | 0.5.0 | Internal (Codex) | Added requirements from README/architecture; ensured testable, learner-facing feature coverage. |
|
||||||
|
| 2025-12-04 | 0.4.0 | Internal (Codex) | Simplified language, reduced redundancy, clarified non-functional requirements. |
|
||||||
|
| 2025-12-03 | 0.3.0 | Internal (Codex) | Reinforced mobile-first learner flow, clarified spoken-skill focus, and (A1–B2) oral practice scope. |
|
||||||
|
| 2025-12-03 | 0.2.0 | Internal (Codex) | Clarified A1–B2 scope; added curriculum/exam, authoring, persistence, and health requirements. |
|
||||||
|
| 2025-12-03 | 0.1.0 | Internal (Codex) | Initial PRD drafted from `README.md` and `docs/architecture.md`. |
|
||||||
|
|
||||||
|
**Date:** 2025-12-04
|
||||||
|
**Status:** Draft
|
||||||
|
|
||||||
|
## Product Overview
|
||||||
|
|
||||||
|
Avaaz is a mobile and web app with a conversational AI tutor. It teaches speaking skills through structured, interactive, voice-first lessons that adapt to each learner’s pace and performance. Avaaz supports CEFR levels A1–B2, with a primary goal of B2 oral exam readiness and confident real-life conversation in the destination country.
|
||||||
|
|
||||||
|
Avaaz combines a CEFR-aligned curriculum with real-time AI conversation to deliver low-latency speech-to-speech practice across devices. Learners primarily use native iOS and Android apps. Instructors, coordinators, and administrators use a responsive web portal to manage curricula, reporting, and settings.
|
||||||
|
|
||||||
|
**Problem Statement:**
|
||||||
|
Adult immigrants and other language learners struggle to achieve confident speaking ability in their target language, especially at the B2 level required for exams, citizenship, or professional roles. Existing solutions (apps, textbooks, group classes) emphasize passive skills (reading, vocabulary drills, grammar) that do not directly translate into fluent speech. Avaaz intentionally keeps reading and writing as contextual supports only—every lesson, scenario, and assessment is designed around spoken interaction, pronunciation, fluency, and comprehension. Human tutors are expensive, scarce in many regions, and difficult to scale, leaving learners underprepared for real-life conversations and high-stakes oral exams.
|
||||||
|
|
||||||
|
**Product Vision:**
|
||||||
|
To be the trusted AI speaking coach for immigrants and global learners. Avaaz should feel like a human tutor—natural dialogue, rich corrective feedback, and realistic scenarios—while scaling to thousands of learners. Avaaz will measurably improve speaking confidence and B2 exam readiness by:
|
||||||
|
|
||||||
|
- Reducing learners’ anxiety in real conversations.
|
||||||
|
- Increasing B2 oral exam pass rates.
|
||||||
|
- Shortening the time required to progress from A1 → A2 → B1 → B2 speaking proficiency.
|
||||||
|
|
||||||
|
## User and Audience
|
||||||
|
|
||||||
|
Avaaz serves adults learning a new language for migration, work, and social integration, with an initial focus on English → Norwegian Bokmål. Learners use the mobile apps or web app to practice speaking; text and transcripts are supporting aids, not the main focus. Instructors, coordinators, and administrators use a web portal to manage curricula, monitor cohorts, and configure settings. Learners can start from A1 and progress through A2 and B1 up to B2; the main goal is to help them reach and pass the B2 oral exam while adding value at each stage.
|
||||||
|
|
||||||
|
**Personas:**
|
||||||
|
|
||||||
|
- **Primary Persona – Adult Immigrant Exam Candidate (Primary):**
|
||||||
|
- Age 20–45; recently moved to a new country (e.g., Norway).
|
||||||
|
- Needs to pass a B2 oral exam for residency, citizenship, or professional accreditation.
|
||||||
|
- Has limited time (work, family duties) and mixed confidence speaking with natives.
|
||||||
|
- Uses a mid-range phone or laptop; often learns on evenings and weekends.
|
||||||
|
- Pain points: insufficient speaking practice, fear of making mistakes, difficulty accessing affordable tutors, and lack of clear feedback on exam readiness.
|
||||||
|
|
||||||
|
- **Secondary Persona – Working Professional Needing Workplace Fluency:**
|
||||||
|
- Already employed or seeking employment; needs to operate in the target language at work (meetings, clients, daily conversations).
|
||||||
|
- Wants targeted practice around workplace scenarios (e.g., stand-ups, 1:1s, presentations).
|
||||||
|
- Pain points: embarrassed about accent/fluency, no safe space to practice, needs domain-specific vocabulary and politeness strategies.
|
||||||
|
|
||||||
|
- **Secondary Persona – Language School / Program Coordinator:**
|
||||||
|
- Manages groups of learners in a language school, NGO, or integration program.
|
||||||
|
- Wants a scalable speaking practice tool that complements classes and provides data on learner progress.
|
||||||
|
- Pain points: limited classroom time, uneven speaking opportunities for students, lack of granular speaking analytics.
|
||||||
|
|
||||||
|
**User Scenarios:**
|
||||||
|
|
||||||
|
- **Daily commute micro-lessons (Primary Persona):**
|
||||||
|
On the bus after work, a learner starts a 10-minute speaking session on “Small Talk at the Workplace.” Avaaz adapts prompts based on mistakes, gives immediate pronunciation and grammar feedback, and updates progress toward the learner’s target level.
|
||||||
|
|
||||||
|
- **Mock B2 oral exam before test day (Primary Persona):**
|
||||||
|
A week before the exam, the learner runs a full mock oral exam in “Exam Mode.” Avaaz simulates an examiner with timed sections, tracks key speaking skills, and produces an exam-style report with an estimated CEFR level and clear improvement suggestions.
|
||||||
|
|
||||||
|
- **Preparing for workplace interactions (Working Professional):**
|
||||||
|
Before a performance review, a learner practices the “Performance Review Conversation” scenario. Avaaz role-plays manager and colleague, uses realistic workplace language, and coaches polite but assertive phrasing and cultural norms.
|
||||||
|
|
||||||
|
- **Program-wide monitoring (Program Coordinator):**
|
||||||
|
An instructor encourages all students to complete three speaking sessions per week. The coordinator reviews dashboards (e.g., minutes spoken, estimated CEFR band, completion of key scenarios) to spot learners who need support and to report impact to stakeholders.
|
||||||
|
|
||||||
|
## Functional Requirements (Features)
|
||||||
|
|
||||||
|
This section describes the core capabilities required for a production-grade Avaaz full-stack application. Each feature is expressed via user stories with acceptance criteria and dependencies.
|
||||||
|
|
||||||
|
### 1. Voice-First Conversational Lessons
|
||||||
|
|
||||||
|
1. **User Story: Real-Time Voice Tutoring**
|
||||||
|
- **As a** learner between A1 and B2 (with a special focus on B1–B2 exam preparation),
|
||||||
|
- **I want to** speak with an AI tutor using my microphone in near real time,
|
||||||
|
- **so that** I can practice spontaneous spoken interaction and receive immediate feedback.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Learner can start a voice session from mobile or web with one tap/click.
|
||||||
|
- Audio is streamed via WebRTC with end-to-end latency low enough to support natural turn-taking (target < 250 ms one-way).
|
||||||
|
- AI tutor responds using synthesized voice and on-screen text.
|
||||||
|
- Transcription and persistence of audio and text follow the persistent conversation and transcript requirements described below.
|
||||||
|
- If the microphone or network fails, the app displays an actionable error and offers a retry or text-only fallback.
|
||||||
|
|
||||||
|
**Dependencies:** LiveKit server (signaling + media), LLM realtime APIs (OpenAI Realtime, Gemini Live), Caddy reverse proxy, WebRTC-capable browsers/mobile clients, backend session orchestration.
|
||||||
|
|
||||||
|
2. **User Story: Adaptive Conversational Flow**
|
||||||
|
- **As a** learner with uneven skills,
|
||||||
|
- **I want to** receive dynamically adjusted prompts and scaffolding,
|
||||||
|
- **so that** conversations stay challenging but not overwhelming.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- AI tutor adjusts prompt complexity based on recent performance (e.g., error rate, hesitation, completion rate) and current CEFR level (A1–B2).
|
||||||
|
- System can slow down, rephrase, or switch to simplified questions when the learner struggles.
|
||||||
|
- System can increase complexity (longer turns, follow-up questions, abstract topics) when the learner performs well.
|
||||||
|
- Explanations include level-appropriate grammar focus (e.g., simple present and basic word order at A1, more complex clause structures and connectors at B1–B2).
|
||||||
|
- When explaining, the AI tutor supplements speech with visual and textual aids (images, tables, short written examples) where appropriate.
|
||||||
|
- Changes in difficulty are logged for analytics.
|
||||||
|
|
||||||
|
**Dependencies:** Backend lesson/lesson-state models, LLM prompt engineering and agent logic, PostgreSQL + pgvector for storing session metrics.
|
||||||
|
|
||||||
|
3. **User Story: Comprehensive Speaking Feedback**
|
||||||
|
- **As a** learner preparing for real conversations and exams,
|
||||||
|
- **I want to** receive detailed feedback on my speaking, not just pronunciation and grammar,
|
||||||
|
- **so that** I understand my strengths and weaknesses across all key speaking skills.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- After a lesson or mock exam, the system can display or generate scores or qualitative ratings for fluency, pronunciation, grammar, vocabulary, and coherence.
|
||||||
|
- Feedback includes at least 2–3 concrete examples from the session (e.g., misused word, unclear phrasing, hesitation).
|
||||||
|
- Feedback format is consistent across sessions and mock exams so results are comparable over time.
|
||||||
|
- Learner can view previous feedback reports from a “History” or equivalent section.
|
||||||
|
|
||||||
|
**Dependencies:** Conversation transcription, scoring and analysis models, feedback formatting logic, persistent storage for feedback reports.
|
||||||
|
|
||||||
|
### 2. CEFR Aligned Curriculum & Real-Life Scenarios
|
||||||
|
|
||||||
|
1. **User Story: Structured CEFR-Aligned Path**
|
||||||
|
- **As a** motivated learner starting anywhere between A1 and B2,
|
||||||
|
- **I want to** follow a clear sequence of speaking lessons mapped to CEFR descriptors,
|
||||||
|
- **so that** I can track my progress toward B2 and avoid gaps in my skills.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Curriculum is structured into levels (A1, A2, B1, B2) and modules (e.g., “Everyday Life,” “Workplace,” “Public Services”).
|
||||||
|
- Each speaking lesson includes goals, target CEFR descriptors, example prompts, and success criteria.
|
||||||
|
- Learner can see which lessons are completed, in progress, or locked.
|
||||||
|
- The system records completion, time spent, and estimated performance for each lesson.
|
||||||
|
|
||||||
|
**Dependencies:** Backend curriculum models and APIs, frontend curriculum navigation views, content authoring workflow (internal or admin UI), PostgreSQL for storing lesson metadata.
|
||||||
|
|
||||||
|
2. **User Story: Immigrant-Focused Real-Life Scenarios**
|
||||||
|
- **As a** newly arrived immigrant,
|
||||||
|
- **I want to** practice conversations that match my daily life (e.g., at the doctor, at school, at work, at public offices),
|
||||||
|
- **so that** I feel confident handling real interactions in my new country.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Library of scenario templates linked to CEFR levels and contexts (workplace, healthcare, school, housing, etc.).
|
||||||
|
- For each scenario, the AI tutor can role-play multiple participants (e.g., nurse, receptionist, colleague).
|
||||||
|
- Visual cues (images, documents, forms) can be shown where relevant.
|
||||||
|
- Scenarios are localizable (e.g., cultural norms, common phrases) per destination country.
|
||||||
|
- Scenarios can be designed to emphasize key oral communication purposes seen in official exams: self-presentation, describing pictures or situations, exchanging information, expressing opinions, and arguing for or against a statement.
|
||||||
|
- Scenario templates support both individual and pair/role-play modes, with configurable durations and turn-taking rules.
|
||||||
|
|
||||||
|
**Dependencies:** Media storage for images/documents, LLM prompt templates by scenario, localization framework, content governance and review processes.
|
||||||
|
|
||||||
|
3. **User Story: Curriculum Model with Multi-Skill Objectives**
|
||||||
|
- **As a** curriculum designer,
|
||||||
|
- **I want to** model learning objectives for each level (A1–B2) across reception, production, interaction, and mediation skills,
|
||||||
|
- **so that** I can align the digital curriculum with established language frameworks and reuse it across languages.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- For each CEFR level (A1–B2), the curriculum can capture objectives for listening/reading (reception), speaking/writing (production), interaction (dialogue and conversations), and mediation (explaining and rephrasing).
|
||||||
|
- Lessons and mock exams reference one or more of these objectives, enabling coverage analysis and reporting.
|
||||||
|
- Objectives and mappings are configurable per language pair and per country-specific curriculum where applicable.
|
||||||
|
|
||||||
|
**Dependencies:** Backend curriculum data model, admin tooling for curriculum management, reporting/analytics based on objectives.
|
||||||
|
|
||||||
|
4. **User Story: Accent and Cultural Adaptation**
|
||||||
|
- **As a** learner moving to a specific country or region,
|
||||||
|
- **I want to** practice with local accents and culturally appropriate language,
|
||||||
|
- **so that** my speech sounds natural and polite in real life.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Lessons and scenarios can be tagged with destination country/region and typical dialect or accent.
|
||||||
|
- AI tutor can switch between at least one default accent and one local accent where the target language supports it.
|
||||||
|
- Scenarios include common cultural norms and politeness strategies (e.g., formal vs informal address) that the tutor can explain on request.
|
||||||
|
- Coordinators or admins can choose which regional variants are enabled for their learners.
|
||||||
|
|
||||||
|
**Dependencies:** Content localization by region, voice configuration options for accents, cultural notes in curriculum content, admin configuration UI or settings.
|
||||||
|
|
||||||
|
### 3. Mock Oral Exam Mode & Assessment
|
||||||
|
|
||||||
|
1. **User Story: Full B2 Mock Exam**
|
||||||
|
- **As a** learner preparing for a B2 oral exam,
|
||||||
|
- **I want to** take a timed mock exam that follows the official exam structure,
|
||||||
|
- **so that** I know what to expect and can benchmark my readiness.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- System supports predefined exam templates (sections, timings, types of prompts) for levels A1–A2, A2–B1, and B1–B2, based on local exam formats where applicable.
|
||||||
|
- Exam templates can include warm-up tasks that are not scored, as well as scored tasks.
|
||||||
|
- Each exam part can be configured as individual or pair conversation, and as one of several task types: self-presentation, describing a picture or situation, speaking about a familiar topic, exchanging views, expressing opinions, and taking a position on a statement with arguments.
|
||||||
|
- During the exam, the system enforces timing (visible countdown) and turn-taking rules.
|
||||||
|
- At the end, the learner receives an exam-like report with an estimated CEFR level and component scores (fluency, pronunciation, vocabulary, grammar, coherence).
|
||||||
|
- Report is saved and viewable later in the “Results” or “History” section.
|
||||||
|
- The system can optionally present a small number of stretch tasks from the next higher level to detect learners whose skills may exceed the nominal exam level.
|
||||||
|
|
||||||
|
**Dependencies:** Assessment rubric definitions, scoring models (LLM-based + heuristic), backend report generation, persistent storage of exam sessions and scores.
|
||||||
|
|
||||||
|
2. **User Story: Performance Summaries After Each Session**
|
||||||
|
- **As a** learner who just completed a session,
|
||||||
|
- **I want to** see a concise summary of what I did well and what to improve,
|
||||||
|
- **so that** I can focus my next practice and see my progress over time.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Post-session screen shows key strengths, common errors, and 2–3 prioritized recommendations.
|
||||||
|
- Summary highlights examples from the conversation (e.g., misused prepositions, pronunciation errors).
|
||||||
|
- Learner can share or export summaries (e.g., PDF or link) where allowed.
|
||||||
|
- Summaries contribute to longitudinal analytics (trends by skill over time).
|
||||||
|
|
||||||
|
**Dependencies:** Conversation transcription, error detection pipeline, LLM feedback processing, analytics storage and querying.
|
||||||
|
|
||||||
|
### 4. Multilingual Scaffolding & Integrated Translation
|
||||||
|
|
||||||
|
1. **User Story: Localized UI and Instructions**
|
||||||
|
- **As a** learner with limited proficiency in the target language,
|
||||||
|
- **I want to** see the app’s UI and core instructions in my native or preferred language,
|
||||||
|
- **so that** I am not blocked by interface comprehension while focusing on speaking practice.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- App supports multiple UI languages with a clear selector during onboarding and in settings.
|
||||||
|
- Static text (menus, buttons, error messages) is localized.
|
||||||
|
- Critical flows (onboarding, subscription, exam mode) are fully localized.
|
||||||
|
- Default UI language is inferred from locale but always user overrideable.
|
||||||
|
|
||||||
|
**Dependencies:** Localization/i18n system on frontend and backend, translations management process, design support for longer text variants.
|
||||||
|
|
||||||
|
2. **User Story: On-Demand Translations During Practice**
|
||||||
|
- **As a** low-confidence speaker,
|
||||||
|
- **I want to** quickly translate AI prompts or my own utterances between my language and the target language,
|
||||||
|
- **so that** I can stay engaged rather than getting stuck on unknown words.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- In-session controls allow optional translations of AI messages and user messages.
|
||||||
|
- Translation support is clearly marked and can be disabled by instructors (to reduce over-reliance).
|
||||||
|
- Translation usage is logged for analytics (e.g., frequency by user, session).
|
||||||
|
- Translations are fast enough to not break conversational flow.
|
||||||
|
|
||||||
|
**Dependencies:** LLM-based or external translation APIs, usage limits and cost management, UI surface in chat and transcripts.
|
||||||
|
|
||||||
|
### 5. Progress Tracking, Gamification, and Analytics
|
||||||
|
|
||||||
|
1. **User Story: Personal Progress Dashboard**
|
||||||
|
- **As a** learner targeting a CEFR speaking level (A1–B2),
|
||||||
|
- **I want to** see my progress over time across key skills,
|
||||||
|
- **so that** I stay motivated and know where to focus and, ultimately, reach my target (often B2).
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Dashboard shows time spent speaking, session count, streaks, and estimated CEFR band over time.
|
||||||
|
- Learner can view trends in specific skill dimensions (fluency, pronunciation, grammar, vocabulary).
|
||||||
|
- Streaks, badges, and milestones are clearly displayed, with rules explained.
|
||||||
|
- Data refreshes near-real-time after a session.
|
||||||
|
|
||||||
|
**Dependencies:** Analytics database structures, data aggregation jobs, frontend charts, privacy/consent handling.
|
||||||
|
|
||||||
|
2. **User Story: Program-Level Reporting (Secondary Persona)**
|
||||||
|
- **As a** coordinator of a small group of learners,
|
||||||
|
- **I want to** see anonymized or per-learner usage and progress,
|
||||||
|
- **so that** I can measure impact and intervene early for learners who are falling behind.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Secure, role-based access for coordinators/instructors.
|
||||||
|
- Metrics include active learners, sessions per week, minutes spoken, and average skill trends.
|
||||||
|
- Simple export (CSV or PDF) for reporting.
|
||||||
|
- Data access respects privacy settings and relevant regulations.
|
||||||
|
|
||||||
|
**Dependencies:** Role-based access control, reporting queries, secure data storage and anonymization, UI components for analytics.
|
||||||
|
|
||||||
|
3. **User Story: Gamified Challenges and Rewards**
|
||||||
|
- **As a** learner who struggles to keep a regular speaking habit,
|
||||||
|
- **I want to** earn streaks, badges, and other rewards when I practice,
|
||||||
|
- **so that** I feel motivated to return and build a long-term habit.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- System tracks daily and weekly speaking activity and calculates streaks based on defined rules (e.g., at least one completed session per day).
|
||||||
|
- Learners can unlock badges or milestones based on clear criteria (e.g., total minutes spoken, number of sessions, mock exams completed).
|
||||||
|
- Gamification status (streaks, badges, milestones) is visible in the dashboard and updates after each session.
|
||||||
|
- All streak and badge rules are documented in-app so they can be tested and verified.
|
||||||
|
|
||||||
|
**Dependencies:** Analytics and event tracking, gamification rules engine or logic, frontend components to display streaks and badges.
|
||||||
|
|
||||||
|
### 6. User Accounts, Authentication, and Subscription Management
|
||||||
|
|
||||||
|
1. **User Story: Account Creation and Sign-In**
|
||||||
|
- **As a** new learner,
|
||||||
|
- **I want to** create an account using my email and password (and optionally social login),
|
||||||
|
- **so that** my progress, preferences, and subscriptions are stored securely.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Email + password registration with verification flow.
|
||||||
|
- Login with JWT-based sessions; secure password hashing in storage.
|
||||||
|
- Basic account management (confirm email, change email, password, profile data).
|
||||||
|
- Session expiry and logout behaviors are clearly implemented.
|
||||||
|
|
||||||
|
**Dependencies:** FastAPI Users (or equivalent auth library), PostgreSQL `user` table/schema, email service for verification, frontend auth flows.
|
||||||
|
|
||||||
|
2. **User Story: Subscription Plans and Billing**
|
||||||
|
- **As a** serious learner,
|
||||||
|
- **I want to** choose a subscription plan that fits my needs (e.g., free tier, standard, premium),
|
||||||
|
- **so that** I can access the right level of usage and features.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Plan definitions (e.g., “Spark,” “Glow,” etc.) with clearly described limits (minutes per month, features like mock exam mode).
|
||||||
|
- Billing integrated with Stripe (or similar) for recurring subscriptions.
|
||||||
|
- System enforces plan limits gracefully (e.g., warn at 80% usage, block after limit with clear upgrade options).
|
||||||
|
- Admin tooling to manage plans and handle refunds/adjustments.
|
||||||
|
|
||||||
|
**Dependencies:** Payment service integration (Stripe), secure webhook handling, backend plan enforcement, accounting/ledger storage.
|
||||||
|
|
||||||
|
### 7. Cross-Device Learning Continuity
|
||||||
|
|
||||||
|
1. **User Story: Seamless Device Switching**
|
||||||
|
- **As a** learner who uses both phone and laptop,
|
||||||
|
- **I want to** continue my learning across devices without losing progress,
|
||||||
|
- **so that** I can practice whenever and wherever it’s convenient.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Sessions, progress, and settings are stored server-side and synced across devices.
|
||||||
|
- Resume-last-lesson feature available on login.
|
||||||
|
- PWA support on mobile for near-native experience and offline access to limited features (where feasible).
|
||||||
|
- Conflict-handling behaviors are defined (e.g., two devices active at once).
|
||||||
|
|
||||||
|
**Dependencies:** Next.js PWA configuration, centralized state in backend, device/session tracking, secure token handling.
|
||||||
|
|
||||||
|
2. **User Story: Consistent Tutor Experience Across Devices**
|
||||||
|
- **As a** learner who sometimes uses headphones and sometimes speakers,
|
||||||
|
- **I want to** have a consistent AI tutor voice and behavior on all my devices,
|
||||||
|
- **so that** my listening practice is predictable and comfortable.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Tutor voice selection (gender, regional accent) is stored in the user profile and applied to all new sessions on any device.
|
||||||
|
- When a learner changes voice settings on one device, the change is reflected on other devices within one session or logout/login cycle.
|
||||||
|
- At least two distinct tutor voice options are available at launch; more can be added later without breaking existing settings.
|
||||||
|
- A simple test script or admin view can confirm which voice configuration is currently active for a given user.
|
||||||
|
|
||||||
|
**Dependencies:** Voice provider configuration, user profile settings for voice, frontend settings UI, backend APIs for voice preference storage and retrieval.
|
||||||
|
|
||||||
|
### 8. AI-Assisted Curriculum and Lesson Authoring
|
||||||
|
|
||||||
|
1. **User Story: Instructor-Designed Lessons with AI Support**
|
||||||
|
- **As a** language instructor or admin,
|
||||||
|
- **I want to** design and manage lessons for each CEFR level (A1–B2) with AI support, using documents or images as the basis for lessons,
|
||||||
|
- **so that** I can efficiently create high-quality, curriculum-aligned speaking practice tailored to my learners.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Instructors can upload documents and images (e.g., forms, articles, exam prompts, everyday photos) into the system.
|
||||||
|
- The backend parses and indexes uploaded material via a document processing and embedding pipeline so that AI can reference it during lessons.
|
||||||
|
- Instructors can select target level(s), objectives, and exam formats when creating or editing a lesson.
|
||||||
|
- AI suggests lesson structures, prompts, and example dialogues that instructors can review and modify before publishing.
|
||||||
|
- Lessons are stored with metadata (level, skills, topics, exam parts) and become available in the learner curriculum and mock exams.
|
||||||
|
|
||||||
|
**Dependencies:** Document upload and processing services, LLM-based content generation, instructor/admin UI, PostgreSQL + pgvector storage.
|
||||||
|
|
||||||
|
2. **User Story: Learner-Generated Lessons from Uploaded Material**
|
||||||
|
- **As a** learner,
|
||||||
|
- **I want to** upload documents or images that are relevant to my life or exams and have the AI tutor form the basis of a lesson from them,
|
||||||
|
- **so that** my practice feels directly useful and is adapted to my current level (A1–B2).
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Learners can upload files (e.g., work documents, letters from authorities, school forms, pictures from daily life) from web or mobile.
|
||||||
|
- System detects or uses the learner’s current CEFR level to adapt the conversation difficulty and grammar focus appropriately.
|
||||||
|
- AI tutor uses the uploaded material as shared context (e.g., refers to specific sections of a document or objects in an image) during the lesson.
|
||||||
|
- Uploaded content is stored securely, scoped to the learner/account or organization according to configuration and privacy requirements.
|
||||||
|
|
||||||
|
**Dependencies:** Same document ingestion pipeline as instructor authoring, user-facing upload UI, LLM prompts conditioned on user level and uploaded context.
|
||||||
|
|
||||||
|
### 9. Persistent Conversations, Transcripts, and Tutor Greetings
|
||||||
|
|
||||||
|
1. **User Story: Persistent Conversation History and Context Loading**
|
||||||
|
- **As a** returning learner,
|
||||||
|
- **I want to** have my previous conversations, transcripts, and progress persisted and used to initialize new lessons,
|
||||||
|
- **so that** the AI tutor can pick up where we left off and provide a sense of continuity.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Audio from both the user and the AI tutor is always transcribed and stored persistently with each session (subject to retention and privacy policies).
|
||||||
|
- Each session stores metadata including date, mode (lesson, mock exam, free conversation), level, topics, and key performance indicators.
|
||||||
|
- The backend exposes an endpoint (e.g., `/sessions/default`) that returns or creates a persistent conversational session containing historical summaries and progress context.
|
||||||
|
- When a user starts a new lesson, the AI tutor’s context includes a short summary of recent sessions plus key goals and challenges.
|
||||||
|
|
||||||
|
**Dependencies:** Session and transcript storage in PostgreSQL + pgvector, summarization logic in backend LLM services, session management API, LiveKit session orchestration.
|
||||||
|
|
||||||
|
2. **User Story: Contextual Greeting on Login**
|
||||||
|
- **As a** returning learner,
|
||||||
|
- **I want to** hear a short spoken greeting from the AI tutor that reminds me where I left off previously,
|
||||||
|
- **so that** I immediately know what I was working on and can resume with confidence.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- After login and reconnecting to the conversational session, the AI tutor greets the user verbally and gives a brief, level-appropriate summary of their most recent activity and suggested next step.
|
||||||
|
- Greeting content is generated from stored summaries and progress records, not from scratch each time.
|
||||||
|
- Learners can adjust how much historical detail is included (e.g., “short summary only” vs. “more detailed recap”).
|
||||||
|
|
||||||
|
**Dependencies:** Same as for persistent conversation history; frontend behavior to play greeting early in the session and surface a text version of the summary.
|
||||||
|
|
||||||
|
### 10. Health Checks and Admin Observability
|
||||||
|
|
||||||
|
1. **User Story: Backend Health Check Endpoint**
|
||||||
|
- **As a** platform operator,
|
||||||
|
- **I want to** have standard health check endpoints for liveness, readiness, and detailed status,
|
||||||
|
- **so that** CI/CD pipelines, uptime monitors, and dashboards can verify that the API is running, ready, and healthy.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Backend exposes three unauthenticated endpoints:
|
||||||
|
- `GET /health/live` returns HTTP 200 and body `"live"` when the process is running.
|
||||||
|
- `GET /health/ready` returns HTTP 200 and body `"ready"` when critical dependencies are OK, and HTTP 503 otherwise.
|
||||||
|
- `GET /health` returns a JSON body with an overall status field and per-component checks, and uses HTTP 200 for `"pass"` and HTTP 503 for `"fail"`.
|
||||||
|
- Health endpoints are used in deployment pipelines and monitoring (e.g., `https://api.<domain>/health`, `/health/ready`, `/health/live`).
|
||||||
|
- All three endpoints are lightweight enough to be polled on the order of seconds without impacting users.
|
||||||
|
|
||||||
|
**Dependencies:** FastAPI route for health checks, integration with basic internal dependency checks (DB, LiveKit, LLM connectivity where feasible).
|
||||||
|
|
||||||
|
2. **User Story: Admin Health Dashboard in Frontend**
|
||||||
|
- **As an** admin or operator,
|
||||||
|
- **I want to** view a dashboard in the frontend showing the health of core components,
|
||||||
|
- **so that** I can quickly detect and diagnose issues without logging into servers directly.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- Frontend provides an admin-only view that aggregates health data for frontend, backend, database, LiveKit, and external LLM APIs.
|
||||||
|
- Dashboard polls or subscribes to backend health endpoints and visualizes status (e.g., up/down, latency, last check time).
|
||||||
|
- Critical issues are highlighted and optionally surfaced as alerts/notifications.
|
||||||
|
|
||||||
|
**Dependencies:** Backend health endpoints and metrics, role-based access control, frontend admin UI components.
|
||||||
|
|
||||||
|
## Non-Functional Requirements (Technical)
|
||||||
|
|
||||||
|
### Frontend Requirements
|
||||||
|
|
||||||
|
**Supported Browsers/Devices:**
|
||||||
|
|
||||||
|
- Desktop: Latest 2 versions of Chrome, Firefox, Safari, and Edge.
|
||||||
|
- Mobile: Latest 2 major versions of iOS and Android (Safari/Chrome), including PWA install support.
|
||||||
|
- Minimum viewport: responsive layouts down to 360px width.
|
||||||
|
|
||||||
|
**Design/UI:**
|
||||||
|
|
||||||
|
- Voice-first interaction prioritized: microphone and conversation views are obvious and usable with one hand on mobile.
|
||||||
|
- Consistent brand identity for Avaaz across web and mobile (colors, typography, logo).
|
||||||
|
- Dark and light modes preferred for accessibility and comfort.
|
||||||
|
- UI is localized alongside backend error messages; language selection is a first-class setting.
|
||||||
|
- UI components built in React/Next.js with Tailwind or equivalent utility-first styling, following design system guidelines (buttons, forms, cards, modals).
|
||||||
|
|
||||||
|
### Backend & Database Requirements
|
||||||
|
|
||||||
|
**API Specifications:**
|
||||||
|
|
||||||
|
- RESTful JSON APIs served by a FastAPI backend for core operations: authentication, user management, lessons, sessions, progress, subscriptions.
|
||||||
|
- Real-time endpoints for voice and agent control:
|
||||||
|
- LiveKit signaling endpoints (`/sessions/default`, `/sessions/default/token` and equivalents).
|
||||||
|
- WebSocket or WebRTC connections from backend to LLM realtime APIs (OpenAI Realtime, Gemini Live).
|
||||||
|
- API documentation exposed via OpenAPI/Swagger (and/or equivalent documentation tooling).
|
||||||
|
- All APIs versioned (e.g., `/api/v1/...`) with change management.
|
||||||
|
|
||||||
|
**Database Schema (High-Level):**
|
||||||
|
|
||||||
|
- PostgreSQL with pgvector for semantic search and embeddings.
|
||||||
|
- Core entities include (indicative, not exhaustive):
|
||||||
|
- `User` (profile, locale, level, subscription plan).
|
||||||
|
- `Session` (conversation metadata, timestamps, mode, links to transcripts).
|
||||||
|
- `Lesson` / `Scenario` (curriculum structure, CEFR mapping).
|
||||||
|
- `ProgressSnapshot` (aggregated metrics per user over time).
|
||||||
|
- `Subscription` / `Payment` (plan, billing status, Stripe references).
|
||||||
|
- `Embedding` / `Document` (semantic chunks for search and content retrieval).
|
||||||
|
- Migrations managed via Alembic, with reproducible dev/prod schemas.
|
||||||
|
|
||||||
|
**Security:**
|
||||||
|
|
||||||
|
- All traffic between client and server encrypted via HTTPS (Caddy as reverse proxy with automatic TLS).
|
||||||
|
- Authentication via JWT or session tokens implemented with FastAPI Users (or equivalent), with configurable token lifetimes and refresh flows.
|
||||||
|
- Passwords stored with modern hashing algorithms (e.g., Argon2, bcrypt).
|
||||||
|
- Role-based access control (e.g., learner, coordinator, admin) for sensitive features (analytics, content management).
|
||||||
|
- Strict input validation and output encoding following OWASP best practices.
|
||||||
|
- Secrets stored securely (e.g., environment variables, secret manager), never hard-coded in the repository.
|
||||||
|
- Rate limiting, abuse detection, and monitoring around critical endpoints.
|
||||||
|
|
||||||
|
### Deployment & CI/CD
|
||||||
|
|
||||||
|
- Production deployment uses Docker-based stacks on a single VPS, with a separate infra stack (`caddy`, `gitea`, `gitea-runner`) and app stack (`frontend`, `backend`, `postgres`, `livekit`) defined in version-controlled Compose files.
|
||||||
|
- Caddy terminates TLS for all public domains and routes traffic to the correct internal services (frontend, backend, LiveKit, Gitea) over a shared Docker network.
|
||||||
|
- A Gitea Actions-based CI pipeline runs on each feature branch and pull request, executing backend/frontend tests, static analysis, and image builds, and must pass before merge to `main`.
|
||||||
|
- A tag-based CD pipeline (tags matching `v*` on `main`) builds production images and redeploys the app stack on the VPS in a controlled way, minimizing downtime.
|
||||||
|
- CI/CD workflows are themselves versioned in the repository so changes to validation or deployment steps are reviewable and reproducible.
|
||||||
|
|
||||||
|
### Performance, Reliability, and Scalability
|
||||||
|
|
||||||
|
**Performance:**
|
||||||
|
|
||||||
|
- Target median API response time: < 200 ms for standard JSON endpoints under normal load.
|
||||||
|
- Voice interaction round-trip (user speaks → AI responds) tuned for natural conversation with minimal perceived delay; target < 1.5 seconds for most responses.
|
||||||
|
- System supports concurrent sessions per LiveKit instance and scales horizontally as needed.
|
||||||
|
- Efficient use of LLM realtime APIs with streaming responses and graceful handling of network jitter.
|
||||||
|
|
||||||
|
**Reliability & Availability:**
|
||||||
|
|
||||||
|
- Initial production target availability: ≥ 99.5%, with a path to ≥ 99.9% as usage grows.
|
||||||
|
- Health checks for all containers (frontend, backend, LiveKit, Postgres) integrated with Docker Compose and any orchestration layer, plus the explicit backend `/health` endpoint and frontend admin dashboard described above.
|
||||||
|
- Graceful degradation: if LLM APIs or LiveKit are temporarily unavailable, the system provides clear messaging to learners and surface-level indicators in the admin dashboard.
|
||||||
|
- Regular automated backups of PostgreSQL and configuration; tested restore procedures.
|
||||||
|
|
||||||
|
**Scalability:**
|
||||||
|
|
||||||
|
- Docker-based deployment on a production VPS, with clear separation between infra stack (Caddy, Gitea) and app stack (frontend, backend, LiveKit, Postgres).
|
||||||
|
- Horizontal scaling supported for stateless services (frontend, backend, LiveKit) and vertical scaling for PostgreSQL as needed.
|
||||||
|
- Efficient connection pooling for database access.
|
||||||
|
- Architecture designed to move from single VPS to managed services or Kubernetes in future without large rewrites.
|
||||||
|
|
||||||
|
**Technical Specifications:**
|
||||||
|
|
||||||
|
- **Frontend:** Next.js (React, TypeScript), Tailwind or equivalent, PWA enabled; communicates with backend via HTTPS and LiveKit via WebRTC.
|
||||||
|
- **Backend:** FastAPI (Python), Uvicorn/Gunicorn, Pydantic for validation, structured services for LLM, payments, and documents.
|
||||||
|
- **Real-time/Media:** LiveKit server for WebRTC signaling and media; integration with LiveKit Agent framework for AI tutor.
|
||||||
|
- **Database:** PostgreSQL + pgvector; migrations via Alembic.
|
||||||
|
- **LLM Providers:** OpenAI Realtime API, Google Gemini Live API (WebSocket/WebRTC).
|
||||||
|
- **Infra:** Caddy reverse proxy, Docker Compose for local and production stacks, Gitea + Actions for CI/CD.
|
||||||
|
- **Testing/Quality:** Pytest, Hypothesis, httpx for API testing, Ruff and Pyright for linting and static analysis, ESLint for frontend.
|
||||||
|
|
||||||
|
**Accessibility:**
|
||||||
|
|
||||||
|
- Compliance target: WCAG 2.1 AA for web UI.
|
||||||
|
- All key actions accessible via keyboard and screen readers.
|
||||||
|
- Sufficient color contrast and scalable font sizes.
|
||||||
|
- Voice-first design complemented by transcripts and captions; learners can read as well as listen.
|
||||||
|
- Consideration for hearing- or speech-impaired users where feasible (e.g., text-only practice, adjustable speech rate).
|
||||||
|
|
||||||
|
## Metrics & Release Plan
|
||||||
|
|
||||||
|
**Success Metrics (KPIs):**
|
||||||
|
|
||||||
|
- **Learning Outcomes:**
|
||||||
|
- ≥ 60% of learners who complete a defined program (e.g., 30+ speaking sessions) report increased speaking confidence.
|
||||||
|
- ≥ 50% of learners who use Avaaz consistently (e.g., 3+ sessions/week for 8 weeks) pass the B2 oral exam on their first or second attempt.
|
||||||
|
- **Engagement:**
|
||||||
|
- Weekly active learners (WAL) growth rate.
|
||||||
|
- Median speaking minutes per active learner per week.
|
||||||
|
- Retention (e.g., 4-week and 12-week).
|
||||||
|
- **Product Quality:**
|
||||||
|
- Average session rating / NPS for speaking sessions.
|
||||||
|
- Error rates and crash-free sessions on mobile/web.
|
||||||
|
- Latency metrics for voice interactions.
|
||||||
|
|
||||||
|
**Timeline & Milestones:**
|
||||||
|
|
||||||
|
- **Phase 1 – Foundation (M0–M2):**
|
||||||
|
- Implement core architecture (backend, frontend, LiveKit, LLM integrations).
|
||||||
|
- Basic authentication, user accounts, and minimal speaking session flow.
|
||||||
|
- Internal alpha with team and close collaborators.
|
||||||
|
- **Phase 2 – Beta Learning Experience (M3–M4):**
|
||||||
|
- CEFR-aligned curriculum MVP, immigrant-focused scenarios, post-session summaries.
|
||||||
|
- Progress dashboard and early gamification (streaks, minutes).
|
||||||
|
- Invite-only beta with small learner cohorts; collect qualitative and quantitative feedback.
|
||||||
|
- **Phase 3 – Exam & Scale Readiness (M5–M6):**
|
||||||
|
- Mock B2 exam mode, robust assessment reports.
|
||||||
|
- Subscription plans and billing.
|
||||||
|
- Production hardening (observability, backups, reliability SLOs).
|
||||||
|
- Public launch in initial target market(s).
|
||||||
|
|
||||||
|
**Release Criteria:**
|
||||||
|
|
||||||
|
- Core features of voice-first lessons, CEFR-aligned curriculum, post-session feedback, and at least one full mock exam template are stable and usable.
|
||||||
|
- User authentication, subscription management, and payment flows validated in staging and production.
|
||||||
|
- System meets agreed performance thresholds (latency, error rates) under expected early-production load.
|
||||||
|
- No open critical security vulnerabilities; penetration testing and reviews completed for auth, payments, and data storage.
|
||||||
|
- Documentation available for learners (help center) and internal teams (runbooks, API docs).
|
||||||
|
|
||||||
|
**Potential Risks & Assumptions:**
|
||||||
|
|
||||||
|
- **Risks:**
|
||||||
|
- Dependence on external LLM realtime APIs and their SLAs, pricing, and model changes.
|
||||||
|
- WebRTC and audio performance may vary across networks and devices, impacting perceived quality.
|
||||||
|
- Assessment accuracy (CEFR-level estimates) may not initially match human examiner judgments, affecting learner trust.
|
||||||
|
- Regulatory or data privacy constraints (e.g., storing voice data, cross-border data flows) may impact certain markets.
|
||||||
|
- **Assumptions:**
|
||||||
|
- Learners have access to a smartphone or laptop with a microphone and stable-enough internet for audio sessions.
|
||||||
|
- LLM providers continue to support low-latency realtime APIs suitable for spoken dialogue.
|
||||||
|
- Target institutions and exam boards accept AI-supported practice tools as preparation, even if they do not formally endorse them.
|
||||||
|
- Initial go-to-market focuses on a limited set of language pairs (e.g., English → Norwegian Bokmål) with potential expansion later.
|
||||||
@@ -5,7 +5,7 @@ Below is a summary of the **Production VPS** and **Development Laptop** architec
|
|||||||
```mermaid
|
```mermaid
|
||||||
flowchart LR
|
flowchart LR
|
||||||
%% Client
|
%% Client
|
||||||
A(Browser / PWA)
|
A(Browser)
|
||||||
Y(iOS App / Android App)
|
Y(iOS App / Android App)
|
||||||
|
|
||||||
subgraph User
|
subgraph User
|
||||||
@@ -27,7 +27,7 @@ flowchart LR
|
|||||||
I(Gitea + Actions + Repositories)
|
I(Gitea + Actions + Repositories)
|
||||||
J(Gitea Runner)
|
J(Gitea Runner)
|
||||||
|
|
||||||
D(Next.js Frontend)
|
D(React/Next.js/Tailwind Frontend)
|
||||||
E(FastAPI Backend + Agent Runtime)
|
E(FastAPI Backend + Agent Runtime)
|
||||||
G(LiveKit Server)
|
G(LiveKit Server)
|
||||||
H[(PostgreSQL + pgvector)]
|
H[(PostgreSQL + pgvector)]
|
||||||
@@ -115,7 +115,7 @@ flowchart LR
|
|||||||
|
|
||||||
#### Infra Stack
|
#### Infra Stack
|
||||||
|
|
||||||
Docker Compose from the `avaaz-infra` Git repository is cloned to `/srv/infra/docker-compose.yml` on the VPS.
|
Docker Compose from `./infra/docker-compose.yml` is cloned to `/srv/infra/docker-compose.yml` on the VPS.
|
||||||
|
|
||||||
| Container | Description |
|
| Container | Description |
|
||||||
| -------------- | ----------------------------------------------------------------------------------- |
|
| -------------- | ----------------------------------------------------------------------------------- |
|
||||||
@@ -125,16 +125,16 @@ Docker Compose from the `avaaz-infra` Git repository is cloned to `/srv/infra/do
|
|||||||
|
|
||||||
#### App Stack
|
#### App Stack
|
||||||
|
|
||||||
Docker Compose from the `avaaz-app` Git repository is cloned to `/srv/app/docker-compose.yml` on the VPS.
|
Docker Compose from `./app/docker-compose.yml` is cloned to `/srv/app/docker-compose.yml` on the VPS.
|
||||||
|
|
||||||
| Container | Description |
|
| Container | Description |
|
||||||
| ---------- | ----------------------------------------------------------------------------------------- |
|
| ---------- | ----------------------------------------------------------------------------------------- |
|
||||||
| `frontend` | **Next.js Frontend** – SPA/PWA interface served from a Node.js-based Next.js server. |
|
| `frontend` | **React/Next.js/Tailwind Frontend** – SPA interface served from a Node.js-based Next.js server. |
|
||||||
| `backend` | **FastAPI + Uvicorn Backend** – API, auth, business logic, LiveKit orchestration, agent. |
|
| `backend` | **FastAPI + Uvicorn Backend** – API, auth, business logic, LiveKit orchestration, agent. |
|
||||||
| `postgres` | **PostgreSQL + pgvector** – Persistent relational database with vector search. |
|
| `postgres` | **PostgreSQL + pgvector** – Persistent relational database with vector search. |
|
||||||
| `livekit` | **LiveKit Server** – WebRTC signaling plus UDP media for real-time audio and data. |
|
| `livekit` | **LiveKit Server** – WebRTC signaling plus UDP media for real-time audio and data. |
|
||||||
|
|
||||||
The `backend` uses several Python packages such as UV, Ruff, FastAPI, FastAPI Users, FastAPI-pagination, FastStream, Pydantic, PydanticAI, Pydantic-settings, LiveKit Agent, Google Gemini Live API, OpenAI Realtime API, SQLAlchemy, Alembic, docling, Gunicorn, Uvicorn[standard], Pyright, Pytest, Hypothesis, and Httpx to deliver the services.
|
The `backend` uses several Python packages such as UV, Ruff, FastAPI, FastAPI Users, FastAPI-pagination, FastStream, FastMCP, Pydantic, PydanticAI, Pydantic-settings, LiveKit Agent, Google Gemini Live API, OpenAI Realtime API, SQLAlchemy, Alembic, docling, Gunicorn, Uvicorn[standard], Pyright, Pytest, Hypothesis, and Httpx to deliver the services.
|
||||||
|
|
||||||
### 1.2 Network
|
### 1.2 Network
|
||||||
|
|
||||||
@@ -152,7 +152,7 @@ The `backend` uses several Python packages such as UV, Ruff, FastAPI, FastAPI Us
|
|||||||
| -------------------- | :---------: | -------------- | -------------------------------- |
|
| -------------------- | :---------: | -------------- | -------------------------------- |
|
||||||
| **www\.avaaz\.ai** | CNAME | avaaz.ai | Marketing / landing site |
|
| **www\.avaaz\.ai** | CNAME | avaaz.ai | Marketing / landing site |
|
||||||
| **avaaz.ai** | A | 217.154.51.242 | Root domain |
|
| **avaaz.ai** | A | 217.154.51.242 | Root domain |
|
||||||
| **app.avaaz.ai** | A | 217.154.51.242 | Next.js frontend (SPA/PWA) |
|
| **app.avaaz.ai** | A | 217.154.51.242 | React/Next.js/Tailwind frontend (SPA) |
|
||||||
| **api.avaaz.ai** | A | 217.154.51.242 | FastAPI backend |
|
| **api.avaaz.ai** | A | 217.154.51.242 | FastAPI backend |
|
||||||
| **rtc.avaaz.ai** | A | 217.154.51.242 | LiveKit signaling + media |
|
| **rtc.avaaz.ai** | A | 217.154.51.242 | LiveKit signaling + media |
|
||||||
| **git.avaaz.ai** | A | 217.154.51.242 | Gitea (HTTPS + SSH) |
|
| **git.avaaz.ai** | A | 217.154.51.242 | Gitea (HTTPS + SSH) |
|
||||||
@@ -448,7 +448,7 @@ The user experiences this as a **continuous, ongoing session** with seamless rec
|
|||||||
|
|
||||||
#### App Stack (local Docker)
|
#### App Stack (local Docker)
|
||||||
|
|
||||||
- `frontend` (Next.js SPA)
|
- `frontend` (React/Next.js/Tailwind SPA)
|
||||||
- `backend` (FastAPI)
|
- `backend` (FastAPI)
|
||||||
- `postgres` (PostgreSQL + pgvector)
|
- `postgres` (PostgreSQL + pgvector)
|
||||||
- `livekit` (local LiveKit Server)
|
- `livekit` (local LiveKit Server)
|
||||||
@@ -465,7 +465,7 @@ No Caddy is deployed locally; the browser talks directly to the mapped container
|
|||||||
|
|
||||||
Local development uses:
|
Local development uses:
|
||||||
|
|
||||||
- `http://localhost:3000` → frontend (Next.js dev/server container)
|
- `http://localhost:3000` → frontend (React/Next.js/Tailwind dev/server container)
|
||||||
- `http://localhost:8000` → backend API (FastAPI)
|
- `http://localhost:8000` → backend API (FastAPI)
|
||||||
- Example auth/session endpoints:
|
- Example auth/session endpoints:
|
||||||
- `POST http://localhost:8000/auth/login`
|
- `POST http://localhost:8000/auth/login`
|
||||||
@@ -480,7 +480,7 @@ No `/etc/hosts` changes or TLS certificates are required; `localhost` acts as a
|
|||||||
|
|
||||||
| Port | Protocol | Purpose |
|
| Port | Protocol | Purpose |
|
||||||
|-------------:|:--------:|------------------------------------|
|
|-------------:|:--------:|------------------------------------|
|
||||||
| 3000 | TCP | Frontend (Next.js) |
|
| 3000 | TCP | Frontend (React/Next.js/Tailwind) |
|
||||||
| 8000 | TCP | Backend API (FastAPI) |
|
| 8000 | TCP | Backend API (FastAPI) |
|
||||||
| 5432 | TCP | Postgres + pgvector |
|
| 5432 | TCP | Postgres + pgvector |
|
||||||
| 7880 | TCP | LiveKit HTTP + WS signaling |
|
| 7880 | TCP | LiveKit HTTP + WS signaling |
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ The following command will delete ALL Docker data, including stopped containers,
|
|||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd /srv/infra
|
cd /srv/infra
|
||||||
|
sudo docker stop $(sudo docker ps -a -q) && sudo docker rm $(sudo docker ps -a -q)
|
||||||
sudo docker compose down -v --rmi all --remove-orphans
|
sudo docker compose down -v --rmi all --remove-orphans
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|||||||
@@ -3,11 +3,12 @@
|
|||||||
* [GitHub Flow](https://dev.to/karmpatel/git-branching-strategies-a-comprehensive-guide-24kh) branching strategy is used.
|
* [GitHub Flow](https://dev.to/karmpatel/git-branching-strategies-a-comprehensive-guide-24kh) branching strategy is used.
|
||||||
* Direct push to `main` branch is prohibited.
|
* Direct push to `main` branch is prohibited.
|
||||||
* Only merges to the `main` branch via Pull Requests from `feature/...` or `bugfix/...` branches are allowed.
|
* Only merges to the `main` branch via Pull Requests from `feature/...` or `bugfix/...` branches are allowed.
|
||||||
* Tags are created for releases on the `main` branch.
|
* Tags `v*` are created for releases on the `main` branch.
|
||||||
|
* Gitea configuration is available in Site Administration, User Settings, and Repository Settings.
|
||||||
|
|
||||||
## Pull Request
|
## Pull Request
|
||||||
|
|
||||||
1. Ensure your main branch is protected, so that direct push is disabled.
|
1. Ensure your main branch is protected on Gitea configuration, so that direct push is disabled.
|
||||||
|
|
||||||
2. Update the local main branch.
|
2. Update the local main branch.
|
||||||
|
|
||||||
@@ -68,10 +69,10 @@
|
|||||||
git push origin --delete feature/new-branch
|
git push origin --delete feature/new-branch
|
||||||
```
|
```
|
||||||
|
|
||||||
13. Create a new branch for upcoming work, for example `feature/dev`.
|
13. Create a new branch for upcoming work, for example `feature/new-branch-2`.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git checkout -b feature/dev
|
git checkout -b feature/new-branch-2
|
||||||
```
|
```
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|||||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
59
docs/norwegian/oppgaver til muntlig prøven.md
Normal file
59
docs/norwegian/oppgaver til muntlig prøven.md
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
# Slik er oppgavene på den muntlige prøven
|
||||||
|
|
||||||
|
## Fortelle kort om deg selv (på A1–A2-prøven)
|
||||||
|
|
||||||
|
* Dette er en individuell oppgave.
|
||||||
|
* Du skal presentere deg kort, og du velger selv hvilken informasjon du vil ta med.
|
||||||
|
* Oppgaven er alltid den samme.
|
||||||
|
* Du snakker i ca. 1-2 minutter.
|
||||||
|
|
||||||
|
## Beskrive et bilde (på A1–A2-prøven)
|
||||||
|
|
||||||
|
* Dette er en individuell oppgave.
|
||||||
|
* Du får se et bilde med flere personer og ting som skjer, og du skal fortelle om hva du ser på bildet og hva personene på bildet gjør.
|
||||||
|
* Du kan få oppfølgingsspørsmål fra eksaminator.
|
||||||
|
* Du snakker i ca. 2-3 minutter.
|
||||||
|
|
||||||
|
## Snakke sammen om et tema (på A1–A2-prøven)
|
||||||
|
|
||||||
|
* Dette er en samtaleoppgave, der to kandidater snakker sammen.
|
||||||
|
* Dere får et spørsmål fra eksaminator om et tema fra dagliglivet, for eksempel skole, fritid, jobb, familie, mat, vær og så videre.
|
||||||
|
* Dere kan få oppfølgingsspørsmål fra eksaminator.
|
||||||
|
* Dere snakker i ca. 2-3 minutter.
|
||||||
|
|
||||||
|
## Fortelle om et tema (på A1–A2 og A2–B1-prøven)
|
||||||
|
|
||||||
|
* Dette er en individuell oppgave.
|
||||||
|
* Du får et spørsmål fra eksaminator om et tema fra dagliglivet som du skal snakke om, for eksempel skole, fritid, jobb, familie, mat, vær og så videre.
|
||||||
|
* Du kan få oppfølgingsspørsmål fra eksaminator.
|
||||||
|
* Du snakker i ca. 2-3 minutter.
|
||||||
|
|
||||||
|
## Snakke sammen om et tema (på A2–B1-prøven)
|
||||||
|
|
||||||
|
* Dette er en samtaleoppgave, der to kandidater snakker sammen.
|
||||||
|
* Dere får et spørsmål eller en problemstilling som dere skal snakke sammen om.
|
||||||
|
* Dere kan få oppfølgingsspørsmål fra eksaminator.
|
||||||
|
* Dere snakker i ca. 5-7 minutter til sammen.
|
||||||
|
|
||||||
|
## Si din mening om et tema og begrunne meningen din (på A2–B1 og B1–B2-prøven)
|
||||||
|
|
||||||
|
* Dette er en individuell oppgave.
|
||||||
|
* Du får et spørsmål eller en problemstilling, og skal si hva du mener om dette.
|
||||||
|
* Du skal begrunne meningen din.
|
||||||
|
* Du kan få oppfølgingsspørsmål fra eksaminator. Du snakker i ca. 2-3 minutter.
|
||||||
|
|
||||||
|
## Utveksle meninger om et tema og begrunne meningene deres (på B1–B2-prøven)
|
||||||
|
|
||||||
|
* Dette er en samtaleoppgave, der to kandidater snakker sammen.
|
||||||
|
* Dere får et spørsmål eller en problemstilling, som dere skal snakke sammen om og utveksle meninger om.
|
||||||
|
* Dere skal begrunne meningene deres.
|
||||||
|
* Eksaminator stiller ikke oppfølgingsspørsmål på denne oppgaven.
|
||||||
|
* Dere snakker i ca. 5-7 minutter til sammen.
|
||||||
|
|
||||||
|
## Ta stilling til en påstand og begrunne meningene dine (B1–B2-prøven)
|
||||||
|
|
||||||
|
* Dette er en individuell oppgave.
|
||||||
|
* Du får presentert en påstand, og du skal ta stilling til påstanden og begrunne meningene dine.
|
||||||
|
* Du får både høre påstanden og se den skriftlig. Så får du tilbud om litt tid til å tenke og notere stikkord.
|
||||||
|
* I denne oppgaven skal du først snakke selvstendig om påstanden i 2-3 minutter før eksaminator går over til å stille oppfølgingsspørsmål i 2-3 minutter.
|
||||||
|
* Til sammen snakker du i ca. 4-6 minutter.
|
||||||
461
docs/plan.md
Normal file
461
docs/plan.md
Normal file
@@ -0,0 +1,461 @@
|
|||||||
|
# Avaaz Implementation Plan
|
||||||
|
|
||||||
|
This implementation plan translates the Product Requirements (`docs/PRD.md`), product description (`README.md`), and system architecture (`docs/architecture.md`) into concrete, phased engineering work for a production-grade Avaaz deployment.
|
||||||
|
|
||||||
|
The goal is to deliver an end-to-end, voice-first AI speaking coach that supports learners from A1–B2, with B2 oral exam readiness as the primary outcome.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Guiding Principles
|
||||||
|
|
||||||
|
- **B2 exam readiness first, A1–B2 capable:** Design features and data models so that they support A1–B2 learners, but prioritize workflows that move learners toward B2 oral exam success.
|
||||||
|
- **Voice-first, text-strong:** Optimize for real-time speech-to-speech interactions, with robust transcripts and text UX as first-class companions.
|
||||||
|
- **Single source of truth:** Keep curriculum, lessons, transcripts, and analytics centralized in PostgreSQL + pgvector; no separate vector store.
|
||||||
|
- **Continuous sessions:** All conversations run within persistent sessions (`/sessions/default`), preserving state across reconnects.
|
||||||
|
- **Infrastructure parity:** Development Docker stack mirrors production VPS stacks (infra/app), as described in `docs/architecture.md`.
|
||||||
|
- **Security and privacy:** Apply strong auth, least-privilege access, safe logging, and clear retention policies for voice/transcript data.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. High-Level Phasing
|
||||||
|
|
||||||
|
### Phase 1 – Foundation (M0–M2)
|
||||||
|
|
||||||
|
- Set up core infrastructure (Dockerized backend, frontend, LiveKit, Postgres+pgvector, Caddy).
|
||||||
|
- Implement authentication, user model, and basic session handling.
|
||||||
|
- Implement minimal voice conversation loop (user ↔ AI tutor) with basic transcripts.
|
||||||
|
- Define initial CEFR-aware curriculum data model and seed a small set of lessons.
|
||||||
|
|
||||||
|
### Phase 2 – Learning Experience & Analytics (M3–M4)
|
||||||
|
|
||||||
|
- Implement full A1–B2 curriculum representation, scenarios, and level-aware adaptive tutoring.
|
||||||
|
- Add progress dashboard, gamification basics, and post-session summaries.
|
||||||
|
- Implement AI-assisted lesson authoring and learner-upload-based lessons.
|
||||||
|
- Introduce mock exam templates (A1–A2, A2–B1, B1–B2) and B2-focused exam reports.
|
||||||
|
|
||||||
|
### Phase 3 – Scale, Reliability & Monetization (M5–M6)
|
||||||
|
|
||||||
|
- Harden infrastructure (observability, health checks, admin dashboards).
|
||||||
|
- Add subscription plans and Stripe integration.
|
||||||
|
- Optimize performance (latency, concurrency), tune analytics pipelines, and finalize launch-readiness tasks.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Backend Workstream (FastAPI + LiveKit + LLMs)
|
||||||
|
|
||||||
|
### 3.1 Core Service Setup
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Production-ready FastAPI service with auth, sessions, and integrations.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Use the existing backend layout under `app/backend` as the foundation:
|
||||||
|
- `app/backend/main.py` – app factory and router wiring.
|
||||||
|
- `app/backend/core/config.py` – Pydantic-settings for core configuration, DB_URL, LLM keys, LiveKit, Stripe, etc.
|
||||||
|
- `app/backend/core/database.py` – database/session utilities; extend to add SQLAlchemy, pgvector, and Alembic integration.
|
||||||
|
- `app/backend/api/v1/router.py` – versioned API router aggregator; include routers from feature and operation modules (existing `features.auth`, `operations.health`, plus future `lessons`, `chat`, `documents`).
|
||||||
|
- `app/backend/features/*` and `app/backend/operations/*` – domain logic and HTTP routers (e.g., auth, lessons, chat, payments, document upload, health).
|
||||||
|
- Implement base middleware (CORS, logging, request ID, error handling).
|
||||||
|
- Ensure `/health`, `/health/live`, and `/health/ready` endpoints are wired and return basic dependency checks (DB connectivity, LiveKit reachability, LLM connectivity where safe).
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Running FastAPI service in Docker with `/health` OK and OpenAPI docs available.
|
||||||
|
|
||||||
|
### 3.2 Data Model & Persistence
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Support A1–B2 curriculum, lessons, sessions, transcripts, and analytics in PostgreSQL + pgvector.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Design and implement SQLAlchemy models:
|
||||||
|
- `User` – profile, locale, target level, subscription plan, preferences.
|
||||||
|
- `CurriculumObjective` – per level (A1–B2), skill (reception, production, interaction, mediation), descriptor text.
|
||||||
|
- `Lesson` – CEFR level, objectives, type (lesson, scenario, exam part), metadata (topic, context).
|
||||||
|
- `Scenario` / `ScenarioStep` – structured oral tasks (self-presentation, picture description, opinion exchange, arguing a statement) with configuration for timing and mode (individual/pair).
|
||||||
|
- `Session` – persistent conversational session per user (mode, state, summary, last_activity_at).
|
||||||
|
- `Turn` – individual utterances with role (user/AI), timestamps, raw transcript, audio reference, CEFR difficulty metadata.
|
||||||
|
- `ExamTemplate` / `ExamPart` – A1–A2, A2–B1, B1–B2 templates with timing, task types, scoring dimensions.
|
||||||
|
- `ExamAttempt` / `ExamScore` – attempt metadata, estimated CEFR level, component scores.
|
||||||
|
- `UploadDocument` / `DocumentChunk` – files and parsed chunks with `vector` embeddings (stored alongside or extending the existing backend package under `app/backend`).
|
||||||
|
- `ProgressSnapshot` – aggregate metrics for dashboards (per user and optionally per program).
|
||||||
|
- `Subscription` / `PaymentEvent` – billing state and usage limits.
|
||||||
|
- **Note:** Seed the database with the specific plans defined in `README.md` (First Light, Spark, Glow, Shine, Radiance) and their respective limits.
|
||||||
|
- Add related Alembic migrations; verify they run cleanly on dev DB.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Migrations and models aligned with PRD feature set and architecture.
|
||||||
|
|
||||||
|
### 3.3 Authentication & User Management
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Secure user auth using FastAPI Users (or equivalent) with JWT and refresh tokens.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Configure FastAPI Users:
|
||||||
|
- Email/password registration, login, password reset, email verification.
|
||||||
|
- Role support (learner, instructor, admin) for curriculum authoring and admin dashboards.
|
||||||
|
- Integrate auth into routes (`dependencies.py` with `current_user`).
|
||||||
|
- Implement `users` endpoints for profile (target CEFR level, locale) and preferences (greeting verbosity, data retention preferences).
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Auth flows working in backend and callable from Postman / curl.
|
||||||
|
|
||||||
|
### 3.4 Session Management & Transcripts
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Provide continuous session behavior with persistent history, as described in `docs/architecture.md` and PRD.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement `GET /sessions/default`:
|
||||||
|
- Create or fetch `Session` for current user.
|
||||||
|
- Load summary, current lesson state, and progress context.
|
||||||
|
- Implement `POST /sessions/default/token`:
|
||||||
|
- Generate short-lived LiveKit token with room identity tied to the session.
|
||||||
|
- Integrate with LiveKit Agent:
|
||||||
|
- Implement an LLM integration module (for example under `app/backend/features/llm.py` or similar) that configures the realtime session using historical summary, current goals, and mode (lesson/mock exam/free).
|
||||||
|
- Implement transcript persistence:
|
||||||
|
- Receive partial/final transcripts from LiveKit/agent.
|
||||||
|
- Append `Turn` records and maintain rolling summaries for context.
|
||||||
|
- Respect retention settings.
|
||||||
|
- Implement post-session summarization endpoint / background job:
|
||||||
|
- Generate per-session summary, strengths/weaknesses, recommended next steps.
|
||||||
|
- Implement on-demand translation:
|
||||||
|
- Endpoint (e.g., `/chat/translate`) or integrated socket message to translate user/AI text between target and native languages (supporting PRD Section 4.2).
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- API and background flows that maintain continuous conversational context per user.
|
||||||
|
|
||||||
|
### 3.5 Curriculum & Lesson APIs
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Expose CEFR-aligned curriculum and lesson content to frontend and agent.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement `lessons` router:
|
||||||
|
- List lessons by level, topic, recommended next steps.
|
||||||
|
- Fetch details for a specific lesson, including objectives and scenario steps.
|
||||||
|
- Mark lesson progress and completion; update `ProgressSnapshot`.
|
||||||
|
- Implement endpoints for curriculum objectives and mapping to lessons.
|
||||||
|
- Implement endpoints to retrieve scenario templates for mock exams and regular lessons.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Stable JSON API for curriculum and lessons, used by frontend and agent system prompts.
|
||||||
|
|
||||||
|
### 3.6 AI-Assisted Authoring & User Uploads
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Support instructor-designed and learner-generated lessons from uploaded materials.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement `documents` router:
|
||||||
|
- File upload endpoints for documents and images (instructor and learner scopes).
|
||||||
|
- Trigger document processing pipeline (Docling or similar) to parse text and structure.
|
||||||
|
- Chunk documents and store embeddings in `DocumentChunk` using pgvector.
|
||||||
|
- Implement instructor authoring endpoints:
|
||||||
|
- Create/update/delete lessons referencing uploaded documents/images.
|
||||||
|
- AI-assisted suggestion endpoint that uses LLM to propose lesson structure, prompts, and exam-style tasks conditioned on level and objectives.
|
||||||
|
- Implement learner upload endpoints:
|
||||||
|
- User-specific upload and lesson creation (on-the-fly lessons).
|
||||||
|
- Link created “ad-hoc” lessons to sessions so the tutor can reference them during practice.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Endpoints supporting both admin/instructor authoring and user-driven contextual lessons.
|
||||||
|
|
||||||
|
### 3.7 Mock Exam Engine & Scoring
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Implement configurable mock oral exam flows for A1–A2, A2–B1, and B1–B2, with B2 focus.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement exam orchestration service:
|
||||||
|
- Given an `ExamTemplate`, manage progression through `ExamPart`s (including warm-up, individual tasks, pair tasks).
|
||||||
|
- Enforce timing and mode flags to drive agent prompts.
|
||||||
|
- Integrate scoring:
|
||||||
|
- Use LLM to derive component scores (fluency, pronunciation, grammar, vocabulary, coherence) from transcripts.
|
||||||
|
- Map to estimated CEFR band and store in `ExamScore`.
|
||||||
|
- Expose endpoints:
|
||||||
|
- Start exam, fetch exam status, retrieve past exam results.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- End-to-end exam session that runs via the same LiveKit + agent infrastructure and stores exam results.
|
||||||
|
|
||||||
|
### 3.8 Analytics & Reporting
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Provide learner-level dashboards and program-level reporting.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement periodic aggregation (cron/async tasks) populating `ProgressSnapshot`.
|
||||||
|
- Implement analytics endpoints:
|
||||||
|
- Learner metrics (minutes spoken, session counts, trends per skill).
|
||||||
|
- Program-level metrics (for instructors/coordinators) with appropriate role-based access.
|
||||||
|
- Ensure privacy controls (anonymized or pseudonymized data where required).
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Backend API supporting progress dashboards and reports as per PRD.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Frontend Workstream (Next.js + LiveKit)
|
||||||
|
|
||||||
|
### 4.1 Foundation & Layout
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Production-ready Next.js PWA front-end that matches Avaaz branding and supports auth, routing, and basic pages.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Initialize Next.js app (per `README.md`):
|
||||||
|
- Configure `next.config.js`, TypeScript, ESLint, PWA manifest, and global styles.
|
||||||
|
- Implement `app/layout.tsx` with theme, localization provider, and navigation.
|
||||||
|
- Implement `app/page.tsx` landing page aligned with product positioning (A1–B2, B2 focus).
|
||||||
|
- Implement auth pages (login, register, email verification, forgot password).
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Frontend skeleton running under Docker, reachable via Caddy in dev stack.
|
||||||
|
|
||||||
|
### 4.2 Chat & Voice Experience
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Voice-first conversational UI integrated with LiveKit and backend sessions.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Build `ChatInterface.tsx`:
|
||||||
|
- Microphone controls, connection status, basic waveform/level visualization.
|
||||||
|
- Rendering of AI and user turns with text and visual aids (images, tables) as provided by backend/agent.
|
||||||
|
- **Translation Support:** UI controls to translate specific messages on demand (toggle or click-to-translate).
|
||||||
|
- Error states for mic and network issues; text-only fallback UI.
|
||||||
|
- Integrate with backend session APIs:
|
||||||
|
- On login, call `GET /sessions/default`, then `POST /sessions/default/token`.
|
||||||
|
- Connect to LiveKit using the token; handle reconnection logic.
|
||||||
|
- Display contextual greeting and summary on session start using data returned from `sessions` API.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Usable chat interface capable of sustaining real-time conversation with the AI tutor.
|
||||||
|
|
||||||
|
### 4.3 Curriculum & Lesson UX
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Allow learners to browse curriculum, start lessons, and view progress.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement curriculum overview page:
|
||||||
|
- Display modules and lessons grouped by CEFR levels (A1–B2).
|
||||||
|
- Indicate completion and recommended next lessons.
|
||||||
|
- Implement lesson detail page:
|
||||||
|
- Show lesson goals, target level, estimated time, and exam-related tags.
|
||||||
|
- Start lesson → opens chat view in appropriate mode with lesson context.
|
||||||
|
- Integrate progress indicators (streaks, minutes, CEFR band) into dashboard.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Navigation and views covering core learning flows described in PRD.
|
||||||
|
|
||||||
|
### 4.4 Mock Exam UX
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Implement exam-specific UX consistent with oral exams and PRD.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Build exam selection page:
|
||||||
|
- Allow user to choose exam level (A1–A2, A2–B1, B1–B2/B2-mock).
|
||||||
|
- In-session exam UI:
|
||||||
|
- Show current exam part, timer, and appropriate instructions.
|
||||||
|
- Indicate whether current part is scored or warm-up.
|
||||||
|
- Results page:
|
||||||
|
- Show estimated CEFR level, component scores, and textual feedback.
|
||||||
|
- Provide links to detailed transcripts, audio, and recommended follow-up lessons.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- End-to-end exam flow from selection to results.
|
||||||
|
|
||||||
|
### 4.5 AI-Assisted Authoring & Upload UX
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Provide UIs for instructors and learners to upload content and create lessons.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Instructor interface:
|
||||||
|
- Lesson builder UI with level, objectives, exam part, and document selection.
|
||||||
|
- “Generate with AI” action to fetch suggested prompts/structure; edit-in-place and publish.
|
||||||
|
- Learner interface:
|
||||||
|
- Simple upload flow (document/image) to create ad-hoc practice.
|
||||||
|
- Quick-start buttons to jump from uploaded content to a tailored lesson in the chat interface.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Authoring tools that map onto backend authoring APIs.
|
||||||
|
|
||||||
|
### 4.6 Analytics & Admin Health Dashboard
|
||||||
|
|
||||||
|
**Goals**
|
||||||
|
|
||||||
|
- Provide admin and instructor dashboards for system health and learner analytics.
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Learner dashboard:
|
||||||
|
- Visualize key metrics and streaks, integrated with backend analytics.
|
||||||
|
- Instructor/program dashboard:
|
||||||
|
- Aggregate usage and progress metrics for groups.
|
||||||
|
- Admin health dashboard:
|
||||||
|
- Surface backend `/health` status, LiveKit status, DB health indicators, and LLM connectivity signals.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Dashboards that satisfy PRD’s analytics and health visibility requirements.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Real-Time & Media Workstream (LiveKit + Agents)
|
||||||
|
|
||||||
|
### 5.1 LiveKit Server & Config
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Use the existing `livekit` service in `app/docker-compose.yml` as the basis, keeping signaling on port 7880 and the WebRTC media port range configurable via environment variables (currently defaulting to 60000–60100) and attached to the shared `proxy` network used by Caddy.
|
||||||
|
- Ensure secure API keys and appropriate room/track settings for voice-only sessions.
|
||||||
|
- Configure UDP ports and signaling endpoints (`rtc.avaaz.ai` → Caddy → `livekit:7880`) as described in `docs/architecture.md` and `infra/Caddyfile`.
|
||||||
|
|
||||||
|
### 5.2 Client Integration
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Wire frontend to LiveKit:
|
||||||
|
- Use `@livekit/client` to join rooms using tokens from backend.
|
||||||
|
- Handle reconnection and session resumption.
|
||||||
|
- Integrate with backend session and agent orchestration.
|
||||||
|
|
||||||
|
### 5.3 Agent Integration with Realtime LLMs
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Implement LiveKit Agent that:
|
||||||
|
- Connects to OpenAI Realtime or Gemini Live according to configuration.
|
||||||
|
- Streams user audio and receives streamed AI audio and partial transcripts.
|
||||||
|
- Forwards transcripts and metadata to backend for persistence.
|
||||||
|
- Implement prompt templates for:
|
||||||
|
- Regular lessons, mock exams, free conversation.
|
||||||
|
- CEFR-level adaptation and exam-specific tasks.
|
||||||
|
|
||||||
|
**Deliverables**
|
||||||
|
|
||||||
|
- Stable real-time pipeline from user microphone to LLM and back, integrated with backend logic.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Infrastructure & DevOps Workstream
|
||||||
|
|
||||||
|
### 6.1 Docker & Compose
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Define and refine:
|
||||||
|
- `infra/docker-compose.yml` – infra stack (Caddy, Gitea, Gitea runner).
|
||||||
|
- `app/docker-compose.yml` – app stack (frontend, backend, LiveKit, Postgres+pgvector).
|
||||||
|
- Configure volumes and networks (`proxy` network for routing via Caddy).
|
||||||
|
|
||||||
|
### 6.2 CI & CD (Gitea + Actions)
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- CI:
|
||||||
|
- Extend `.gitea/workflows/ci.yml` to run linting, type-checking, and tests for backend and frontend once those projects are scaffolded under `app/backend` and `app/frontend`.
|
||||||
|
- Add build verification for any Docker images produced for the app stack.
|
||||||
|
- CD:
|
||||||
|
- Use `.gitea/workflows/cd.yml` as the tag-based deploy workflow, following the deployment approach in `docs/architecture.md`.
|
||||||
|
- Deploy tags `v*` only if they are on `main`.
|
||||||
|
- Use `/health` and key endpoints for readiness checks; roll back on failures.
|
||||||
|
|
||||||
|
### 6.3 Observability & Monitoring
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Centralize logs and metrics for backend, frontend, LiveKit, and Postgres.
|
||||||
|
- Configure alerting for:
|
||||||
|
- Application errors.
|
||||||
|
- Latency and uptime SLOs for voice and API endpoints.
|
||||||
|
- Resource usage (CPU, memory, DB connections).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Quality, Security, and Compliance
|
||||||
|
|
||||||
|
### 7.1 Testing Strategy
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Backend:
|
||||||
|
- Unit tests for core logic modules (e.g., health checks, config, LLM/document/payment integration) and any data models.
|
||||||
|
- Integration tests for auth, sessions, and lessons using httpx + pytest.
|
||||||
|
- Frontend:
|
||||||
|
- Component tests for core UI (chat, curriculum, dashboards).
|
||||||
|
- E2E flows for login, start lesson, start exam, and view progress.
|
||||||
|
- Voice stack:
|
||||||
|
- Automated sanity checks for LiveKit connectivity and audio round-trip.
|
||||||
|
|
||||||
|
### 7.2 Security & Privacy
|
||||||
|
|
||||||
|
**Tasks**
|
||||||
|
|
||||||
|
- Apply OWASP-aligned input validation and output encoding.
|
||||||
|
- Enforce HTTPS everywhere via Caddy; HSTS and secure cookies where applicable.
|
||||||
|
- Implement appropriate retention and deletion policies for audio and transcripts.
|
||||||
|
- Document data handling for learners and institutions (for future legal review).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Rollout Plan
|
||||||
|
|
||||||
|
### 8.1 Internal Alpha
|
||||||
|
|
||||||
|
- Run app stack locally for core team.
|
||||||
|
- Validate foundational flows: auth, voice session, basic lesson, transcripts.
|
||||||
|
|
||||||
|
### 8.2 Closed Beta
|
||||||
|
|
||||||
|
- Onboard a small cohort of A2–B2 learners and one or two programs.
|
||||||
|
- Focus on curriculum fit, tutor behavior, and exam simulation realism.
|
||||||
|
- Collect data to refine prompt templates, lesson design, and dashboard UX.
|
||||||
|
|
||||||
|
### 8.3 Public Launch
|
||||||
|
|
||||||
|
- Enable subscription plans and payment.
|
||||||
|
- Turn on production monitoring and on-call processes.
|
||||||
|
- Iterate on performance, reliability, and content quality based on real usage.
|
||||||
666
infra/README.md
Normal file
666
infra/README.md
Normal file
@@ -0,0 +1,666 @@
|
|||||||
|
# Configuration
|
||||||
|
|
||||||
|
## 1. Configure the firewall at the VPS host
|
||||||
|
|
||||||
|
| Public IP |
|
||||||
|
| :------------: |
|
||||||
|
| 217.154.51.242 |
|
||||||
|
|
||||||
|
| Action | Allowed IP | Protocol | Port(s) | Description |
|
||||||
|
| :-----: | :--------: | :------: | ----------: | :------------ |
|
||||||
|
| Allow | Any | TCP | 80 | HTTP |
|
||||||
|
| Allow | Any | TCP | 443 | HTTPS |
|
||||||
|
| Allow | Any | TCP | 2222 | Git SSH |
|
||||||
|
| Allow | Any | TCP | 2885 | VPS SSH |
|
||||||
|
| Allow | Any | UDP | 3478 | STUN/TURN |
|
||||||
|
| Allow | Any | TCP | 5349 | TURN/TLS |
|
||||||
|
| Allow | Any | TCP | 7881 | LiveKit TCP |
|
||||||
|
| Allow | Any | UDP | 50000-60000 | LiveKit Media |
|
||||||
|
|
||||||
|
## 2. Configure the DNS settings at domain registrar
|
||||||
|
|
||||||
|
| Host (avaaz.ai) | Type | Value |
|
||||||
|
| :-------------: | :---: | :------------: |
|
||||||
|
| @ | A | 217.154.51.242 |
|
||||||
|
| www | CNAME | avaaz.ai |
|
||||||
|
| app | A | 217.154.51.242 |
|
||||||
|
| api | A | 217.154.51.242 |
|
||||||
|
| rtc | A | 217.154.51.242 |
|
||||||
|
| git | A | 217.154.51.242 |
|
||||||
|
|
||||||
|
## 3. Change the SSH port from 22 to 2885
|
||||||
|
|
||||||
|
1. Connect to the server.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh username@avaaz.ai
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Edit the SSH configuration file.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo nano /etc/ssh/sshd_config
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Add port 2885 to the file and comment out port 22.
|
||||||
|
|
||||||
|
```text
|
||||||
|
#Port 22
|
||||||
|
Port 2885
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Save the file and exit the editor.
|
||||||
|
|
||||||
|
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
||||||
|
|
||||||
|
5. Restart the SSH service.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo systemctl daemon-reload && sudo systemctl restart ssh.socket && sudo systemctl restart ssh.service
|
||||||
|
```
|
||||||
|
|
||||||
|
6. **Before closing the current session**, open a new terminal window and connect to the server to verify the changes work correctly.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh username@avaaz.ai # ssh: connect to host avaaz.ai port 22: Connection timed out
|
||||||
|
ssh username@avaaz.ai -p 2885
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Once the connection is successful, close the original session safely.
|
||||||
|
|
||||||
|
## 4. Build and deploy the infrastructure
|
||||||
|
|
||||||
|
1. Check with `dig git.avaaz.ai +short` wether the DNS settings have been propagated.
|
||||||
|
|
||||||
|
2. SSH into the VPS to install Docker & docker compose.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh username@avaaz.ai -p 2885
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Update system packages.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo apt update && sudo apt upgrade -y
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Install dependencies for Docker’s official repo
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo apt install -y ca-certificates curl gnupg lsb-release
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Add Docker’s official APT repo.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo install -m 0755 -d /etc/apt/keyrings
|
||||||
|
|
||||||
|
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
|
||||||
|
|
||||||
|
sudo chmod a+r /etc/apt/keyrings/docker.gpg
|
||||||
|
|
||||||
|
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
|
||||||
|
|
||||||
|
sudo apt update
|
||||||
|
```
|
||||||
|
|
||||||
|
6. Install Docker Engine + compose plugin.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Verify the installation.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker --version
|
||||||
|
sudo docker compose version
|
||||||
|
```
|
||||||
|
|
||||||
|
8. Create the `/etc/docker/daemon.json` file to avoid issues with overusing disk for log data.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo nano /etc/docker/daemon.json
|
||||||
|
```
|
||||||
|
|
||||||
|
9. Paste the following.
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"log-driver": "local",
|
||||||
|
"log-opts": {
|
||||||
|
"max-size": "10m",
|
||||||
|
"max-file": "3"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
10. Save the file and exit the editor.
|
||||||
|
|
||||||
|
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
||||||
|
|
||||||
|
11. Restart the docker service to apply changes.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo systemctl daemon-reload
|
||||||
|
sudo systemctl restart docker
|
||||||
|
```
|
||||||
|
|
||||||
|
12. Create directory for infra stack in `/srv/infra`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo mkdir -p /srv/infra
|
||||||
|
sudo chown -R $USER:$USER /srv/infra
|
||||||
|
cd /srv/infra
|
||||||
|
```
|
||||||
|
|
||||||
|
13. Create directories for Gitea (repos, config, etc.) and Runner persistent data. Gitea runs as UID/GID 1000 by default.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p gitea-data gitea-runner-data
|
||||||
|
```
|
||||||
|
|
||||||
|
14. Create the `/srv/infra/docker-compose.yml` (Caddy + Gitea + Runner) file.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano docker-compose.yml
|
||||||
|
```
|
||||||
|
|
||||||
|
15. Paste the following.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
caddy:
|
||||||
|
# Use the latest official Caddy image
|
||||||
|
image: caddy:latest
|
||||||
|
# Docker Compose automatically generates container names: <folder>_<service>_<index>
|
||||||
|
container_name: caddy # Fixed name used by Docker engine
|
||||||
|
# Automatically restart unless manually stopped
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
# Expose HTTP (ACME + redirect)
|
||||||
|
- "80:80"
|
||||||
|
# Expose HTTPS/WSS (frontend, backend, LiveKit)
|
||||||
|
- "443:443"
|
||||||
|
volumes:
|
||||||
|
# Mount the Caddy config file read-only
|
||||||
|
- ./Caddyfile:/etc/caddy/Caddyfile:ro
|
||||||
|
# Caddy TLS certs (persistent Docker volume)
|
||||||
|
- caddy_data:/data
|
||||||
|
# Internal Caddy state/config
|
||||||
|
- caddy_config:/config
|
||||||
|
networks:
|
||||||
|
# Attach to the shared "proxy" network
|
||||||
|
- proxy
|
||||||
|
|
||||||
|
gitea:
|
||||||
|
# Official Gitea image with built-in Actions
|
||||||
|
image: gitea/gitea:latest
|
||||||
|
container_name: gitea # Fixed name used by Docker engine
|
||||||
|
# Auto-restart service
|
||||||
|
restart: unless-stopped
|
||||||
|
environment:
|
||||||
|
# Run Gitea as host user 1000 (prevents permission issues)
|
||||||
|
- USER_UID=1000
|
||||||
|
# Same for group
|
||||||
|
- USER_GID=1000
|
||||||
|
# Use SQLite (stored inside /data)
|
||||||
|
- GITEA__database__DB_TYPE=sqlite3
|
||||||
|
# Location of the SQLite DB
|
||||||
|
- GITEA__database__PATH=/data/gitea/gitea.db
|
||||||
|
# Custom config directory
|
||||||
|
- GITEA_CUSTOM=/data/gitea
|
||||||
|
volumes:
|
||||||
|
# Bind mount instead of Docker volume because:
|
||||||
|
# - We want repos, configs, SSH keys, and SQLite DB **visible and editable** on host
|
||||||
|
# - Easy backups (just copy `./gitea-data`)
|
||||||
|
# - Easy migration
|
||||||
|
# - Avoids losing data if Docker volumes are pruned
|
||||||
|
- ./gitea-data:/data
|
||||||
|
networks:
|
||||||
|
- proxy
|
||||||
|
ports:
|
||||||
|
# SSH for Git operations mapped to host 2222
|
||||||
|
- "2222:22"
|
||||||
|
|
||||||
|
gitea-runner:
|
||||||
|
# Official Gitea Actions Runner
|
||||||
|
image: gitea/act_runner:latest
|
||||||
|
container_name: gitea-runner # Fixed name used by Docker engine
|
||||||
|
restart: unless-stopped
|
||||||
|
depends_on:
|
||||||
|
# Runner requires Gitea to be available
|
||||||
|
- gitea
|
||||||
|
volumes:
|
||||||
|
# Runner uses host Docker daemon to spin up job containers (Docker-out-of-Docker)
|
||||||
|
- /var/run/docker.sock:/var/run/docker.sock
|
||||||
|
# Bind mount instead of volume because:
|
||||||
|
# - Runner identity is stored in /data/.runner
|
||||||
|
# - Must persist across container recreations
|
||||||
|
# - Prevents duplicated runner registrations in Gitea
|
||||||
|
# - Easy to inspect/reset via `./gitea-runner-data/.runner`
|
||||||
|
- ./gitea-runner-data:/data
|
||||||
|
environment:
|
||||||
|
# Base URL of your Gitea instance
|
||||||
|
- GITEA_INSTANCE_URL=${GITEA_INSTANCE_URL}
|
||||||
|
# One-time registration token
|
||||||
|
- GITEA_RUNNER_REGISTRATION_TOKEN=${GITEA_RUNNER_REGISTRATION_TOKEN}
|
||||||
|
# Human-readable name for the runner
|
||||||
|
- GITEA_RUNNER_NAME=${GITEA_RUNNER_NAME}
|
||||||
|
# Runner labels (e.g., ubuntu-latest)
|
||||||
|
- GITEA_RUNNER_LABELS=${GITEA_RUNNER_LABELS}
|
||||||
|
# Set container timezone to UTC for consistent logs
|
||||||
|
- TZ=Etc/UTC
|
||||||
|
networks:
|
||||||
|
- proxy
|
||||||
|
# Start runner using persisted config
|
||||||
|
command: ["act_runner", "daemon", "--config", "/data/.runner"]
|
||||||
|
|
||||||
|
networks:
|
||||||
|
proxy:
|
||||||
|
# Shared network for Caddy + Gitea (+ later app stack)
|
||||||
|
name: proxy
|
||||||
|
# Default Docker bridge network
|
||||||
|
driver: bridge
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
# Docker volume for Caddy TLS data (safe to keep inside Docker)
|
||||||
|
caddy_data:
|
||||||
|
name: caddy_data
|
||||||
|
# Docker volume for internal Caddy configs/state
|
||||||
|
caddy_config:
|
||||||
|
name: caddy_config
|
||||||
|
```
|
||||||
|
|
||||||
|
16. Save the file and exit the editor.
|
||||||
|
|
||||||
|
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
||||||
|
|
||||||
|
17. Create the `/srv/infra/.env` file with environment variables.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano .env
|
||||||
|
```
|
||||||
|
|
||||||
|
18. Paste the following:
|
||||||
|
|
||||||
|
```env
|
||||||
|
# Base URL of your Gitea instance (used by the runner to register itself
|
||||||
|
# and to send/receive workflow job information).
|
||||||
|
GITEA_INSTANCE_URL=https://git.avaaz.ai
|
||||||
|
|
||||||
|
# One-time registration token generated in:
|
||||||
|
# Gitea → Site Administration → Actions → Runners → "Generate Token"
|
||||||
|
# This MUST be filled in once, so the runner can register.
|
||||||
|
# After registration, the runner stores its identity inside ./gitea-runner-data/.runner
|
||||||
|
# and this value is no longer needed (can be left blank).
|
||||||
|
GITEA_RUNNER_REGISTRATION_TOKEN=
|
||||||
|
|
||||||
|
# Human-readable name for this runner.
|
||||||
|
# This is shown in the Gitea UI so you can distinguish multiple runners:
|
||||||
|
# Example: "vps-runner", "staging-runner", "gpu-runner"
|
||||||
|
GITEA_RUNNER_NAME=gitea-runner
|
||||||
|
|
||||||
|
# Runner labels allow workflows to choose specific runners.
|
||||||
|
# The label format is: label[:schema[:args]]
|
||||||
|
# - "ubuntu-latest" is the <label> name that workflows request using runs-on: [ "ubuntu-latest" ].
|
||||||
|
# - "docker://" is the <schema> indicating the job runs inside a separate Docker container.
|
||||||
|
# - "catthehacker/ubuntu:act-latest" is the <args>, specifying the Docker image to use for the container.
|
||||||
|
# Workflows can target this using:
|
||||||
|
# runs-on: [ "ubuntu-latest" ]
|
||||||
|
GITEA_RUNNER_LABELS=ubuntu-latest:docker://catthehacker/ubuntu:act-latest
|
||||||
|
```
|
||||||
|
|
||||||
|
19. Save the file and exit the editor.
|
||||||
|
|
||||||
|
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
||||||
|
|
||||||
|
20. Create `/srv/infra/Caddyfile` to configure Caddy.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano Caddyfile
|
||||||
|
```
|
||||||
|
|
||||||
|
21. Paste the following:
|
||||||
|
|
||||||
|
```caddy
|
||||||
|
{
|
||||||
|
# Global Caddy options.
|
||||||
|
#
|
||||||
|
# auto_https on
|
||||||
|
# - Caddy listens on port 80 for every host (ACME + redirect).
|
||||||
|
# - Automatically issues HTTPS certificates.
|
||||||
|
# - Automatically redirects HTTP → HTTPS unless disabled.
|
||||||
|
#
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Redirect www → root domain
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
www.avaaz.ai {
|
||||||
|
# Permanent redirect to naked domain
|
||||||
|
redir https://avaaz.ai{uri} permanent
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Marketing site (optional — if frontend handles it, remove this)
|
||||||
|
# Redirect root → app
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
avaaz.ai {
|
||||||
|
# If you have a static marketing page, serve it here.
|
||||||
|
# If not, redirect visitors to the app.
|
||||||
|
redir https://app.avaaz.ai{uri}
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Frontend (Next.js)
|
||||||
|
# Public URL: https://app.avaaz.ai
|
||||||
|
# Internal target: frontend:3000
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
app.avaaz.ai {
|
||||||
|
# Reverse-proxy HTTPS traffic to the frontend container
|
||||||
|
reverse_proxy frontend:3000
|
||||||
|
|
||||||
|
# Access log for debugging frontend activity
|
||||||
|
log {
|
||||||
|
output file /data/app-access.log
|
||||||
|
}
|
||||||
|
|
||||||
|
# Compression for faster delivery of JS, HTML, etc.
|
||||||
|
encode gzip zstd
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Backend (FastAPI)
|
||||||
|
# Public URL: https://api.avaaz.ai
|
||||||
|
# Internal target: backend:8000
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
api.avaaz.ai {
|
||||||
|
# Reverse-proxy all API traffic to FastAPI
|
||||||
|
reverse_proxy backend:8000
|
||||||
|
|
||||||
|
# Access log — useful for monitoring API traffic and debugging issues
|
||||||
|
log {
|
||||||
|
output file /data/api-access.log
|
||||||
|
}
|
||||||
|
|
||||||
|
# Enable response compression (JSON, text, etc.)
|
||||||
|
encode gzip zstd
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# LiveKit (signaling only — media uses direct UDP)
|
||||||
|
# Public URL: wss://rtc.avaaz.ai
|
||||||
|
# Internal target: livekit:7880
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
rtc.avaaz.ai {
|
||||||
|
# LiveKit uses WebSocket signaling, so we reverse-proxy WS → WS
|
||||||
|
reverse_proxy livekit:7880
|
||||||
|
|
||||||
|
# Access log — helps diagnose WebRTC connection failures
|
||||||
|
log {
|
||||||
|
output file /data/rtc-access.log
|
||||||
|
}
|
||||||
|
|
||||||
|
# Compression not needed for WS traffic, but harmless
|
||||||
|
encode gzip zstd
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Gitea (Git server UI + HTTPS + SSH clone)
|
||||||
|
# Public URL: https://git.avaaz.ai
|
||||||
|
# Internal target: gitea:3000
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
git.avaaz.ai {
|
||||||
|
# Route all HTTPS traffic to Gitea’s web UI
|
||||||
|
reverse_proxy gitea:3000
|
||||||
|
|
||||||
|
# Log all Git UI requests and API access
|
||||||
|
log {
|
||||||
|
output file /data/git-access.log
|
||||||
|
}
|
||||||
|
|
||||||
|
# Compress UI responses
|
||||||
|
encode gzip zstd
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
22. Save the file and exit the editor.
|
||||||
|
|
||||||
|
- Press `Ctrl+O`, then `Enter` to save, and `Ctrl+X` to exit.
|
||||||
|
|
||||||
|
23. Start the stack from `/srv/infra`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose pull # fetch images: caddy, gitea, act_runner
|
||||||
|
sudo docker compose up -d # start all containers in the background
|
||||||
|
```
|
||||||
|
|
||||||
|
24. Verify that the status of all the containers are `Up`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose ps -a
|
||||||
|
```
|
||||||
|
|
||||||
|
25. Open `https://git.avaaz.ai` in your browser. Caddy should have already obtained a cert and you should see the Gitea installer.
|
||||||
|
|
||||||
|
26. Configure database settings.
|
||||||
|
|
||||||
|
- **Database Type:** `SQLite3`
|
||||||
|
- **Path:** `/data/gitea/gitea.db` *(matches `GITEA__database__PATH`)*
|
||||||
|
|
||||||
|
27. Configure general settings.
|
||||||
|
|
||||||
|
- **Site Title:** default *(`Gitea: Git with a cup of tea`)*
|
||||||
|
- **Repository Root Path:** default *(`/data/git/repositories`)*
|
||||||
|
- **LFS Root Path:** default *(`/data/git/lfs`)*
|
||||||
|
|
||||||
|
28. Configure server settings.
|
||||||
|
|
||||||
|
- **Domain:** `git.avaaz.ai` *(external HTTPS via Caddy)*
|
||||||
|
- **SSH Port:** `2222` *(external SSH port)*
|
||||||
|
- **HTTP Port:** `3000` *(internal HTTP port)*
|
||||||
|
- **Gitea Base URL / ROOT_URL:** `https://git.avaaz.ai/`
|
||||||
|
|
||||||
|
29. Create the admin account (username + password + email) and finish installation.
|
||||||
|
|
||||||
|
30. Edit Gitea `/data/gitea/conf/app.ini` at the host bind mount `/srv/infra/gitea-data/gitea/conf/app.ini`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano gitea-data/gitea/conf/app.ini
|
||||||
|
```
|
||||||
|
|
||||||
|
31. Add/verify the following sections.
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[server]
|
||||||
|
; Gitea serves HTTP internally (Caddy handles HTTPS externally)
|
||||||
|
PROTOCOL = http
|
||||||
|
; External hostname used for links and redirects
|
||||||
|
DOMAIN = git.avaaz.ai
|
||||||
|
; Hostname embedded in SSH clone URLs
|
||||||
|
SSH_DOMAIN = git.avaaz.ai
|
||||||
|
; Internal container port Gitea listens on (Caddy reverse-proxies to this)
|
||||||
|
HTTP_PORT = 3000
|
||||||
|
; Public-facing base URL (MUST be HTTPS when behind Caddy)
|
||||||
|
ROOT_URL = https://git.avaaz.ai/
|
||||||
|
; Enable Gitea's built-in SSH server inside the container
|
||||||
|
DISABLE_SSH = false
|
||||||
|
; Host-side SSH port exposed by Docker (mapped to container:22)
|
||||||
|
SSH_PORT = 2222
|
||||||
|
; Container-side SSH port (always 22 inside the container)
|
||||||
|
SSH_LISTEN_PORT = 22
|
||||||
|
|
||||||
|
[database]
|
||||||
|
; SQLite database file stored in bind-mounted volume
|
||||||
|
PATH = /data/gitea/gitea.db
|
||||||
|
; Using SQLite (sufficient for single-node small/medium setups)
|
||||||
|
DB_TYPE = sqlite3
|
||||||
|
|
||||||
|
[security]
|
||||||
|
; Prevent web-based reinstallation (crucial for a secured instance)
|
||||||
|
INSTALL_LOCK = true
|
||||||
|
; Auto-generated on first startup; DO NOT change or delete
|
||||||
|
SECRET_KEY =
|
||||||
|
|
||||||
|
[actions]
|
||||||
|
; Enable Gitea Actions (CI/CD)
|
||||||
|
ENABLED = true
|
||||||
|
; Default platform to get action plugins, github for https://github.com, self for the current Gitea instance.
|
||||||
|
DEFAULT_ACTIONS_URL = github
|
||||||
|
```
|
||||||
|
|
||||||
|
32. Restart Gitea to apply changes.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose restart gitea
|
||||||
|
```
|
||||||
|
|
||||||
|
33. Check if Actions is enabled.
|
||||||
|
|
||||||
|
1. Log in as admin at `https://git.avaaz.ai`.
|
||||||
|
2. Go to **Site Administration**.
|
||||||
|
3. Look for a menu item **Actions**. If `[actions] ENABLED = true` in `app.ini`, there will be options related to **Runners**, allowing management of instance-level action runners. Otherwise, the Actions menu item in the Site Administration panel will not appear, indicating the feature is globally disabled.
|
||||||
|
|
||||||
|
34. Get registration token to register the Gitea Actions runner and create a *user* account.
|
||||||
|
|
||||||
|
1. Log in as admin at `https://git.avaaz.ai`.
|
||||||
|
2. Go to **Site Administration → Actions → Runners**.
|
||||||
|
3. Choose **Create new Runner**.
|
||||||
|
4. Copy the **Registration Token**.
|
||||||
|
5. Create a *user* account.
|
||||||
|
|
||||||
|
35. Edit `.env` to add the token.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano .env
|
||||||
|
```
|
||||||
|
|
||||||
|
36. Paste the Registration Token after `=` without spaces.
|
||||||
|
|
||||||
|
```env
|
||||||
|
# One-time registration token generated in:
|
||||||
|
# Gitea → Site Administration → Actions → Runners → "Generate Token"
|
||||||
|
# This MUST be filled in once, so the runner can register.
|
||||||
|
# After registration, the runner stores its identity inside ./gitea-runner-data/.runner
|
||||||
|
# and this value is no longer needed (can be left blank).
|
||||||
|
GITEA_RUNNER_REGISTRATION_TOKEN=
|
||||||
|
```
|
||||||
|
|
||||||
|
37. Check for configuration changes and restart the container `gitea-runner`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose up -d gitea-runner
|
||||||
|
```
|
||||||
|
|
||||||
|
38. Confirm that the Gitea instance URL, Runner name, and Runner labels in `gitea-runner-data/.runner` file are the same as the values in the `.env` file. Fix it using `nano gitea-runner-data/.runner` if different.
|
||||||
|
|
||||||
|
39. Verify that the Runner is connected to `https://git.avaaz.ai` and is polling for jobs.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker logs -f gitea-runner
|
||||||
|
```
|
||||||
|
|
||||||
|
40. Generate an SSH key on laptop. Accept the defaults and optionally set a passphrase. The public key is placed in `~/.ssh/id_ed25519.pub`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh-keygen -t ed25519 -C "user@avaaz.ai"
|
||||||
|
```
|
||||||
|
|
||||||
|
41. Add the public key to Gitea.
|
||||||
|
|
||||||
|
1. Log into `https://git.avaaz.ai` as *user*.
|
||||||
|
2. Go to **Profile → Settings → SSH / GPG Keys → Add Key**.
|
||||||
|
3. Paste the contents starting with `ssh-ed25519` in `~/.ssh/id_ed25519.pub`.
|
||||||
|
4. Save.
|
||||||
|
|
||||||
|
42. Test SSH remote on laptop.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh -T -p 2222 git@git.avaaz.ai
|
||||||
|
```
|
||||||
|
|
||||||
|
43. Type `yes` to tell SSH client to trust the fingerprint and press `Enter`. Enter the passphrase and verify the response *You've successfully authenticated..., but Gitea does not provide shell access.*
|
||||||
|
|
||||||
|
44. Confirm that Gitea’s **clone URLs** of a repo show `ssh://git@git.avaaz.ai:2222/<user>/<repo>.git`.
|
||||||
|
|
||||||
|
45. Upgrade Docker images safely.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose pull # pull newer images
|
||||||
|
sudo docker compose up -d # recreate containers with new images
|
||||||
|
```
|
||||||
|
|
||||||
|
46. Restart the whole infra stack.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose restart # restart all containers
|
||||||
|
```
|
||||||
|
|
||||||
|
47. Check logs for troubleshooting.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker logs -f caddy # shows “obtaining certificate” or ACME errors if HTTPS fails.
|
||||||
|
sudo docker logs -f gitea # shows DB/permissions problems, config issues, etc.
|
||||||
|
sudo docker logs -f gitea-runner # shows registration/connection/job-execution issues.
|
||||||
|
```
|
||||||
|
|
||||||
|
## 5. Validate the infrastructure
|
||||||
|
|
||||||
|
1. Confirm that all containers `caddy`, `gitea`, and `gitea-runner` are `Up`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker compose ps -a
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Confirm that `https://git.avaaz.ai` shows Gitea login page with a valid TLS cert (padlock icon) when opened in a browser.
|
||||||
|
|
||||||
|
3. Confirm the response *You've successfully authenticated..., but Gitea does not provide shell access.* when connecting to Gitea over SSH.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh -T -p 2222 git@git.avaaz.ai
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Create a `test` repo in Gitea and confirm cloning it.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone ssh://git@git.avaaz.ai:2222/<your-user>/test.git
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Confirm that the Actions runner `gitea-runner` is registered and online with status **Idle**.
|
||||||
|
|
||||||
|
1. Log in as admin at `https://git.avaaz.ai`.
|
||||||
|
2. Go to **Site Administration → Actions → Runners**.
|
||||||
|
|
||||||
|
6. Add `.gitea/workflows/test.yml` to the `test` repo root, commit and push.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Workflow Name
|
||||||
|
name: Test Workflow
|
||||||
|
|
||||||
|
# Trigger on a push event to any branch
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
# This means 'any branch'
|
||||||
|
- '**'
|
||||||
|
|
||||||
|
# Define the jobs to run
|
||||||
|
jobs:
|
||||||
|
hello:
|
||||||
|
# Specify the runner image to use
|
||||||
|
runs-on: [ "ubuntu-latest" ]
|
||||||
|
|
||||||
|
# Define the steps for this job
|
||||||
|
steps:
|
||||||
|
- name: Run a Test Script
|
||||||
|
run: echo "Hello from Gitea Actions!"
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Confirm a workflow run appears in Gitea → test repo → **Actions** tab and progresses from queued → in progress → success.
|
||||||
|
|
||||||
|
8. Confirm the logs show the job picked up, container created, and the “Hello from Gitea Actions!” output.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo docker logs -f gitea-runner
|
||||||
|
```
|
||||||
Reference in New Issue
Block a user