Update PRD, plan, and agent instructions
All checks were successful
Continuous Integration / Validate and test changes (push) Successful in 3s

This commit is contained in:
2025-12-04 08:26:58 +01:00
parent d6b61ae8fb
commit 20d7a66d57
3 changed files with 89 additions and 18 deletions

View File

@@ -84,6 +84,7 @@ The goal is to deliver an end-to-end, voice-first AI speaking coach that support
- `UploadDocument` / `DocumentChunk` files and parsed chunks with `vector` embeddings (stored alongside or extending the existing backend package under `app/backend`).
- `ProgressSnapshot` aggregate metrics for dashboards (per user and optionally per program).
- `Subscription` / `PaymentEvent` billing state and usage limits.
- **Note:** Seed the database with the specific plans defined in `README.md` (First Light, Spark, Glow, Shine, Radiance) and their respective limits.
- Add related Alembic migrations; verify they run cleanly on dev DB.
**Deliverables**
@@ -129,6 +130,8 @@ The goal is to deliver an end-to-end, voice-first AI speaking coach that support
- Respect retention settings.
- Implement post-session summarization endpoint / background job:
- Generate per-session summary, strengths/weaknesses, recommended next steps.
- Implement on-demand translation:
- Endpoint (e.g., `/chat/translate`) or integrated socket message to translate user/AI text between target and native languages (supporting PRD Section 4.2).
**Deliverables**
@@ -248,6 +251,7 @@ The goal is to deliver an end-to-end, voice-first AI speaking coach that support
- Build `ChatInterface.tsx`:
- Microphone controls, connection status, basic waveform/level visualization.
- Rendering of AI and user turns with text and visual aids (images, tables) as provided by backend/agent.
- **Translation Support:** UI controls to translate specific messages on demand (toggle or click-to-translate).
- Error states for mic and network issues; text-only fallback UI.
- Integrate with backend session APIs:
- On login, call `GET /sessions/default`, then `POST /sessions/default/token`.