24 KiB
PX360 PROMPT ENHANCED
AlHammadi Group (Saudi Arabia).
This is a complex enterprise system. You MUST follow the process, implement incrementally, run checks, and produce a clean codebase. Do not skip steps. Do not guess silently. If a requirement is not implementable without external vendor credentials (MOH/CHI/social APIs), implement robust integration stubs with clear TODOs and testable interfaces.
⸻
- Non-Negotiable Rules (Follow Strictly)
- No monolithic app. Use the modular apps described below.
- No undocumented magic. Every critical business rule must be documented in code comments and in /docs/.
- Config-driven behavior: SLA thresholds, routing rules, journey templates, stage triggers, survey thresholds must be stored in DB models or settings, not hardcoded.
- Event-driven: Patient journey stage completion and survey sending must be triggered via integration events (stored in DB + processed by Celery).
- Everything has an API: Build DRF APIs for the main entities and operations (including journeys and stage events).
- RBAC is mandatory: Role-based access must be implemented early and enforced in views/DRF permissions.
- Auditing is mandatory: Status changes, assignments, SLA escalations, and external events must be logged.
- Deliver in checkpoints: Implement in phases; each phase must compile, migrate, and pass basic tests.
- Use UUIDs for all primary models (except Django internal tables).
- Saudi context: Multi-language support (Arabic/English in surveys and UI where applicable), data residency assumptions, security best practices.
⸻
- Tech Stack Requirements • Python 3.11+ • Django 5.x • DRF • PostgreSQL • Celery + Redis (celery-beat included) • JWT auth (SimpleJWT) • Bootstrap 5 templates (minimal but functional) • django-environ for .env • Docker + docker-compose • Logging: structured logs with separate loggers for integrations • Tests: pytest-django (preferred) OR Django TestCase • Formatting/lint: ruff (preferred) OR black/isort/flake8
⸻
- Project Name + Folder Layout
Project name: px360
Create this structure:
px360/ manage.py config/ init.py settings/ init.py base.py dev.py prod.py urls.py asgi.py wsgi.py celery.py apps/ core/ accounts/ organizations/ journeys/ surveys/ complaints/ feedback/ callcenter/ social/ px_action_center/ analytics/ physicians/ projects/ integrations/ notifications/ ai_engine/ templates/ static/ docs/ requirements/ docker/ docker-compose.yml Dockerfile .env.example README.md
Important: Put all business apps under apps/ for clarity.
- Implementation Phases (Must Follow)
Phase 0 — Bootstrap & Infrastructure
Deliverables: • Django project scaffolding • settings split (base/dev/prod) • environment loading via .env • Dockerfile + docker-compose (web, db, redis, celery, celery-beat) • Celery working and discoverable tasks • Health endpoint: /health/ • Basic README with run commands
Acceptance: • docker compose up works • migrations run • health endpoint returns JSON {status:"ok"}
⸻
Phase 1 — Core + Accounts + RBAC + Audit
Deliverables: • core app: • TimeStampedModel, UUIDModel, SoftDeleteModel (if needed) • AuditEvent model (generic audit log) • reusable enums/choices • accounts app: • Custom User model (extends AbstractUser) • Role model or role choices with Group mapping • Permission utilities + DRF permission classes • Create default roles & groups via data migration/management command • Authentication: • JWT endpoints • Audit logging hooks: • record login, role changes, status changes (later) using helper service
Acceptance: • Can create users with roles • RBAC enforced on at least one endpoint • AuditEvent records at least user login and a sample action
⸻
Phase 2 — Organizations (Hospitals/Departments/People)
Deliverables:
- organizations app models:
- Hospital, Department (hierarchical), Physician, Employee, Patient
- Admin pages for these models
- DRF endpoints for CRUD (admin/authorized only)
Acceptance:
- CRUD works, permissions enforced
⸻
Phase 3 — Journeys (EMS/Inpatient/OPD) + Event Intake
Deliverables: • journeys app models (templates + instances): • JourneyType enum: EMS / INPATIENT / OPD • PatientJourneyTemplate • PatientJourneyStageTemplate • PatientJourneyInstance • PatientJourneyStageInstance • Journey configuration UI (admin) + APIs • Integration event table (in integrations app or journeys app): • InboundEvent model to store events from HIS/Lab/Radiology/Pharmacy etc. • Fields: source_system, event_code, payload_json, encounter_id, patient_identifier, received_at, processed_at, status, error • An API endpoint to receive events: • POST /api/integrations/events/ (secured by API key or JWT service account) • Celery task that processes unprocessed inbound events: • resolves journey by encounter_id • completes stage based on trigger_event_code • writes audit logs
Acceptance: • Can define an OPD journey template with stages • Can create a journey instance for an encounter • Posting an inbound event completes the correct stage instance
⸻
Phase 4 — Surveys (Journey Stage Surveys) + Delivery
Deliverables: • surveys app models: • SurveyTemplate, SurveyQuestion • SurveyInstance, SurveyResponse • Survey language support (ar/en text fields) • Survey link generation: • use signed token URL to prevent guessing • Survey submission endpoint: • POST /api/surveys/submit/ or instance-specific endpoints • Journey integration: • When a stage completes, if: • auto_send_survey=True • survey_template exists • Then create SurveyInstance linked to: • journey_instance • journey_stage_instance • encounter_id • notifications app: • unify send interface: • send_sms(), send_whatsapp(), send_email() • for now: implement a “console backend” that logs messages in DB (NotificationLog) • Celery tasks: • send_survey_invitation(instance_id) • retry logic on failure • Stage survey examples: • OPD_MD_CONSULT → survey for MD interaction • LAB → lab experience • RADIOLOGY → radiology • PHARMACY → pharmacy
Acceptance: • Completing a stage triggers a SurveyInstance • NotificationLog records outbound survey invitation • Survey can be filled and responses saved
⸻
Phase 5 — Complaints/Inquiries/Feedback + Complaint Resolution Satisfaction
Deliverables: • complaints app models: • Complaint, ComplaintAttachment, ComplaintUpdate/Timeline • Inquiry • Complaint SLA: due_at computed based on severity config • Complaint workflow: open → in progress → resolved → closed • Complaint Resolution Satisfaction: • When complaint closes, automatically send “resolution satisfaction” survey • Low score triggers PXAction creation (see next phase) • API endpoints and admin
Acceptance: • Creating complaint triggers SLA • Closing complaint triggers satisfaction survey send
Phase 6 — PX Action Center (SLA engine, escalations, evidence, approval)
Deliverables: • px_action_center models: • PXAction, PXActionLog, PXActionAttachment, PXActionSLAConfig, RoutingRule • Automation triggers: create PXAction when: • complaint created (optional config) • negative stage survey score • negative complaint-resolution satisfaction • negative social sentiment • low call center rating • KPI decline (later) • SLA reminders and escalation Celery tasks: • reminder X hours before due • escalate when overdue (assign to next level) • PX review/approval: • closing action requires PX role approval flag
Acceptance: • Action is created automatically by at least 2 triggers • Overdue escalates automatically • PX approval enforced
⸻
Phase 7 — Call Center + Social Media + AI Engine (Foundations)
Deliverables: • callcenter models for ratings • social models for mentions • ai_engine foundation: • AISentimentResult model linked generically • A service interface for sentiment scoring (stub) • When new text enters (complaints/comments/social), create AISentimentResult via Celery task (stubbed) • Negative sentiment flags drive PXActions (config)
Acceptance: • Create mention → sentiment task runs → PXAction created if negative
⸻
Phase 8 — Analytics, KPIs, Dashboards, Physician Hub, QI Projects
Deliverables: • analytics: KPI, KPIValue, dashboard endpoints • physicians: monthly rating aggregation from surveys • projects: QI projects and tasks • Basic “PX Command Center” UI: • active complaints count • overdue actions count • latest negative sentiment • stage survey averages by department
Acceptance: • Dashboard endpoint returns aggregated results • Physician monthly rating computed for a test month
⸻
- Detailed Business Logic Specifications
4.1 Patient Journey Definition (Key Requirement)
The system must allow PX Admin to define journeys for: • EMS • INPATIENT • OPD
A journey is a template of ordered stages. Each stage: • has trigger_event_code that maps to an incoming integration event • optionally has a survey_template • can automatically send survey when the stage completes
OPD Example: • OPD_MD_CONSULT (trigger: OPD_VISIT_COMPLETED) → send MD survey • LAB (trigger: LAB_ORDER_COMPLETED) → send Lab survey • RADIOLOGY (trigger: RADIOLOGY_REPORT_FINALIZED) → send Radiology survey • PHARMACY (trigger: PHARMACY_DISPENSED) → send Pharmacy survey
The journey instance is linked to an encounter_id. The same encounter can trigger multiple stages in sequence, and each stage may send its own survey.
4.2 Stage Completion Rules
When an inbound event arrives: 1. Identify journey instance via encounter_id 2. Find the first matching stage where: • stage_template.trigger_event_code == event.event_code • stage_instance.status != Completed 3. Mark it Completed with timestamps 4. Attach optional physician/department from payload 5. Audit log the completion 6. If stage has auto survey enabled → create survey instance and queue send task
4.3 Survey Delivery Rules • Each survey invitation uses a secure signed link token • Survey templates are bilingual (ar/en) • Delivery channels: • SMS (default) • WhatsApp / Email (optional config) • Sending must be Celery-based and record NotificationLog • Survey responses: • store per question • compute summary score (template-defined scoring rules) • if score below threshold → trigger PXAction
4.4 Threshold & Routing Configuration (Must Be DB-Driven)
Add models/config to control: • per hospital: • survey score threshold for “negative” • SLA durations by priority • routing rules (department manager vs PX team) • per stage: • auto_send_survey • assigned survey template
⸻
- APIs (Must Implement)
Implement DRF endpoints with filtering, pagination, and permissions: • Auth: • /api/auth/token/, /api/auth/refresh/ • Organizations: • hospitals, departments, physicians, employees, patients • Journeys: • journey templates, stage templates • journey instances, stage instances • Integrations: • inbound events create + list • Surveys: • templates/questions • instances • submit responses • Complaints/Inquiries: • CRUD + status transitions • Action Center: • actions list/detail • status changes • attachments/logs • Call center: • interactions CRUD • Social: • mentions CRUD • Analytics: • dashboard summary endpoint
Include OpenAPI/Swagger via drf-spectacular.
- UI Templates & Admin Console (Comprehensive, Modern, Best-Practice)
You must build a modern, control-panel style web UI (Bootstrap 5 + best-practice Django templates) that provides: • Comprehensive CRUD coverage for all core modules • Dashboards and control panels as first-class features (not an afterthought) • Fast filtering, saved views, and usable tables • Action-oriented workflows: assign, escalate, close, approve, export • Consistent UI layout with reusable components and partials
6.1 UI Framework and Layout Rules
Use Bootstrap 5 and implement a consistent layout system:
Base layout structure • templates/layouts/base.html • top navbar (global search, notifications, user menu) • left sidebar navigation (module links + quick counts) • main content area • footer • templates/layouts/partials/ • sidebar.html (RBAC-based nav) • topbar.html • breadcrumbs.html • flash_messages.html • filters_panel.html (collapsible) • pagination.html • table_toolbar.html (export, bulk actions) • stat_cards.html (dashboard metric cards) • charts_panel.html • modal_form.html (optional)
Front-end best practices • Use Bootstrap components: cards, offcanvas/accordion, badges, progress bars, nav tabs, modals, tooltips. • Use DataTables (or a lightweight alternative) for advanced tables: • search, sorting, server-side pagination where needed • column toggles • export actions • Use Select2 for all large foreign-key selects (patients, departments, physicians, staff). • Use HTMX (recommended) for partial updates: • filters update list without full refresh • status changes inline (e.g., close action) • quick assign from list view • Use Chart.js or ApexCharts for dashboards: • trends, distributions, leaderboards, time series • Ensure RTL-ready templates (Arabic support): • Provide an RTL base CSS toggle and dir="rtl" handling. • Accessibility: • proper labels, aria, keyboard-friendly forms
6.2 UI Apps Coverage (Full CRUD + Control Panels)
You must implement full UI pages for:
A) PX Command Center Dashboard (Group-Level Real-Time Control Panel) Create a primary dashboard: dashboard/command-center/
It must include:
Top KPI strip (cards) • Active complaints • Overdue complaints (SLA breach) • Open PX actions • Overdue PX actions • Negative stage surveys (last 24h / 7d) • Negative social mentions • Call center low ratings • Complaint resolution dissatisfaction rate
Charts (interactive) • Complaints trend by day/week/month • SLA compliance trend (complaints + actions) • Survey satisfaction averages by stage (OPD MD / LAB / Radiology / Pharmacy) • Sentiment distribution (positive/neutral/negative) • Department leaderboard (best/worst) • Physician leaderboard (best/worst)
Live feed widgets • Latest high severity complaints • Latest escalated actions • Latest negative comments/mentions with quick “Create Action” button • Stage events log stream (latest integration events processed)
Filters (persisted) • Date range • Hospital • Department • Journey type (EMS/Inpatient/OPD) • Source (HIS/MOH/CHI/PXConnect/Social/Internal)
Actions • Quick create complaint • Quick create action • Export dashboard snapshot (PDF/Excel)
RBAC rules: • PX Admin can see all • Hospital admin sees their hospital only • Department manager sees their department only
⸻
B) Complaints Console (Complete CRUD + Workflow + SLA) Provide a full complaints module UI:
Pages 1. Complaints List
• table with advanced filters:
• status, severity, priority, SLA status (due soon / overdue)
• source
• hospital, department
• physician (optional)
• date range
• bulk actions:
• assign to department/staff
• change status
• export selected
• quick actions from row:
• “View”
• “Assign”
• “Escalate”
• “Add Note”
• “Close”
• SLA badges and progress indicator
2. Complaint Create / Edit form
• structured form sections:
• patient info (select patient or enter MRN)
• encounter linking (encounter_id)
• classification (category/subcategory)
• description + attachments
• assignment + priority
• Use Select2 for patient/department/physician
3. Complaint Detail View (Case Management)
• case header (status, SLA countdown, assigned owner)
• timeline feed (updates, status changes, attachments, notes)
• tabs:
• Details
• Timeline
• Attachments
• Related Journey Stages & Surveys
• Actions (linked PX actions)
• Buttons:
• assign, escalate, change status
• “Trigger Resolution Satisfaction Survey” (admin only)
• “Create Action” (if not already)
• SLA indicator + audit log summary
4. Complaint Resolution Satisfaction Dashboard
• score distribution
• by department/hospital
• dissatisfied list with drill-down
Exports • export list or selected complaints to CSV/Excel/PDF
⸻
C) PX Action Center Console (Operational Hub) This must be a control room UI, not just CRUD.
Pages 1. Action Center Board (List + Advanced Filters)
• filters: status, SLA, priority, type/source, department, assignee
• views: “My Actions”, “Overdue”, “Escalated”, “New from Surveys”, “From Complaints”, “From Social”
• bulk actions:
• assign
• update status
• escalate
• row quick actions:
• open
• change status
• attach evidence
• request approval
2. Action Detail View (Workflow + Evidence + Approval)
• action header: SLA progress bar, owner, linked source object
• tabs:
• Details
• Evidence & Attachments
• Discussion / Logs
• SLA History
• Approval
• approval flow:
• department closes → PX reviews/approves
• prevent closure without required evidence if configured
• show related complaints/surveys/mentions
3. SLA Control Panel (Admin)
• manage SLA configs by:
• priority/severity
• department/hospital
• manage escalation rules (who gets escalated next)
⸻
D) Patient Journey Console (Templates + Instances + Monitoring) This must be comprehensive because it’s core to your survey logic.
Journey Template Builder UI • Journey templates list (EMS/Inpatient/OPD) • Create/edit template • Stage builder: • reorder stages (drag-drop if possible, else order field) • set trigger_event_code • set auto_send_survey • bind survey template • require physician or not • Test panel: • simulate event code on a test encounter to validate mapping
Journey Instances Monitoring • Instances list with filters: • journey type • hospital/department • stage status • encounter_id • patient MRN • Instance detail: • progress stepper (stages in sequence) • each stage card shows: • status (pending/in progress/completed) • completion time • linked dept/physician • survey instance status + link • “Send survey now” (admin) / “Resend invitation” • event history timeline: • inbound events used to complete stages • analytics snapshot: • average score per stage • negative flags
⸻
E) Surveys Console (Templates + Instances + Analytics) This is not just forms — it’s a survey control center.
Survey Templates • list templates • create/edit template • question builder: • add question • reorder • bilingual text • type • required • branch logic UI (simple JSON editor acceptable, but validate JSON)
Survey Instances • list/filter instances (status, journey type, stage) • detail view: • invitation status • responses • computed score • sentiment results • “Create Action” if negative
Survey Analytics Dashboard • satisfaction by: • hospital • department • physician • journey type • stage • trends over time • negative survey heatmap by department/stage
⸻
F) Survey Public Form (Token Link — Modern UX) This page is used by patients; it must be mobile-first and clean.
Requirements: • token-based access (signed token) • bilingual toggle (Arabic/English) • progress indicator • validation and friendly error messages • thank-you page • optional NPS style components • should work beautifully on mobile
⸻
G) Social Media Monitoring Console • mentions feed with filters (platform, sentiment, hospital) • sentiment trend chart • “Create Action” button per negative mention • mention detail page with AI sentiment breakdown
⸻
H) Call Center Ratings Console • interactions list + filters (agent, dept, score range) • dashboard: • avg waiting time • avg satisfaction • lowest performing agents/depts • “Create Action” on low rating
⸻
I) Analytics / KPI Admin Console • KPI definitions CRUD • KPI values list • KPI dashboard: • threshold breaches • trend lines • department rankings • ability to trigger actions from KPI breaches
⸻
J) Admin “System Configuration” Console Provide a UI for: • routing rules (where actions go) • thresholds (negative survey threshold, dissatisfaction threshold) • notification gateways configuration (toggle SMS/WhatsApp/email modes) • integration endpoint configs (placeholders)
⸻
6.3 UI Architecture Requirements (Don’t Skip) • Use Django Class-Based Views + ModelForm best practices. • Use Form validation + server-side errors displayed in UI. • Use reusable template patterns: • *_list.html, *_detail.html, _form.html, partials/ • Use consistent breadcrumb navigation. • Use consistent UI status badges: • Open/Closed/Overdue/Completed • Add “Empty state” designs when no data exists. • Add “Saved filters”: • store user filter presets (DB model SavedViewFilter) • Add Export: • CSV/Excel for lists • PDF for detail summary
6.4 Performance Requirements for UI • Use server-side pagination everywhere for large lists. • Use indexes in DB for filtering fields (status, dates, hospital, department). • Use select_related/prefetch_related in views.
6.5 Acceptance Criteria for UI (Must Pass)
By the end: • Every core model has full CRUD pages (create/list/detail/edit/delete where safe). • Dashboards show real aggregated metrics (not placeholder values). • Action Center supports assignment, evidence upload, closure approval. • Journey builder can define stages and link surveys. • Journey instance view shows stage progression and survey status. • Public survey form works via token and saves responses. • UI is consistent, modern, and usable as a control panel.
- Security & Compliance • RBAC: enforce in DRF permission classes • Audit logs for critical events • Data encryption: document at-rest & in-transit expectations • Data residency: mention KSA hosting requirement in docs • Rate-limit integration endpoint if possible (simple middleware or DRF throttle)
⸻
- Deliverables Checklist (Must Produce)
When finished, ensure the repository includes: • Full Django project with apps • Migrations for all apps • Admin configuration • DRF APIs + permissions • Celery tasks for events, survey sending, SLA reminders • Docker setup • .env.example • README.md • /docs/ including: • Architecture overview • Journey & survey engine explanation • Integration event contracts (example payload JSON) • SLA and routing config explanation • Minimal tests verifying: • event completes stage • stage triggers survey • negative survey triggers action • overdue action escalates (can be tested with time override)
⸻
- Final Output Requirements
At the end, provide: 1. A list of created apps and their responsibilities 2. Key endpoints list 3. How to run locally (docker compose) 4. Example API calls: • create journey template • create encounter journey instance • post event to complete stage • verify survey send log
⸻
- Start Now
Proceed phase-by-phase. After each phase: • run migrations • run basic tests • confirm acceptance criteria met • then continue.
Do not jump ahead. Do not leave broken code. Do not omit the journey-stage survey requirement.