10 KiB
Survey Analytics Frontend Implementation
Overview
This document describes the Phase 1 implementation of survey tracking analytics in the frontend, providing administrators with comprehensive visibility into patient survey engagement metrics.
Implementation Summary
Phase 1: Quick Wins - Enhanced existing survey instance list page with tracking analytics
What Was Added
1. New Stat Cards (8 total)
Primary Statistics Row
- Total Surveys - Overall count of surveys
- Opened - Number of surveys that were opened at least once (NEW)
- Completed - Number of completed surveys with response rate
- Negative - Number of negative surveys requiring attention
Secondary Statistics Row (NEW)
- In Progress - Surveys started but not completed
- Viewed - Surveys opened but not started
- Abandoned - Surveys left incomplete for >24 hours
- Avg Completion Time - Average time in seconds to complete surveys
2. New Charts (6 total)
Primary Charts Row (NEW)
-
Engagement Funnel Chart
- Visualizes: Sent → Opened → In Progress → Completed
- Shows conversion rates at each stage
- Identifies where patients drop off
- Horizontal bar chart with percentages
-
Completion Time Distribution
- Categories: < 1 min, 1-5 min, 5-10 min, 10-20 min, 20+ min
- Shows how long patients take to complete surveys
- Helps identify optimal survey length
- Vertical bar chart
-
Device Type Distribution
- Breakdown: Mobile, Tablet, Desktop
- Shows what devices patients use
- Helps optimize survey design for devices
- Donut chart with percentages
Secondary Charts Row (Existing)
- Score Distribution - Distribution of survey scores (1-2, 2-3, 3-4, 4-5)
- Survey Types - Breakdown by survey type (Journey Stage, Complaint Resolution, General, NPS)
- 30-Day Trend - Line chart showing sent vs completed over time
Files Modified
Backend
apps/surveys/ui_views.py- Added tracking statistics calculation
- Added engagement funnel data
- Added completion time distribution
- Added device type distribution
- Extended
survey_instance_listview context
Frontend
templates/surveys/instance_list.html- Added 4 new stat cards
- Added 3 new chart containers
- Added ApexCharts configurations for new charts
- Maintained existing charts and functionality
Data Flow
SurveyInstance Model (tracking fields)
↓
survey_instance_list view
↓
Calculate statistics:
- open_count > 0 → opened_count
- status='in_progress' → in_progress_count
- status='abandoned' → abandoned_count
- status='viewed' → viewed_count
- Avg(time_spent_seconds) → avg_completion_time
- Avg(opened_at - sent_at) → avg_time_to_open
↓
Generate visualization data:
- Engagement funnel
- Completion time distribution
- Device type distribution
↓
Template context
↓
ApexCharts render
↓
Visual analytics dashboard
Key Metrics Explained
Open Rate
Open Rate = (Opened / Sent) × 100
- Measures how many patients open the survey link
- Typical benchmark: 30-50%
- Low rate may indicate: email delivery issues, unclear subject lines, timing
Response Rate
Response Rate = (Completed / Total) × 100
- Measures overall completion rate
- Typical benchmark: 20-40%
- Low rate may indicate: survey too long, poor UX, inconvenient timing
Completion Rate
Completion Rate = (Completed / Opened) × 100
- Measures conversion from opened to completed
- Typical benchmark: 60-80%
- Low rate may indicate: confusing questions, technical issues, survey abandonment
Engagement Funnel Analysis
Sent → Opened (Open Rate)
- < 30%: Review email delivery, subject lines, sending time
- 30-50%: Good performance
-
50%: Excellent engagement
Opened → In Progress (Start Rate)
- < 50%: Landing page issues, unclear instructions
- 50-70%: Good performance
-
70%: Excellent first impression
In Progress → Completed (Completion Rate)
- < 60%: Survey too long, complex questions, technical issues
- 60-80%: Good performance
-
80%: Excellent survey design
Completion Time Analysis
< 1 min
- May indicate rushed responses
- Low-quality feedback
- Consider requiring minimum time or adding attention checks
1-5 min
- Optimal range for most surveys
- Balanced engagement
- High-quality responses
5-10 min
- Acceptable for detailed surveys
- Higher abandonment risk
- Ensure questions are clear and organized
10-20 min
- High abandonment risk
- Consider splitting into multiple surveys
- Add progress indicators
20+ min
- Very high abandonment risk
- Too long for single session
- Break into multiple parts
Device Type Implications
Mobile (>50%)
- Survey must be mobile-optimized
- Shorter, simpler questions
- Avoid complex layouts
- Touch-friendly interfaces
Tablet (10-20%)
- Good middle ground
- Can handle moderate complexity
- Still need responsive design
Desktop (<40%)
- Can support more complex surveys
- Better for longer surveys
- Consider device-specific layouts
Usage Examples
Monitor Survey Performance
- Navigate to
/surveys/instances/ - Review stat cards for overview
- Check engagement funnel for drop-off points
- Analyze completion time distribution
- Identify improvement opportunities
Identify Abandonment Issues
- Look at "Abandoned" stat card
- Check "Viewed" vs "In Progress" counts
- Review engagement funnel conversion rates
- Examine completion time distribution for outliers
- Optimize survey design based on findings
Optimize for Mobile Users
- Check device type distribution
- If mobile > 50%, ensure mobile optimization
- Review completion time by device type (future enhancement)
- Test survey on mobile devices
- Simplify questions and layouts
Track Campaign Effectiveness
- Use date filters to isolate specific campaigns
- Compare open rates across campaigns
- Analyze response rates by survey type
- Review score distribution changes
- Identify best practices
Technical Details
Performance Considerations
- All statistics are calculated server-side using Django ORM
- Queries are optimized with
select_relatedandprefetch_related - Pagination prevents loading too much data
- Charts render client-side using ApexCharts
- No API calls needed for basic analytics
Data Freshness
- Statistics are calculated in real-time on page load
- No caching currently implemented
- For large datasets (>10,000 surveys), consider caching
- Scheduled aggregation jobs could improve performance
Browser Compatibility
- ApexCharts supports all modern browsers
- Requires JavaScript enabled
- Responsive design works on all devices
- Charts adapt to screen size
Future Enhancements
Phase 2: Patient-Level Details
- Timeline view for individual patient surveys
- Detailed tracking events table
- Device/browser info per visit
- Time spent per question
- Completion metrics breakdown
Phase 3: Comprehensive Dashboard
- Dedicated analytics page at
/surveys/analytics/ - Hourly activity heatmap
- Top 10 fastest/slowest completions
- Patient timeline view
- Export to CSV/Excel
- Print-friendly layout
- Advanced filtering and drill-down
Advanced Analytics
- Time of day analysis
- Day of week patterns
- Seasonal trends
- A/B testing comparison
- Predictive modeling for completion likelihood
- NPS trends over time
- Correlation with patient satisfaction scores
Troubleshooting
Charts Not Rendering
- Check browser console for JavaScript errors
- Verify ApexCharts library is loaded
- Ensure data is being passed to template
- Check for JavaScript syntax errors
Incorrect Statistics
- Verify survey status transitions are working
- Check that tracking background jobs are running
- Ensure abandonment detection is active
- Review database for tracking data integrity
Performance Issues
- Reduce date range for statistics
- Add database indexes on tracking fields
- Implement caching for statistics
- Use aggregation tables for large datasets
Related Documentation
docs/SURVEY_TRACKING_GUIDE.md- Tracking system overviewdocs/SURVEY_TRACKING_IMPLEMENTATION.md- Backend implementationdocs/SURVEY_MULTIPLE_ACCESS_FIX.md- Multiple access fixdocs/SURVEY_TRACKING_FINAL_SUMMARY.md- Complete tracking system
API Reference
View: survey_instance_list
URL: /surveys/instances/
Method: GET
Query Parameters:
status- Filter by status (sent, completed, pending, etc.)survey_type- Filter by survey typehospital- Filter by hospital IDis_negative- Filter negative surveys only (true/false)date_from- Start date filterdate_to- End date filtersearch- Search by MRN, name, or encounterpage- Page numberpage_size- Results per page (default: 25)order_by- Sort order (default: -created_at)
Context Variables:
{
'page_obj': Paginator object,
'surveys': List of SurveyInstance objects,
'stats': {
'total': int,
'sent': int,
'completed': int,
'negative': int,
'response_rate': float,
'opened': int, # NEW
'open_rate': float, # NEW
'in_progress': int, # NEW
'abandoned': int, # NEW
'viewed': int, # NEW
'avg_completion_time': int,# NEW
'avg_time_to_open': int, # NEW
},
'engagement_funnel': [ # NEW
{'stage': 'Sent', 'count': int, 'percentage': float},
{'stage': 'Opened', 'count': int, 'percentage': float},
{'stage': 'In Progress', 'count': int, 'percentage': float},
{'stage': 'Completed', 'count': int, 'percentage': float},
],
'completion_time_distribution': [ # NEW
{'range': str, 'count': int, 'percentage': float},
# ...
],
'device_distribution': [ # NEW
{'type': str, 'name': str, 'count': int, 'percentage': float},
# ...
],
# ... existing data
}
Conclusion
Phase 1 implementation provides immediate visibility into survey engagement metrics without requiring significant changes to the existing infrastructure. The new stat cards and charts enable administrators to:
- Track patient engagement throughout the survey lifecycle
- Identify abandonment patterns and improvement opportunities
- Optimize survey design based on device type and completion time
- Monitor campaign effectiveness with real-time metrics
- Make data-driven decisions to improve patient experience
The implementation is performant, maintainable, and provides a solid foundation for future enhancements in Phases 2 and 3.