HH/docs/SURVEY_ANALYTICS_FRONTEND.md
2026-01-24 15:27:30 +03:00

10 KiB
Raw Permalink Blame History

Survey Analytics Frontend Implementation

Overview

This document describes the Phase 1 implementation of survey tracking analytics in the frontend, providing administrators with comprehensive visibility into patient survey engagement metrics.

Implementation Summary

Phase 1: Quick Wins - Enhanced existing survey instance list page with tracking analytics

What Was Added

1. New Stat Cards (8 total)

Primary Statistics Row

  • Total Surveys - Overall count of surveys
  • Opened - Number of surveys that were opened at least once (NEW)
  • Completed - Number of completed surveys with response rate
  • Negative - Number of negative surveys requiring attention

Secondary Statistics Row (NEW)

  • In Progress - Surveys started but not completed
  • Viewed - Surveys opened but not started
  • Abandoned - Surveys left incomplete for >24 hours
  • Avg Completion Time - Average time in seconds to complete surveys

2. New Charts (6 total)

Primary Charts Row (NEW)

  1. Engagement Funnel Chart

    • Visualizes: Sent → Opened → In Progress → Completed
    • Shows conversion rates at each stage
    • Identifies where patients drop off
    • Horizontal bar chart with percentages
  2. Completion Time Distribution

    • Categories: < 1 min, 1-5 min, 5-10 min, 10-20 min, 20+ min
    • Shows how long patients take to complete surveys
    • Helps identify optimal survey length
    • Vertical bar chart
  3. Device Type Distribution

    • Breakdown: Mobile, Tablet, Desktop
    • Shows what devices patients use
    • Helps optimize survey design for devices
    • Donut chart with percentages

Secondary Charts Row (Existing)

  1. Score Distribution - Distribution of survey scores (1-2, 2-3, 3-4, 4-5)
  2. Survey Types - Breakdown by survey type (Journey Stage, Complaint Resolution, General, NPS)
  3. 30-Day Trend - Line chart showing sent vs completed over time

Files Modified

Backend

  • apps/surveys/ui_views.py
    • Added tracking statistics calculation
    • Added engagement funnel data
    • Added completion time distribution
    • Added device type distribution
    • Extended survey_instance_list view context

Frontend

  • templates/surveys/instance_list.html
    • Added 4 new stat cards
    • Added 3 new chart containers
    • Added ApexCharts configurations for new charts
    • Maintained existing charts and functionality

Data Flow

SurveyInstance Model (tracking fields)
    ↓
survey_instance_list view
    ↓
Calculate statistics:
  - open_count > 0 → opened_count
  - status='in_progress' → in_progress_count
  - status='abandoned' → abandoned_count
  - status='viewed' → viewed_count
  - Avg(time_spent_seconds) → avg_completion_time
  - Avg(opened_at - sent_at) → avg_time_to_open
    ↓
Generate visualization data:
  - Engagement funnel
  - Completion time distribution
  - Device type distribution
    ↓
Template context
    ↓
ApexCharts render
    ↓
Visual analytics dashboard

Key Metrics Explained

Open Rate

Open Rate = (Opened / Sent) × 100
  • Measures how many patients open the survey link
  • Typical benchmark: 30-50%
  • Low rate may indicate: email delivery issues, unclear subject lines, timing

Response Rate

Response Rate = (Completed / Total) × 100
  • Measures overall completion rate
  • Typical benchmark: 20-40%
  • Low rate may indicate: survey too long, poor UX, inconvenient timing

Completion Rate

Completion Rate = (Completed / Opened) × 100
  • Measures conversion from opened to completed
  • Typical benchmark: 60-80%
  • Low rate may indicate: confusing questions, technical issues, survey abandonment

Engagement Funnel Analysis

Sent → Opened (Open Rate)

  • < 30%: Review email delivery, subject lines, sending time
  • 30-50%: Good performance
  • 50%: Excellent engagement

Opened → In Progress (Start Rate)

  • < 50%: Landing page issues, unclear instructions
  • 50-70%: Good performance
  • 70%: Excellent first impression

In Progress → Completed (Completion Rate)

  • < 60%: Survey too long, complex questions, technical issues
  • 60-80%: Good performance
  • 80%: Excellent survey design

Completion Time Analysis

< 1 min

  • May indicate rushed responses
  • Low-quality feedback
  • Consider requiring minimum time or adding attention checks

1-5 min

  • Optimal range for most surveys
  • Balanced engagement
  • High-quality responses

5-10 min

  • Acceptable for detailed surveys
  • Higher abandonment risk
  • Ensure questions are clear and organized

10-20 min

  • High abandonment risk
  • Consider splitting into multiple surveys
  • Add progress indicators

20+ min

  • Very high abandonment risk
  • Too long for single session
  • Break into multiple parts

Device Type Implications

Mobile (>50%)

  • Survey must be mobile-optimized
  • Shorter, simpler questions
  • Avoid complex layouts
  • Touch-friendly interfaces

Tablet (10-20%)

  • Good middle ground
  • Can handle moderate complexity
  • Still need responsive design

Desktop (<40%)

  • Can support more complex surveys
  • Better for longer surveys
  • Consider device-specific layouts

Usage Examples

Monitor Survey Performance

  1. Navigate to /surveys/instances/
  2. Review stat cards for overview
  3. Check engagement funnel for drop-off points
  4. Analyze completion time distribution
  5. Identify improvement opportunities

Identify Abandonment Issues

  1. Look at "Abandoned" stat card
  2. Check "Viewed" vs "In Progress" counts
  3. Review engagement funnel conversion rates
  4. Examine completion time distribution for outliers
  5. Optimize survey design based on findings

Optimize for Mobile Users

  1. Check device type distribution
  2. If mobile > 50%, ensure mobile optimization
  3. Review completion time by device type (future enhancement)
  4. Test survey on mobile devices
  5. Simplify questions and layouts

Track Campaign Effectiveness

  1. Use date filters to isolate specific campaigns
  2. Compare open rates across campaigns
  3. Analyze response rates by survey type
  4. Review score distribution changes
  5. Identify best practices

Technical Details

Performance Considerations

  • All statistics are calculated server-side using Django ORM
  • Queries are optimized with select_related and prefetch_related
  • Pagination prevents loading too much data
  • Charts render client-side using ApexCharts
  • No API calls needed for basic analytics

Data Freshness

  • Statistics are calculated in real-time on page load
  • No caching currently implemented
  • For large datasets (>10,000 surveys), consider caching
  • Scheduled aggregation jobs could improve performance

Browser Compatibility

  • ApexCharts supports all modern browsers
  • Requires JavaScript enabled
  • Responsive design works on all devices
  • Charts adapt to screen size

Future Enhancements

Phase 2: Patient-Level Details

  • Timeline view for individual patient surveys
  • Detailed tracking events table
  • Device/browser info per visit
  • Time spent per question
  • Completion metrics breakdown

Phase 3: Comprehensive Dashboard

  • Dedicated analytics page at /surveys/analytics/
  • Hourly activity heatmap
  • Top 10 fastest/slowest completions
  • Patient timeline view
  • Export to CSV/Excel
  • Print-friendly layout
  • Advanced filtering and drill-down

Advanced Analytics

  • Time of day analysis
  • Day of week patterns
  • Seasonal trends
  • A/B testing comparison
  • Predictive modeling for completion likelihood
  • NPS trends over time
  • Correlation with patient satisfaction scores

Troubleshooting

Charts Not Rendering

  1. Check browser console for JavaScript errors
  2. Verify ApexCharts library is loaded
  3. Ensure data is being passed to template
  4. Check for JavaScript syntax errors

Incorrect Statistics

  1. Verify survey status transitions are working
  2. Check that tracking background jobs are running
  3. Ensure abandonment detection is active
  4. Review database for tracking data integrity

Performance Issues

  1. Reduce date range for statistics
  2. Add database indexes on tracking fields
  3. Implement caching for statistics
  4. Use aggregation tables for large datasets
  • docs/SURVEY_TRACKING_GUIDE.md - Tracking system overview
  • docs/SURVEY_TRACKING_IMPLEMENTATION.md - Backend implementation
  • docs/SURVEY_MULTIPLE_ACCESS_FIX.md - Multiple access fix
  • docs/SURVEY_TRACKING_FINAL_SUMMARY.md - Complete tracking system

API Reference

View: survey_instance_list

URL: /surveys/instances/ Method: GET Query Parameters:

  • status - Filter by status (sent, completed, pending, etc.)
  • survey_type - Filter by survey type
  • hospital - Filter by hospital ID
  • is_negative - Filter negative surveys only (true/false)
  • date_from - Start date filter
  • date_to - End date filter
  • search - Search by MRN, name, or encounter
  • page - Page number
  • page_size - Results per page (default: 25)
  • order_by - Sort order (default: -created_at)

Context Variables:

{
    'page_obj': Paginator object,
    'surveys': List of SurveyInstance objects,
    'stats': {
        'total': int,
        'sent': int,
        'completed': int,
        'negative': int,
        'response_rate': float,
        'opened': int,          # NEW
        'open_rate': float,       # NEW
        'in_progress': int,       # NEW
        'abandoned': int,         # NEW
        'viewed': int,            # NEW
        'avg_completion_time': int,# NEW
        'avg_time_to_open': int,   # NEW
    },
    'engagement_funnel': [       # NEW
        {'stage': 'Sent', 'count': int, 'percentage': float},
        {'stage': 'Opened', 'count': int, 'percentage': float},
        {'stage': 'In Progress', 'count': int, 'percentage': float},
        {'stage': 'Completed', 'count': int, 'percentage': float},
    ],
    'completion_time_distribution': [  # NEW
        {'range': str, 'count': int, 'percentage': float},
        # ...
    ],
    'device_distribution': [        # NEW
        {'type': str, 'name': str, 'count': int, 'percentage': float},
        # ...
    ],
    # ... existing data
}

Conclusion

Phase 1 implementation provides immediate visibility into survey engagement metrics without requiring significant changes to the existing infrastructure. The new stat cards and charts enable administrators to:

  • Track patient engagement throughout the survey lifecycle
  • Identify abandonment patterns and improvement opportunities
  • Optimize survey design based on device type and completion time
  • Monitor campaign effectiveness with real-time metrics
  • Make data-driven decisions to improve patient experience

The implementation is performant, maintainable, and provides a solid foundation for future enhancements in Phases 2 and 3.