449 lines
11 KiB
Markdown
449 lines
11 KiB
Markdown
# ATS Load Testing Framework
|
|
|
|
This directory contains a comprehensive load testing framework for the ATS (Applicant Tracking System) application using Locust. The framework provides realistic user simulation, performance monitoring, and detailed reporting capabilities.
|
|
|
|
## 📋 Table of Contents
|
|
|
|
- [Overview](#overview)
|
|
- [Installation](#installation)
|
|
- [Quick Start](#quick-start)
|
|
- [Test Scenarios](#test-scenarios)
|
|
- [Configuration](#configuration)
|
|
- [Test Data Generation](#test-data-generation)
|
|
- [Performance Monitoring](#performance-monitoring)
|
|
- [Reports](#reports)
|
|
- [Distributed Testing](#distributed-testing)
|
|
- [Best Practices](#best-practices)
|
|
- [Troubleshooting](#troubleshooting)
|
|
|
|
## 🎯 Overview
|
|
|
|
The ATS load testing framework includes:
|
|
|
|
- **Multiple User Types**: Public users, authenticated users, API clients, file uploaders
|
|
- **Realistic Scenarios**: Job browsing, application submission, dashboard access, API calls
|
|
- **Performance Monitoring**: System metrics, database performance, response times
|
|
- **Comprehensive Reporting**: HTML reports, JSON data, performance charts
|
|
- **Test Data Generation**: Automated creation of realistic test data
|
|
- **Distributed Testing**: Master-worker setup for large-scale tests
|
|
|
|
## 🚀 Installation
|
|
|
|
### Prerequisites
|
|
|
|
```bash
|
|
# Python 3.8+ required
|
|
python --version
|
|
|
|
# Install required packages
|
|
pip install locust faker psutil matplotlib pandas requests
|
|
|
|
# Optional: For enhanced reporting
|
|
pip install jupyter notebook seaborn
|
|
```
|
|
|
|
### Setup
|
|
|
|
1. Clone the repository and navigate to the project root
|
|
2. Install dependencies:
|
|
```bash
|
|
pip install -r requirements.txt
|
|
pip install locust faker psutil matplotlib pandas
|
|
```
|
|
3. Set up environment variables:
|
|
```bash
|
|
export ATS_HOST="http://localhost:8000"
|
|
export TEST_USERNAME="your_test_user"
|
|
export TEST_PASSWORD="your_test_password"
|
|
```
|
|
|
|
## ⚡ Quick Start
|
|
|
|
### 1. List Available Scenarios
|
|
|
|
```bash
|
|
python load_tests/run_load_tests.py list
|
|
```
|
|
|
|
### 2. Run a Smoke Test
|
|
|
|
```bash
|
|
# Interactive mode with web UI
|
|
python load_tests/run_load_tests.py run smoke_test
|
|
|
|
# Headless mode (no web UI)
|
|
python load_tests/run_load_tests.py headless smoke_test
|
|
```
|
|
|
|
### 3. Generate Test Data
|
|
|
|
```bash
|
|
python load_tests/run_load_tests.py generate-data --jobs 100 --users 50 --applications 500
|
|
```
|
|
|
|
### 4. View Results
|
|
|
|
After running tests, check the `load_tests/results/` directory for:
|
|
- HTML reports
|
|
- CSV statistics
|
|
- Performance charts
|
|
- JSON data
|
|
|
|
## 📊 Test Scenarios
|
|
|
|
### Available Scenarios
|
|
|
|
| Scenario | Users | Duration | Description |
|
|
|-----------|--------|----------|-------------|
|
|
| `smoke_test` | 5 | 2m | Quick sanity check |
|
|
| `light_load` | 20 | 5m | Normal daytime traffic |
|
|
| `moderate_load` | 50 | 10m | Peak traffic periods |
|
|
| `heavy_load` | 100 | 15m | Stress testing |
|
|
| `api_focus` | 30 | 10m | API endpoint testing |
|
|
| `file_upload_test` | 15 | 8m | File upload performance |
|
|
| `authenticated_test` | 25 | 8m | Authenticated user workflows |
|
|
| `endurance_test` | 30 | 1h | Long-running stability |
|
|
|
|
### User Types
|
|
|
|
1. **PublicUser**: Anonymous users browsing jobs and careers
|
|
2. **AuthenticatedUser**: Logged-in users with full access
|
|
3. **APIUser**: REST API clients
|
|
4. **FileUploadUser**: Users uploading resumes and documents
|
|
|
|
### Common Workflows
|
|
|
|
- Job listing browsing
|
|
- Job detail viewing
|
|
- Application form access
|
|
- Application submission
|
|
- Dashboard navigation
|
|
- Message viewing
|
|
- File uploads
|
|
- API endpoint calls
|
|
|
|
## ⚙️ Configuration
|
|
|
|
### Environment Variables
|
|
|
|
```bash
|
|
# Target application host
|
|
export ATS_HOST="http://localhost:8000"
|
|
|
|
# Test user credentials (for authenticated tests)
|
|
export TEST_USERNAME="testuser"
|
|
export TEST_PASSWORD="testpass123"
|
|
|
|
# Database connection (for monitoring)
|
|
export DATABASE_URL="postgresql://user:pass@localhost/kaauh_ats"
|
|
```
|
|
|
|
### Custom Scenarios
|
|
|
|
Create custom scenarios by modifying `load_tests/config.py`:
|
|
|
|
```python
|
|
"custom_scenario": TestScenario(
|
|
name="Custom Load Test",
|
|
description="Your custom test description",
|
|
users=75,
|
|
spawn_rate=15,
|
|
run_time="20m",
|
|
host="http://your-host.com",
|
|
user_classes=["PublicUser", "AuthenticatedUser"],
|
|
tags=["custom", "specific"]
|
|
)
|
|
```
|
|
|
|
### Performance Thresholds
|
|
|
|
Adjust performance thresholds in `load_tests/config.py`:
|
|
|
|
```python
|
|
PERFORMANCE_THRESHOLDS = {
|
|
"response_time_p95": 2000, # 95th percentile under 2s
|
|
"response_time_avg": 1000, # Average under 1s
|
|
"error_rate": 0.05, # Error rate under 5%
|
|
"rps_minimum": 10, # Minimum 10 RPS
|
|
}
|
|
```
|
|
|
|
## 📝 Test Data Generation
|
|
|
|
### Generate Realistic Data
|
|
|
|
```bash
|
|
# Default configuration
|
|
python load_tests/run_load_tests.py generate-data
|
|
|
|
# Custom configuration
|
|
python load_tests/run_load_tests.py generate-data \
|
|
--jobs 200 \
|
|
--users 100 \
|
|
--applications 1000
|
|
```
|
|
|
|
### Generated Data Types
|
|
|
|
- **Jobs**: Realistic job postings with descriptions, qualifications, benefits
|
|
- **Users**: User profiles with contact information and social links
|
|
- **Applications**: Complete application records with cover letters
|
|
- **Interviews**: Scheduled interviews with various types and statuses
|
|
- **Messages**: User communications and notifications
|
|
|
|
### Test Files
|
|
|
|
Automatically generated test files for upload testing:
|
|
- Text files with realistic content
|
|
- Various sizes (configurable)
|
|
- Stored in `load_tests/test_files/`
|
|
|
|
## 📈 Performance Monitoring
|
|
|
|
### System Metrics
|
|
|
|
- **CPU Usage**: Percentage utilization
|
|
- **Memory Usage**: RAM consumption and usage percentage
|
|
- **Disk I/O**: Read/write operations
|
|
- **Network I/O**: Bytes sent/received, packet counts
|
|
- **Active Connections**: Number of network connections
|
|
|
|
### Database Metrics
|
|
|
|
- **Active Connections**: Current database connections
|
|
- **Query Count**: Total queries executed
|
|
- **Average Query Time**: Mean query execution time
|
|
- **Slow Queries**: Count of slow-running queries
|
|
- **Cache Hit Ratio**: Database cache effectiveness
|
|
|
|
### Real-time Monitoring
|
|
|
|
During tests, the framework monitors:
|
|
- Response times (avg, median, 95th, 99th percentiles)
|
|
- Request rates (current and peak)
|
|
- Error rates and types
|
|
- System resource utilization
|
|
|
|
## 📋 Reports
|
|
|
|
### HTML Reports
|
|
|
|
Comprehensive web-based reports including:
|
|
- Executive summary
|
|
- Performance metrics
|
|
- Response time distributions
|
|
- Error analysis
|
|
- System performance graphs
|
|
- Recommendations
|
|
|
|
### JSON Reports
|
|
|
|
Machine-readable reports for:
|
|
- CI/CD integration
|
|
- Automated analysis
|
|
- Historical comparison
|
|
- Custom processing
|
|
|
|
### Performance Charts
|
|
|
|
Visual representations of:
|
|
- Response time trends
|
|
- System resource usage
|
|
- Request rate variations
|
|
- Error rate patterns
|
|
|
|
### Report Locations
|
|
|
|
```
|
|
load_tests/
|
|
├── reports/
|
|
│ ├── performance_report_20231207_143022.html
|
|
│ ├── performance_report_20231207_143022.json
|
|
│ └── system_metrics_20231207_143022.png
|
|
└── results/
|
|
├── report_Smoke Test_20231207_143022.html
|
|
├── stats_Smoke Test_20231207_143022_stats.csv
|
|
└── stats_Smoke Test_20231207_143022_failures.csv
|
|
```
|
|
|
|
## 🌐 Distributed Testing
|
|
|
|
### Master-Worker Setup
|
|
|
|
For large-scale tests, use distributed testing:
|
|
|
|
#### Start Master Node
|
|
|
|
```bash
|
|
python load_tests/run_load_tests.py master moderate_load --workers 4
|
|
```
|
|
|
|
#### Start Worker Nodes
|
|
|
|
```bash
|
|
# On each worker machine
|
|
python load_tests/run_load_tests.py worker
|
|
```
|
|
|
|
### Configuration
|
|
|
|
- **Master**: Coordinates test execution and aggregates results
|
|
- **Workers**: Execute user simulations and report to master
|
|
- **Network**: Ensure all nodes can communicate on port 5557
|
|
|
|
### Best Practices
|
|
|
|
1. **Network**: Use low-latency network between nodes
|
|
2. **Resources**: Ensure each worker has sufficient CPU/memory
|
|
3. **Synchronization**: Start workers before master
|
|
4. **Monitoring**: Monitor each node individually
|
|
|
|
## 🎯 Best Practices
|
|
|
|
### Test Planning
|
|
|
|
1. **Start Small**: Begin with smoke tests
|
|
2. **Gradual Increase**: Progressively increase load
|
|
3. **Realistic Scenarios**: Simulate actual user behavior
|
|
4. **Baseline Testing**: Establish performance baselines
|
|
5. **Regular Testing**: Schedule periodic load tests
|
|
|
|
### Test Execution
|
|
|
|
1. **Warm-up**: Allow system to stabilize
|
|
2. **Duration**: Run tests long enough for steady state
|
|
3. **Monitoring**: Watch system resources during tests
|
|
4. **Documentation**: Record test conditions and results
|
|
5. **Validation**: Verify application functionality post-test
|
|
|
|
### Performance Optimization
|
|
|
|
1. **Bottlenecks**: Identify and address performance bottlenecks
|
|
2. **Caching**: Implement effective caching strategies
|
|
3. **Database**: Optimize queries and indexing
|
|
4. **CDN**: Use content delivery networks for static assets
|
|
5. **Load Balancing**: Distribute traffic effectively
|
|
|
|
### CI/CD Integration
|
|
|
|
```yaml
|
|
# Example GitHub Actions workflow
|
|
- name: Run Load Tests
|
|
run: |
|
|
python load_tests/run_load_tests.py headless smoke_test
|
|
# Upload results as artifacts
|
|
```
|
|
|
|
## 🔧 Troubleshooting
|
|
|
|
### Common Issues
|
|
|
|
#### 1. Connection Refused
|
|
|
|
```
|
|
Error: Connection refused
|
|
```
|
|
|
|
**Solution**: Ensure the ATS application is running and accessible
|
|
|
|
```bash
|
|
# Check if application is running
|
|
curl http://localhost:8000/
|
|
|
|
# Start the application
|
|
python manage.py runserver
|
|
```
|
|
|
|
#### 2. Import Errors
|
|
|
|
```
|
|
ModuleNotFoundError: No module named 'locust'
|
|
```
|
|
|
|
**Solution**: Install missing dependencies
|
|
|
|
```bash
|
|
pip install locust faker psutil matplotlib pandas
|
|
```
|
|
|
|
#### 3. High Memory Usage
|
|
|
|
**Symptoms**: System becomes slow during tests
|
|
|
|
**Solutions**:
|
|
- Reduce number of concurrent users
|
|
- Increase system RAM
|
|
- Optimize test data generation
|
|
- Use distributed testing
|
|
|
|
#### 4. Database Connection Issues
|
|
|
|
```
|
|
OperationalError: too many connections
|
|
```
|
|
|
|
**Solutions**:
|
|
- Increase database connection limit
|
|
- Use connection pooling
|
|
- Reduce concurrent database users
|
|
- Implement database read replicas
|
|
|
|
### Debug Mode
|
|
|
|
Enable debug logging:
|
|
|
|
```bash
|
|
export LOCUST_DEBUG=1
|
|
python load_tests/run_load_tests.py run smoke_test
|
|
```
|
|
|
|
### Performance Issues
|
|
|
|
#### Slow Response Times
|
|
|
|
1. **Check System Resources**: Monitor CPU, memory, disk I/O
|
|
2. **Database Performance**: Analyze slow queries
|
|
3. **Network Latency**: Check network connectivity
|
|
4. **Application Code**: Profile application performance
|
|
|
|
#### High Error Rates
|
|
|
|
1. **Application Logs**: Check for errors in application logs
|
|
2. **Database Constraints**: Verify database integrity
|
|
3. **Resource Limits**: Check system resource limits
|
|
4. **Load Balancer**: Verify load balancer configuration
|
|
|
|
### Getting Help
|
|
|
|
1. **Check Logs**: Review Locust and application logs
|
|
2. **Reduce Load**: Start with smaller user counts
|
|
3. **Isolate Issues**: Test individual components
|
|
4. **Monitor System**: Use system monitoring tools
|
|
|
|
## 📚 Additional Resources
|
|
|
|
- [Locust Documentation](https://docs.locust.io/)
|
|
- [Performance Testing Best Practices](https://docs.locust.io/en/stable/testing.html)
|
|
- [Django Performance Tips](https://docs.djangoproject.com/en/stable/topics/performance/)
|
|
- [PostgreSQL Performance](https://www.postgresql.org/docs/current/performance-tips.html)
|
|
|
|
## 🤝 Contributing
|
|
|
|
To contribute to the load testing framework:
|
|
|
|
1. **Add Scenarios**: Create new test scenarios in `config.py`
|
|
2. **Enhance Users**: Improve user behavior in `locustfile.py`
|
|
3. **Better Monitoring**: Add new metrics to `monitoring.py`
|
|
4. **Improve Reports**: Enhance report generation
|
|
5. **Documentation**: Update this README
|
|
|
|
## 📄 License
|
|
|
|
This load testing framework is part of the ATS project and follows the same license terms.
|
|
|
|
---
|
|
|
|
**Happy Testing! 🚀**
|
|
|
|
For questions or issues, please contact the development team or create an issue in the project repository.
|