Project: Christmas Data Advent 2025 - PROJECT 3E Author: Ricardo (SaaS Analytics) Industry: SaaS / Technology Orchestration Pattern: Event-driven webhook processing with Redis buffer
This project demonstrates a real-time event processing pipeline for SaaS analytics. It receives webhook events from a SaaS application (user signups, subscriptions, usage tracking), queues them in Redis for reliability, and processes them in micro-batches to update analytics dashboards.
Ricardo works as a SaaS analytics specialist and needs to process user events in near-real-time to power analytics dashboards. The system must:
[SaaS App] β [Webhook Endpoint] β [Redis Queue] β [Batch Consumer] β [Analytics Dashboard]
β β β
Non-blocking Buffer Layer Idempotency Check
202 Response (Reliability) (Deduplication)
β
[Dead Letter Queue]
(Failed Events)
# Navigate to day15 directory
cd day15
# Install dependencies
pip install -r day15_requirements.txt
# (Optional) Start Redis if using real Redis
# redis-server
The system is configured via config/.env. Key settings:
# Use mock mode (no Redis required) - default for testing
DAY15_USE_MOCK_REDIS=true
# Or use real Redis
DAY15_USE_MOCK_REDIS=false
DAY15_REDIS_HOST=localhost
DAY15_REDIS_PORT=6379
# Webhook settings
DAY15_WEBHOOK_PORT=5015
# Processing settings
DAY15_BATCH_SIZE=10
DAY15_BATCH_TIMEOUT_SECONDS=5
# Start both webhook and consumer
python3 day15_ORCHESTRATOR_main.py
You should see:
================================================================================
Day 15 - Real-Time Analytics Orchestrator
Author: Ricardo (SaaS Analytics)
================================================================================
Webhook endpoint: http://localhost:5015/webhook/events
Health check: http://localhost:5015/health
================================================================================
In another terminal:
# Run basic test scenario (3 events)
python3 day15_TEST_event_generator.py basic
# Run all test scenarios
python3 day15_TEST_event_generator.py
# Test idempotency (duplicate detection)
python3 day15_TEST_event_generator.py idempotency
# Test burst load (50 events)
python3 day15_TEST_event_generator.py burst
# Test batch endpoint (20 events at once)
python3 day15_TEST_event_generator.py batch
# Health check
curl http://localhost:5015/health
# Send single event
curl -X POST http://localhost:5015/webhook/events \
-H "Content-Type: application/json" \
-d '{
"event_id": "evt_123456",
"event_type": "user_signup",
"user_id": "user_abc123",
"timestamp": "2025-12-16T10:00:00Z",
"metadata": {
"source": "organic",
"email": "test@example.com"
}
}'
# Send batch of events
curl -X POST http://localhost:5015/webhook/batch \
-H "Content-Type: application/json" \
-d '{
"events": [
{ "event_id": "evt_001", "event_type": "user_signup", "timestamp": "2025-12-16T10:00:00Z" },
{ "event_id": "evt_002", "event_type": "subscription_created", "timestamp": "2025-12-16T10:01:00Z" }
]
}'
All activity is logged to day15/logs/day15_pipeline.log:
# Watch logs in real-time
tail -f day15/logs/day15_pipeline.log
# View processing statistics
grep "STATISTICS" day15/logs/day15_pipeline.log
received_at to processing completion2025-12-16 19:30:15,234 - INFO - Event received: user_signup (evt_a1b2c3) - latency: 12.45ms
2025-12-16 19:30:15,235 - INFO - Event queued (mock): evt_a1b2c3, queue size: 1
2025-12-16 19:30:20,567 - INFO - Processing batch of 3 events
2025-12-16 19:30:20,578 - INFO - User signup: user_abc123 from organic
2025-12-16 19:30:20,579 - INFO - Processed user_signup (evt_a1b2c3) - latency: 5343.40ms
2025-12-16 19:30:20,601 - INFO - Batch complete: 3 success, 0 failed
| Variable | Default | Description |
|---|---|---|
DAY15_REDIS_HOST |
localhost |
Redis server hostname |
DAY15_REDIS_PORT |
6379 |
Redis server port |
DAY15_REDIS_PASSWORD |
"" |
Redis password (if required) |
DAY15_WEBHOOK_PORT |
5015 |
Webhook server port |
DAY15_BATCH_SIZE |
10 |
Events per batch |
DAY15_BATCH_TIMEOUT_SECONDS |
5 |
Wait time before processing partial batch |
DAY15_MAX_RETRIES |
3 |
Retry attempts for failed events |
DAY15_IDEMPOTENCY_TTL_HOURS |
24 |
How long to remember processed event IDs |
DAY15_USE_MOCK_REDIS |
true |
Use in-memory queue instead of Redis |
DAY15_DRY_RUN |
false |
Log events without processing |
DAY15_DASHBOARD_UPDATE_ENABLED |
false |
Enable dashboard API updates |
Tests one event of each type (signup, subscription, usage):
python3 day15_TEST_event_generator.py basic
Sends same event 3 times to verify duplicate detection:
python3 day15_TEST_event_generator.py idempotency
Expected: 1 accepted, 2 ignored as duplicates
Sends 50 random events to test throughput:
python3 day15_TEST_event_generator.py burst
Sends 20 events in a single batch request:
python3 day15_TEST_event_generator.py batch
day15/
βββ day15_CONFIG_redis.py # Configuration management
βββ day15_WEBHOOK_receiver.py # Flask webhook server
βββ day15_CONSUMER_batch_processor.py # Event processing consumer
βββ day15_ORCHESTRATOR_main.py # Process orchestration
βββ day15_TEST_event_generator.py # Synthetic event generator
βββ day15_requirements.txt # Python dependencies
βββ .env.example # Environment variable template
βββ logs/
β βββ day15_pipeline.log # Execution logs
βββ data/
β βββ day15_sample_events.json # Sample event data
βββ README.md # Full documentation
Problem: Connection refused when testing
Solution: Check if port 5015 is available:
lsof -i :5015
# If in use, change DAY15_WEBHOOK_PORT in config/.env
Problem: Redis connection failed error
Solution: Use mock mode for testing:
# In config/.env
DAY15_USE_MOCK_REDIS=true
Or start Redis server:
redis-server
Problem: Events queued but not processed Solution: Check consumer logs:
grep "consumer" day15/logs/day15_pipeline.log
Ensure both webhook and consumer processes are running.
Problem: Processing latency > 10 seconds Solution:
DAY15_BATCH_SIZE for faster processingDAY15_BATCH_TIMEOUT_SECONDS to wait for fuller batchesThis project demonstrates:
day15_ patternFor questions or issues:
day15/logs/day15_pipeline.logconfig/.envDAY15_USE_MOCK_REDIS=trueTime to Deliver: 3 hours Status: β Complete Portfolio Ready: Yes