@@ -16,30 +16,40 @@ A high-performance Redis load testing application built with Python and redis-py
1616
1717### Prerequisites
1818
19- You'll need the ** Redis Metrics Stack** running to collect and visualize metrics:
19+ - ** Python 3.8+** with pip
20+ - ** Redis server** running (localhost:6379 by default)
21+ - ** Redis Metrics Stack** (optional, for observability - separate repository)
22+
23+ For metrics collection and visualization, you'll need the separate metrics stack:
2024
2125``` bash
22- # Clone and start the metrics stack (separate repository)
23- git clone < redis-metrics-stack-repo>
26+ # Clone and start the metrics stack (optional - separate repository)
27+ git clone < redis-metrics-stack-repo-url >
2428cd redis-metrics-stack
2529make start
2630```
2731
2832### 🛠️ Local Development (Recommended)
2933
30- ** Fast iteration with external metrics stack :**
34+ ** Quick setup and testing :**
3135
3236``` bash
3337# Install dependencies
3438make install-deps
3539
36- # Run tests repeatedly (fast iteration)
40+ # Test Redis connection
41+ make test-connection
42+
43+ # Run basic test (60 seconds)
44+ make test
45+
46+ # Run custom tests
3747python main.py run --workload-profile basic_rw --duration 30
3848python main.py run --workload-profile high_throughput --duration 60
3949python main.py run --workload-profile basic_rw # unlimited test run
4050```
4151
42- ** Access your dashboards (from metrics stack):**
52+ ** With metrics stack (optional ):**
4353- ** 📊 Grafana** : http://localhost:3000 (admin/admin) - ** Redis Test Dashboard**
4454- ** 📈 Prometheus** : http://localhost:9090 - Raw metrics
4555
@@ -51,10 +61,14 @@ python main.py run --workload-profile basic_rw # unlimited test run
5161# Build the application
5262make build
5363
54- # Run with Docker (requires external metrics stack)
64+ # Run with Docker (standalone)
65+ docker run redis-py-test-app:latest \
66+ python main.py run --workload-profile basic_rw --duration 60
67+
68+ # Run with Docker (with metrics stack)
5569docker run --network redis-test-network \
5670 -e OTEL_EXPORTER_OTLP_ENDPOINT=http://otel-collector:4317 \
57- redis-py-test-app \
71+ redis-py-test-app:latest \
5872 python main.py run --workload-profile basic_rw --duration 60
5973```
6074
@@ -71,10 +85,21 @@ cp .env.example .env
7185# Edit .env with your Redis configuration
7286
7387# Test Redis connection
74- python main.py test-connection
88+ make test-connection
89+ ```
90+
91+ ### Available Make Commands
92+
93+ ``` bash
94+ make help # Show all available commands
95+ make install-deps # Install Python dependencies
96+ make test-connection # Test Redis connection
97+ make test # Run basic test (60 seconds)
98+ make build # Build Docker image
99+ make clean # Clean up Python cache and virtual environment
75100```
76101
77- ### Access Services (from metrics stack)
102+ ### Access Services (with metrics stack)
78103- ** 📊 Grafana** : http://localhost:3000 (admin/admin) - ** Redis Test Dashboard**
79104- ** 📈 Prometheus** : http://localhost:9090 - Raw metrics and queries
80105- ** 📡 Redis** : localhost:6379 - Database connection
@@ -103,56 +128,59 @@ python main.py run --workload-profile high_throughput
103128
104129#### ** Quick Tests:**
105130``` bash
131+ # Test Redis connection
132+ make test-connection
133+
106134# 60-second basic test
107135make test
108136
109- # Test Redis connection
110- make test-connection
137+ # Build Docker image
138+ make build
111139```
112140
113141#### ** Custom Test Examples:**
114142
115143** Basic Read/Write Operations:**
116144``` bash
117145# 30-second test
118- ./venv/bin/ python main.py run --workload-profile basic_rw --duration 30
146+ python main.py run --workload-profile basic_rw --duration 30
119147
120148# Unlimited duration (until Ctrl+C)
121- ./venv/bin/ python main.py run --workload-profile basic_rw
149+ python main.py run --workload-profile basic_rw
122150```
123151
124152** High Throughput Testing:**
125153``` bash
126154# 60-second high-load test
127- ./venv/bin/ python main.py run --workload-profile high_throughput --duration 60 --threads-per-client 4
155+ python main.py run --workload-profile high_throughput --duration 60 --threads-per-client 4
128156
129157# Unlimited high-load test (for sustained performance monitoring)
130- ./venv/bin/ python main.py run --workload-profile high_throughput --threads-per-client 4
158+ python main.py run --workload-profile high_throughput --threads-per-client 4
131159```
132160
133161** List Operations:**
134162``` bash
135163# 45-second list operations test
136- ./venv/bin/ python main.py run --workload-profile list_operations --duration 45
164+ python main.py run --workload-profile list_operations --duration 45
137165
138166# Unlimited list operations (for memory usage monitoring)
139- ./venv/bin/ python main.py run --workload-profile list_operations
167+ python main.py run --workload-profile list_operations
140168```
141169
142170** Async Mixed Workload:**
143171``` bash
144172# 2-minute mixed workload test
145- ./venv/bin/ python main.py run --workload-profile async_mixed --duration 120
173+ python main.py run --workload-profile async_mixed --duration 120
146174
147175# Unlimited mixed workload (for long-term stability testing)
148- ./venv/bin/ python main.py run --workload-profile async_mixed
176+ python main.py run --workload-profile async_mixed
149177```
150178
151179** Custom Configuration:**
152180``` bash
153181# Multiple clients with custom threading
154182# Note: app-name will be concatenated with workload profile (python-dev-basic_rw)
155- ./venv/bin/ python main.py run \
183+ python main.py run \
156184 --workload-profile basic_rw \
157185 --duration 300 \
158186 --client-instances 2 \
@@ -168,7 +196,7 @@ make test-connection
168196export TEST_DURATION=180
169197export TEST_CLIENT_INSTANCES=3
170198export APP_NAME=python-custom
171- ./venv/bin/ python main.py run --workload-profile high_throughput
199+ python main.py run --workload-profile high_throughput
172200```
173201
174202### Available Make Commands
@@ -197,14 +225,14 @@ make clean # Clean up everything
197225** Unlimited Test Monitoring:**
198226``` bash
199227# Start long-running test in background
200- ./venv/bin/ python main.py run --workload-profile high_throughput &
228+ python main.py run --workload-profile high_throughput &
201229
202230# Monitor in real-time
203231echo " Monitor at: http://localhost:3000"
204232echo " Stop test with: kill %1"
205233
206234# Or run in foreground and stop with Ctrl+C
207- ./venv/bin/ python main.py run --workload-profile basic_rw
235+ python main.py run --workload-profile basic_rw
208236# Press Ctrl+C when ready to stop
209237```
210238
@@ -272,13 +300,16 @@ VERSION=v1.0.0
272300# OpenTelemetry
273301OTEL_ENABLED=true
274302OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
303+
304+ # Output
305+ OUTPUT_FILE=test_results.json # If specified, final summary goes to file; otherwise stdout
275306```
276307
277308### Performance Tuning Examples
278309
279310** Low Resource Testing:**
280311``` bash
281- ./venv/bin/ python main.py run \
312+ python main.py run \
282313 --workload-profile basic_rw \
283314 --duration 60 \
284315 --client-instances 1 \
@@ -288,7 +319,7 @@ OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
288319
289320** High Load Testing:**
290321``` bash
291- ./venv/bin/ python main.py run \
322+ python main.py run \
292323 --workload-profile high_throughput \
293324 --duration 300 \
294325 --client-instances 3 \
@@ -298,21 +329,60 @@ OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
298329
299330** Memory Usage Testing:**
300331``` bash
301- ./venv/bin/ python main.py run \
332+ python main.py run \
302333 --workload-profile list_operations \
303334 --duration 600 \
304335 --client-instances 2 \
305336 --connections-per-client 8 \
306337 --threads-per-client 3
307338```
308339
340+ ### Final Test Summary Output
341+
342+ The application outputs a standardized final test summary at the end of each test run. This summary includes key metrics and connection statistics.
343+
344+ ** Output to stdout (default):**
345+ ``` bash
346+ python main.py run \
347+ --workload-profile basic_rw \
348+ --duration 60
349+ ```
350+
351+ ** Output to file:**
352+ ``` bash
353+ python main.py run \
354+ --workload-profile high_throughput \
355+ --duration 300 \
356+ --output-file results.json
357+ ```
358+
359+ ** Example Final Summary Output:**
360+ ``` json
361+ {
362+ "test_duration" : " 60.0s" ,
363+ "workload_name" : " basic_rw" ,
364+ "total_commands_count" : 1234567 ,
365+ "successful_commands_count" : 1234565 ,
366+ "failed_commands_count" : 2 ,
367+ "success_rate" : " 99.84%" ,
368+ "overall_throughput" : 20576 ,
369+ "connection_attempts" : 5 ,
370+ "connection_failures" : 0 ,
371+ "connection_success_rate" : " 100.00%" ,
372+ "reconnection_count" : 0 ,
373+ "avg_reconnection_duration_ms" : 0
374+ }
375+ ```
376+
309377## Manual Installation
310378
311379### Prerequisites & Dependencies
312380
3133811 . Install dependencies:
314382``` bash
315- pip install -r requirements.txt
383+ make install-deps
384+ # OR manually:
385+ # pip install -r requirements.txt
316386```
317387
3183882 . Copy environment configuration (optional):
@@ -325,7 +395,9 @@ cp .env.example .env
325395
326396Test Redis connection:
327397``` bash
328- python main.py test-connection
398+ make test-connection
399+ # OR manually:
400+ # python main.py test-connection
329401```
330402
331403Run a basic load test:
@@ -417,11 +489,41 @@ python main.py run \
417489
418490## Metrics & Monitoring
419491
420- - ** Prometheus metrics** available at ` http://localhost:8000/metrics `
492+ The application provides comprehensive metrics and monitoring capabilities:
493+
494+ - ** Final test summary** output to stdout or JSON file with ` --output-file `
421495- ** Real-time statistics** during test execution
422- - ** JSON export ** for custom analysis with ` --output-file `
496+ - ** OpenTelemetry metrics ** sent to external metrics stack (optional)
423497- ** Comprehensive logging** with configurable levels
424498
499+ ### Final Test Summary
500+
501+ ``` bash
502+ # Print summary to stdout (default)
503+ python main.py run --workload-profile basic_rw --duration 60
504+
505+ # Save summary to JSON file
506+ python main.py run --workload-profile basic_rw --duration 60 --output-file results.json
507+ ```
508+
509+ Example output:
510+ ``` json
511+ {
512+ "test_duration" : " 60.0s" ,
513+ "workload_name" : " basic_rw" ,
514+ "total_commands_count" : 1234567 ,
515+ "successful_commands_count" : 1234565 ,
516+ "failed_commands_count" : 2 ,
517+ "success_rate" : " 99.84%" ,
518+ "overall_throughput" : 20576 ,
519+ "connection_attempts" : 5 ,
520+ "connection_failures" : 0 ,
521+ "connection_success_rate" : " 100.00%" ,
522+ "reconnection_count" : 0 ,
523+ "avg_reconnection_duration_ms" : 0
524+ }
525+ ```
526+
425527## 📊 Metrics and Observability
426528
427529This application sends metrics to an external ** Redis Metrics Stack** using OpenTelemetry.
@@ -478,18 +580,20 @@ The application is designed for cloud deployment with:
478580- ` redis_client.py ` - Redis connection management with resilience
479581- ` workloads.py ` - Workload implementations
480582- ` test_runner.py ` - Main test execution engine
481- - ` metrics.py ` - OpenTelemetry metrics collection
583+ - ` metrics.py ` - Metrics collection and final test summaries
482584- ` logger.py ` - Logging configuration
483585- ` requirements.txt ` - Python dependencies
484586- ` .env.example ` - Environment variables template
485587- ` Makefile ` - Development and build commands
486588- ` Dockerfile ` - Container image definition
487- - ` setup.sh ` - Quick setup script
488589
489590## Related Projects
490591
491592- ** ` redis-metrics-stack ` ** - Observability infrastructure (separate repository)
492- - OpenTelemetry Collector
493- - Prometheus metrics storage
494- - Grafana dashboards
593+ - OpenTelemetry Collector for metrics collection
594+ - Prometheus for metrics storage
595+ - Grafana dashboards for visualization
495596 - Redis database for testing
597+ - Complete monitoring stack for Redis applications
598+
599+ ** Note** : This Redis test application is standalone and doesn't require the metrics stack to function. The metrics stack provides additional observability features for advanced monitoring and visualization.
0 commit comments