-
Notifications
You must be signed in to change notification settings - Fork 4
Open
Labels
documentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or request
Description
Summary
Add benchmark results and performance metrics to the README and documentation site to showcase Rustrak's performance characteristics.
Motivation
Rustrak already has a comprehensive benchmarking suite (packages/benchmarks/) that produces detailed performance metrics. However, these results are not visible to potential users evaluating the project. Publishing real benchmark data would:
- Demonstrate the "lightweight" and "fast" claims with concrete numbers
- Help users understand expected performance for capacity planning
- Build trust through transparency
- Differentiate Rustrak from heavier alternatives
Proposed Solution
1. Add Performance section to README.md
Add a "Performance" section after "Why Rustrak?" with key metrics:
## Performance
Benchmarked on [hardware specs], [Rustrak version]:
| Metric | Value |
|--------|-------|
| Throughput | X,XXX events/second |
| P50 Latency | X.X ms |
| P99 Latency | XX.X ms |
| Memory (idle) | XX MB |
| Memory (peak @ 1k rps) | XXX MB |
| Docker Image | ~20 MB |2. Add detailed benchmark page to documentation
Create a dedicated benchmarks page in apps/docs/ with:
- Methodology (scenarios, hardware, configuration)
- Detailed results for each scenario (baseline, burst, sustained, stress)
- Comparison charts
- Instructions to reproduce benchmarks
3. Consider CI automation (optional)
- Run benchmarks on release tags
- Auto-update benchmark results in docs
- Track performance regressions
Files to Update
README.md- Add performance summary sectionapps/docs/- Create detailed benchmark documentation page- Potentially
packages/benchmarks/results/- Add reference baseline results
Additional Context
The benchmark suite already supports:
- Multiple scenarios (baseline, burst, sustained, stress)
- Latency histograms (P50/P95/P99)
- Memory and CPU metrics via Docker
- JSON output for comparison
- Result comparison tooling
See packages/benchmarks/README.md for full benchmark documentation.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
documentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or request