A PostgreSQL performance testing framework that benchmarks indexing and partitioning strategies. Helps DBAs and developers measure the real-world impact of optimization techniques through automated scenarios.
You can view an example report showcasing the output format and performance insights:
- Comprehensive Performance Scenarios: Comprehensive testing of various PostgreSQL optimization techniques
- Automated Benchmarking: Time-based performance measurements with detailed reporting
- Docker Integration: Easy setup with PostgreSQL 15 container
- Rich CLI Interface: Beautiful terminal output with progress tracking and HTML reports
- Realistic Data Generation: 1M+ synthetic user records for meaningful performance testing
- Clean Architecture: Modular design with separate setup and execution phases
- S01: Single Column Indexes - Basic B-tree index performance
- S02: Composite Indexes - Multi-column index optimization
- S03: Covering Indexes - Include-only queries for maximum performance
- S04: Unique Indexes - Constraint enforcement performance
- S05: Partial Indexes - Conditional indexing strategies
- S06: GIN Indexes - Full-text search and array operations
- S07: Foreign Key Indexes - Referential integrity performance
- S08: Bulk Loading Strategies - Data import optimization
- S09: Expression Indexes - Computed column indexing
- S10: Index Bloat - Maintenance and optimization
- S11: Range Partitioning - Date-based table partitioning
- S12: List Partitioning - Categorical data partitioning
- S13: Materialized Views - Pre-computed result caching
- Python 3.8 or higher
- Docker and Docker Compose
- Git
-
Clone the repository
git clone <repository-url> cd postgres-indexing-partitioning-performance-optimization
-
Start PostgreSQL container
docker-compose up -d
-
Install Python dependencies
cd src pip install -r requirements.txt -
Verify setup
python cli.py list
python cli.py listpython cli.py run S01_single_column_indexpython cli.py run-allpostgres-indexing-partitioning-performance-optimization/
├── docker-compose.yml # PostgreSQL container configuration
├── LICENSE # MIT License
├── README.md # This file
├── media/
│ └── example_reports/ # Generated HTML reports
└── src/
├── cli.py # Command-line interface
├── requirements.txt # Python dependencies
├── scenario_manager.py # Scenario orchestration
├── core/ # Core functionality
│ ├── base.py # Abstract base classes
│ ├── benchmark.py # Performance timing utilities
│ ├── database.py # Database connection management
│ └── logger.py # Rich console output
├── datasets/
│ └── generator.py # Synthetic data generation
└── scenarios/ # Performance test scenarios
├── base_data_setup.py # Common data setup
├── S01_single_column_index/
├── S02_composite_indexes/
├── S03_covering_indexes/
├── S04_unique_indexes/
├── S05_partial_indexes/
├── S06_gin_indexes/
├── S07_foreign_key_indexes/
├── S08_bulk_loading_strategies/
├── S09_expression_indexes/
├── S10_index_bloat/
├── S11_range_partitioning/
├── S12_list_partitioning/
└── S13_materialized_views/
The tool connects to PostgreSQL using these default settings:
- Host: localhost
- Port: 5432
- Database: master_db
- Username: service_user
- Password: service_password
- Default dataset: 1,000,000 user records
- Data types: Users with demographics, timestamps, and location data
- Setup Phase: Create tables and populate with synthetic data
- Execution Phase: Measure performance without optimization,Apply optimization technique and re-measure
- Cleanup Phase: Removes data, indexes, temperory tables created during setup phase
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
Author: Siva Sai Krishna