← Back to Index¶
PostgreSQL: Setup & Example Usage¶
This page provides a complete guide for analyzing PostgreSQL queries with IQToolkit Analyzer.
Table of Contents¶
- 🔥 NEW: Database-Direct EXPLAIN Analysis
- 1. Enable Slow Query Logging in PostgreSQL
- 2. Find Your Log Files
- 3. Generate Example Slow Queries
- 4. Collect and Analyze Logs
- 5. Supported Log Formats
- 6. Example Output
- 7. Troubleshooting
🔥 NEW: Database-Direct EXPLAIN Analysis¶
The fastest way to analyze PostgreSQL queries - no log files needed!
Run EXPLAIN directly against your database using the IQToolkit config file:
Quick Start¶
- Create config file with your database connection:
# ~/.iqtoolkit/config.yaml
default_provider: openai
providers:
openai:
api_key: ${OPENAI_API_KEY}
model: gpt-4o-mini
databases:
my_database:
type: postgres
host: localhost
port: 5432
database: production_db
user: postgres
password: ${DB_PASSWORD}
rds_demo:
type: postgres
host: my-db.us-east-1.rds.amazonaws.com
port: 5432
database: demo
user: admin
password: ${RDS_PASSWORD}
- Run EXPLAIN on any query:
# Analyze inline query
poetry run python -m iqtoolkit_analyzer \
--config ~/.iqtoolkit/config.yaml \
postgresql \
--db-name my_database \
--sql "SELECT * FROM orders WHERE customer_email = 'user@example.com' AND order_date > '2024-01-01'" \
--output analysis.md
# Analyze query from file
poetry run python -m iqtoolkit_analyzer \
--config ~/.iqtoolkit/config.yaml \
postgresql \
--db-name rds_demo \
--query-file slow_query.sql \
--output analysis.md
What You Get¶
- Automatic EXPLAIN ANALYZE execution against your database
- Full execution plan with Index Scan, Seq Scan, buffer usage, etc.
- AI-powered recommendations based on actual execution metrics
- Contextual performance assessment (fast vs slow based on actual timing)
- No log file parsing - direct database analysis
Benefits vs Log Analysis¶
| Feature | Database-Direct | Log File |
|---|---|---|
| Setup required | Config file only | Enable logging + wait for queries |
| Analysis speed | Instant | Requires historical data |
| EXPLAIN context | Full execution plan | Limited to logged data |
| Query flexibility | Any query, anytime | Only logged slow queries |
| Production impact | Single query | Continuous logging overhead |
Requirements¶
- IQToolkit config file (
~/.iqtoolkit/config.yaml) with database connection details - Database credentials with SELECT privileges
- Python package
psycopg(v3, installed automatically)
Environment Variables¶
Use environment variables for sensitive credentials:
export OPENAI_API_KEY="sk-..."
export DB_PASSWORD="your_password"
export RDS_PASSWORD="your_rds_password"
The config file will automatically expand ${VARIABLE_NAME} references.
1. Enable Slow Query Logging in PostgreSQL¶
Edit your postgresql.conf (commonly at /etc/postgresql/*/main/postgresql.conf or /usr/local/var/postgres/postgresql.conf):
logging_collector = on
log_directory = 'log' # or an absolute path
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
log_min_duration_statement = 1000 # log queries slower than 1 second (adjust as needed)
log_statement = 'none'
log_duration = off
log_line_prefix = '%m [%p] %u@%d '
log_destination = 'stderr'
Restart PostgreSQL after making changes:
sudo systemctl restart postgresql
# or
pg_ctl restart
Enable for Current Session (optional)¶
SET log_min_duration_statement = 1000;
2. Find Your Log Files¶
Check the log_directory you set above. Common locations:
- /var/log/postgresql/
- /usr/local/var/postgres/log/
- /opt/homebrew/var/postgresql@*/log/
3. Generate Example Slow Queries¶
Run slow queries in your database to generate log entries. For example:
SELECT * FROM employees ORDER BY random() LIMIT 10000;
SELECT COUNT(*) FROM sales WHERE amount > 1000;
4. Collect and Analyze Logs¶
Copy the relevant log file to your project directory's sample_logs/ folder (create it if it doesn't exist). For example:
mkdir -p sample_logs
cp /var/log/postgresql/postgresql-2025-10-31_*.log ./sample_logs/
Sample log file:
sample_logs/postgresql-2025-10-31_122408.log.txt(plain text, multi-line queries supported)
Analyze with IQToolkit Analyzer¶
# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt
With Verbose Output¶
# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --verbose
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --verbose
Specify Output Report Path¶
# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --output reports/my_report.md
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --output reports/my_report.md
Analyze a standalone EXPLAIN (ANALYZE, FORMAT JSON) file¶
If you already have an EXPLAIN JSON file (e.g., exported from psql or a saved plan), you can skip log parsing and analyze it directly:
# With Poetry
poetry run python -m iqtoolkit_analyzer postgresql --plan slow_query_plan.json --output reports/explain_report.md
# Traditional approach
python -m iqtoolkit_analyzer postgresql --plan slow_query_plan.json --output reports/explain_report.md
Requirements:
- The file must be produced with EXPLAIN (ANALYZE, FORMAT JSON) and include a top-level Plan key.
- Execution Time (preferred) or Actual Total Time will be used for the duration shown in the report.
5. Supported Log Formats¶
IQToolkit Analyzer currently supports the following PostgreSQL log formats:
- Plain text PostgreSQL logs (default)
- CSV logs (
log_destination = 'csvlog') - JSON logs (extensions and custom setups)
Ensure your logs include durations and statements for accurate analysis.
6. Example Output¶
You can generate a Markdown report summarizing top slow queries, their durations, frequency, and AI-powered recommendations. See the main README for a detailed sample report.
7. Troubleshooting¶
- Ensure
log_min_duration_statementis set appropriately (e.g.,1000for 1s) - Confirm log file path and permissions
- Use
--verboseto get more details during analysis