This page provides a complete guide for analyzing PostgreSQL queries with IQToolkit Analyzer.
The fastest way to analyze PostgreSQL queries - no log files needed!
Run EXPLAIN directly against your database using the IQToolkit config file:
# ~/.iqtoolkit/config.yaml
default_provider: openai
providers:
openai:
api_key: ${OPENAI_API_KEY}
model: gpt-4o-mini
databases:
my_database:
type: postgres
host: localhost
port: 5432
database: production_db
user: postgres
password: ${DB_PASSWORD}
rds_demo:
type: postgres
host: my-db.us-east-1.rds.amazonaws.com
port: 5432
database: demo
user: admin
password: ${RDS_PASSWORD}
# Analyze inline query
poetry run python -m iqtoolkit_analyzer \
--config ~/.iqtoolkit/config.yaml \
postgresql \
--db-name my_database \
--sql "SELECT * FROM orders WHERE customer_email = 'user@example.com' AND order_date > '2024-01-01'" \
--output analysis.md
# Analyze query from file
poetry run python -m iqtoolkit_analyzer \
--config ~/.iqtoolkit/config.yaml \
postgresql \
--db-name rds_demo \
--query-file slow_query.sql \
--output analysis.md
| Feature | Database-Direct | Log File |
|---|---|---|
| Setup required | Config file only | Enable logging + wait for queries |
| Analysis speed | Instant | Requires historical data |
| EXPLAIN context | Full execution plan | Limited to logged data |
| Query flexibility | Any query, anytime | Only logged slow queries |
| Production impact | Single query | Continuous logging overhead |
~/.iqtoolkit/config.yaml) with database connection detailspsycopg (v3, installed automatically)Use environment variables for sensitive credentials:
export OPENAI_API_KEY="sk-..."
export DB_PASSWORD="your_password"
export RDS_PASSWORD="your_rds_password"
The config file will automatically expand ${VARIABLE_NAME} references.
Edit your postgresql.conf (commonly at /etc/postgresql/*/main/postgresql.conf or /usr/local/var/postgres/postgresql.conf):
logging_collector = on
log_directory = 'log' # or an absolute path
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
log_min_duration_statement = 1000 # log queries slower than 1 second (adjust as needed)
log_statement = 'none'
log_duration = off
log_line_prefix = '%m [%p] %u@%d '
log_destination = 'stderr'
Restart PostgreSQL after making changes:
sudo systemctl restart postgresql
# or
pg_ctl restart
SET log_min_duration_statement = 1000;
Check the log_directory you set above. Common locations:
/var/log/postgresql//usr/local/var/postgres/log//opt/homebrew/var/postgresql@*/log/Run slow queries in your database to generate log entries. For example:
SELECT * FROM employees ORDER BY random() LIMIT 10000;
SELECT COUNT(*) FROM sales WHERE amount > 1000;
Copy the relevant log file to your project directory’s sample_logs/ folder (create it if it doesn’t exist). For example:
mkdir -p sample_logs
cp /var/log/postgresql/postgresql-2025-10-31_*.log ./sample_logs/
Sample log file:
sample_logs/postgresql-2025-10-31_122408.log.txt (plain text, multi-line queries supported)# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt
# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --verbose
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --verbose
# With Poetry (recommended)
poetry run python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --output reports/my_report.md
# Traditional approach
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-31_122408.log.txt --output reports/my_report.md
If you already have an EXPLAIN JSON file (e.g., exported from psql or a saved plan), you can skip log parsing and analyze it directly:
# With Poetry
poetry run python -m iqtoolkit_analyzer postgresql --plan slow_query_plan.json --output reports/explain_report.md
# Traditional approach
python -m iqtoolkit_analyzer postgresql --plan slow_query_plan.json --output reports/explain_report.md
Requirements:
EXPLAIN (ANALYZE, FORMAT JSON) and include a top-level Plan key.Execution Time (preferred) or Actual Total Time will be used for the duration shown in the report.IQToolkit Analyzer currently supports the following PostgreSQL log formats:
log_destination = 'csvlog')Ensure your logs include durations and statements for accurate analysis.
You can generate a Markdown report summarizing top slow queries, their durations, frequency, and AI-powered recommendations. See the main README for a detailed sample report.
log_min_duration_statement is set appropriately (e.g., 1000 for 1s)--verbose to get more details during analysis