Oracletrace
Lightweight Python tool to detect performance regressions and compare execution traces with call graph visualization.
Install / Use
/learn @KaykCaputo/OracletraceREADME
OracleTrace — Python Performance Profiler
<table><tr> <td><img src="https://raw.githubusercontent.com/KaykCaputo/oracletrace/master/oracletracecat.png" alt="OracleTrace Logo"/></td> <td>Detect Python performance regressions and compare execution traces with a lightweight call graph profiler.
OracleTrace is a lightweight Python performance analysis tool designed to help developers detect performance regressions, compare execution traces, and visualize call graphs in a simple and readable way.
</td> </tr></table>It is ideal for:
- Detecting performance regressions between script versions
- Comparing execution time across runs
- Visualizing function call graphs
- Lightweight profiling without heavy instrumentation
- CI performance validation
Documentation: https://kaykcaputo.github.io/oracletrace/
See OracleTrace in action:
Why OracleTrace?
Performance regressions in Python projects are often hard to detect early.
Traditional profilers focus on deep performance analysis, but they are not optimized for quick regression comparison between two executions.
OracleTrace solves this by allowing you to:
- Run a script and generate an execution trace
- Export results to JSON
- Compare two trace files
- Identify performance differences
- Detect new or removed functions
- Measure execution time deltas
Key Features
Performance Regression Detection
Compare two JSON trace files and instantly see:
- Slower functions
- Faster functions
- New functions
- Removed functions
Execution Trace Analysis
- Total execution time per function
- Average time per call
- Call counts
- Caller → callee relationships
Call Graph Visualization
Visual tree structure of your program’s execution flow.
JSON Export
Export trace results for:
- CI performance checks
- Historical comparison
- Automation pipelines
Clean Output
Filters internal Python calls to focus only on your project code.
Installation
pip install oracletrace
Quick Example
Step 1 — Create a script
import time
def process_data():
time.sleep(0.1)
calculate_results()
def calculate_results():
time.sleep(0.2)
def main():
for _ in range(2):
process_data()
if __name__ == "__main__":
main()
Step 2 — Run OracleTrace
oracletrace my_app.py
Export trace to JSON
oracletrace my_app.py --json baseline.json
Compare with a new version
oracletrace my_app.py --json new.json --compare baseline.json
This allows you to detect performance regressions between two executions.
How It Works
OracleTrace uses Python’s built-in sys.setprofile() function to intercept:
callreturn
It measures execution time per function and records caller-callee relationships.
By filtering functions outside your project directory, the output focuses only on relevant application code.
Example Output
Summary table showing top functions by total execution time and average time per call.
Call graph visualization displaying execution flow hierarchy.
Starting application...
Iteration 1:
> Processing data...
> Calculating results...
Iteration 2:
> Processing data...
> Calculating results...
Application finished.
Summary:
Top functions by Total Time
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━┳━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━┓
┃ Function ┃ Total Time (s) ┃ Calls ┃ Avg. Time/Call (ms) ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━╇━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━┩
│ my_app.py:main │ 0.6025 │ 1 │ 602.510 │
│ my_app.py:process_data │ 0.6021 │ 2 │ 301.050 │
│ my_app.py:calculate_results │ 0.4015 │ 2 │ 200.750 │
└──────────────────────────────┴────────────────┴───────┴─────────────────────┘
Logic Flow:
<module>
└── my_app.py:main (1x, 0.6025s)
└── my_app.py:process_data (2x, 0.6021s)
└── my_app.py:calculate_results (2x, 0.4015s)
Use Cases
- Detect Python performance regressions in development
- Compare execution time between versions
- Lightweight alternative to heavy profilers
- CI/CD performance monitoring
- Educational demonstration of call graphs
Requirements
- Python >= 3.10
- rich
Contributing
Contributions are welcome.
If you have ideas for improving regression detection, trace comparison, or visualization features, feel free to open an issue or submit a pull request.
Contributors
Thanks to all the people who have contributed to this project:
<a href="https://github.com/KaykCaputo/oracletrace/graphs/contributors"> <img src="https://contrib.rocks/image?repo=KaykCaputo/oracletrace" /> </a>Recognition
- Included in awesome-debugger
⭐ Support the Project
If you find OracleTrace useful, give it a ⭐ on GitHub:
👉 https://github.com/KaykCaputo/oracletrace
Your support helps improve the project and makes it more visible to others.
Related Skills
node-connect
345.9kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
106.4kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
106.4kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
345.9kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.

