Decision Mastery Learn & Calibrate

Confidence Calibration

Say this

How often am I right when I feel this sure?

Do this now 5 min

Review your last 5 predictions or major expectations (from your forecast log or decision journal). For each, note whether the outcome matched your confidence level. If your '80% confident' calls are only right 50% of the time, you have a calibration problem — and now you know its size.

Use when

You've accumulated enough tracked predictions to spot patterns — monthly is ideal.

Avoid when

You haven't been tracking predictions yet. Start with Forecast Log first.


Why it works

Unmeasured confidence drifts away from accuracy. Tracking predictions exposes the gap between how sure you feel and how often you are right.

Calibrated confidence means your stated odds match reality: 80% confidence should be right roughly 80% of the time, and 50% should behave like a coin flip. Confidence is often too high because the gap is invisible without records. The answer is not vague humility. Measure the gap and adjust. Write the prediction, assign odds, and check the result later. Over time you learn what 60%, 75%, and 90% actually feel like in your own judgement instead of treating confidence as a mood.


Go deeper · 8 min read
Calibration: The Meta-Skill Behind Every Decision Skill
You can't fix overconfidence by willing yourself to be less confident. You fix it by measuring the gap between what you believe and what's true — and that requires data people often have never collected.
Related tools