Confidence Calibration
How often am I right when I feel this sure?
Review your last 5 predictions or major expectations (from your forecast log or decision journal). For each, note whether the outcome matched your confidence level. If your '80% confident' calls are only right 50% of the time, you have a calibration problem — and now you know its size.
You've accumulated enough tracked predictions to spot patterns — monthly is ideal.
You haven't been tracking predictions yet. Start with Forecast Log first.
Why it works
Unmeasured confidence drifts away from accuracy. Tracking predictions exposes the gap between how sure you feel and how often you are right.
Calibrated confidence means your stated odds match reality: 80% confidence should be right roughly 80% of the time, and 50% should behave like a coin flip. Confidence is often too high because the gap is invisible without records. The answer is not vague humility. Measure the gap and adjust. Write the prediction, assign odds, and check the result later. Over time you learn what 60%, 75%, and 90% actually feel like in your own judgement instead of treating confidence as a mood.