Tripwires and Gradual Drift: Why You Need Pre-Set Triggers to Catch Slow Failure
No single day looks like a crisis. Each small compromise feels reasonable. The drift from the original plan to the current reality happens so gradually that by the time you notice, you're months past the point where action was cheap.
The project was supposed to hit break-even by month nine. At month three, revenue was slightly below projection — 8% under, nothing alarming. At month six, it was 15% under, but costs were also lower than expected, so the gap felt manageable. At month nine, break-even was nowhere in sight, and the conversation shifted from “when will we break even?” to “how do we justify continuing?” Nobody made a bad decision on any given day. The problem was that nobody made a decision at all. The drift was gradual, each increment felt like noise, and the point at which the plan needed revisiting slid past without anyone noticing.
This is the pattern that tripwires are designed to interrupt. Not the dramatic failure — those are hard to miss. The slow, incremental deviation that accumulates invisibly until the original plan and the current reality are in completely different places.
The research
William Samuelson and Richard Zeckhauser documented the underlying bias in a 1988 paper in the Journal of Risk and Uncertainty. They called it status quo bias — the tendency to treat the current state of affairs as the default, requiring active effort to change. In a series of experiments, they showed that people disproportionately favour whatever option is presented as the status quo, even when alternative options are objectively superior. The bias operates through multiple channels: loss aversion (changing risks losing what you have), regret avoidance (active choices that fail feel worse than passive failures), and cognitive ease (not deciding is less effortful than deciding).
In the context of ongoing projects and commitments, status quo bias means that continuation is the default. Stopping, pivoting, or fundamentally restructuring requires an active decision — and active decisions require cognitive energy, social capital, and a willingness to acknowledge that the current path isn’t working. The result is that the threshold for continuing is almost zero (just keep doing what you’re doing), while the threshold for changing is substantial (gather evidence, build a case, persuade affected people, absorb the emotional cost of admitting the plan has failed).
Daniel Simons and Christopher Chabris demonstrated a complementary phenomenon in their famous 1999 study on inattentional blindness, published in Perception. Participants watching a video of people passing a basketball failed to notice a person in a gorilla suit walking through the scene. The finding generalised well beyond basketball videos: when attention is focused on a primary task — executing a plan, meeting daily targets, managing operations — unexpected or gradual changes in the environment become functionally invisible. You don’t see what you’re not looking for, even when it’s right in front of you.
Max Bazerman and Michael Watkins extended this analysis to organisational failures in Predictable Surprises (2004). They studied disasters — financial crises, regulatory failures, industrial accidents — and found a consistent pattern: the warning signs were present, often for months or years, and were seen by individuals within the system. The information existed. The problem was that no mechanism existed to force action on it. Without a trigger — a pre-set condition that compels re-evaluation — warning signs are absorbed into the background noise of normal operations.
The mechanism
Chip and Dan Heath, in Decisive (2013), articulated why tripwires are necessary as a structural intervention rather than a behavioural one. They argued that the human tendency to avoid re-evaluation isn’t primarily a failure of attention or diligence — it’s a consequence of how decisions are structured. Most decisions are made once and then executed indefinitely. The plan has a start date but no built-in review date. The commitment is open-ended by default. Without a pre-set moment that forces the question “should we still be doing this?”, the answer is always “yes” — not because the evidence supports it, but because nobody asked.
The tripwire mechanism works because it externalises the decision to re-evaluate. Instead of relying on someone to notice a problem, raise it, build a case for change, and absorb the social cost of challenging the status quo, the tripwire creates an automatic decision point. When revenue drops below X, when the timeline slips past Y, when the team loses Z — the tripwire fires, and the conversation shifts from “should we even discuss this?” to “the trigger has been hit — what do we do?”
The specificity is essential. A vague intention to “check in if things aren’t going well” isn’t a tripwire — it’s a wish. Vague conditions are subject to the same rationalisation that allows drift in the first place. Each small decline can be explained away. A specific, quantitative threshold — £50,000 in revenue by March 31, three consecutive sprints with scope reduction, a Net Promoter Score below 30 — creates a binary condition that can’t be argued with. It either fired or it didn’t.
Roger Nickerson and Marilyn Adams demonstrated in a 1979 study in Cognitive Psychology that people have remarkably poor memory for the details of familiar objects — even objects they interact with daily. The gradual evolution of a familiar situation is processed as continuity, not change. Your project today feels like your project last month, even if the metrics, team morale, and market conditions have all shifted. The tripwire bypasses this perceptual limitation by comparing the current state to a fixed, external standard rather than to your subjective sense of “how things are going.”
A tripwire doesn’t rely on you noticing that something has changed. It fires whether you’ve noticed or not — and that’s precisely the point.
The practical implications
Set the tripwire at the point of commitment, not the point of concern. The time to define what would constitute failure is when you’re clear-headed, optimistic, and uncommitted to a particular narrative about how things will unfold. Once you’re months into execution, the same metrics that would have alarmed you at the outset have been normalised by gradual exposure. The tripwire preserves the judgement of your pre-commitment self and applies it at the moment it’s most needed.
One condition and one deadline is sufficient for most decisions. Complex tripwires with multiple conditions and weighted criteria are harder to track, easier to rationalise around, and less likely to be honoured. A single clear condition — “if we haven’t signed the partnership agreement by April 15” — is easy to evaluate, hard to negotiate with, and forces a binary decision: continue or re-evaluate. Simplicity is a feature, not a limitation.
The hardest part is not setting the tripwire — it’s honouring it. When the condition is met, the natural response is to explain it away: the market shifted, the timing was unusual, we need one more quarter. The tripwire doesn’t mean you must stop. It means you must consciously, deliberately decide to continue — with updated information, revised projections, and an honest assessment of whether the path ahead justifies further investment. Removing the option to sleepwalk past it is the entire mechanism.
The bigger picture
Organisations are littered with commitments that no one explicitly decided to continue. Projects that outlived their rationale. Strategies that stopped making sense but were never formally reviewed. Investments that accumulated costs quarter after quarter because the default was continuation and no one set a trigger for re-evaluation.
The cost of this drift is enormous but invisible, because it manifests as the slow accumulation of resources directed at the wrong things rather than as a single dramatic failure. No alarm sounds. No crisis meeting is called. The organisation simply becomes gradually less effective, its energy spread across commitments that were never tested against the reality they now occupy.
A tripwire is a small act of intellectual honesty committed in advance. It says: “I believe this will work, and here’s the specific condition under which I’ll admit it isn’t working.” It’s the decision-making equivalent of a smoke detector — not something you hope will fire, but something you’re grateful for when it does.
References
- Heath, C., & Heath, D. (2013). Decisive: How to Make Better Choices in Life and Work. Crown Business.
- Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1(1), 7–59.
- Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events. Perception, 28(9), 1059–1074.
- Bazerman, M. H., & Watkins, M. D. (2004). Predictable Surprises: The Disasters You Should Have Seen Coming, and How to Prevent Them. Harvard Business School Press.
- Nickerson, R. S., & Adams, M. J. (1979). Long-term memory for a common object. Cognitive Psychology, 11(3), 287–307.