Sharing incident reports feels like transparency, but it has a peculiar side effect. It compresses complex, layered situations into a sequence of facts that look cleaner than they ever were in reality.

An incident report reads as if the event unfolded along a single, understandable line:

Time stamps.
Actions.
Conditions.
Outcomes

But the actual event was never like that.

It was parallel.

Multiple small degradations, each tolerable on its own, aligning in time

A valve slightly slow.
An indication slightly off.
A procedure interpreted one way instead of another.
A decision made under a different mental model than the designer assumed.

None of these would justify a report. Together, they do.

The problem is that when we share reports, we tend to extract lessons as if there had been a clear decision point where things could have been “done right.” We look for the hinge—the moment where the outcome could have flipped.

But often there is no hinge. Only accumulation.

And accumulation is hard to teach.

Reports are still valuable. They carry the texture of real systems under stress. They expose how work is actually done, not how it is described. They remind us that procedures are interpreted, not executed, and that calculations are themselves human actions, not neutral ground.

But they are incomplete in a particular way.

They rarely convey how normal everything felt until it didn’t. They rarely capture the absence of signal—the fact that nothing clearly announced that the system had crossed a line.

So when we share incident reports, the task is not just to extract recommendations.

We should try to convey the ambiguity.

To resist the urge to clean the story into something that looks preventable in hindsight. To show how easily ordinary conditions can align into something else.

Otherwise, we teach the wrong lesson.

We teach that accidents come from identifiable mistakes.

When more often, they come from alignment.

***

In "The China Syndrome", there is that small, frustrating exchange at the post-incident hearing.

Shift supervisor Jack Godell is asked why the operators did not look at the other meter.

He answers, simply: “I don’t know. I didn’t either.”

It lands badly, especially if you have ever sat in a control room.

Because the question itself is wrong.

It assumes that safe operation is a matter of vigilance—of people scanning, cross-checking, second-guessing every indication in real time. That if only someone had looked one meter further, the chain would have broken.

But that is not the job.

An operator is not there to hunt inconsistencies across a wall of instruments. If that is required, the system has already failed in its design. Cross-checking is not a human fallback—it is a system function. It belongs in logic, in validation, in alarms that present disagreement as a single, unmissable condition.

Otherwise, you are asking a person to solve a combinatorial problem under time pressure, with no clear stopping rule.

And they will not. No one would.

The frustration comes from that mismatch: the outsider’s belief that more attention would have saved the situation, and the operator’s lived reality that attention is already saturated just keeping the plant in a coherent state.

We have not suffered from a lack of data for a long time.

We suffer from too much of it, presented without structure.

Clutter does not just obscure—it creates false choices. It invites you to look somewhere else, just when you most need to trust what is already in front of you. And once you start chasing, you are no longer controlling. You are searching.

Good systems do not ask for heroics.

They reduce the space of doubt.

If two measurements matter, their agreement should be the signal—not something the operator has to infer. If a deviation matters, it should arrive already interpreted, already prioritized, already tied to action.

Otherwise, we end up blaming the person for not doing what the system never made possible.

The line in the film endures because it is honest.

Not careless—honest.

And perhaps the quiet hope is that we have learned something since then.

Not to collect more data.

But to decide, with discipline, what deserves to be seen at all.