A fire, a camera, and a permit

On New Year’s night, a fire broke out inside a nightclub in Switzerland.

Start of the fire

At first, it didn’t look like an emergency. It looked like a disruption – something unusual, but not yet threatening. Music was still playing. People were still moving. Some stood near exits, others laughed, some raised their phones and began recording. A moment passed where the situation existed in a strange in-between state: visible, but not yet acknowledged as dangerous.

That moment matters more than what followed.

Because disasters are not defined by the moment everything collapses. They are defined by the time they are allowed to exist before anyone is forced to react.

And that time is never accidental.

The illusion of safety

Modern public spaces sell a very specific promise: you are safe by default.

This promise is reinforced through design, language, and repetition. Exit signs glow softly in the dark. Capacity numbers are printed somewhere near the bar. Fire extinguishers hang on walls like museum artifacts. Security staff wear black and stand where they can be seen, not necessarily where they are most effective.

Most people do not walk into a nightclub thinking about evacuation routes. They do not count doors. They do not test whether exits are unlocked or ask how smoke would behave in a crowded, enclosed space. They assume those questions have already been answered by someone else.

That assumption is not ignorance. It is rational. It is the social contract.

Permits exist so individuals don’t have to think about risk. Inspections exist to remove doubt. Regulation exists to convert danger into something manageable, legible, and supposedly controlled.

Once that conversion happens, vigilance becomes unnecessary – even undesirable. Instinct is replaced by trust. Awareness is outsourced.

And once instinct is weakened, reaction becomes delayed.

When danger waits for permission

What makes footage from incidents like this disturbing is not the fire itself. Fire is chaotic, but honest. It announces itself eventually.

What is harder to watch is the waiting.

People standing inches from exits, talking, smiling, filming. Some dancing. Some stepping aside to get a better angle. Not moving. Not helping. Not reacting. Not because they are cruel, but because the moment still feels undecided.

This is not panic.
This is learned delay.

Modern life trains people to wait for signals before acting: alarms, announcements, authority figures, confirmation that something is officially wrong. Until that signal arrives, the default assumption is that the situation remains under control.

So people do what they are socially trained to do. They look around. They read the room. They calibrate their response based on others. If no one else seems alarmed, they suppress their own instinct to leave.

The first stage of disaster is not chaos.
It is social calibration.

People don’t ask, “Is this dangerous?”
They ask, “Is this dangerous enough for everyone to agree?”

By the time agreement arrives, exits are no longer paths – they are choke points.

Risk that looks like entertainment

A nightclub is a strange environment even before anything goes wrong. It is designed to override instinct.

It is dark, so vision is compromised.
It is loud, so hearing is compromised.
It is crowded, so movement is restricted.
It is socially charged, so behavior is constantly adjusted based on others.

Above all, it is built around a single assumption: anything unusual is probably part of the experience.

Smoke might be a machine. A flare might be celebration. Shouting might be excitement. Chaos might be choreography.

In that environment, early warning signs are not interpreted as warnings. They are interpreted as atmosphere.

You can blame individuals for misreading the moment. But that misses the core issue: the environment trains people to misread it.

And the system allows that environment to exist.

The phone as a substitute for action

Phones do not merely record emergencies. They restructure them.

When someone starts filming, they adopt a role that feels active without requiring risk. Recording creates distance. It turns a person from a participant into a witness. From someone who might act into someone who documents.

It feels responsible. It feels useful. It feels justified.

In reality, it freezes time.

The phone reframes urgency as content. It delays reaction by transforming a real threat into something that still fits inside normal behavior. If it can be filmed, it cannot yet be fully real.

This is why people record instead of helping – not because they don’t care, but because the device offers a way to remain present without committing to danger.

The system never taught them what to do before the alarm. So they do what they know how to do.

Laughing near exits is not indifference

Some footage shows people laughing or joking near doors as smoke begins to spread. This detail is often presented as proof of moral decay.

It isn’t.

It is disbelief.

The brain resists reclassifying a familiar environment as dangerous. A nightclub is supposed to be chaotic but safe – intense, loud, dark, but contained. That expectation does not collapse instantly.

So humor appears. Casual behavior persists. People cling to normalcy because accepting the alternative requires immediate, irreversible action.

Once panic breaks through that resistance, it usually arrives too late.

The myth of spontaneous heroism

There is a dangerous fantasy embedded in how safety is discussed: the belief that, in a crisis, people will suddenly behave decisively, altruistically, even heroically.

They won’t.

Most people have never rehearsed evacuation under stress. They do not understand crowd behavior. They do not know how smoke disorients or how quickly exits become impassable. They do not know how bodies pile, how visibility collapses, how sound disappears.

Expecting coordinated action under those conditions is not realism. It is negligence.

A functioning safety system does not rely on courage.
It relies on design that compensates for hesitation.

If a venue’s emergency logic assumes people will recognize danger early, act immediately, and help others instinctively, then that system is already broken.

Real systems assume people will freeze, delay, follow the crowd, and misinterpret the moment – and they design against that.

The most damning detail

The most damning aspect of this incident is not that people hesitated.

It is that nothing forced them not to.

No immediate shutdown.
No overwhelming signal.
No clear authority cutting through ambiguity.

The environment allowed indecision to exist.

That is not chance. That is design.

If people had time to gather near exits and treat the beginning of a fire as something to be watched rather than escaped, then the failure began long before flames appeared.

Compliance is not safety

After incidents like this, attention shifts to compliance. Were permits valid? Were inspections passed? Did the venue meet standards?

These questions sound reassuring. They are also misleading.

Compliance is static. Safety is not.

A venue can meet every regulatory requirement and still be lethal under real conditions. Paperwork does not move crowds. Certificates do not clear smoke. Capacity limits printed on walls do not matter once bodies begin pressing toward the same door.

“Passed inspection” is not the same as “safe.”

It simply means the system has decided it is done thinking about risk – until something forces it to start again.

The quiet failure of regulated societies

Highly regulated countries are particularly vulnerable to this kind of failure. Regulation builds trust. Trust erodes vigilance. Vigilance fades into assumption.

People stop scanning environments because they believe someone else already has.

So when something goes wrong, reaction is slow – not because people are reckless, but because they were never meant to need instincts in the first place.

The system promised it would handle that.

It didn’t.

Responsibility flows upward

It is easy to blame the people filming, laughing, or hesitating. It feels satisfying. It changes nothing.

Behavior in crowds is predictable. Systems exist to manage predictable behavior. When they fail, responsibility does not belong to the individuals caught inside them.

If a fire can begin in a crowded public space and still be treated as entertainment for even a few minutes, then the failure is structural.

The scandal is not that people recorded.
The scandal is that recording felt reasonable.

A fire should never feel like content.

If it does, the system has already failed.

Scroll to Top