When Design Deviates
Behavior is often shaped less by ethics alone than by defaults, friction, and constraints embedded in systems.
The Pilates studio I attend recently changed how lockers are used.
Until then, when I entered the gym, I would greet the instructors, scan my membership card, and pick up a locker key from the desk in front of them. It was a small ritual—brief, visible, and mildly inconvenient in a way that felt normal.
The new system was only a subtle change. Now I simply scan my card and walk straight into the locker room, where the keys are already inserted into the lockers. I can choose any locker I want.
At first, this didn’t seem to make any difference at all. It reduced a bit of attention and labor—one less step between the desk and the locker room. It felt like a reasonable improvement.
But not long after, I noticed something unexpected. Under this new system, some people began using two lockers instead of one. I had never imagined this would happen, but it was clearly not prohibited. The structure simply allowed it.
My first reaction was judgment. I remember thinking that this wasn’t a very considerate way to use a shared space, and that if a system permits it, people will sometimes act in the worst possible way.
Then winter arrived. Coats became thicker, bags bulkier, and one day I found myself struggling to fit everything into a single locker. For the first time, I felt the impulse to use two.
That was the moment the problem became clear to me. The issue was not a sudden decline in individual morality. The lockers themselves were simply too small. What looked like selfish behavior was, in fact, a predictable response to a structure that no longer matched people’s needs.
The failure was not in the people, but in the system.
This kind of system failure is not uncommon. I see it often in subway stations. At one station I use regularly, an escalator leads directly onto the platform, but a pillar stands immediately at the exit. People rushing to board or get off the train cannot see each other in time, and collisions happen constantly unless everyone makes a conscious effort to avoid them.
The cause is obvious. The pillar is likely there because of structural constraints, not because anyone intended to create danger. Still, the design repeatedly produces the same risky behavior. People blame carelessness or impatience, but the problem is embedded in the layout itself. The system keeps manufacturing collisions.
In both cases, the instinct to judge comes quickly. Someone is inconsiderate. Someone is careless. But when I found myself wanting a second locker, or instinctively bracing for impact at the end of an escalator, the explanation felt much simpler. People adapt. They respond to what a system quietly allows.
It is striking how easily we mistake these outcomes for human nature. Small structural decisions—locker size, pillar placement, default settings—shape behavior long before morality enters the picture.
People do not suddenly change. Systems do. And we live with the consequences of that difference.