By Jennifer Puccio
As Americans pore over the latest release of Epstein-related documents – pages detailing exploitation, abuse and institutional failure – millions of people are consuming the same disturbing material.
This shared attention feels unusual, even unsettling. The last time the country truly watched or read the same thing together was decades ago, when there were three television channels and a nightly news anchor.
Then, the shared experience often unified. Today, it exhausts.
This isn’t only about content. It’s about synchronization. When large populations deal with emotionally charged material simultaneously, emotional responses align. Stress signals multiply. Nervous systems built for episodic threat and recovery are pushed into sustained activation to remain engaged. What emerges is something many people struggle to articulate but can feel: a heavier emotional atmosphere, a low-grade sense of unease, a narrowing of patience and trust.
This phenomenon isn’t mystical.
It’s biological.
Modern neuroscience, psychology, and sociology all point to the same mechanisms: observing the trauma of others, emotional content meant to influence, and lasting distress caused by reading acts that violate deeply held moral beliefs.
When people are repeatedly exposed to narratives of harm, empathy systems activate alongside threat systems. Without containment or closure, reactions trigger without anything initiating it and over time, the culture’s baseline of what is considered normal shifts into a lower gear.
Some people notice this shift earlier than others. Often, they are people with experience in caregiving, community work, or systems thinking, accustomed to tracking morale and cohesion rather than headlines. They recognize the signature of eroded trust and shared exhaustion because they’ve seen it at smaller scales.
What makes this moment distinct is not only what we are consuming but how. A CIA-released document from the 1970s and 80s reported studies of biofeedback, stress regulation, and meditation for strategic reasons. The question was how human systems stabilize under pressure and, implicitly, how they destabilize.
Emotional synchronization works both ways. Calm reproduces itself and spreads through groups. Fear does too.
For much of the Cold War, psychological destabilization techniques were aimed outward, targeting foreign populations. Ethical firewalls, at least on paper, kept these tools from being turned inward.
But with the rise of the algorithmic platform, particularly on social media, the distinction between foreign and domestic influence has collapsed. Attention has become monetized, outrage a growth strategy.
And destabilizing dynamics no longer require a central actor but are produced by the system itself.
Late last month, during the same week, a New Mexico jury found Meta (Facebook) liable of creating a “breeding ground” for child predators and a California jury found Meta and YouTube knowingly designed addictive platforms, failed to warn parents and users of the risks, and harmed a young woman’s mental health. Both companies face hundreds more cases. Meta and YouTube plan to appeal.
As Tristan Harris, former Google design ethicist and co-founder of the Center for Humane Technology, noted on The Diary of a CEO podcast, a small design choice – the decision to ping phones for every new Gmail – marked the beginning of a new era. Attention stopped being something humans offered and became something technology took. Interruption became the default. Silence disappeared.
Then, the information environment accelerated faster than the human nervous system could adapt. We entered a world without mental health safety standards, the cognitive equivalent of the early industrial era before labor laws or weekends existed.
Every farmer, mechanic, nurse, and teacher understands that safety standards exist because human beings have limits. The information environment should be no different.
For a long time, we’ve lacked plain language for the harm we’re experiencing. “Doom scrolling.” “Attention overload.” The burden fell on individuals who blamed themselves for feeling overwhelmed by systems designed to overwhelm them.
The result is a population stuck in partial threat mode: reactive, polarized, morally exhausted, and increasingly convinced that the problem lies in one another rather than in the environment shaping everyone’s behavior.
None of this requires a grand conspiracy. Algorithms don’t need ideology to divide us; they only need incentives aligned with arousal. And we do not evolve on quarterly earnings cycles. Our brains are ancient, relational and rhythm-dependent.
Industrial reform succeeded by redesigning environments to fit limitations and the same approach applies now. National public awareness campaigns, healthier default settings on platforms, and education about nervous system regulation would be a start.
Most importantly, the cultural narrative must shift from “stay informed at all costs” to “a regulated society is a more competent society.”
We are not broken. We are overloaded.
And recognizing the mismatch between ancient biology and modern velocity is not pessimism. It’s the first step every reform movement has taken before meaningful change became possible.
Sometimes, progress begins not with answers, but with a collective pause and a shared sentence:
This pace is not sustainable.
– Jennifer Puccio lives in Sweet Home, has a degree in International Relations and currently studies environmental science at Linn-Benton Community College. She notes that she used AI to edit this piece to demonstrate how “AI is a tool that can be used in everyday life, and people can see what that looks like in practice.”