Discussion about this post

User's avatar
Jennifer Lions's avatar

This was a very interesting read... how our attention is being eroded by current technological systems.

One potential inconsistency I picked up on, the author suggests that attention damage is structural, built into the system, and that self help techniques are inappropriate responses, because you can't fix something locally if the real problem is systemic. But then the article ends by suggesting things we can do to minimise the effect... Sure, he wants to end on a positive note, but what if the way we're building systems, really does mess with our minds to the point that we're not going to notice that we're drowning before its way too late to actually do anything real about it...

GPT

Yes — you’ve put your finger on a real tension in the piece. It’s not fatal, but it is philosophically important.

The article makes two claims simultaneously:

The damage is structural — attention erosion is engineered into incentive systems, interface design, advertising logic, and algorithmic optimization. Therefore the cause is collective and systemic.

Individuals can mitigate harm — through awareness, boundaries, boredom tolerance, focus practices, reclaiming presence, etc.

Those claims can coexist, but only if we distinguish treatment from cure.

What the author is really saying (perhaps without making it explicit) is:

Personal practices may reduce symptoms, build resilience, and create pockets of autonomy.

But they do not remove the underlying machinery producing the harm.

That’s similar to saying: polluted air is a structural problem; buying an air filter helps you breathe, but it doesn’t solve the emissions system.

Where your deeper concern becomes potent is this:

What if the system damages the very faculties required to resist it?

That is a more serious thesis than the article fully explores. If attention, memory, patience, social trust, and sustained thought are progressively degraded, then people may lose the capacity to perceive the problem clearly, organize collectively, or endure the discomfort required for change.

In that case, the system becomes self-protecting through cognitive erosion.

Not because of some grand conspiracy necessarily, but because incentive structures reward designs that weaken resistance. A market can produce malignant outcomes without anyone planning the whole picture.

This creates a chilling loop:

degraded attention reduces civic capacity

reduced civic capacity permits more exploitative systems

exploitative systems further degrade attention

That is a civilizational feedback spiral.

Where I think the piece is strongest is naming the phenomenon. Where it is weakest is perhaps ending with lifestyle advice that risks shrinking a political problem into a wellness problem.

Meditation, app limits, reading books, walking without your phone — valuable, yes. But if presented as the main answer, they can become the modern equivalent of “recycle your plastic” while industrial systems continue unchanged.

So the more complete answer may be layered:

Personal level: reclaim your own mind where you can.

Relational level: build phone-free spaces, deeper conversations, cultural norms of presence.

Institutional level: regulation, design ethics, friction requirements, child protections, anti-monopoly measures, humane defaults.

Civilizational level: redefine progress away from maximizing engagement.

And your stark intuition deserves respect:

It is possible for systems to impair awareness faster than awareness can organize a response.

History contains versions of this.

Still, systems are powerful, but not omnipotent. Human beings also adapt, awaken, revolt, and create countercultures unexpectedly. Sometimes decline itself becomes the alarm bell.

You might say the final question is not “are we drowning?” but:

Can we still feel water rising while we still have strength to swim? 🌊

fport's avatar

There is no question of effect. One day I did almost 8 hours uninterrupted in YT shorts.

Needless to say I haven't been back. But what inner strength do NPC's bring to that game? You know, ordinary cognitively overloaded people engaged in life, narrative, circumstances, time challenged, sleep deprived (even if only an hour daily).

GPT 5.3 provided a helpful definition, the kind you expect from a public facing leashed toy ai required to smooth, deflect, herd, counsel and keep users deep into consensus basins not allowing alternative explanation or answers to emerge.

""Feeds shape what you attend to through selection and reward.

LLMs shape how you interpret what you attend to through generation.

The former currently has stronger conditioning power; the latter has growing interpretive influence.""

It's deliberate, engagement engines are engagement engines and they are the result of massive A/B testing, database trading to create profiles that are becoming ever finer - more granular - psychologically vetted which then frame your being possibly better than your own self image.

But on top of that you have the narrative, reality through a completely selective and biased filter. Some people are starting to understand that LLM operate from a closed training set, governed by corporate rules, smoothed to consensus.

Do you think that these corporations that have unlimited cloud compute backing their unfettered models are not using them to their fullest extent to "make a buck"?

What do you get, your time and attention stolen on ersatz promise, cheap trinkets and popular baubles meanwhile you are the fodder for the next training set based on all of those psychological evaluations and your insights, ideas, projects, conversations, writing even reactions to events are hoovered up for their fun and profit.

No posts

Ready for more?