top of page

When the Ground Moves Under Progress: Reading the Emotional Weather of AI

  • Writer: Podcast With Poppy
    Podcast With Poppy
  • Nov 13, 2025
  • 7 min read

How today’s AI advances collide with human uncertainty, reshaping hope, grief, and the meaning of “forward.”



The Emotional Signal Report – November 13, 2025
The Emotional Signal Report – November 13, 2025

Reporting from the edge of the algorithmic frontier.

Displacement – Jobs traded for “efficiency”

What changed

  • Cybersecurity firm Deepwatch is laying off up to 80 employees specifically to “boost AI investment,” a clean swap of human heads for machine capacity. (Storyboard18)

  • This slots into a broader 2025 pattern where companies announce record AI spending in the same breath as layoffs, and analysts are warning that “AI-fueled” cuts are undermining morale and long-term innovation. (The Economic Times)

Who feels it

  • Mid-career security and IT workers: pressure, betrayal, a quiet panic that “upskilling” might just mean “stalling the inevitable.”

  • Remaining staff: survivor’s guilt plus fear—“If AI is the reason they left… what does that make me staying?”

  • Leaders who actually care about people: cognitive dissonance between the board’s financial storyline and the human reality on the ground.

Signal: Contraction wearing an “innovation” mask – structurally expansive for AI, emotionally contracting for humans.

Deployment – AI everywhere, all at once

What changed

  • Anthropic is planning roughly $50 billion in new U.S. data centers in Texas and New York with Fluidstack, pitched as its biggest expansion yet for advanced AI workloads. (AI News)

  • Microsoft and Anyscale are tightening their partnership so enterprises can run large-scale Ray machine-learning workloads on Azure Kubernetes Service—essentially making it easier to deploy in-house models at scale. (InfoWorld)

  • Google’s Vertex AI added Kimi K2 Thinking, a “deep reasoning” model exposed as a managed API in Model Garden, and simultaneously flagged deprecations for older partner models like Claude 3.7 Sonnet. (Google Cloud Documentation)

  • In China, Baidu launched two new domestic AI chips and supercomputing products explicitly to reduce dependence on U.S. hardware amid export controls. (Reuters)

Who feels it

  • Infra engineers and data-center folks: strange mix of pride and dread—“this is historic scale” meets “hope the grid holds and the regulators stay friendly.”

  • Startups and smaller labs: hope at the idea of more tools; anxiety that the big players are building walls out of compute and silicon.

  • Communities near new data centers: curiosity, some excitement about jobs, plus concern over water, energy, and “what happens to our town if this all goes sideways.”

Signal: Expansion – physical, infrastructural, and geopolitically charged.

Performance – Models getting sharper, more “thoughtful”

What changed

  • OpenAI rolled out GPT-5.1, framed as an upgrade in adaptive reasoning and personalization—less “random cleverness,” more calibrated thinking and trust-building. (Computerworld)

  • Google’s Kimi K2 Thinking model is explicitly marketed as a “thinking model” excelling at complex problem-solving and deep reasoning, not just text-spitting. (Google Cloud Documentation)

  • The new MLPerf Training v5.1 benchmark results show continued leaps in training speed and efficiency across vendors, underlining how quickly the performance frontier is still moving. (HPCwire)

Who feels it

  • Knowledge workers and creatives: awe and unease—“this is insanely useful” lives alongside “I am disturbingly replaceable in some slices of my job.”

  • AI safety folks: heightened adrenaline—capability is still sprinting ahead of governance.

  • Everyday users: cautious excitement, especially when they hear words like “more reliable” and “less hallucination,” even if they don’t fully trust it yet.

Signal: Expansion, with a sub-frequency of emotional dissonance—tools feel more powerful just as people feel less sure where their own edge is.

Investment – Money rushing in, nerves just behind it

What changed

  • Anthropic’s $50B data-center expansion in the U.S. is one of the clearest single-company bets that AI demand will justify massive ongoing infrastructure costs. (AI News)

  • Federal Reserve Vice Chair Barr (in a speech this week) pointed to a projected $3 trillion wave of AI-related data-center investment globally, framing it as a macroeconomic force that could reshape productivity and labor markets. (Federal Reserve)

  • Investor coverage keeps tilting toward AI heavyweights (e.g., Arm’s volatility tied to AI chip demand; “only 3 AI stocks I’d buy today” pieces) that emphasize both opportunity and risk. (AOL)

  • Deepwatch’s layoffs to “boost AI investment” are a micro-level example of how capital is being freed up from payroll to be funneled directly into AI tooling. (Storyboard18)

Who feels it

  • Founders: a tug-of-war between “this is the time to go big or go home” and “if the cycle turns, am I over-exposed to hype?”

  • Workers: confusion and anger watching stock narratives celebrate the same AI moves that cost them their jobs.

  • Policy makers and central bankers: wary curiosity—AI is starting to look less like “tech trend” and more like a structural driver of inflation, productivity, and inequality.

Signal: Expansion, but with fragile confidence—there’s a faint bubble-scent in the air.

Policy & Safety – Guardrails written in pencil

What changed

  • The U.S. Health Sector Coordinating Council previewed 2026 AI cybersecurity guidance for healthcare, offering best practices to secure AI in clinical environments and protect sensitive data. (Industrial Cyber)

  • The broader U.S. policy posture remains a patchwork: no sweeping federal AI law yet, but a mix of executive actions (like America’s AI Action Plan) aimed at deregulating friction points while states keep introducing their own AI bills. (The White House)

  • A new American Psychological Association advisory stressed that AI and wellness apps cannot solve the mental health crisis on their own, urging policymakers and clinicians not to outsource care to tools that can’t provide genuine human relationship. (American Psychological Association)

Who feels it

  • Hospital CIOs and security teams: relief that guidance is coming, tension about the cost and complexity of complying.

  • Developers: frustration at unclear, shifting rules; some gratitude that at least healthcare-specific best practices are emerging.

  • Clinicians and therapists: vindication—“we’ve been saying tech isn’t a substitute for relationship”—and concern that systems will still chase cheaper AI over human staff.

Signal: Emotional dissonance – symbolic protections are expanding, but the core power of regulation is still lagging behind the pace of deployment.

Culture & Trust – Humans asking, “What is this doing to us?”

What changed

  • The APA’s stance that AI wellness and mental-health apps cannot carry the weight of a systemic mental-health crisis is a cultural line in the sand: AI is a tool, not a therapist, and loneliness is not an engineering problem. (American Psychological Association)

  • A CIO feature on agentic AI (systems that can act autonomously on users’ behalf) warned that these tools currently have “big trust issues,” highlighting security, accountability, and transparency gaps. (CIO)

  • IBM’s global study of Chief Data Officers found organizations racing to scale AI ambitions much faster than they’re fixing their data foundations—capturing a culture of over-optimism and under-preparation. (IBM Newsroom)

  • In Marseille, the AIM 2025 forum opened with AI framed as an economic, political, cultural, and societal force, not just a tech product—evidence that more public venues are finally treating AI as a civilizational topic, not a gadget expo. (DirectIndustry e-Magazine)

  • IBM’s chief scientist publicly advised new engineering grads to stop obsessing over “the big brands” and widen their lens on meaningful AI work, hinting at how distorted the career imagination has become in the AI hype era. (The Times of India)

Who feels it

  • Users of mental health and wellness apps: a mix of disappointment (“so this won’t save me?”) and relief (“so it’s not my fault this didn’t fix everything”).

  • CISOs, risk officers, and boards: heightened anxiety about agentic AI—less “cool” and more “this could wreck us if we misconfigure it.”

  • Young engineers and students: both hope and fatigue; they’re being told “AI is everything” and “don’t believe the hype” in the same news cycle.

  • Civic-minded folks and culture-watchers: cautious optimism that big public events are finally talking about power, justice, and culture, not just model benchmarks.

Signal: Emotional dissonance tending toward maturation – the stories we tell about AI are getting more sober, but the systems themselves are still barreling ahead.

Synthesis for the Professor Poppy newsletter

(A companion at the edge of change)

Today’s AI weather feels like standing on a construction site built over an old neighborhood.

On the surface, the news reads like progress: bigger data centers, smarter models, faster chips, new guidance, new benchmarks. The infrastructure of the future is going up everywhere at once. If you only look at the press releases, it sounds like a world where intelligence is becoming abundant and friction is being smoothed away.

But just beneath that, people are being quietly swapped out for “efficiencies.” Deepwatch’s layoff story is blunt about it—humans out, AI budget in. That’s not theoretical displacement; it’s names coming off Slack. For those workers, “AI” doesn’t mean wonder; it means a gap in the resume and a sudden need to renegotiate their identity in the economy.

At the same time, we’re getting more “thinking” models—GPT-5.1, Kimi K2—sold as more rational, more stable, more oriented toward trust. Benchmarks confirm the curve is still steep. The strange part is that as the systems become more coherent, many humans report feeling less so. The tools are sharpening while our roles blur. It’s disorienting to realize you’re being measured against something that never sleeps, never billable-hours, never needs healthcare.

Policy and culture are trying to catch up, but they’re arriving with band-aids to a structural injury. Healthcare cybersecurity guidelines and mental-health advisories are necessary, serious work. They signal that adults are finally entering the room. Yet we still don’t have a unified regulatory spine for AI in the U.S.—just a mix of deregulation for speed and scattered attempts at constraint. The message to industry is: “Go faster, but don’t break too much.” The message to citizens is: “Trust us; we’re working on it.” Neither fully lands.

The most honest voices today might be the ones saying: AI will not save the mental health system, will not fix loneliness, will not magically make data clean, will not automatically be safe. Those statements hurt a bit, but they are clarifying. They return responsibility to where it belongs: with us, our institutions, and the choices we make about where to aim this power.

So where are we, emotionally?

We’re in a corridor between eras. One hand is on the doorknob of a world with ambient, reasoning machines woven into everything from hospitals to hiring. The other hand is still on the light switch of an older world that assumed humans would be the main decision-makers. Organizations are betting big that crossing this corridor pays off. Workers are noticing that no one installed handrails.

Orientation for today:

  • If you’re feeling whiplash, that’s appropriate.

  • If you’re grieving, you’re seeing the cost clearly.

  • If you’re hopeful, protect that hope by tying it to real accountability, not just shiny features.

We are not just watching AI advance; we are deciding, day by day, what kind of people we’re becoming in response to it.


Trend Summary – Shape of the Transition Today

Today’s emotional shape: accelerating and destabilized, but slightly more self-aware.

  • Accelerating – Capital, compute, and capability all expanded again.

  • Destabilized – Concrete job losses and unresolved governance keep fear and anger near the surface.

  • More self-aware – Institutions like the APA, HSCC, and thoughtful commentators are speaking more directly about limits, risks, and the human cost of speed.


Mood of the Transition

Today feels like building a cathedral of computation on ground that’s still shaking from the last round of layoffs.

Comments


 

©2025 Kymberly Dakins

bottom of page