This Is Not A Crisis
AI is coming for our jobs, apparently
Throughout my PR career I’ve learned that if you want to know what’s going on in the world, read. At University, my lecturer suggested we buy a newspaper and read it every day. Very soon I started to see which “news” articles were actually from PR sources.
In my first junior PR role, I was taught that if I’m going to succeed I need to be reading what journalists are writing.
This is a foundational skill for any PR person. We learn early on that if you want to pitch a journalist, read what they write (and let it be known to them that you have, because flattery works). It genuinely helps.
If you’re needing to pitch a client or story to a journalist, you’re much better placed when you know which journalists actually would cover the story.
It’s this very skill I’m using to write this piece. There are an increasing number of journalists and subject matter experts ‘sounding the alarm’ about what’s coming with Artificial Intelligence. There’s growing consensus that AI can now do a lot of the work of junior roles across industries. In particular PR and comms. The researching, reading, writing skills are functions of existing LLMs.
It was sparked by a post from Matt Shumer that’s now reached over 100M views.
“This is different from every previous wave of automation. AI isn’t replacing one specific skill. It’s a general substitute for cognitive work. It gets better at everything simultaneously. When factories automated, a displaced worker could retrain as an office worker. When the internet disrupted retail, workers moved into logistics or services. But AI doesn’t leave a convenient gap to move into. Whatever you retrain for, it’s improving at that too.”
Matt Shumer - https://www.linkedin.com/pulse/something-big-happening-matt-shumer-so5he/
What’s Actually Happening?
Whether you see an opinion piece or two from people leaving the likes of OpenAI or Anthropic, the opinions shared on social from AI founders and investors, or whether you’re tracking the news about data centres, investments, or the impacts on the stock market - what all these different threads have in common is that they’re signalling that the work being done today will have profound impacts that we don’t yet fully understand.
As Steve Yegge puts it, writing about what he observed after speaking with nearly 40 people at Anthropic:
“2026 is going to be a year that just about breaks a lot of companies, and many don’t see it coming. Anthropic is trying to warn everyone, and it’s like yelling about an offshore earthquake to villages that haven’t seen a tidal wave in a century.” - The Anthropic Hive Mind
Taiwan’s former minister of digital affairs, Audrey Tang, said in an interview last year: “The biggest risk is allowing a small group to unilaterally decide how AI is built and deployed. Only a handful of nations have the power to compete for dominance, while the other 200+ countries are effectively in a race to safety, as they have little control over AI development.”
I’ve said it for a while but I genuinely think it’s a race to the bottom with AI. Like so many innovations, there are a handful of companies racing to develop new features and products and capture users now, sacrificing profitability for scale. For now.
Money
Historically, software and internet companies would usually spend 10-15% of their revenue on infrastructure. This is one of the reasons tech stocks are usually valued so high; they are profit machines that don’t need heavy investments to grow.
They are now, in essence, acting more like utilities companies, which are more capital intensive. To sell more electricity, you have to build massive power plants, lay heavy cables and maintain physical things. Usually at around 40-50% of revenue on infrastructure.
This is what the hyperscalers are moving to do: for every dollar that these companies make, they’re spending around half of it on building the infrastructure to support AI.
The comparison to utilities in particular is important. LLMs require massive amounts of compute power (and thus energy) to run. Microsoft, Google, Amazon, Meta, Oracle, and Nvidia are driving landmark deals to restart decommissioned reactors, build new modular ones, and signing partnerships with advanced nuclear startups
Safety
CNN reported this month on a wave of AI researchers and executives publicly ringing the alarm bell on the way out. Mrinank Sharma, the head of Anthropic’s Safeguards Research team, posted a cryptic letter announcing his decision to leave the company and warning that “the world is in peril.”
Zoe Hitzig, a researcher with OpenAI broadcast her resignation in a New York Times essay, citing “deep reservations” about OpenAI’s emerging advertising strategy.
“I once believed I could help the people building artificial intelligence get ahead of the problems it would create. This week confirmed my slow realization that OpenAI seems to have stopped asking the questions I’d joined to help answer.” - Zoe Hitzig, New York Times
And what followed was a report in the Platformer that OpenAI has disbanded its “mission alignment” team, created in 2024 to promote the company’s goal of ensuring that all of humanity benefits from the pursuit of artificial general intelligence.
Being a skeptic, this certainly feels uncomfortable. I acutely remember the Cambridge Analytica saga with Facebook. So my trust that a company such as OpenAI would put users first rather than profiteering is hard to accept. But none of this is taking our jobs.
Combined, the money, the safety, and the race to the bottom is having an impact in a number of ways.
The New York Times details communities that are affected by power cuts, weeks-long water outages, school closures, and the spread of stomach bugs. Similarly, Computer Weekly shares details of the downside of data centres - noise pollution, deflated property values, energy issues and increased utility prices.
When Anthropic launched Claude Cowork in February 2026, the Fear, Uncertainty, and Doubt (FUD) it caused wiped roughly $300 billion in market value disappeared from the software sector.
“I’ve never seen it this shit”
Mark Ritson wrote a piece in The Drum in February 2026 about something he calls The Great Stay. His argument is the job market in marketing and comms isn’t crashing in some dramatic way. It’s tightening quietly, year on year, and most people don’t notice until they actually try to move jobs.
When someone leaves a team in 2026, the role often doesn’t get replaced. It just... goes. One person with the right tools can now do the research that used to need four people. Content that needs a writer, editor and designer gets done - passably, at least - by one senior person who knows how to use AI. Social scheduling, analytics, report writing, first-draft copy are all getting absorbed into fewer hands.
Senior people are - for now - relatively protected. They’ve got the judgement, the relationships, the strategic sense that you still need a human for. But at junior and mid level, people are being hollowed out. One agency creative director told The Drum: “For many young people, AI now performs the entirety of what used to be a year-one role.”
Yet, when you see that the PRCA actually updated the definition of public relations you’d be forgiven for thinking that PR and comms is becoming more important than ever.
“As artificial intelligence increasingly mediates how information is discovered and consumed, public relations plays a critical role in ensuring individuals and organisations are represented accurately and authoritatively in AI-generated outputs.” - PRCA
And while seemingly AI is squeezing jobs at one end, WSJ says that companies are are desperate for storytellers. Stuart Bruce, who writes about the future of PR, made the observation that these new “storytellers” aren’t really doing anything radical. They’re doing what PR people were always supposed to do - build narrative, understand audiences, earn attention rather than buy it.
What I think we can say for certain is that the future is being reshaped by AI and it’s having it’s impacts on the stock market, on where we live, and how and where we work. It’s just how much of an impact it is going to have is still very much in question.

