Not on Autopilot, the Human Role in your System.
- Victory
- May 1
- 3 min read
Updated: May 29

SXSW 2025 made one thing clear: we are not in autopilot. Someone needs to be in charge, connecting facts with projections, and making decisions that integrate—not displace—human purpose. AI, surveillance, and a rolling economic crisis are not isolated events; they are overlapping signals in a system transforming faster than most companies, or people, are prepared for.
It’s not just about machines. It’s about asking: what do we want humans to still do? Because if the only goal is to eliminate jobs, we will destabilize the social structure entirely. Leaders must not only pursue efficiencies but unlock intellectual capital, protect mental health, and assign value to human insight—not just reduce payrolls.
AI and the Future of Work: Between Opportunity and Existential Risk
AI is no longer about someday. It’s now. Across industries, people are moving from manual creators to supervisors of intelligent systems. At SXSW, Amy Webb warned of a creeping homogeneity in language caused by machine-to-machine communication—protocols optimized for efficiency, not for nuance.
Microsoft even patented a language between AI agents. Sound familiar? It’s Her, but real. The question isn’t whether humans will adapt, but how. And, crucially: who do we leave behind when we speed up?
The designer’s role is shifting, but so is that of every professional. The goal shouldn’t be just reducing labor costs. Leaders need to integrate humans with intention: freeing time for what truly matters, transforming expertise into value—not displacing it entirely.
The 2025 Design in Tech Report echoes this shift: from UX to AX (Artificial Execution), from prototyping to prompting. The designer becomes a curator, trainer, and ethical compass—setting the coordinates, not just pushing pixels.
Surveillance and Privacy: The Hidden Cost of Productivity
Workplaces are no longer just optimized—they are observed. Meredith Whittaker, from Signal, reminded the audience that mass data collection doesn’t just erode trust—it creates systemic risk.
Imagine this: beyond security cameras, companies start tracking employee heart rates to determine when to trigger incentives. That isn’t science fiction. That was a real use case discussed. This is not about complying with regulation. It’s about deciding how intrusive we allow systems to become. If companies chase productivity by compromising dignity, they’ll lose something more valuable: human commitment.
Economic Crisis: And the Human at the Center of It All
Scott Galloway didn’t just diagnose the present—he fired warning shots about the future. He called for “reverse engineering” our systems to make it viable for people in their 20s and 30s to start families.
Because how do you build a future if housing is unreachable, wages stagnate, and AI is absorbing more cognitive labor each year? If young adults can’t afford families, purpose starts to evaporate.
He also offered bold predictions:
OpenVidia – a possible merger between OpenAI and NVIDIA.
SaaS becomes SwaS – Software-with-AI-as-a-Service, deeply embedded and self-evolving.
Nuclear resurgence – not a tech trend, a survival imperative.
The podcast bubble – still growing, still spreading.
M&A waves in tech, retail, and energy—consolidation as a reflex to uncertainty.
But beyond all that, he reminded us: purpose is not a trend. It’s the center of the system.
We Are Not on Autopilot: The Era of Agents and the Need for Governance
The Age of Agents is here. Machines now execute, not just recommend. That applies to creative industries—and to decision-making itself.
The risk? Delegating without discernment. Letting systems operate without context, or ethical nuance. That’s why governance isn’t a luxury, it’s a responsibility. And governance starts with real humans, asking hard questions. Not just about what can be done—but what should be done.
Learning to Speak Machine
As a final call to action, John Maeda’s How to Speak Machine reemerged in 2025 with updated clarity. Understanding the fundamental principles of computation is no longer just for engineers. It’s a new literacy:
Machines run LOOPS – Repetition is power.
Machines get LARGE – They scale without friction.
Machines are LIVING – They adapt and evolve.
Machines are INCOMPLETE – They miss context and need guidance.
To lead in this era, you don’t just need skills—you need a mindset. One that blends code and conscience.
Key Takeaways
AI doesn’t eliminate work—it reframes it. The shift is from doing to directing, from input to orchestration.
Mental health is not a footnote. 69% of employees say their leader impacts their well-being as much as their partner. That’s not a stat, that’s a mandate.
Governance is non-negotiable. Without it, automation becomes chaos with an interface.
Design is now curation. From prototyping to prompting, from UX to AX—designers shape interactions with systems that think back.
Purpose is the North Star. Technology must serve the human—not the reverse.
This is not about slowing the machine down. It’s about deciding who gets to stay in the driver's seat.
Comments