
As a regular flyer, one of my favourite YouTubers is Swedish airline pilot Petter Hörnfeldt and his channel, Mentour Pilot. Petter dives deep into the inner workings of commercial aviation, especially the systems and protocols that keep millions of people safe in the air every day.
One of his most memorable explanations isn’t about technology at all – it’s about how pilots decide who’s actually flying the plane.
In a modern commercial cockpit, there are two pilots: a captain and a first officer. Before every flight, they decide who will be the “pilot flying” (the one with hands on the controls) and who will be the “pilot monitoring” (the one who manages supporting tasks, checks systems, and keeps an eye on the pilot flying).
Crucially, both pilots are fully briefed and capable of performing every part of the flight plan. If the pilot flying becomes incapacitated, the pilot monitoring can seamlessly take control. That’s the essence of co-piloting – two fully trained, fully capable humans working in tandem, ready to back each other up when things get difficult.
That’s not a job for AI.
When Microsoft and others use “Copilot” to describe AI, they’re unintentionally assigning it a role it was never designed to play.AI is much closer to an autopilot system:
- Great at handling routine tasks.
- Helpful for reducing workload during normal operations.
- Useful in assisting during certain critical phases of flight.
But the autopilot does not step in when things go wrong. It doesn’t take over from the captain in an emergency. In fact, in turbulent or unexpected scenarios, pilots often switch it off. It’s the pilot monitoring — a fully capable human — who steps up. Not the autopilot.
The aviation industry is built on layered safety systems and shared responsibility. The co-pilot metaphor implies equivalence, or at least shared accountability. AI doesn’t meet that bar. AI can:
- Accelerate routine work
- Spot patterns humans might miss
- Improve decision-support in stable conditions
But it cannot:
- Fully understand context
- Manage edge cases outside its training
- Take accountability when the unexpected happens
We’ve already seen what happens when automation is trusted too far. The recent AWS outages were a reminder: when the system encounters something outside its model, everything stops, and humans scramble to fix it.
The lesson from aviation is simple: Use AI as autopilot. Keep people as co-pilots.
Never use AI as an excuse to not train your people to the fullest extent necesary. Invest in their ability to understand, monitor, and override AI when needed. Don’t let the “copilot” marketing metaphor lull you into delegating responsibility to a system that isn’t designed to carry it.
AI can make the flight smoother. But when the storm hits, you’ll want a human in the left seat.





