Autonomous technology is the future of transport. Soon we will have cars that drive themselves, boats that sail without a crew, and planes that fly without a pilot.

While many have been waiting for this for years, including the owners of airlines companies who stand to save potential billions in pilot wages, critics of autonomous planes cite concerns ranging from how an AI would handle emergency scenarios to severe weather, without any human oversight. 

Why Autonomous Planes Are A Bad Idea

The basis for the technology already exists, and many planes already have a degree of autonomous technology active during take-off and landing. Currently most planes are required to have two pilots on board, and if this number were to be reduced, additional automated systems would need to be implemented, as well as a substantial redesign of the cockpit. According to some schools of thought, dropping the numbers of pilots down to just one would not bode well.

A Nasa study published in September of 2017 found that the workload of flying a Boeing 737 was “unacceptable” even in normal flight conditions, let alone when something went wrong.

In this event, the plane’s automated systems would assist the pilot, but in a fully autonomous plane the AI would be in control before and during the emergency. The AI compiles data from flights and can learn how to pilot the plane, as well as understanding the best course of action to take in specific circumstances like a storm or an engine stall.

In theory, an autonomous plane would be safer than operating under the control of a human pilot, with sensors more intricate and precise than human eyes constantly scanning the plane’s surroundings and correcting its course.

How An Autonomous Plane Might React In An Emergency

However, there are a host of things an autonomous plane cannot do as well as a human, like react to change quickly. This is an absolutely vital ability to have in the event of an emergency, one which an AI may not be able to replicate. If something unexpected happens, a human pilot can react at a split second’s notice thanks to years of training and experience.

An AI would have to be fed new data, analyse that data, and then decide on a new course of action. If there isn’t an action that the AI has programmed as a response, it has to decide on one based on what’s going on around it, which may not be simple or easy for the system. An AI system can only make choices based on logic determined by set parameters and accumulated data, which may not be able to handle fast-paced situations.

No Override

If something were to go wrong, the plane and all its inhabitants would be entirely reliant on the AI to navigate and land safely. In the future, if all planes are designed to be fully autonomous, there may not even be controls for anyone with flight experience to take over, effectively locking out any chance of

Earl Wiener, a figure in aviation safety, coined what is known as Wiener’s Laws of aviation and human error. One of them was: ‘Digital devices tune out small errors while creating opportunities for large errors.’ Only time will tell if this proves to pessimistic, or sadly prophetic.

Read more about what autonomous planes mean for international travel, or learn how plane design has changed.