HyperAI

EgyptAir Black Box Gradually Cracked, the Crash Is Directly Linked to Fully Automated Flight Software

6 years ago
Headlines
Information
Dao Wei
特色图像

By Super Neuro

Scenario description:Autopilot is a standard feature of aviation technology. It can effectively assist pilots in completing long-distance flights and routine operations during flight, but there are still many problems that need to be solved.

Keywords:Civil Aviation Autopilot Automation

Automatic flight control for aircraft appeared as early as the 1910s and matured in the 1930s. Engineers connected the aircraft's elevators, ailerons and rudders to gyroscopes and altimeters, allowing the aircraft to fly according to the set direction and altitude.

The automatic driving (autopilot) of aircraft, ships, missiles and spacecraft is still relatively simple at present, assisting drivers to complete relatively simple and repetitive tasks. It is different from the automatic driving (auto-drive) of cars, which requires more complex functions such as judging the road surface and planning the path in real time.

To this day, this structure has not changed in essence. The autopilot of an aircraft is to make the aircraft fly according to the set trajectory and speed. In other words, the aircraft autopilot replaces"The pilot looks at the instrument pointer and corrects it according to fixed actions when the pointer deviates from the set value."operation.

The aircraft's autopilot greatly reduces the burden on pilots, allowing them to focus on other operations, such as monitoring aircraft status, weather conditions, etc. When complex and precise driving is required, such as takeoff and landing, ground taxiing, and collision warnings, the pilot will intervene in a timely manner.

It all started with Boeing

In October 2018, an Indonesian Lion Air passenger plane accidentally crashed into the sea. Not long after, on March 10 of this year, an Ethiopian Airlines passenger plane crashed. Boeing 737-Max 8 passenger planes (737 Max series) suffered two major air crashes within five months, claiming 346 precious lives. 

Looking back in time, in the past three years, there have been several serious air crashes besides these two.

In May 2018, a Cubana Airlines plane crashed, leaving only one of the 113 people on board alive; in June 2017, a military plane flying from Myeik, Myanmar to Yangon crashed, killing all 122 people on board; in November 2016, a Lamia flight carrying players from Chapecoense Football Club crashed, leaving only two of the 73 people on board alive.

Although aviation accidents cause heavy losses, data shows that airplanes are much safer, but each accident is too shocking.According to statistics from the U.S. Department of Transportation, from 2007 to 2016, there were an average of only 11 deaths per trillion miles of commercial airline travel, compared to 7,864 deaths per trillion miles on highways. 

However, two tragic accidents occurred on the same model of Boeing aircraft within just a few months, which also proved that there were indeed problems with this aircraft. Boeing also announced the global grounding of this aircraft soon after. 

On March 17, the Seattle Times, a local newspaper in Seattle where Boeing is located, published an article titled "Flawed analysis, failed oversight: How Boeing, FAA certified the suspect 737 MAX flight control system".Interviews with several current and former FAA engineers revealed some inappropriate operations when the 737 Max passed the flight safety assessment. 

According to the report, the major mistake was ultimately caused by negligence on both sides, and one of the important reasons was the imperfect system. 

Boeing's new autopilot software may have a fatal flaw

Although the information from the black box is still being analyzed for this accident, many details since last year's air crash in Indonesia have pointed to the same bug in Boeing aircraft, and judging from the details that have been released, the two accidents are quite similar. 

The Boeing 737 aircraft was launched in 1968. It is currently a more mature model and the world's best-selling aircraft, with sales exceeding 10,000 since its inception. The Boeing 737 Max series is also its latest flagship series. 

In order to remain competitive, the Max 8 uses a new engine.For this purpose, MCAS was also introduced. Its full name is Maneuvering Characteristics Augmentation System, which is used to assist the stability of the aircraft. The business motivation behind MCAS is clear, it is a software patch that attempts to fix a physical flaw in the aircraft. 

MCAS is software that runs in the background. If the nose of the plane pitches up, the system automatically activates the tail and pulls the nose of the plane back to a safe cruising track without the pilot even noticing the software intervention. 

MCAS workflow diagram

Using software to address aircraft instability is not new. Many of the more advanced fighter jets are also designed to be unstable to ensure greater maneuverability. Fighter pilots are also trained to anticipate the aircraft's particular flight characteristics. But the unkind part is that many Max 8 pilots were not informed of the existence of the MCAS system, one said: 

"The introductory manual is 1,400 pages long, and only one page mentions the so-called MCAS... But the manual doesn't explain what it is..." 

Perhaps Boeing thought pilots didn't need to pay attention to the system's information.Because the purpose of the MCAS system is to make the 737 Max 8 produce the same "operating experience" as the previous model, the 737 NG.This was also a selling point of Boeing at the time: "Buying a new aircraft does not require additional training." 

The Law of Vulnerability Abstraction

Why would such a system that ensures security become a "murderer"? 

There is a law in software development called the "Law of Leaky Abstractions", which states that "All non-trivial abstractions, to some degree, are leaky."

MCAS may be such a leaky abstraction, that is, it attempts to create a virtual equivalent of a conventional 737 NG without the Leap engine to correct the imbalance the aircraft would experience. But abstracting a virtual machine is one thing, and trying to abstract physical reality is another entirely. Ultimately in both cases, something is leaky. 

So how does MCAS behave when the thing it is trying to abstract fails? Here’s what the pilots reported: 

"On the NG and MAX, when you have a tendency to lose control, you can temporarily stop it by pulling the control column in the opposite direction. But when MCAS is activated, it can only be stopped by cutting the power." 

737 Max 8 Control Room

The pilot's response to an abstraction hole can be very different from the actual situation it is trying to abstract away. With a faulty sensor, one can turn it off and use one's understanding of the situation and the aircraft to make the right decision.

However, when the understanding of the nature of the aircraft is virtual and not real, then you cannot return to reality. Reality is outside the pilot's comprehension and is the cause of incorrect decisions. One of the things that makes the real world very different from the virtual world is that many times there is no undo function! 

This vulnerability occurs when the aircraft itself demonstrates its intentions: 

"EFS never acts autonomously, but, in certain circumstances, such as what happened on Flight 610, MCAS can activate autonomously." 

And this: "MCAS activated without pilot input and operated only in manual, flaps-up flight." 

If MCAS is turned off, pilots will find themselves flying a completely different aircraft.

The control of MCAS has already involved the scope of automated driving, and can even be rated as Level 5 by technicians.

Is autonomous driving the final payer? 

Because the black box of the crashed plane is still under investigation, it cannot be determined that it was entirely MCAS's fault. So what is the connection between MCAS and automatic flight? 

The Society of Automotive Engineering (SAE) has an international standard that defines six levels of driving automation (SAE J3016). This framework is used to classify levels of automation in domains other than automotive. The detailed classification is as follows: 

Level 0 (manual process) 

There is no automation whatsoever. 

Level 1 (Participation Process) 

The user is aware of the initiation and completion of the performance of each automated task. The user can undo a task in the event of incorrect execution. However, the user is responsible for the correct sequencing of tasks. 

Level 2 (participating in multiple processes) 

The user is aware of the initiation and completion of the task combination. However, the user is not responsible for the correct sequencing of the tasks.

Level 3 (unattended process) 

Users are only notified in exceptional circumstances and are required to complete work under those conditions.

Level 4 (Smart Processes) 

The user is responsible for defining the end goal of the automation, however, all aspects of process execution and handling of exceptional conditions in-flight are handled by automation.

Level 5 (fully automated process) 

This is the final and future state, where humans are no longer required in the process. Of course, this may not be the final level, as it does not assume that the process is able to optimize itself to improve. 

Level 6 (Self-optimizing process) 

It’s completely automated, requires no human involvement, and is able to self-improve over time.

Typically when an error occurs, the autopilot disengages and returns control to the pilot. This is Level 3 (unattended process) automation, where the extent to which the automation is operational is clear. In Level 3, the pilot is aware of the abnormal situation and assumes manual control of the aircraft. 

At Level 4 (Intelligent Process), the pilot is able to identify unusual situations and is able to specify when automation is applicable. Today’s self-driving cars can, for example, park in a row and drive autonomously in good weather conditions on the highway. These functions are all controlled by the driver to decide whether to engage automation.

The aircraft autopilot is also Level 4 automation, capable of engaging in low-complexity environments. 

Boeing's 737 Max 8's MCAS is similar to Level 5 automation. That is, it is a fully automated process that has the power to decide in what scenarios to operate.

As with the electronics that control engine performance, fully automated processes usually do not cause problems.But when it comes to piloting (or steering an aircraft), the question arises as to who is in control. 

Level 5 automation requires an intelligence that can recognize which sensors are faulty and has the intelligence to navigate problems using partial and unobserved information. But the current state of technology simply cannot achieve this level of AI intelligence. 

Does the responsibility lie with the technology, or with those who designed it?

The development of automation is not the main cause of these disasters. The main reason lies in whether the automated development of the system can make human operation and control safer and smarter. 

In short, Boeing may be facing a situation where technology has not kept pace with ambition, because not all software complexity is the same.

This is not a simple matter of insufficient testing that uncovered software logic errors, nor is it a simple matter of testing and handling sensor and device failures. This is an attempt to accomplish an ambitious task that resulted in a dangerous solution. 

Regardless, introducing software patches as a means of virtualizing physical behavior could lead to unintended consequences. There will still be pilots flying, and the hope is that pilots will be able to address unexpected situations that automation cannot handle.But MCAS is like an illusion, hampering pilots' ability to distinguish between the real and the simulated. 

A Boeing 737 MAX, called the Spirit of Renton, takes off for the first time from Renton Municipal Airport on January 29, 2016

Expect regulators to take a different approach to how systems like MCAS are handled and tested in future reviews than other automated systems.

Such control systems should be considered Level 5 automation and subject to more detailed review standards. Only in this way can there be less bloodshed and tears.

In 1803, British engineer Trevithick invented the steam locomotive, which could run on rails and carry much more cargo than a horse-drawn carriage.

However, this steam locomotive has numerous minor problems, and often has minor malfunctions, so it has to stop for repairs after traveling a certain distance, so it is not accepted by most people. It can even roll over, causing serious casualties.

The carriage owners felt that their status was challenged and formed an alliance to resist the promotion of trains for various reasons.

However, hundreds of years have passed, horse-drawn carriages have long since withdrawn from the stage of history, and trains have become one of the most important means of long-distance transportation.

Being patient and cautious with technological development is our only option.

Click to read the original article