‘Too cozy.’ Boeing crashes raise doubts over FAA certification.
For Boeing engineers, the crash of two new 737 Max jets within five months of each other is a software puzzle. How did a sensor apparently go awry and make the airliners unmanageable within minutes of takeoff, resulting in the deaths of all aboard?
For Congress and the public, the twin accidents raise larger issues. Did Boeing unreasonably rush the plane’s design in the face of mounting competition? Did federal regulators have the necessary independence and resources to oversee the plane’s certification? In an era when automation is transforming the man-machine interface, can today’s regulatory system keep up with change?
On Wednesday, the Senate Commerce Committee’s aviation panel will give Congress its first crack at seeking answers to some of these questions.
These issues affect not just the flying public. For example, car drivers and other consumers face many of the same challenges of adequate oversight and spreading artificial intelligence.
“Increasingly, regulating these products is regulating software,” says Daniel Carpenter, a professor of government at Harvard University in Cambridge, Massachusetts. “That’s not only true with respect to planes, [it’s true] with respect to drugs, with respect to financial products, with respect to medical devices…. I think we’re going to need to look carefully at the way that these software programs are created and managed and how we simulate them.”
Faulty sensors and a loss of control
Officially, investigators have yet to determine what caused the crash of Indonesia’s Lion Air Flight 610 in October and Ethiopian Airlines Flight 302 in the morning hours of March 10 shortly after takeoff. Both planes were versions of the Boeing 737 Max, the manufacturer’s bestselling update of its workhorse plane. After the Ethiopian Airlines crash, the planes were grounded worldwide, pending investigation. The Federal Aviation Administration has said there are similarities between the two crashes.
In the earlier Lion Air accident, reports suggest a faulty sensor caused the automated flight-control system to nose down repeatedly shortly after takeoff. The day before the crash, the plane’s crew, with the help of an off-duty pilot, managed to wrest control of the aircraft from the automated system. The crew the following day failed to gain control. The crash prompted Boeing engineers to begin work on a software fix.
It also caused federal prosecutors in the United States to launch a highly unusual criminal investigation into the safety procedures and certification of the 737 Max. After the Ethiopian Airlines crash, the inspector general of the Transportation Department also began looking into the Max certification.
On the face of things, the certification process looks problematic, earning the ire of everyone from consumer advocate Ralph Nader to Capt. Chesley “Sully” Sullenberger, a safety expert and retired pilot who famously landed a plane in the Hudson River after a bird strike disabled its engines. “There is too cozy a relationship between the industry and the regulators,” he wrote in a recent op-ed. “And in too many cases, FAA employees who rightly called for stricter compliance with safety standards and more rigorous design choices have been overruled by FAA management, often under corporate or political pressure.”
Rising safety, but new risks?
Increasingly, the FAA has relied on Boeing employees to act as FAA inspectors to certify their own planes. A 2013 Government Accountability Office report found that more than 90 percent of the certification tasks were carried out by these FAA-approved private employees.
Under that deregulatory regime, starting in 2010, domestic airlines recorded eight straight years without a single fatal accident in the United States – a remarkable record. But some critics say that both Boeing and regulators grew complacent about the mounting challenges of the deregulated certification process.
“The regulator went to sleep,” says James Hall, a safety consultant and former chairman of the National Transportation Safety Board. “What we have now is a system that essentially provides Boeing with a system of self-certification.”
Another potential compounding factor: Boeing’s rush to build the 737 Max to compete with the Airbus A320neo. “It was go, go, go,” one engineer told The New York Times about the expedited process to design, build, and certify the plane. None of the Boeing staff interviewed by the Times said the speedup compromised safety.
Outside the company, however, there’s skepticism.
“No one wants to build a bad plane,” says Mike Perrone, union president of Professional Aviation Safety Specialists, based in Washington, D.C. “But when there are other pressures, like to get the plane approved, to get it out so you could get competing with your competitors, certain things can take a back seat. And maybe some things don’t get tested as much as they should.”
That’s especially true as manufacturers pack in more and more automation.
A compromised design?
To accommodate larger and more fuel-efficient engines on the older 737 airframe, Boeing implemented modifications that made the plane less stable during a steep takeoff. To compensate, it created the Maneuvering Characteristics Augmentation System, software that would automatically nose the plane down if it was in danger of stalling. But if a sensor designed to detect stalling was picking up erroneous data, as apparently happened in the earlier Lion Air crash, then MCAS could push the plane into a nosedive unless the crew disabled it.
The company’s initial safety analysis didn’t account for the fact that MCAS could repeatedly reset itself after the pilot pulled the nose upward, according to an investigation by The Seattle Times. Also, it underestimated by a factor of four the distance MCAS could move the tail. Meanwhile, pilots complained that the plane’s documentation was inadequate to explain what was going on.
In addition to a software fix, Boeing has announced it will make a safety feature standard rather than extra – a “disagree light” that warns pilots if two sensors are misaligned.
More-complex products, less money for oversight
Some experts say the company’s scramble also signals the need for a new approach to regulation. That’s the case even though software and artificial intelligence systems are designed to make planes and other vehicles safer.
“In many instances, AI can fly an airplane safer or drive a car safer than you or I,” says Sid Shapiro, an expert on regulatory policy at Wake Forest University’s law school. “So there’s certainly a promise there of increased safety. On the other hand, you’ve got to get it right for that to happen. And it’s not clear that the National Highway Traffic Safety Administration or, for that matter, the FAA is really up to this job and it has the sophistication to do it.”
Funding for many regulatory agencies, including NHTSA, has been declining for the past quarter century when adjusted for inflation, he says. “Making those kinds of technological transitions can be very resource intensive and they don’t appear to have the resources.”
Compared with the FAA, NHTSA is even more underfunded, says Jason Levine, executive director of the Center for Auto Safety, a consumer watchdog in Washington, D.C. The agency has issued only voluntary guidelines for the various automated safety systems now going into cars.
“It’s impossible for consumers, or even safety professionals, to know which of these work, which of these work well, which of these work well in different circumstances, because there’s no required standards,” he says. “When used correctly and when operating as designed, yeah, [the technology] is amazing. But there’s a lot of ifs.”