‘Too cozy.’ Boeing crashes raise doubts over FAA certification.

|
Lindsey Wasson/Reuters
An aerial photo shows Boeing 737 Max jets at the Boeing factory in Renton, Washington, on March 21. The company is working on a software fix to address concerns related to two recent crashes.
  • Quick Read
  • Deep Read ( 5 Min. )

After the crash of two Boeing 737 Max airliners within five months, the cozy relationship between the Federal Aviation Administration and Boeing is drawing intense scrutiny by everyone from accident inspectors to the FBI.

On Wednesday, a Senate subcommittee will hold a hearing on air safety. Among the questions that may be probed: Did Boeing compromise safety in its rush to revamp its 737 to compete with the Airbus A320neo? Is the FAA irresponsible to enlist company workers to act on its behalf in certifying aircraft? Does the United States need a new model of regulation as manufacturers stuff more and more intelligent software into its products? The regulation question pertains to more than planes.

“Increasingly, regulating these products is regulating software,” says Daniel Carpenter of Harvard University. “That’s not only true with respect to planes, [it’s true] with respect to drugs, with respect to financial products, with respect to medical devices…. I think we’re going to need to look carefully at the way that these software programs are created and managed and how we simulate them.”

Why We Wrote This

Recent crashes have left Boeing’s top-selling jetliner grounded. They also could signal the need – in aviation and beyond – to rethink regulation in an era of rising reliance on software.

For Boeing engineers, the crash of two new 737 Max jets within five months of each other is a software puzzle. How did a sensor apparently go awry and make the airliners unmanageable within minutes of takeoff, resulting in the deaths of all aboard?

For Congress and the public, the twin accidents raise larger issues. Did Boeing unreasonably rush the plane’s design in the face of mounting competition? Did federal regulators have the necessary independence and resources to oversee the plane’s certification? In an era when automation is transforming the man-machine interface, can today’s regulatory system keep up with change?

On Wednesday, the Senate Commerce Committee’s aviation panel will give Congress its first crack at seeking answers to some of these questions.

Why We Wrote This

Recent crashes have left Boeing’s top-selling jetliner grounded. They also could signal the need – in aviation and beyond – to rethink regulation in an era of rising reliance on software.

These issues affect not just the flying public. For example, car drivers and other consumers face many of the same challenges of adequate oversight and spreading artificial intelligence.

“Increasingly, regulating these products is regulating software,” says Daniel Carpenter, a professor of government at Harvard University in Cambridge, Massachusetts. “That’s not only true with respect to planes, [it’s true] with respect to drugs, with respect to financial products, with respect to medical devices…. I think we’re going to need to look carefully at the way that these software programs are created and managed and how we simulate them.”

Faulty sensors and a loss of control

Officially, investigators have yet to determine what caused the crash of Indonesia’s Lion Air Flight 610 in October and Ethiopian Airlines Flight 302 in the morning hours of March 10 shortly after takeoff. Both planes were versions of the Boeing 737 Max, the manufacturer’s bestselling update of its workhorse plane. After the Ethiopian Airlines crash, the planes were grounded worldwide, pending investigation. The Federal Aviation Administration has said there are similarities between the two crashes.

In the earlier Lion Air accident, reports suggest a faulty sensor caused the automated flight-control system to nose down repeatedly shortly after takeoff. The day before the crash, the plane’s crew, with the help of an off-duty pilot, managed to wrest control of the aircraft from the automated system. The crew the following day failed to gain control. The crash prompted Boeing engineers to begin work on a software fix.

It also caused federal prosecutors in the United States to launch a highly unusual criminal investigation into the safety procedures and certification of the 737 Max. After the Ethiopian Airlines crash, the inspector general of the Transportation Department also began looking into the Max certification.

On the face of things, the certification process looks problematic, earning the ire of everyone from consumer advocate Ralph Nader to Capt. Chesley “Sully” Sullenberger, a safety expert and retired pilot who famously landed a plane in the Hudson River after a bird strike disabled its engines. “There is too cozy a relationship between the industry and the regulators,” he wrote in a recent op-ed. “And in too many cases, FAA employees who rightly called for stricter compliance with safety standards and more rigorous design choices have been overruled by FAA management, often under corporate or political pressure.”

Rising safety, but new risks?

Increasingly, the FAA has relied on Boeing employees to act as FAA inspectors to certify their own planes. A 2013 Government Accountability Office report found that more than 90 percent of the certification tasks were carried out by these FAA-approved private employees.

Under that deregulatory regime, starting in 2010, domestic airlines recorded eight straight years without a single fatal accident in the United States – a remarkable record. But some critics say that both Boeing and regulators grew complacent about the mounting challenges of the deregulated certification process.

“The regulator went to sleep,” says James Hall, a safety consultant and former chairman of the National Transportation Safety Board. “What we have now is a system that essentially provides Boeing with a system of self-certification.”

Another potential compounding factor: Boeing’s rush to build the 737 Max to compete with the Airbus A320neo. “It was go, go, go,” one engineer told The New York Times about the expedited process to design, build, and certify the plane. None of the Boeing staff interviewed by the Times said the speedup compromised safety.

Outside the company, however, there’s skepticism.

“No one wants to build a bad plane,” says Mike Perrone, union president of Professional Aviation Safety Specialists, based in Washington, D.C. “But when there are other pressures, like to get the plane approved, to get it out so you could get competing with your competitors, certain things can take a back seat. And maybe some things don’t get tested as much as they should.”

That’s especially true as manufacturers pack in more and more automation.

A compromised design?

To accommodate larger and more fuel-efficient engines on the older 737 airframe, Boeing implemented modifications that made the plane less stable during a steep takeoff. To compensate, it created the Maneuvering Characteristics Augmentation System, software that would automatically nose the plane down if it was in danger of stalling. But if a sensor designed to detect stalling was picking up erroneous data, as apparently happened in the earlier Lion Air crash, then MCAS could push the plane into a nosedive unless the crew disabled it.

The company’s initial safety analysis didn’t account for the fact that MCAS could repeatedly reset itself after the pilot pulled the nose upward, according to an investigation by The Seattle Times. Also, it underestimated by a factor of four the distance MCAS could move the tail. Meanwhile, pilots complained that the plane’s documentation was inadequate to explain what was going on.

In addition to a software fix, Boeing has announced it will make a safety feature standard rather than extra – a “disagree light” that warns pilots if two sensors are misaligned.

More-complex products, less money for oversight 

Some experts say the company’s scramble also signals the need for a new approach to regulation. That’s the case even though software and artificial intelligence systems are designed to make planes and other vehicles safer.

“In many instances, AI can fly an airplane safer or drive a car safer than you or I,” says Sid Shapiro, an expert on regulatory policy at Wake Forest University’s law school. “So there’s certainly a promise there of increased safety. On the other hand, you’ve got to get it right for that to happen. And it’s not clear that the National Highway Traffic Safety Administration or, for that matter, the FAA is really up to this job and it has the sophistication to do it.”

Funding for many regulatory agencies, including NHTSA, has been declining for the past quarter century when adjusted for inflation, he says. “Making those kinds of technological transitions can be very resource intensive and they don’t appear to have the resources.”

Compared with the FAA, NHTSA is even more underfunded, says Jason Levine, executive director of the Center for Auto Safety, a consumer watchdog in Washington, D.C. The agency has issued only voluntary guidelines for the various automated safety systems now going into cars.

“It’s impossible for consumers, or even safety professionals, to know which of these work, which of these work well, which of these work well in different circumstances, because there’s no required standards,” he says. “When used correctly and when operating as designed, yeah, [the technology] is amazing. But there’s a lot of ifs.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to ‘Too cozy.’ Boeing crashes raise doubts over FAA certification.
Read this article in
https://www.csmonitor.com/Business/2019/0326/Too-cozy.-Boeing-crashes-raise-doubts-over-FAA-certification
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe