Airline safety: The next generation of smart planes
New ways of handling air traffic, including 3-D infrared upfront, may cut delays and improve airline safety.
The holidays are coming. You’ve decided to buck the economy and head over the river and through the clouds to Grandmother’s house.
Airline boarding pass in hand, you sit at the gate, only to watch the clock tick past your departure time. Your winged steed is late. Once airborne, you breathe a sigh of relief, only to learn that air-traffic jams are adding more delays. It’s starting to look as if cold turkey sandwiches will be your first meal at Grandma’s.
But help is on the way. Federal transportation officials are working with researchers and air carriers across the country to find new high-tech ways of handling air traffic, from takeoff to landing, that are expected to reduce delays, save fuel (thereby cutting greenhouse-gas emissions), and improve safety. All this while allowing for a significant increase in airline traffic.
The broad program is called NextGen. And for all the improvements planners say it will bring, it is destined “to change the way the system operates in ways we can’t envision today,” according to Vicky Cox, senior vice president for the project at the Federal Aviation Administration, who shared her insights on the program with a group of FAA employees during a “lunch and learn” session earlier this year.
Demands on the nation’s air-traffic system are expected to rise to between two and three times today’s level by 2025, she explains, although the current economic slump will affect that estimate. Globally, air traffic is expected to rise by about the same factor by 2027, according to the International Civil Aviation Organization, in Montreal.
To cope with that growth, planners are turning to Global Positioning System (GPS) satellites; increasingly automated flight-control systems on airliners; and rapid-fire communication between controllers, flight crews, and airliners in the sky. In essence, much of the decisionmaking on altitude, route, and speed will shift from controllers on the ground peering at radar screens to pilots scanning display screens in the cockpit.
“There is no doubt that the number of things that a pilot has to keep track of will increase,” says Christopher Wickens, professor emeritus of psychology at the University of Illinois at Urbana-Champaign and a consultant to companies designing the flight deck of the future.
Massachusetts Institute of Technology aerospace engineering professor John Hansman says the upcoming system will have three broad groups: first, using the pinpoint precision of GPS to “thread the needle,” especially at airports nestled in tricky terrain or plagued by chronic bad weather; second, in-flight tracking; and third, a kind of airborne Internet that makes all this available to other aircraft and the ground crews faster than today’s system can manage.
He notes that when controllers track aircraft with radar, they get updates on planes close to the airport every 4-1/2 seconds and long-range radar checks about every 12 seconds. It takes that long for the antenna to make one circular sweep of the sky. Several sweeps are needed to pick up changes in an airplane’s course and speed. The uncertainty regarding a plane’s precise location, altitude, and speed between radar sweeps – as short as that period seems – is one factor limiting how tightly air-traffic controllers are willing to bunch aircraft together, especially in the skies around airports.
NextGen technologies already are working their way into the cockpit. For a lesson on using GPS to thread the needle, for instance, look to Alaska Airlines. Its pilots must fly into and out of airports in some of the most rugged terrain on the planet under fickle weather conditions.
Juneau, the state capital, is a case in point. The airport was built on a tidal flat nestled against a mountainside. The field hosts one runway. Approach from the north or south, and you descend amid narrow inlets hemmed by mountains.
In 1992, the airline began working with the FAA on using GPS to guide landings there, says Ken Williams, the airline’s fleet captain. At the time, the airport had only one navigation aid at only one end of the runway. If you wanted to land, cloud ceilings could be no lower than 1,000 feet. Conditions are often much worse than that. And you had to be able to see at least three miles ahead. If you missed your approach, you had to fly out and try again.
“If the nav aid went out, you had to wait for the FAA to send up a flight-check aircraft to revalidate the nav aid,” he says. That could take two or three days.
Once the airline linked GPS data into aircraft flight-management computers, its planes could land with clouds only 330 feet from the ground and with visibility of one mile – no ground systems needed. If the
airport adds approach lights to that end of the runway, the airline could operate with ceilings of 250 feet and visibility of half a mile.
Couple this with visualization tools developed by the National Aeronautics and Space Administration (NASA), Captain Williams says, and any airline with the right in-cockpit gear could drop that ceiling level to 50 feet, with 300-foot visibility without the expensive land-based navigation aids currently needed to support air traffic under bad weather conditions.
The visualization tools include an infrared video camera in the nose. Its images appear on a screen, along with data from the aircraft’s flight computer, to give pilots the ability to “see” through all but a Martha’s Vineyard-type fog. If visibility gets that bad, flight crews could pull up a 3-D image of the terrain around them from a vast database of maps. The plane’s location within this virtual terrain is continuously updated from GPS data.
Researchers talk of eventually incorporating some of these visualization tools into helmetlike displays, says Steven Young, a researcher at NASA’s Langley Research Center in Hampton, Va. This would allow pilots to scan their windows without losing track of vital flight information. At the moment, so-called heads-up displays “have a very narrow field of view,” Mr. Young says. “Some people have called it like looking through a straw” because pilots can focus on looking through the display so intently that they in effect tune out their peripheral vision.
Meanwhile, in Louisville, Ky., United Parcel Service is experimenting with another NextGen tool, known in geekspeak as automated dependent surveillance broadcast (ADS-B). It’s a way for aircraft to share location, altitude, speed, direction, and “intent” information with ground stations or other aircraft.
Longer-term, ADS-B is envisioned to form the backbone of an airborne Internet, adds NASA’s Young. Pilots will have onboard displays showing the course of other aircraft in their vicinity. Cockpit computers will adjust course and speed to ensure the plane reaches preset way points on time without fear of more tightly spaced planes drifting too close to one another.
Pilots can intervene or chart a new course based on weather ahead, with the ADS-B system merging them into the new traffic flow. Air-traffic controllers, whom the FAA expects to maintain at today’s staffing levels, will manage the system, rather than direct each individual flight.
One key issue is designing the system in ways that don’t make it easy for flight crews’ minds to wander – or to nod off, as some pilots suspect happened with Northwest Airlines Flight 188, which overshot the Minneapolis airport by some 150 miles before the flight crew realized what had happened. Of perhaps greater concern is the need to design the system in ways that don’t overload pilots with information, undercutting their “situation awareness” by giving them too much to track.
“That’s a huge issue – how to make sure the pilots focus on the right things,” says Greg Abjerg, national director of airspace planning for HNTB, an architecture and engineering firm based in Overland Park, Kan. “You can get so focused you forget to look out the window.”