Think voice control for pilots, 3D-printed parts and displays to help the crew see what they might otherwise miss. All these technologies could make aviation safer than ever.
I'm sitting in the jump seat of an airplane perched at the runway threshold at Deer Valley Airport in Phoenix. The cramped cockpit puts me so close to the action that I can almost reach out and grab the throttle.
Over my headset I hear air traffic control give us clearance to take off. We begin to roar down the runway and seconds later, we're in the air.
It's a clear evening and the Arizona desert stretches in all directions, but I'm not here to admire the view. The Dassault Falcon that I'm flying in isn't an ordinary aircraft. Full of glowing displays and experimental warning systems, it tests technologies that may change the way we fly. Soon, advancements like voice assistants for pilots, augmented reality displays, and 3D printed components could find their into way into cockpits, making air travel safer and more efficient for both pilots and passengers.
One of the screens in the Falcon's cockpit demonstrates a technology called synthetic vision. Unlike a conventional primary flight display that shows only the sky and land in 2D separated by a line, synthetic vision shows a 3D rendering of the world outside with terrain, runways and obstacles.
Bryan Weaver, the Honeywell Aerospace lead test pilot managing our flight, is eager to show me how this technology can help the crew see things that they might otherwise miss, both in the sky and on the ground.
"We're trying to make it look like you're flying around in good weather all the time," he said as we cruise over the McDowell Mountains at dusk. I take a look and I'm genuinely surprised at how it makes it look like we're flying on a bright, sunny day, even though it's almost completely dark outside.
NASA developed synthetic vision in the 1970s and '80s to help avoid a phenomenon called controlled flight into terrain. That's when a plane under a pilot's control is accidentally flown into the ground, a mountain or another obstacle.
The technology merges GPS data, aeronautical information and terrain maps to show where the plane is in relation to its environment. The display updates in real time to show pilots where they are and what's around them, such as a mountain in the airplane's path.
Kyle Ellis, an aerospace research engineer at NASA, says it's a more intuitive and familiar way of displaying position information because pilots don't have to cross-reference and interpret multiple 2D displays. "Ever since we've been born we've been looking at the world around us in that 3D way," he said.
In late 2017, NASA partnered with Boeing to test synthetic vision with junior pilots from Colombian airline Avianca on 787 simulators. At the helm with Ellis is Daniel Kiggins, a 34-year captain with American Airlines and a former NASA research pilot who selected the Avianca pilots because they fly through mountains on a daily basis.
They took to the technology quickly. "All the people we used for the experiment had never been exposed to synthetic vision," Kiggins said. "But when we put them in [the simulator] it was a duck to water."
Synthetic vision helped the Avianca pilots recover from unusual attitudes -- when the plane isn't straight and level -- more gracefully than when they used traditional displays. When flying low near terrain, for example, they could see where the plane was in relation to a mountain so they knew it would clear it in time.
But it isn't just about making a pilot's job easier, synthetic vision also has a tangible benefit for passengers.
Kiggins says these displays will make flights smoother, because pilots will know so much more about the environment around them. But they'll also help flight crews recover from extreme situations like a bird strike or flying through a volcanic ash cloud. "They'll know the taxiways, they'll see where they're going, you'll smooth out the turns," he said.
Ultimately, the NASA team's goal is to reduce deaths caused by loss of control in flight and improve safety for passengers. "It's so they don't think of Air France 447 when it gets bumpy in turbulence," Ellis said, referring to a 2009 crash that killed all the passengers of an Airbus A330 that flew into a severe storm over the Atlantic. "And they don't think of [Chesley] Sullenberger" -- the US Airways captain who put a plane down on the Hudson River -- "when they're taking off and see birds flying around," he said.
An infrared camera mounted on the front of the Falcon fills in additional terrain detail over the synthetic vision display, showing objects the pilot might be unable to see in the dark. Those could include other planes or even a coyote on the ground. It's like augmented reality in the cockpit.
The system has a real advantage for passengers, too. If you've ever been on a plane that couldn't land due to low visibility and was forced to perform a go-around, you'll know how frustrating and even nerve-wracking missed approaches can be.
Delayed or diverted flights are also an expensive exercise for airlines. With an infrared overlay, however, pilots can see heat given off by the lights when approaching an airport, so they can land even if it's foggy.
Synthetic vision displays can also assist pilots by displaying helpful symbols such as a roll recovery arrow. I get a demonstration when I hear an audio alert in the cockpit that says "autopilot off" and before I know it, Weaver starts to roll the plane into a steep, 45-degree turn. My stomach didn't feel good, but I had been informed where the air sick bags were.
As soon as he starts to bank too far, an arrow pops up on the primary flight display to show Weaver what angle he needs to take to recover. It's a small thing, but can make a big difference if the pilot doesn't recognize how that roll might affect the plane and how he or she needs to correct it.
"If [pilots] roll the wrong direction, push instead of pull, you can be overspeeding the airplane, over G-ing the airplane faster than you anticipate," Weaver said as we leveled off again. (My stomach also recovers.)
Co-pilot Sandy Wyatt from Honeywell Aerospace says these types of indicators can be beneficial to pilots in high-stress situations, when you're more prone to making a mistake. While planes already have ground proximity warning systems that tell the crew when they're flying too close to terrain, Wyatt says by the time that alert goes off, the pilot's situational awareness is already reduced.
"[In a high-stress moment] your amount of excess processing power just goes down to zero," he said. "You're human."
We're getting used to setting alarms or locking our front door with our voice. Pilots may soon be get similar voice control on the flight deck.
Of course, this pilot assistant is much more focused than Siri or Alexa. It responds to specific cockpit control language that was developed to minimize misunderstandings, so you can't set a reminder to pick up milk on the way home or find out who won the Super Bowl.
Voice control in the cockpit can help decrease the workload by letting pilots speak rather than manually enter commands. Honeywell has already tested voice control on one of its Embraer 170 planes, allowing pilots to navigate to different pages in the flight manual. (Rockwell Collins is also bringing voice control to pilots.)
To make it work, a pilot speaks into his or her headset and the command is sent to a neural network for interpretation. Then, an intent processor decides what action to take, such as pulling up a flight checklist, zooming into a map or checking the state of the engines.
I take a seat inside a sound booth where the assistant is set up on two screens. Loud engine noise is piped in to simulate the sound of a real cockpit. I speak my command to get a map of the Phoenix airport and it works seamlessly, even understanding my Australian accent.
Research groups around the world are making voice models for different accents, which will be used to create a dialect database. Honeywell Aerospace senior engineer Roger Burgin says that accuracy is a top priority. "As we build larger and larger vocabularies and put [them] into the neural nets, we can get up into maybe 10 percent word error rate, which is 90 percent accuracy," he said.
Vertical takeoff and landing vehicles, or VTOL, could remove the need for a runway altogether. There are more than a dozen flying car concepts from companies like Uber Elevate and Bell Helicopter, while Airbus SE has tested an autonomous passenger drone called Vahana.
One technology that could power VTOL is a megawatt generator.
"Helicopters can do [VTOL] fairly efficiently. They've got a very large rotors," said Torey Davis, director of advanced technology at Honeywell Aerospace. But with smaller rotors that will be used for VTOL, you need enough power to be able to lift and land safely.
Honeywell's megawatt generator weighs 280 pounds and can generate enough power for 100 houses, but its intended use is for hybrid electric propulsion in the Aurora LightningStrike X-Plane. The company expects this technology to be mature enough for production within five to ten years.
You might not be able to see the components made at Honeywell's 3D printing lab on your next flight, but these aircraft parts are lighter than ever before. Saving weight is important for any aircraft as lighter planes use less fuel and can carry more passengers and cargo.
But don't worry, these parts aren't plastic and they aren't printed on a machine you would buy for your home. These industrial machines, some at least 6 feet high, can print materials like titanium or nickel alloys with ease.
To feel some of the weight savings additive manufacturing can bring, I pick up two versions of the same rear engine mount made from a nickel-based alloy. One's been cast at a foundry by pouring the alloy into a mold, while the other has been printed.
They don't look all that different, but I can tell the printed version is much lighter than the cast one. It's actually 62 percent lighter, engineering fellow Don Godfrey tells me. (Honeywell didn't disclose the specific weight of each part.)
Producing a 3D-printed part is faster, as well. A part that might take six months to get from a traditional process can take as little as two weeks when printed.
"If we can print parts rapidly instead of waiting months to order them and receive them, that's where we are going to make the biggest bang for the buck," Godfrey said. That time saving could translate to reduced downtime if an airplane needs a replacement component, and could allow for parts to be made on-demand.
3D printing can also improve aircraft reliability by streamlining the assembly process for complicated parts. Godfrey shows me a thermal anti-ice valve that was printed as one entire piece, rather than lots of different parts bolted together in the original design.
"We removed 52 parts," he said. "A lot of those parts were nuts, bolts, washers, o-rings. But an o-ring is never going to leak if it's not there."
Other manufacturers like Airbus and Boeing are also using 3D-printed components. Airbus installed a titanium-printed bracket on an A350 while Boeing is printing parts for 787 Dreamliners.
All these technologies have the potential to make air travel safer for passengers. Voice controls and enhanced displays have a lot of software under the hood making them all work, but the pilot just sees a more intuitive interface. By reducing pilot workloads, they're freed up to make more accurate, timely decisions in high-stress situations. Which means a safer flight for everyone.