Will the Head-Up Display of the future be a wearable device like the Google Glass phenomenon? Rick Adams looks at developments in simulator and aviation training technology, including visionic systems, tablet-enabled instructor stations, and a “second-gen” motion system.

“Right in front of your eyes” is an expression we’ve all heard over the years, meaning the information we’re looking for should be obvious.

Now, the critical info a civil pilot needs to fly the aircraft may not be on a paper checklist, not on a cockpit display monitor, not even on a monochrome projected head-up display. It is literally right in front of their eyes, displayed on the lenses of special eyeglasses.

Aero Glass, a start-up company with roots in Hungary and Southern California, was the buzz of the Oshkosh air show in Wisconsin this past summer with what may be the first viable “wearable augmented reality” device for civil aviation. Not only did their exhibit attract thousands of interested pilots and a couple hundred product beta testers, they’ve been talking with commercial airlines, training companies, regulators, and the military as well.

What’s the big deal? Using a pair of off-the-shelf US$699.99 Epson Moverio BT-200 binocular, transparent “smart glasses” equipped with a motion tracker, Aero Glass overlays the out-the-windscreen real-world view the pilot sees with an array of essential data and even helpful synthetic vision graphics. For example, during a pre-flight checklist, the glasses will draw colored circles around the referenced instruments or switches. In flight, restricted airspace will have an impenetrable-looking red graphic wall to discourage the pilot from entering the area. During takeoff, climb-out, and approach to landing, colored target boxes depict the desired flight path. The runway may be highlighted in bright green. On the ground, moving arrows show the direction you should taxi. Other air or ground traffic is displayed at its proper altitude. Wherever the pilot turns his or her head, information and visualization about the terrain, traffic, navigation, weather, and so forth is front and center in the field of vision synchronized with the outside world.

Aero Glass is “working out the kinks” in the first version of the software, targeting the first quarter of 2015 for the Version 1.0 release for the general aviation market, Chief Operating Officer and VP Business Development Cameron Clarke told me, possibly with demonstrations at the trend-setting Consumer Electronics Show in January. They have also been entertaining partnerships, including using Osterhout Design Group (ODG) glasses running Qualcomm Vuforia software for higher-end applications.

Unlike Google Glass, which features a single small monitor near one eye, the Epson and ODG glasses offer “full binocular 3D vision,” Clarke notes.

Embry-Riddle Aeronautical University professor Timothy A. Sestak, an Aero Glass advisor, remarks, “This technology is exactly what the general aviation pilot needs: a way to make ‘situation awareness’ easier and make the interpretation and integration of the myriad points of information that a pilot must deal with before, during, and after flight.” Professional pilots will also benefit from the simple direct cues available. And with the tightened clearances and tolerances for Next Gen, the ability to precisely navigate in all four dimensions will become even more critical. The integrated ‘picture’ of what is happening around them and the ability to emphasize the most time-critical parameters in an integrated total picture is what the pilots will need to be able to make the right decisions at the right time.”

Aero Glass was conceived in Hungary by Akos Maroy, whose interests vary from an Air Transport Pilot License to architecture to investigative journalism. Befitting a technology start-up, Maroy reached out to San Diego geospatial software engineer Jeffrey Johnson via an online tech forum. Of course, I connected with COO Clarke through Skype tablet-to-tablet video conferencing.

But lest we get too excited about the real-world applications of the new smart glass technology, keep in mind another ambitious start-up, Aerocross Systems of McKinney, Texas, announced their intent to debut “Brilliant Eyes” as “the world’s first augmented reality head-mounted display” at Oshkosh a year earlier … and has been oddly quiet since.

NASA’s Langley Research Center in Virginia worked on a crude “miniature head-up display” as early as 2006, using a single eyepiece much like Google Glass. A Boeing 747 captain who assisted with the system’s testing complained, though, that he became so engrossed in the synthetic view inside the small display that he completely forgot to look at the big picture outside the cockpit!

Among the likely applications for smart glasses is aviation maintenance (and therefore maintenance training): hands-free access to technical manuals while working on the aircraft, as well as interaction with maintenance information systems for updating job status, ordering parts, etc. An Indian company, Ramco Systems, has created a tool that allows an airline engineering crew with Google Glass devices and smart watches to retrieve a list of to-do items and order replacement parts “even before passengers disembark.” Austin, Texas-based Pristine says its secure, enterprise-grade software empowers senior training staff to stream secure, first-person video through Google Glass to aviation support personnel anywhere in the world. Engineers at GE Aviation’s training facility in Cincinnati, Ohio have been testing Glass on jet engine inspections, a task where workers find it difficult to stop and check information on a computer.

Another potential new training technology is Oculus Rift, sort of a swimmer’s facemask with an iPad tablet inserted as the display. Rift is now owned by Facebook (so perhaps anything you look at during an immersive training experience will be recorded for marketing purposes). Danish company Aviation eLearning had an Oculus Rift virtual reality display in its booth at the European Airline Training Symposium (EATS) in Berlin in late October, so I gave it a spin. Despite removing my eyeglasses – Oculus has not yet come up with an adequate solution for those of us without perfect eyesight – the demonstration of an aircraft walk-around was compelling. Using a game console to control the binocular view, it felt a bit like I was navigating a Segway two-wheeler. Nonetheless, I was able to move up close to “inspect” an engine, even “climb” the airstairs. As I approached an active runway, I could hear the increasing volume from an aircraft about to perform a touch-and-go.

In the traditional flight simulator visual system realm, Frasca has added animated shorelines, moving vegetation, light points, and luminance maps to its TruVision Global image generator, enhancing training realism. The base map already included over 10,000 runways, coastlines, representative terrain, rivers, and roads around the world.

RSI Visual Systems recently doubled its manufacturing space, moving to a new headquarters in the Dallas, Texas area. RSI offers an “airport currency service” for its XT image generator, which is used by Textron’s TRU Simulation + Training and other customers. RSI assumes responsibility for maintaining accurate and current airport models from industry, government, airline, and military sources, and makes updates available for FTP direct download. Fees are based on the number of airports actually “in training.”

Have Tablet, Will Travel

Flight simulator manufacturers CAE, FlightSafety International, and Frasca have recently focused their design efforts on a long-overlooked part of the sim - the instructor-operator station (IOS). A few years ago, the major IOS advancement was touch-screen monitor technology. Today it’s tablets.

Frasca International’s new IOS is designed for Windows 8-based touch screens, and an instructor can control the simulation from the tablet device because the instructional control software is “native,” not a remote connection. The tablet can be mounted on the arm of the instructor’s chair, or he/she can use traditional screens in either a forward- or sideways-facing mode. Sikorsky S-76 rotary-wing instructors at Bristow Helicopters in Scotland are the first to use the new Frasca IOS design.

FlightSafety’s next-gen IOS is “a totally independent island” with “maximum flexibility to position monitors at different angles – it’s like setting a rear-view mirror,” according to John Van Maren, Vice President, Simulation. Overall, the instructor area is also considerably more spacious. “The entire back end was redesigned from the customer perspective.”

CAE’s IOS is movable. Bruno Cacciola, Director of Product Strategy and Marketing, calls it “a chair with displays.” Instructors can bring their personal Apple iPad or iPhone into the simulator as sort of a third display monitor, and they can synchronize their device with the instructor station software. CAE’s enhanced IOS also features an automatic event capture system, much like a flight data recorder in an aircraft, an adaptation of CAE’s simulator operations quality assurance (SOQA) capability but now with the option of immediate debrief in the sim instead of waiting until the end of the hours-long session.

Level D as in Dutch

Since the shift from hydraulic to electric systems over the past several years, there seems to have been relatively little movement, so to speak, in flight simulator motion system technology. But now a Dutch company, E2M Technologies, has developed a “second generation” Level D-qualified six-degree-of-freedom (6DOF) motion platform which claims improved cueing, reliability, and diagnostics. E2M stands for "Electric to Move.”

Leveraging experience gained across 20 years with the former Fokker Control Systems (FCS), their engineers “did extensive research to make the actuator as simple as possible,” E2M international account manager Ton Stam told CAT. They worked with both a screw manufacturer and a motor manufacturer, whom they declined to identify, to refine the ball screws and other elements of the system. Then E2M applied its own patented Direct Workspace Management (DWM) software algorithms to achieve a “larger workspace” for motion cueing.

The DWM uses model predictive control for “washout” (a motion system’s ever-present push towards the center). It adapts the intensity and direction of the washout cue depending on the present motion cues, the available workspace, and the present and future predicted position of the motion system relative to the workspace boundaries. “As the motion system moves, the shape and size of the workspace continuously change,” says Stam. “It's like walking in a room where all the walls, ceiling, and floor are constantly moving toward and away from you. DWM is about avoiding any contact with these ‘walls’ while moving through the room.”

An evolution of DWM to further improve the realism of flight simulators will be available mid-2015.

E2M CEO Jan van Bekkum touts “a record level of smoothness for electric systems” and the “lowest turn-around bump,” though comparing competitor motion systems is often an apples-to-oranges exercise. Van Bekkum also said E2M offers the “only washout technology working completely at the platform level.”

E2M’s design uses a “unique upper joint concept for a lower center of gravity and eye reference point,” as well as a non-mechanical concept for safety. All of the commercial off-the-shelf (COTS) electronics used in the E2M 1400-14000 Level D system are self-diagnosing. Stam says this means faster troubleshooting and reduced downtime.

E2M collaborated with CAE, who incorporated the new motion system into a prototype simulator, further refining the technology. CAE subsequently placed an order for “a large number of systems,” and the first Level D simulator with E2M’s motion base is expected to be installed in January 2015. Other E2M motion system buyers have included Lockheed Martin-owned Sim Industries, Austria’s Axis Systems, and an undisclosed military simulator supplier.

E2M has also delivered EM3-ROT-1500-HF systems for Level D helicopter simulators. The 3DOF systems can simulate rotor-related failures which a pilot often detects through a change in perceived, seat-of-the-pants vibration.

FlightSafety has also been doing some motion systems research, using a tool and new software algorithms on their FS1000 Level D device “to optimize cues to match aircraft data much more closely,” Van Maren said, calling the enhanced results “shocking” when compared to the standard physics-based legacy algorithm.

CAE is using a new “one of a kind” electrical vibration system in their 3000 Series helicopter simulators, claiming the electric motion and vibration saves operators as much as USD$20-30,000 per year compared with hydraulic motion systems.