Forward march

Special Report: Rockwell Collins Avionics

Aircraft and avionics technology has made a steady, and in the last few years, dramatic, progression from the electro-mechanical instrumentation that was all-pervasive 20 years ago, to electronic flight information systems (EFIS). As Craig Peterson, Rockwell Collins Director, Avionics and Marketing, notes, “Very swiftly, over the course of relatively few years, we moved from the initial step, which involved putting all the instrumentation onto CRT-based displays driven by computer systems, to HSI and ultimately, to moving maps. The latter completely transformed the pilot’s positional awareness as to where the aircraft was relative to the final destination.”

He points out that the first commercial airliner to be equipped with these new systems was the Boeing 767, in the mid-1980s. In business aviation, the Beechcraft Starship was the innovator. The Starship was a twin-turboprop six-to-eight passenger business jet which saw its first flight in February 1986. From there, the evolution of EFIS systems advanced to the point where CRTs started growing in size and functionality, adding information that went beyond the basics of air speed, attitude and position navigation, such as pressure and temperature as well as data on the aircraft’s pneumatic systems and power systems. “We started to integrate these display feeds into systems state-awareness displays that helped the crew in the running and oversight of multiple systems in the aircraft,” he says.

Through the late 1980s and early 1990s the underlying technology shifted from CRT-based displays to LCD displays, which brought real benefits in terms of lower weight, lower cost, lower power and the ability to go up in size without a massive increase in weight and cost. Today’s cabin displays can be upwards of 18-19 inches, and the orientation has switched from portrait to a more natural, landscape display. The sheer number of data sets on these displays has exploded, so they now have the ability to switch back and forth between all manner of visual aids to assist the crew in monitoring various elements of the airframe and the aircraft’s progress.

Today’s systems are a kind of synthesis of the external world, and the databases holding the various geographical and informational data sets. “The power of today’s computer processors and their internal bus structures allows an almost virtual reality-like depiction of the outside world, showing the geophysical features of the terrain in ways that are intuitively easy for the pilot to grasp. The data sets make it possible to show political boundaries laid over the geographic terrain, the point where you would be crossing into another country’s airspace and so on.” Moreover, as Peterson points out, all this information is available regardless of the external weather conditions. The aircraft could be travelling through heavy cloud and the pilot would have all the terrain features displayed for him/her.

Along with the development and transformation of the cockpit displays the biggest change has been the migration of heads-up displays, or HUDs, from commercial airliners to business aviation jets. “The truly intuitive thing about a HUD is that the pilot is seeing this synthesis of the outside world and geometry, plus key flight data while looking through the HUD at the outside world,” he explains. HUDs have a long history and are still evolving, with newer display technologies emerging all the time. However, one of the early and still crucial features of a HUD is collimation, a technical process which takes the projected image and makes the light rays parallel. This might sound technical but what it does is to remove the need for the pilot’s eyes to refocus when switching between viewing the HUD and scanning the view outside the cockpit window. Because the light rays are parallel, the focus is at infinity, which is where the focus is when you look into the distance through a window. So the projected image seems to be “out there” and both the image and the scene beyond the window are in sharp focus for the pilot. The HUD display is very precisely aligned with the three axes of the aircraft, so projected runway lights, for example, align very precisely with real runway lights as the plane approaches.

The HUD concept was developed and commercialised by a subdivision of Rockwell Collins, originally called Flight Dynamics, which started life as an independent company in the 1980s. The company called the HUD its Heads-up Guidance System (HGS®). It was certified on the Boeing 727 in 1985 with CAT III operational approval in 1987 and was put into operational use that same year. “What the HGS system initially did for the industry was to fill a void for airplanes of an older generation who wanted the advantages of being certified for Category III (CAT III) automatic landing system minima.” The HGS introduced by Flight Dynamics was recognised by the FAA as enabling hand-flown CAT III approaches, since the HGS provides the pilot with the necessary electronic guidance to land the airplane in the absence of true outside visual references. The HGS uses guidance cues from the airport’s ILS system and the crew have sufficient warning, if a safe landing is in doubt, to be able to abort and come round again. This ability to get to CAT III made the HUD very popular with ‘feeder’ airlines and it saw a rapid take-up from regional jet manufacturers. One or two business jet OEMs also saw the benefits and were early users.

Dean Schwab, one of Rockwell’s experts on HUD technology, points out that the company now has some 4,500 HUD systems in active use on about 42 different aircraft models. “The primary benefit to the crews is safety. The pilot can see very quickly what the state of the aircraft is and can maintain the aircraft’s stability and adherence to the correct flight path in the approach to the runway. In particular it added a huge margin of safety to so-called ‘black-hole’ approaches where there is very little by way of visual clues to help the pilot find the runway.” A long, straight-in approach at night over featureless terrain to a brightly lit runway is a classic instance of the black-hole approach. Pilots know that they cannot rely upon their physical senses when approaching with minimal visual clues. Any time the human eye is going faster than the conditions it evolved in, ie a walking speed of three to four miles per hour at ground level, and more so when the perspective is high in the air, visual miscues abound. The HUD eliminates the problem by ‘cueing’ the pilot with flared dashes highlighting the correct approach to an accurate, electronic graphic of the runway while simultaneously displaying speed and altitude.

The huge danger in ‘black-hole‘ landings is that instead of a pilot following a normal three degree approach path, and allowing the angle of approach to the runway to steepen, there is a compelling tendency to keep the visual angle constant. If you do this, you actually fly in on the descending arc of a circle with its end point a few miles short of the runway, with very unpleasant consequences. If this description puzzles you try drawing a diagram , with two sides of the triangle being the three degree approach and the runway, then draw lines from the end of the runway to various heights on the glide path – you’ll see the angle steepening through the descent. HUDs eliminate this problem through providing a different set of visual clues, namely the glideslope reference line, which keep the pilot flying a proper flight path and orienting to the real end of the runway. From a safety perspective, it is hard to overemphasise the difference, which can literally mean the difference between life and death for everyone on board. (For an excellent article on the black-hole phenomenon see the article by Linda Pendleton (April 2000) on AVweb: www.avweb.com/news/airman/182402-1.html)

We started to integrate these display feeds into systems state-awareness displays that helped the crew in the running and oversight of multiple systems in the aircraft

For operators and owners of business aircraft the HGS system can be very important since it is the only system that allows aircraft to take off with under 500 feet of visibility. “For scheduled airlines and regional airlines the impact of not being able to fly because you do not have a system that will let you take off when visibility drops to under 500 VR, can be huge. You can get ripple effects in your schedule that run on for days if you cannot get the aircraft away,” he comments.

Rockwell Collins sees HGS systems moving rapidly over the next few years from a nice to have option on the flight deck, to becoming an absolutely integral part of the aircraft’s avionics system. The technology too, is changing and developing. Rockwell now has a substrate wave guided system for its HGS that is filtered down through the glass, eliminating the need for a projector.
The future of avionics as a whole is for more and more information to be synthesised onto the display in an entirely intuitive fashion. “The continual march of information and data, and now synthetic vision with input from infrared external cameras, for example, is now just part of avionics. Hi-res maps will simply increase in fidelity and granularity. We are also seeing technologies beginning to help flight crews in decision-making processes, computers that analyse traffic in the vicinity, that look at environmental threats and hazards – there is so much that could make things easier for pilots,” Peterson says. Ultimately the march seems to be towards utilising unmanned aerial vehicle technologies to enhance the overall safety of the aircraft and to bring further refinements to pilot situational awareness, to the point where flight crew sizes may well reduce, he suggests.

Share
.