Flight simulation is an artificial re-creation of aircraft flight and various aspects of the flight environment. This includes the equations that govern how aircraft fly, how they react to applications of their controls and other aircraft systems, and how they react to the external environment such as air density, turbulence, cloud, precipitation, etc. Flight simulation is used for a variety of reasons, including flight training (mainly of pilots), for the design and development of the aircraft itself, and for research into aircraft characteristics, control handling qualities, and so forth.
Flight simulations have varying degrees of hardware, modelling detail and realism that depend on their purpose. They can range from PC laptop-based models of aircraft systems, to simple replica cockpits for familiarisation purposes, to more complex cockpit simulations with some working controls and systems, to highly detailed cockpit replications with all controls and aircraft systems and wide-field outside-world visual systems, all mounted on six degree-of-freedom (DOF) motion platforms which move in response to pilot control movements and external aerodynamic factors.
The early visual systems used a small physical terrain model, normally called a "model board". The model board was illuminated, typically by an array of fluorescent light tubes (to avoid shadowing), and a miniature camera was moved over the model terrain in accordance with the pilot's control movements. The resultant image was then displayed to the pilot. Only limited geographical areas could be simulated in this manner, and for civil flight simulators were usually limited to the immediate vicinity of an airport or airports. In military flight simulators, as well as airfields, model boards were produced for larger areas that included terrain for practicing low flying and attacking targets. During the "cold war" between NATO and the Warsaw Pact powers, some model boards with large areas of real terrain were produced before being superseded by digital image generation systems.
The motion system in the 1929 Link Trainer design gave movements in pitch, roll and yaw, but the payload (weight of the replica cockpit) was limited. For flight simulators with heavier cockpits, the Link Division of General Precision Inc. (later part of Singer Corporation and now part of L-3 Communications) in 1954 developed a system where the cockpit was housed within a metal framework that provided 3 degrees of displacement in pitch, roll, and yaw. By 1964, improved versions of this system provided displacements of up to 10 degrees.
It was found that six jacks in the appropriate layout could produce all six degrees of freedom that are possible for a body that can freely move. These are the three angular rotations pitch, roll and yaw, and the three linear movements heave (up and down), sway (side to side) and surge (fore and aft). The design of such a 6-jack (hexapod) platform was used first by Eric Gough in 1954 in the automotive industry and further refined by Stewart in a 1966 paper to the UK Institution of Mechanical Engineers (See under Stewart platform.
From about 1977, aircraft simulators for Commercial Air Transport (CAT) aircraft were designed with ancillaries such as the Instructor Operating Station (IOS), computers, etc., being placed on the motion platform along with the replica cockpit, rather than being located off the motion platform.
In 1972, the Singer-Link company, headquartered at Binghamton, New York State, developed a display unit that produced an image at a distant focus. This took the image from a TV screen but displayed it through a collimating lens which had a curved mirror and a beamsplitter device. The focal distance seen by the user was set by the amount of vertical curvature of the mirror. These collimated display systems improved realism and "depth perception" for visual scenes that included distant objects.
Optical infinity This above was achieved by adjusting the focal distance so that it was above what is sometimes referred to as "optical infinity", which is generally taken as about 30 feet or 9 meters. In this context, "Optical Infinity" is the distance at which, for the average adult person, the angle of view of an object at that distance is effectively the same from both the left and right eyes. For objects below this distance, the angle of view is different for each eye, leading to the brain being able to process scenes in a stereoscopic or three-dimensional sense. The inference is that, in simulation display technology, for scenes with objects which in the real world are at distances over about 9 meters / 30 feet, there is little advantage in using two-channel imagery and stereoscopic display systems.
Collimated Monitor Design The 1972 Singer-Link collimated monitors had a horizontal field of view (FoV) of about 28 degrees. In 1976, wider-angle units were introduced with a 35 degree horizontal FoV, and were called 'WAC windows', standing for 'Wide Angle Collimated', and this became a well-used term. Several "WAC Window" units would be installed in a simulator to provide an adequate field of view to the pilot for flight training. Single-pilot trainers would typically have three display units (center, left and right), giving a FoV of about 100 degrees horizontally and between 25 and 30 degrees vertically.
Viewing Volume and user's Eye-point For all of these Collimated Monitor units, the area from which the user had a correct view of the scene (the "Viewing Volume" from the user's "Eye-point") was quite small. This was no problem in single-seat trainers because the monitors could be positioned in the correct position for the pilots average eye-point. However, in multi-crew aircraft with pilots seated side-by-side, this led to each pilot only being able to see the correct outside-world scene through the collimated monitors that were positioned for that pilot's own eye-point. If a pilot looked across the cockpit towards the other pilot's display monitors, this was outside the viewing volume for those monitors and distortions or even "black holes" would be seen because the viewing angle was so far outside the Viewing Volume for the display units concerned. Clearly, for simulators with side-by side crew, a system that gave correct Cross-Cockpit viewing was required.
Engineering flight simulators are used by aerospace manufacturers for such tasks as:
- Development and testing of flight hardware. Simulation (emulation) and stimulation techniques can be used, the latter being where real hardware is fed artificially-generated or real signals (stimulated) in order to make it work. Such signals can be electrical, RF, sonar, etc., depending on the equipment to be tested.
- Development and testing of flight software. It is much safer to develop critical flight software on simulators or using simulation techniques than it is to develop using actual aircraft in flight.
- Development and testing of aircraft systems. For electrical, hydraulic, and flight control systems, full-size engineering rigs sometimes called 'Iron Birds' are used during the development of the aircraft and its systems.
A Full flight simulator (FFS) duplicates relevant aspects of the aircraft and its environment, including motion. This is typically accomplished by placing a replica cockpit and visual system on a motion platform. A six degree-of-freedom (DOF) motion platform using six jacks is the modern standard, and is required for the so-called Level D flight simulator standard of civil aviation regulatory authorities such as FAA in the USA and EASA in Europe. Since the travel of the motion system is limited, a principle called 'acceleration onset cueing' (which see) is used. This simulates initial accelerations well, and then returns the motion system to a neutral position at a rate below the pilot's sensory threshold in order to prevent the motion system from reaching its limits of travel.
Flight simulator motion platforms used to use hydraulic jacks but electric jacks are now being used. The latter do not need hydraulic motor rooms and other complications of hydraulic systems, and can be designed to give lower latencies (transport delays) compared to hydraulic systems. Level D flight simulators are used at training centers such as those provided by Airbus, FlightSafety International, CAE, Boeing Training and Flight Services (ex-Alteon) and at the training centers of the larger airlines. In the military, motion platforms are commonly used for large multi-engined aircraft and also in helicopters, except where a training device is designed for rapid deployment to another training base or to a combat zone.
Statistically significant assessments of training transfer from simulator to the aircraft are difficult to make, particularly where motion cues are concerned. Large sample sizes of pilot opinion are required and many subjective opinions tend to be aired, particularly by pilots not used to making objective assessments and responding to a structured test schedule. However, it is generally agreed that a motion-based simulation gives the pilot closer fidelity of flight control operation and aircraft responses to control inputs and external forces. This is described as "handling fidelity", which can be assessed by test flight standards such as the numerical Cooper-Harper rating scale for handling qualities. Generally, motion-based aircraft simulation feels like being in an aircraft rather than in a static procedural trainer. In a re-structuring of civil flight training device characteristics and terminology that will take place in about 2012, the Level D Full flight simulator will be re-designated an ICAO Type 7 and will have improved specifications for both motion and visual systems. This is a result of a rationalisation of worldwide civil flight training devices through which 27 previous categories have been reduced to seven.
Most simulators have Instructor Operating Stations (IOS). At the IOS, an instructor can quickly create any normal and abnormal condition in the simulated aircraft or in the simulated external environment. This can range from engine fires, malfunctioning landing gear, electrical faults, storms, downbursts, lightning, oncoming aircraft, slippery runways, navigational system failures and countless other problems which the crew need to be familiar with and act upon.
Many simulators allow the instructor to control the simulator from the cockpit, either from a console behind the pilot's seats, or, in some simulators, from the co-pilot's seat on sorties where a co-pilot is not being trained. Some simulators are equipped with PDA-like devices in which the instructor can fly in the co-pilot seat and control the events of the simulation, while not interfering with the lesson.
Flight simulators are an essential element in individual pilot as well as flight crew training. They save time, money and lives. The cost of operating even an expensive Level D Full Flight Simulator is many times less than if the training was to be on the aircraft itself and a cost ratio of some 1:40 has been reported for Level D simulator training compared to the cost of training in a real Boeing 747 aircraft.
A specific field for flight simulation are radio-controlled models - mostly airplanes and helicopters - that are steered from the ground by a transmitter. Here a major part of the learning curve for the pilot is the coordination between finger movement - typically via two joysticks for pitch/yaw and nick/roll - and the model reaction. RC flight simulator software thus uses an input, mostly an USB cable, from the same transmitter that is used for the actual model. With this setup the pilot can learn basic flying procedures without the risk of damaging the model. Current products use photographic textures from real flight fields and mimic the model and weather behaviour with complex physical models.