Very Large Format Stereoscopic Head-Up Display For The .

3y ago
33 Views
2 Downloads
822.99 KB
7 Pages
Last View : 2d ago
Last Download : 3m ago
Upload by : Louie Bolen
Transcription

Very Large Format Stereoscopic Head-Up Displayfor the Airport TowerStephen PetersonMagnus AxholtStephen R. EllisEUROCONTROL Experimental Centre EUROCONTROL Experimental CentreNASA Ames Research CenterBrétigny-sur-Orge, FranceBrétigny-sur-Orge, FranceMoffett Field, California, United �� Head-up displays typically used in aircraft andautomobiles have limited fields of view, generally less than 30 .Such systems can provide large, essentially unlimited fields ofregard if they are head-mounted. A new alternative for provisionof large fields of regard without requiring a head mount isa large format holographic optical element that can serve asa see-though polarization preserving diffusion screen. It maybe adaptable to very large formats ( 2 m). The see-throughdisplays it can support will not provide accommodative reliefbut will avoid cumbersome head-mounted optics. Some of thepsychophysical aspects of this display technology, e.g., luminance,distortion, visual resolution loss, and depth rendering biases havebeen investigated and are reported below as part of a project todesign a practical see-though display that may be used in anairport tower.The idea of HUDs in the airport tower is not novel initself. The first known reference to the concept was madeby Hitchcock et al. in the late 1980’s at FAA [1]. Potential benefits of such a display system include better displayintegration/placement, improved low visibility operations, reduced controller memory load, and a kind of x-ray vision inwhich controllers can see through occluding structures [2],[3]. Human performance experiments have been performedwith transparent projection screens for various visual searchtasks both with monoscopic [4] and stereoscopic [5] displayconditions. The effects of limited field of view in AR displaysystems have been widely researched. Ellis et al. [6] provide areview of relevant literature related to field of view constraints.I. I NTRODUCTIONHead-up displays (HUDs) may prove a valuable additionto or substitute for current information systems in the controltower. The typical HUD as found in cockpits is a see-throughdisplay that superimposes information about the aircraft andflight status onto the forward out-of-the-window view, minimizing the number of head-down movements needed to obtaininformation. Some displayed elements of aircraft HUDs areconformal imagery, meaning that the superimposed symbolsappear attached to real objects in the window view. Seethrough display systems have been proposed for a rangeof other application domains, emerging into the field ofAugmented Reality (AR). An AR display and an aircraftHUD are similar in many ways, but usually differ in the factthat the information presented to the user of an AR displaysystem is adjusted to the user’s personal perspective ratherthan that of the aircraft. User motion tracking systems feedthe AR display system with data that allows for spatiallyregistered (i.e. conformal) graphics regardless of user positionand orientation. Most AR displays are head-mounted (HMD),just like a helmet-mounted aircraft HUD.We propose the use of another AR display format, whichwe refer to as a spatial AR display. It uses a polarizationpreserving transparent projection screen. This format is moresimilar to the panel-mounted aircraft HUD in that the imagegenerating source is not placed on the user’s head but rigidlymounted at a distance in front of the user. Because of its potentially large form factor, it can provide, however, a considerablylarger field of view as compared to a typical aircraft HUD.Fig. 1. The transparent projection screen demonstrating a simple overlay(left) and during the refraction experiment (right).Our proposed transparent projection screen (Figure 1) consists of two sheets of glass enclosing a Holographic OpticalElement (HOE), through which projected light is directedtowards the user principally through diffractive effects. As theHOE is polarization-preserving, passive stereo techniques canbe used to produce artificial depth in the rendered images.The only equipment needed by the user for a stereoscopicdisplay is a pair of light-weight, polarizing, glasses. This issubstantially less intrusive than a traditional HMD where theuser is encumbered by signal and power cables and deviceweights commonly over 1 kg.Another advantage of a rigid, externally mounted display isthat the position of the screen can be precisely measured. It

is therefore possible to perform calibration and registrationmore accurately than in the case of HMD since there isno unpredictable or difficult to measure equipment slippageusually associated with HMDs.Moreover, theoretically, multiple viewers can be accommodated in a projection screen if they are individually trackedand if multiplexing techniques are implemented for separatingthe distinct display viewpoints.However, there are some design issues with this type ofsystem that need to be addressed. (a) The screen is sensitiveto high ambient light as the HOE refracts, diffracts and diffusessome light coming from other directions than the projectors,decreasing overall contrast. (b) The HOE also shows slightinhomogeneity and refraction effects at certain positions andangles. (c) The reduced contrast due to optical imperfectionscould affect human visual acuity. (d) Depth rendering usingthe screen may show some biases that would interfere withoperational use.This paper reports on the results of preliminary investigations of the impact of the enumerated issues. Issues (a) and(b) are addressed in the following Screen Properties section.Issues (c) and (d) are addressed in the Human Performancesection.Fig. 2.Fig. 3.Screen luminance values, measured at a perpendicular angleScreen luminance values, measured at various azimuthal anglesII. S CREEN P ROPERTIESA. Luminance and ContrastThe luminance levels were measured using the light sensorof a Canon 350D digital SLR camera. The method of luminance measurement was validated against a known luminanceon a laptop and determined to be accurate to approximately13%. See Appendix for details on luminance and contrastcalculations.1) Setup: The screen was located in a darkened room 2m from a pair of projectors mounted below at an angle of 38 . The projectors have a maximum luminance of 6500ANSI lumens, XGA resolution and a 1300:1 contrast ratio.The projector images were adjusted and keystoned to fill theentire screen. The camera was mounted on a tripod 50 cmfrom the screen on the opposite side. Luminance values weremeasured in three different projector conditions, (a) projectinga white image, (b) projecting a black image, and (c) projectorsswitched off to measure ambient conditions. In each conditionnine measurements were made in a 3x3 matrix covering theentire screen. The optical axis of the camera was alwaysperpendicular to the screen, only the tripod position and heightwas adjusted. In condition (a) additional measurements weremade from varying azimuthal angles to the screen, with thecamera always pointing towards the center of the screen ata distance of 76 cm. For each measurement the camera’srecommended aperture and exposure time were recorded forcalculating luminance values.2) Results: The results of the luminance measurements areshown in Figure 2. The numbers in white font show theluminance values when projecting a full white image, numbersin black and bold font when projecting a black image, numbersin thin black font when projectors are turned off. Figure 3shows the measurements when projecting a white image butvarying the azimuthal angle from a 90 normal to the screen.The contrast values were calculated using the full white andfull black values subtracted by the ambient light values fromFigure 2. Figure 4 shows the contrast values calculated withthe Michelson formula.3) Discussion: The luminance values when projecting awhite image are higher in the lower three measurements(Figure 2). Since the projectors are mounted below, the throwdistance is shorter and the incident angle is higher in thelower part of the screen, which are likely causes of the higherluminance values.The ambient luminance values increase in the upper measurements. This is likely due to the fact that the room (basement) has windows close to the ceiling. They were coveredwith blinds, but the blinds did not block 100% of the outsideFig. 4.Contrast values (Michelson)

light. As the luminance was measured in the upper parts ofthe screen, more of the windows were visible in the cameraframe, resulting in the higher luminance values.When using this screen technology in the normal daylightconditions of a control tower, the ambient light could approach30,000 cd/m2 , and contrast will thus be reduced. Contrastlevels with the system we used would be approximately 1/3 ofthat needed when comparing to aircraft HUDs, but this couldbe easily compensated for with tinted window films that reducethe amount of incoming daylight or through the use of brighterprojectors.TABLE IO FFSET BETWEEN PROJECTED LASER POINT AND REFERENCE POINTPoint12345678910Screen location (cm)5152535455565758595Offset (mm)-37-12-6-9-6-9-6-9-912Visual angle .111.49B. InhomogeneityAs previously described the screen consists of two sheets ofglass and a HOE. The HOE is constructed to affect the lightin ways similar to a prism. The HOE is made of dichromatedgelatin (DCG). The purpose of the DCG is to modulate therefractive index so that the HOE diffracts incoming lightinto wave fronts which by constructive interference produceparallel rays so that the HOE in effect refracts the rays withoutspectrally separating the light. There are at least two theorieson how the DCG simulates refraction. A complete descriptionis still missing and much of the production process is adjustedby trial and error [7].Since DCG is sensitive to humidity and changes in temperature, we are curious to see how homogeneous the DCG isacross the HOE. Distortions in the lens and screen of displayequipment are significant sources of registration error [8] andinhomogeneities in the HOE would disturb spatial registrationused in an AR system.1) Setup: To investigate the DCG we mounted a laserpointer on a stand and studied the projected laser point on asheet of paper on a perpendicular wall at 27.75 m distance. Weused the sheet of paper to sketch the outline of the projectedlaser point. First we sketched the projected laser point whenthe laser did not intersect the screen. This registration was usedas a reference point. Then the screen was moved laterally intothe laser beam. The screen was successively moved so thatthe laser beam would pass through the screen at 10 discreetpoints approximately 10 cm apart along a horizontal line overthe screen. At each point the outline of the projected laserpoint was sketched on the sheet of paper.2) Results: As the screen was moved laterally the projectedlaser point moved significantly to the left compared to thereference point. However, approximately 5 cm from the edgeof the screen the projected laser point moved back andassumed roughly the same location as the reference point. Notuntil the laser point reached the other edge of the screen did anoffset on the sketch paper become noticeable again. In short,the first and the last projected laser points were clearly offsetfrom the rest of the projection points as seen in table I. Thisleads us to believe that the properties of the DCG are differentin the edges of the screen compared to the center portion ofthe screen.3) Discussion: The path traced stretched laterally from oneedge of the screen to the other. It was only in the edges thatinhomogeneities were clearly visible. We are speculating thatthis is due to the fact that the substrate is exposed to airhumidity at the edges in its current setup or has been exposedto uncontrolled changes in temperature and humidity duringthe production process.We could immediately use this result in the next experimenton refraction (see section below) as it proved that varyingintersection points between screen and laser beam is anindependent variable. The intersection point must remain fixedor else it will influence the results.In general terms, the results show that normal usage throughthe center of the screen is not a problem, as registrationerrors around or slightly above 1 arcmin are hardly detectable.However, the inhomogeneities in the fringes of the screencould be problematic when tiling several screens for a largerfield of view, as the error approaches 5 arcmin and theinhomogeneity is not continuous across the edge. A more controlled experiment, preferably over several screens, is neededto further describe this phenomenon.C. RefractionA ray of light entering from air, passing through glass andexiting to air should only exhibit a slight parallel displacementproportional to the incident angle. Normally this would notbe a problem, but since the data is overlaid on one side ofthe screen one might experience a registration error betweenthe data layer and the real world due to refraction. Deeringhas previously described how Snell’s law can cause positionalerrors in displays with thick glass surfaces [9]. The greaterthe angle the more significant the displacement. Moreover, theeffects of the HOE from an oblique viewpoint are not known.1) Setup: To further study the refraction in the azimuthalplane of the DCG we made use of the same laser, mounted on astand, projecting a point on a sheet of paper on a perpendicularwall at 27.75 m distance, as visible in Figure 1. The sheetof paper had a 0.5 cm reference grid. As in the previousexperiment, the screen was initially taken aside so that theunobstructed laser point could be recorded as reference.Nine azimuthal angles were marked on the floor. Plumblines from the screen ensured that the screen could be positioned at these angles varying from 0 to 80 , where 0 meanta perpendicular intersection between laser beam and screen.The screen would rotate around the center of the radial so

TABLE IIO FFSET AND VISUAL ANGLE AS A FUNCTION OF INCIDENT ANGLEPoint123456789Incident angle ( )01020304050607080Offset (mm)0005551017.537.5Visual angle (arcmin)0.000.000.000.620.620.621.242.174.65that the beam always would intersect the same point on thescreen so as to avoid interference with the aforementionedinhomogeneities.For each angle the location of the laser point projection wasrecorded.2) Results: As the screen was inserted into the beam thelaser projection point was the same as the reference point.This indicated that the portion of the screen where the laserintersected was homogeneous. As the screen was rotatedclockwise, the laser projection point did not move until thescreen reached 30 where the point would be displaced 5 mmto the right compared to the reference point. It would remainat 5 mm offset until the screen was oriented at a 60 anglewhere the offset would increase to 10 mm.3) Discussion: The resulting refraction distortion is 0.62arcmin at 30-50 . At 60 the distortion, 1.24 arcmin, shouldstart to become noticeable, and it approaches 5 arcmin at 80 .If the user stays within 50 from the normal the distortionshould not contribute significantly to the resulting registrationerror. In this region the distortion is less than 1 arcmin in visualangle, which is normally defined as human visual acuity.III. H UMAN P ERFORMANCEA. Visual AcuityAn experiment was performed to measure visual acuity lossin the HUD. This would give figures on the extent whichthe HOE (without any projected graphics) actually degradeshuman vision. The task was to determine visual acuity byhaving subjects read lines of letters on Snellen eye charts invarious display conditions, until two errors on the same linewere reported.1) Setup: The experiment was performed as a mixed design, with two independent and one dependent variable. Theindependent variables were display condition (within subjects)and ambient light level (between subjects). The display conditions were “no screen” (the subject viewed the eye chartsdirectly), “screen” (the subject viewed the eye chart throughthe screen), and “glasses” (as “screen” but with the addition ofpolarized glasses). Ambient light was either on or off (meaningthe ceiling lights which provided normal indoor fluorescentoffice illumination were switched on or off).The eye charts were illuminated at all times and had aluminance of 156 cd/m2 measured from the subject viewpoint.The addition of the screen reduced the luminance level to 74or 62 cd/m2 , depending on whether the ambient light wason or off respectively. Thus the ambient light increased theluminance of the screen by 12 cd/m2 . The addition of glassesreduced the luminance to 30 or 25 cd/m2 . Thus the additionof the screen alone decreased eye chart luminance by about55%, screen combined by glasses about 80%. (Luminance wasmeasured in the same way as in the Luminance and Contrastsection.)Twelve subjects performed the experiment, 6 with and 6without ambient light. Subjects were nested within the lightingcondition and crossed with all other independent variables.The display condition was counterbalanced throughout theexperiment. The subject was placed on a chair 4 m from theeye charts (designed for 4 m viewing distance, the 6/6 linemeasured 5.8 mm in height), with the chin on a chin rest.Polarizing glasses were added to the chin rest in the “glasses”condition, but the eye height remained constant in all trials.The subjects were told to read from line 4 and down, and werestopped after two errors were reported on the same line. Theprevious line was recorded as their acuity. If one error wasmade on the line before the line with two errors, acuity wasconsidered to be half-way between the line of one error andthe previously passed line.2) Results: The result line number is quantized and basedon a logarithmic scale. Therefore the line numbers wereconverted to a linear scale, visual resolution in arcmin, moresuitable for averaging. The mean visual resolution values areshown in Figure 5, where the impact of the independent variables are visualized. Further statistical analysis using ANOVAshowed that the display condition had a significant effect onthe results (F(2,22) 4.264, p .029). The “no screen”condition showed a 0.15 arcmin better resolution than the“screen” condition, corresponding to approximately 1 line onthe eye chart. There was no significant difference betweenthe “screen” and “glasses” conditions. Ambient light variationshowed no significant effect.3) Discussion: It was surprising that ambient light showedno significant effect on the results. Possibly this was due tothe high contrast in the eye charts, combined with the ratherlow screen luminance increase with ambient light (12 cd/m2 ).A future experiment will involve the much greater outdoorluminance conditions, that are similar to those that would beexperienced in a control tower.B. Depth MatchingAn experiment was performed to test the user’s depthmatching ability with stereoscopically rendered objects seenin the HUD. The purpose was to test user performance inmatching the depth of real to virtual (rendered) and virtualto real objects in various conditions at range of 3-10 m. Thisexperiment would thus indicate if there is any overall judgmentbiases associated with depth rendering on the HOE screen.Even though the distances covered in this experiment aremuch shorter than the distances observed from a control tower,they would give preliminary measures of the possibilities and

Fig. 5.Mean visual resolution values per display conditionFig. 7.Fig. 6. The setup for the depth matching experiment showing the real objecton rails (left) and the virtual object on the screen (right).limitations of depth rendering in this type of large formatHUD.1) Setup: The experiment was performed as a withinsubjects design, with three independent

Stephen Peterson EUROCONTROL Experimental Centre Brétigny-sur-Orge, France stephen.peterson@eurocontrol.int Magnus Axholt EUROCONTROL Experimental Centre Brétigny-sur-Orge, France magnus.axholt@eurocontrol.int Stephen R. Ellis NASA Ames Research Center Moffett Field, California, United States sellis@mail.arc.nasa.gov

Related Documents:

3D game data is sent to stereoscopic driver The driver takes the 3D game data and renders each scene twice – once for the left eye and once for the right eye. Left Eye view. Right Eye view. A Stereoscopic display then shows the left eye view for even frames (0, 2, 4, etc) and the right eye view for odd frames (1, 3, 5, etc). How It Works

running at 120 Hz is 60 Hz per eye) The resulting image for the end user is a combined image that appears to have depth in front of and behind the stereoscopic 3D Display. Left eye view on, right lens blocked Right eye view on, left lens blocked on off off on Left lens Right lens Left lens Right lens Stereoscopic Basics How It WorksFile Size: 2MBPage Count: 76

when buying a large format printer: 1. Types of large format printers: With so many types of large format printers, we'll narrow the choices down to a few. 2. Monochrome or color: Color is not as expensive as you think. 3. Costs: Is it cheaper to outsource large format printing or buy a printer to print in-house? 4.

buying a large format printer: 1. Types of large format printers: With so many types of large format printers, we'll narrow the choices down to a few. 2. Color or black & white: Color is not as expensive as you think. 3. Costs: Is it cheaper to outsource large format printing or buy a printer to print in-house? 4.

buying a large format printer: 1. Types of large format printers: With so many types of large format printers, we'll narrow the choices down to a few. 2. Monochrome or color: Color is not as expensive as you think. 3. Costs: Is it cheaper to outsource large format printing or buy a printer to print in-house? 4.

when buying a large format printer: 1. Types of large format printers: With so many types of large format printers, we'll narrow the choices down to a few. 2. Monochrome or color: Color is not as expensive as you think. 3. Costs: Is it cheaper to outsource large format printing or buy a printer to print in-house? 4.

photogrammetry mimics the natural ability of human depth perception. The ability of three dimensional vision is a result of the offset in perspective centers between the left and right eyes. Digital images created to simulate this perspective shift are referred to as stereoscopic or stereo. The succession of stereoscopic images taken from .

Our first day celebrates the past and future of stereoscopy, taking us back and forth on an incredible journey through time and space. DR. JEREMY ROWE EARLY STEREOSCOPIC DOCUMENTATION OF TERRITORIAL ARIZONA Short bio Dr. Jeremy Rowe has collected, researched, and written about 19th and e