Effects Of Augmented Reality Head-up Display Graphics' Perceptual Form .

1y ago
16 Views
2 Downloads
4.81 MB
175 Pages
Last View : 15d ago
Last Download : 3m ago
Upload by : Mara Blakely
Transcription

Effects of Augmented Reality Head-up Display Graphics’ Perceptual Form onDriver Spatial Knowledge AcquisitionNAYARA DE OLIVEIRA FARIAThesis submitted to the faculty of the Virginia Polytechnic Institute and State Universityin partial fulfillment of the requirements for the degree ofMaster of ScienceinIndustrial and Systems EngineeringJoseph L. Gabbard, ChairSheila G. KlauerMartha Irene SmithDecember 16, 2019Blacksburg, VAKeywords: spatial knowledge; augmented reality; driving; head-up display

Effects of Augmented Reality Head-up Display Graphics’ Perceptual Form onDriver Spatial Knowledge AcquisitionNAYARA DE OLIVEIRA FARIAABSTRACTIn this study, we investigated whether modifying augmented reality head-updisplay (AR HUD) graphics’ perceptual form influences spatial learning of theenvironment. We employed a 2x2 between-subjects design in which twenty-fourparticipants were counterbalanced by gender. We used a fixed base, medium-fidelitydriving simulator at the COGENT lab at Virginia Tech. Two different navigation cuessystems were compared: world-relative and screen-relative. The world-relative conditionplaced an artificial post sign at the corner of an approaching intersection containing a reallandmark. The screen-relative condition displayed turn directions using a screen-fixedtraditional arrow located directly ahead of the participant on the right or left side on theHUD. We captured empirical data regarding changes in driving behaviors, glancebehaviors, spatial knowledge acquisition (measured in terms of landmark and routeknowledge), reported workload, and usability of the interface.Results showed that both screen-relative and world-relative AR head-up displayinterfaces have similar impact on the levels of spatial knowledge acquired; suggesting thatworld-relative AR graphics may be used for navigation with no comparative reduction inspatial knowledge acquisition. Even though our initial assumption that the conformal ARHUD interface would draw drivers’ attention to a specific part of the display was correct,this type of interface was not helpful to increase spatial knowledge acquisition. This findingcontrasts a common perspective in the AR community that conformal, world-relativegraphics are inherently more effective than screen-relative graphics. We suggest thatsimple, screen-fixed designs may indeed be effective in certain contexts.Finally, eye-tracking analyses showed fundamental differences in the wayparticipants visually interacted with different AR HUD interfaces; with conformalgraphics demanding more visual attention from drivers. We showed that the distribution ofvisual attention allocation was that the world-relative condition was typically associatedwith fewer glances in total, but glances of longer duration.

Effects of Augmented Reality Head-up Display Graphics’ Perceptual Form onDriver Spatial Knowledge AcquisitionNAYARA DE OLIVEIRA FARIAGENERAL AUDIENCE ABSTRACTAs humans, we develop mental representations of our surroundings as we movethrough and learn about our environment. When navigating via car, developing robustmental representations (spatial knowledge) of the environment is crucial in situationswhere technology fails, or we need to find locations not included in a navigation system’sdatabase. Over-reliance on traditional in-vehicle navigation devices has been shown tonegatively impact our ability to navigate based on our own internal knowledge. Recently,the automotive industry has been developing new in-vehicle devices that have the potentialto promote more active navigation and potentially enhance spatial knowledge acquisition.Vehicles with augmented reality (AR) graphics delivered via head-up displays (HUDs)present navigation information directly within drivers’ forward field of view, allowingdrivers to gather information needed without looking away from the road. While this ARnavigation technology is promising, the nuances of interface design and its impacts ondrivers must be further understood before AR can be widely and safely incorporated intovehicles. In this work, we present a user study that examines how screen-relative andworld-relative AR HUD interface designs affect drivers’ spatial knowledge acquisition.Results showed that both screen-relative and world-relative AR head-up displayinterfaces have similar impact on the levels of spatial knowledge acquired; suggesting thatworld-relative AR graphics may be used for navigation with no comparative reduction inspatial knowledge acquisition. However, eye-tracking analyses showed fundamentaldifferences in the way participants visually interacted with different AR HUD interfaces;with conformal-graphics demanding more visual attention from drivers

DEDICATIONTo the Almighty God for giving me the miracle of life and for opening my path to opportunities thatallowed me to be here today. Thank you for your guidance, protection and strength in all the moments Ifelt alone far from my beloved ones.To my mother Maria Aparecida de Oliveira de Faria and to my father Donizete Aparecido Ribeiro deFaria for their unconditional love and patience. You are the best parents one could ever ask for.To my brother Daniel de Oliveira Faria and to my sister Poliana Aparecida de Oliveira Faria for alwaysbeing present in my life.To Malu Faria for being the little human being that brings joy to all in our family. I am also grateful forScooby, Sheik and my first little one, Bidu Faria.To Jadiel Hernandez Rivera for being my ideal partner during my graduate school journey. I really love,respect, and appreciate you. Thank you for your unconditional support throughout my graduate life’schallenges and for always being here when I need you.1

TABLE OF CONTENTSLIST OF FIGURES . 8LIST OF TABLES . 10LIST OF EQUATIONS . 121. INTRODUCTION . 11.1 MOTIVATION . 11.2 RESEARCH PROBLEM . 21.3 RESEARCH OBJECTIVES . 31.3.1 MAIN OBJECTIVE . 31.3.2 SECONDARY OBJECTIVES . 32. BACKGROUND AND RELATED WORK . 52.1 BACKGROUND AND DEFINITIONS . 52.1.1 AUGMENTED REALITY . 52.1.2 THE AUGMENTED REALITY HEAD-UP DISPLAY (AR HUD) . 62.1.2.1 AR HUD PERCEPTUAL FORMS. 82.1.3 AR HUD CHALLENGES . 92.1.3.1 AR HUD PERCEPTUAL FORMS. 92.1.3.2 AR HUD LIMITED FIELD OF VIEW . 102.1.3.3 INATTENTIONAL BLINDNESS. 112.1.3.4 COGNITIVE TUNNELING . 112.1.3.5 OCCLUSION . 122.1.4 DRIVING BEHAVIOR . 122.1.5 GLANCE BEHAVIOR . 132

2.1.6 SPATIAL KNOWLEDGE . 132.1.6.1 MEASURING LANDMARK AND ROUTE KNOWLEDGE . 142.2 RELATED WORK . 152.2.1 NAVIGATION STRATEGIES . 152.2.2 SPATIAL KNOWLEDGE AND THE HIPPOCAMPUS . 162.2.3 NAVIGATION SYSTEMS & SPATIAL KNOWLEDGE . 172.2.3.1 UNDERLOAD . 192.2.3.2 ENCOURAGING ACTIVE AWARENESS AND ENGAGEMENT . 192.2.3.3 HUD STUDIES ON SPATIAL KNOWLEDGE . 213. METHODS . 233.1 PARTICIPANTS . 233.1.1 PARTICIPANTS RECRUITMENT . 233.1.2 PARTICIPANTS DEMOGRAPHICS . 233.2 EQUIPMENT . 243.3 DRIVING SCENARIO . 253.4 EXPERIMENTAL DESIGN . 273.4.1 INDEPENDENT VARIABLES . 283.4.1.1 WORLD-RELATIVE POST SIGN . 283.4.1.2 SCREEN-RELATIVE TRADITIONAL ARROW. 293.4.2 DEPENDENT MEASURES . 293.5 PROCEDURES . 314.DATA ANALYSIS . 344.1 SURVEY DATA . 363

4.1.1 WORKLOAD. 364.1.2 SITUATION AWARENESS . 364.1.3 USABILITY QUESTIONS . 374.2 QUALITATIVE FEEDBACK . 374.3 DRIVING MEASURES . 374.4 SPATIAL KNOWLEDGE DATA . 394.4.1 BRIEF OVERVIEW OF SIGNAL DETECTION THEORY . 394.4.2 LANDMARK KNOWLEDGE TEST / SDT DATA CODIFICATION . 444.4.3 LANDMARK KNOWLEDGE ANALYSIS . 454.4.4 LANDMARK KNOWLEDGE REGRESSION ANALYSIS . 454.4.5 LANDMARK KNOWLEDGE HEAT MAP . 464.5 GAZE BEHAVIOR . 484.5.1 GAZE BEHAVIOR DATA PROCESSING . 495. RESULTS . 505.1 SURVEY DATA . 505.1.1WORKLOAD NASA-TLX . 505.1.1.1MENTAL DEMAND . 505.1.1.2PHYSICAL DEMAND . 515.1.1.3TEMPORAL DEMAND. 525.1.1.4EFFORT . 545.1.1.5FRUSTRATION . 555.1.1.6PERFORMANCE . 565.1.1.7OVERALL WORKLOAD . 575.1.2SITUATION AWARENESS . 594

5.1.3USABILITY QUESTIONS . 635.1.3.1EASY TO NAVIGATE . 633.1.3.2EASY TO VIEW . 643.1.3.3TRUSTWORTHY INFORMATION. 653.1.3.4POSITIVE IMPACT ON DRIVING . 663.1.3.5DISTRACTION . 675.2 QUALITATIVE FEEDBACK . 685.2.1SCREEN-RELATIVE QUALITATIVE FEEDBACK . 685.2.2WORLD-RELATIVE QUALITATIVE FEEDBACK . 705.3 DRIVING MEASURES . 725.3.1 MISSED TURNS . 725.3.2 STANDARD DEVIATION OF LANE POSITION (SDLP) . 745.3.3 STANDARD DEVIATION OF SPEED . 755.4 SPATIAL KNOWLEDGE . 775.4.1 LANDMARK KNOWLEDGE . 775.4.1.1 INITIAL ANALYSIS . 775.4.1.2 SIGNAL DETECTION THEORY . 805.4.1.3 HEAT MAPS . 875.4.2 ROUTE KNOWLEDGE . 895.5 GAZE BEHAVIOR . 915.5.1 MAXIMUM HUD GRAPHIC GLANCE DURATION . 915.5.2 AVERAGE HUD GRAPHIC GLANCE DURATION . 925.5.3 PERCENTAGE OF TIME LOOKING AT HUD GRAPHIC . 935.5.4 PERCENTAGE OF TIME LOOKING AROUND HUD GRAPHIC . 945

5.5.5 PERCENTAGE OF TIME LOOKING AROUND HUD GRAPHIC VS HUD GRAPHIC . 945.5.6 TOTAL NUMBER OF GLANCES AT THE HUD GRAPHIC . 956.DISCUSSION . 976.1SURVEYS . 976.2QUALITATIVE FEEDBACK . 976.3DRIVING MEASURES . 986.4SPATIAL KNOWLEDGE . 1016.4.1 LANDMARK KNOWLEDGE . 1016.4.2 ROUTE KNOWLEDGE . 1026.5GAZE BEHAVIOR . 1037. CONCLUSIONS . 1068. LIMITATIONS . 107REFERENCES. 110APPENDIX . 115A.DOCUMENTS . 115A1. IRB APPROVAL . 115A2. INFORMED CONSENT FORM . 117A3. STUDY ADVERTISEMENT FLYER. 120B.SURVEYS . 121B1. DEMOGRAPHICS SURVEY . 121B2. NASA-TASK LOAD INDEX (NASA-TLX) . 123B3. USABILITY QUESTIONNAIRE . 1246

B4. SITUATION AWARENESS RATING TECHNIQUE - SART . 125B.5 LANDMARK KNOWLEDGE SURVEY IMAGES AND AOI . 127C.DATA COLLECTION: RAW DATA . 131C1. NASA TLX RAW DATA . 131C2. SART RAW DATA. 132C3. USABILITY RAW DATA . 133C4. LANDMARK KNOWLEDGE RAW DATA . 134C5. ROUTE KNOWLEDGE RAW DATA . 136C6. DRIVING MEASURES . 137C7. SDT PARAMETERS . 144C8. LANDMARK KNOWLEDGE HEAT MAPS . 1457

LIST OF FIGURESFigure 1- Milgram’s Reality-Virtuality Continuum. 6Figure 2- An example of a head-down display located in the center of the dashboard of a vehicle . 7Figure 3- An example of a head-up display in a vehicle in which information is displayed within drivers’forward field of view . 7Figure 4- Examples of screen-relative (left) and world-relative (right) AR HUD perceptual forms . 9Figure 5- Example of current AR HUD field of view . 10Figure 6- COGENT Lab’s AR HUD can be conformal to the simulated world (top left). Top view (right)and side view (bottom left) of COGENT’s driving simulator . 24Figure 7- SensoMotoric Instruments (SMI) eye-tracking glasses used within this study . 25Figure 8- Driving scenario used within this study . 26Figure 9- World-relative post sign interface indicating both left turn (left) and right turn (right) . 28Figure 10- World-relative post sign interface at a distance of a “keep going straight” point drive (left) andapproaching said “keep going straight” point (right) . 28Figure 11- Screen-relative traditional arrow as drivers approached the turn . 29Figure 12- AR HUD graphics overlaid onto calibration scenario . 31Figure 13- Lab space used for route knowledge test. 33Figure 14- Example of Residuals Assumptions Analysis . 35Figure 15- Example of Probability Plot of Residuals with AD Test. 35Figure 16-Graphic representation of driving data analyzed . 37Figure 17- Driving measures txt output example. 38Figure 18- Representation of the decision space for SDT . 41Figure 19- Examples of signal and noise probabilities that are discriminable because of (a) separationbetween the distribution or (b) low variance of the distributions . 42Figure 20- A given decision criterion can either present bias towards saying “yes”( bottom) or “no” (top). 43Figure 21- SDT response codification scheme . 44Figure 22- Heat map example from landmark knowledge test . 47Figure 23- HUD Graphics AOI. 48Figure 24- Interval plot of mental demand by gender and condition. . 50Figure 25- Interval plot of mental demand by condition. . 51Figure 26- Interval plot of physical demand by gender and condition . 52Figure 27- Interval plot of physical demand by condition . 52Figure 28- Interval plot of temporal demand by condition and gender . 53Figure 29- Interval plot of temporal demand by condition . 53Figure 30- Interval plot of effort by condition and gender . 54Figure 31- Interval plot of effort by condition . 54Figure 32- Interval plot of frustration by condition and gender. 55Figure 33- Interval plot of frustration by condition . 55Figure 34- Interval Plot of performance by condition and gender . 56Figure 35- Interval plot of performance by condition . 56Figure 36- Interval plot of average workload by condition and gender . 58Figure 37- Interval plot of average workload by condition . 588

Figure 38- Interval plot of SART categories by condition . 61Figure 39- Interval plot of overall situation awareness . 62Figure 40- Interval plot of easy to navigate subscale . 63Figure 41- Interaction plot for “easy to view” subscale . 64Figure 42- Interval plot of “positive impact on driving” subscale . 66Figure 43- Interval Plot of “distraction” subscale . 67Figure 44- 95% confidence interval plot of SD lane position by maneuver direction . 74Figure 45- 95% confidence interval plot of SD speed by maneuver direction . 76Figure 46-Interval plot bar of the total number of images correctly placed into piles. 95% CI for the mean. 77Figure 47- Interval Plot of the number of correctly recognized scenes by condition and on/off routesscenes. 95% ci for the mean . 79Figure 48- Interval Plot of SDT parameter rates for all participants by condition. 95% ci for the mean . 81Figure 49- Main effects plot for HIT by turning direction. 83Figure 50- Interaction plot for sensitivity . 85Figure 51-Bar plot of number of clicks . 87Figure 52- Pie chart of the number of click’s by selected area . 87Figure 53- Tree map of clicked landmarks . 88Figure 54- Interval plot of proportion of landmark and route knowledge acquisition by condition . 89Figure 55- Interval plot deviation from optimal position of route knowledge by condition and gender . 90Figure 56- Line chart of average HUD glance duration by condition and event . 93Figure 57- Boxplot of number of glances at the HUD graphic by condition. . 95Figure 58- Boxplot of number of glances around the HUD graphic by condition.

Keywords: spatial knowledge; augmented reality; driving; head-up display . Effects of Augmented Reality Head-up Display Graphics' Perceptual Form on Driver Spatial Knowledge Acquisition NAYARA DE OLIVEIRA FARIA ABSTRACT In this study, we investigated whether modifying augmented reality head-up display (AR HUD) graphics' perceptual form .

Related Documents:

pembelajaran augmented reality dan kelompok siswa yang diajar dengan menggunakan media pembelajaran virtual reality sebesar 5.71. Penelitian ini menunjukkan bahwa ada peningkatan hasil belajar dengan menggunakan media virtual reality dan augmented reality, serta terdapat perbedaan efektivitas antara media virtual reality dan augmented reality.

virtual reality reality augmented reality augmented virtuality mixed reality real environment virtual environment alex olwal course notes mixed reality 9 augmented reality: definition [Azuma 1997; Azuma, Baillot, Behringer, Feiner, Julier & MacIntyre 2001] 1) real virtual objects in real environment 2) runs interactively and in realtime

alternative reality market. The Alternative Reality Landscape Virtual Reality Augmented Reality Mixed Reality What it Does Changes reality by placing the user in a 360-degree imaginary world. Visible world is overlaid with digital content. Like AR, but virtual objects are integrated into and respond to visible surroundings. Where it Stands

Augmented Reality Like Virtual Reality (VR), Augmented Reality (AR) is becoming an emerg-ing edutainment platform for museums. Many artists have started using thistechnologyinsemi-permanentexhibitions. Industrialuseofaugmented reality is also on the rise. Some of these efforts are, however, limited to us-ing off-the-shelf head-worn displays.

Augmented Reality in Education ISBN: 978-960-473-328-6 . 1 Augmented Reality in Education EDEN - 2011 Open Classroom Conference Augmented Reality in Education Proceedings of the “Science Center To Go” Workshops October 27 - 29, 2011 Ellinogermaniki Agogi, Athens, Greece. 2File Size: 2MB

Augmented Reality "Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data." (Wikipedia) "Augmented Reality (AR) is a variation Virtual Reality

that modern reality presentation technologies are compelling mediums for the expression of digital IoT streams. Such reality presentation technologies include the eXtended Reality (XR) family of technologies 3-- Augmented Reality (AR), Mixed Reality (MR) and Virtual Reality (VR) - rendering as well as more mature and accepted

Alex Rider is not your average fourteen-year-old. Raised by his mysterious uncle, an uncle who dies in equally mysterious circumstances, Alex finds himself thrown into the murky world of espionage. Trained by MI6 and sent out into the field just weeks later, Alex [s first mission is to infiltrate the base of the reclusive billionaire suspected of killing his uncle. Filmic and fast-paced (the .