Augmented Reality Interfaces - Masaryk University

2y ago
15 Views
2 Downloads
1.60 MB
23 Pages
Last View : Today
Last Download : 3m ago
Upload by : Cannon Runnels
Transcription

10/5/2015AR SolutionsPA198Augmented Reality InterfacesLecture 2Augmented Reality Software In the past only few tools Nowadays loads of solutions– Open Source– Proprietary– End-to-End Branded App SolutionsFotis Liarokapisliarokap@fi.muni.cz05th October 2015ARToolKit Dual-license:– GPL, commercialOpen Source C-library Ported to different languagesand platforms– i.e. Android, Flash, Silverlight,etc More later on Argon AR browser by Georgia Tech'sGVU Center Uses a mix of KML andHTML/JavaScript/CSS fordeveloping AR applications Any web content can beconverted into AR content– With appropriate meta-data andproperly formattedArUco Minimal library for AR applications– Trivial integration with OpenGL and OGRE OpenCV implementation– Up to 1024 different markers Licenses:– BSD, Linux, Windows As of November 2011, availablefor iPhone stiga/grupos/ava/node/261

10/5/2015JavaCV Java/Android interface to OpenCV Licenses– GPLv2ATOMIC Authoring Tool Created as a front end (Graphic Interface) for useARToolKit without having to know programming– Written in Processing Multi-platform authoring for creating ARapplications– Microsoft Windows, Linux and Mac OS X atomic-project/Goblin XNA Platform for researching 3D user interfacesemphasizing games– Mobile augmented reality and virtual reality Implementation in C# Based on Microsoft XNA Game Studio 4.0 BSD licensehttps://goblinxna.codeplex.com/GRATF GRAFT (Glyph Recognition And TrackingFramework) Does localization, recognition and poseestimation of optical glyphs in still images andvideo e Mixare (mix Augmented Reality Engine) ARbrowser– Published under the GPLv3– Available for Android iPhonehttp://www.mixare.org/PTAM PTAM (Parallel Trackingand Mapping) is acamera tracking systemfor AR Requires no markers,pre-made maps,known templates orinertial sensors Non-commercial useonlyhttp://www.robots.ox.ac.uk/ gk/PTAM/2

10/5/2015DroidARGeoAR AR framework for Android Features location based and marker based AR Open source (dual-license): Open source (Apache 2.0 License) browser forAndroid Features location based AR and a flexible datasource framework– GPLv3 or R Open source (Apache 2.0 License) AR framework based on geo-localisation forAndroidProprietaryhttp://beyondar.com/Kudan AR Engine AR SDK for iOS and Android devices Powerful Rendering– Multi-million polygon 3D models Advanced Tracking– MarkerlessVuforia Augmented Reality SDK The Vuforia SDK supports different types oftargets– both 2D and 3D, including multi-targetconfigurations, cylinder targets to track images ona cylindrical surface, marker less image targets,frame markers and cloud recognition targets totrack 1 million targets simultaneously SDK supports both iOS and om/products/vuforia3

10/5/2015Vuforia FeaturesVuforia Video Provide faster local detection of targets withcapacity of tracking 5 targets simultaneously Efficient tracking in low light conditions andeven though target is covered partially Extended tracking capabilities, which enableapp to keep tracking targets and helps maintaina consistent reference for augmentations ofobject even when the targets are no longervisible in real time camera ://www.youtube.com/watch?v GrP xwrCtB0Metaio SDK Modular framework which consists of differentcomponents– Rendering, capturing, tracking and the sensor interface Compatible with all major platforms:– Android, IOS, Unity3D and Windows SDK contains:––––––Marker or marker-less 2D and 3D trackingPOI’s trackingSupport for QR code and barcode readingBuilt in 3D rendererOptimizations for mobile chipsetc.https://www.metaio.com/Metaio Features Provides high level of abstraction Provides powerful 3D rendering engine and highlyadvance tracking Support for obj, fbx and md2 model formats for 3Dobjects Direct loading of 3D models is supported withabstraction of OpenGL calls Includes AREL scripting language– AR experiences based on common web technologies i.e. XML, HTML5 and JavaScripthttps://www.metaio.com/Metaio VideoEnd-to-End BrandedApp Solutionshttps://www.youtube.com/watch?v 2HRW-yDgzA84

10/5/2015Alive app Developed by Times Internet Ltd Recognise pre-trained images and overlay animage, video or 3D content Available on:– Android, iOS, Windows, Blackberry Symbian and Javahttp://www.alivear.com/Aurasma HP Autonomy's AR platform Available as SDK or free appfor iOS and Android Recognize real worldimages and then overlaygraphics Created in Cambridgehttps://www.aurasma.com/Blippar Visual browsing app using image-recognitionand AR technologies Focused on smartphones, tablets or wearablesJunaio AR browser designed for 3G and 4G mobiledevices First AR browser that has overcome the accuracylimitations of GPS through LLA Markers– i.e. latitude, longitude, altitude marker, aio.com/XARMEX XARMEX (eXtended AugmentedReality for Military EXercise) AR-aided Close quarters combatsimulation system, combiningmotion detection hardware withimage overlay/stabilizationsoftware to create realisticmilitary simulation environmentsand computer gameshttps://en.wikipedia.org/wiki/XARMEXLayar Mobile browser Can enhance– flyers, postcards,packaging or any otheritem with interactivecontent, including videomessages, Web andsocial links, photoslideshowshttps://www.layar.com/5

10/5/2015Wikitude Mobile AR technology provider– Based in Salzburg, Austria– Founded in 2008 Initially focused on providing location-based ARexperiences through the Wikitude WorldBrowser App In 2012, introduced Wikitude SDK– Uses image recognition and tracking– Geo-location technologieshttps://www.wikitude.com/Wikitude . Content in the Wikitude World Browser ismostly user generated Content can be added by a web interface– i.e. KML and ARML Web services are available to register thedelivery of dynamic data Wikitude is a W3C member and OGC memberand is working to develop ARML further as partof a W3C ARML projecthttps://www.wikitude.com/Wikitude VideoD’Fusion Integrates real time interactive 3D graphicscontents on a live video stream SDK available in different platforms– i.e. desktop, mobile and special Flash Plug-inhttps://www.youtube.com/watch?v JnemoNQTa wD’Fusion n-suiteD’Fusion Video It is more UI based– D’Fusion Studio– D’Fusion CV Enables to build the whole scenariothrough the GUI One bundle scenario will work on bothAndroid & iPhone Supports multi-tag and fusion-suitehttps://www.youtube.com/watch?v 6NKT6eUGJDE6

10/5/2015ARmedia 3D Uses a 3D model tracking approach– Recognizes planar images and complex 3D objectsindependently of their size and geometry SDK architecture consist of renderer to render 3Dmodel, tracker to track the target, capture forcapturing frames from the device camera andinterface to native android and iOS Cross-platform and implemented in C/C ARmedia Features Provides 3D tracking of real world objects in realtime in changing lighting conditions Provides modularity– Can integrate between different tracking algorithmsand different 3D engines and interface– Different tracker parameters can be fixed to finetune the results– Easily integrate with any other AR platform– 3D target creation and management services areavailable on cloudhttp://www.armedia.it/index.phpAR Media VideoARToolKitPlus Mobile version of ARToolKit– First approach– Development stopped June 2006 Succeeded by Studierstube Tracker– It is a marker tracking library s://www.youtube.com/watch?v g4 lus.phpComparison of AR SDK’s Based on free and open-source SDK’s Popular SDK’sComparisons– Metaio– Vuforia– Wikitude– D’Fusion– Armedia– ARToolKit7

10/5/2015Based on License TypeAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.Based on Marker GenerationAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.Based on Overlaying CapabilityAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.Based on Platform SupportedAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.Based on TrackingAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.Benefits of Different AR SDK’sAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.8

10/5/2015Limitation of Different AR SDK’sARToolKit BasicsAmin, D., Govilkar, S. Comparative Study of Augmented Reality SDK’s, Int’l Journal on Computational Sciences & Applications, 5(1): 11-26, February 2015.How Does ARToolKit WorkHow Does ARToolKit Work . The camera captures video of the real world and sends it to thecomputer Software on the computer searches through each video framefor any square shapes If a square is found, the software uses some mathematics tocalculate the position of the camera relative to the blacksquare Once the position of the camera is known a computer graphicsmodel is drawn from that same position This model is drawn on top of the video of the real world andso appears stuck on the square marker The final output is shown back in the handheld display, sowhen the user looks through the display they see graphicsoverlaid on the real .edu/artoolkit/documentation/cs.htmComputer Vision Algorithm Based on abasic cornerdetectionapproachwith a fastposeestimationalgorithma. Original imageb. Thresholdedimaged. Contourse. Extracted marker f. Fitted squareedges and cumentation/vision.htmComputer Vision Algorithmc. artoolkit/documentation/vision.htm9

10/5/2015ARToolKit Coordinate SystemsComputer Vision Coordinate System ARToolKit defines different coordinate systemsmainly used by the computer vision algorithmand the rendering You need to precisely understand relationsbetween them to prevent reverse image orbad position object cumentation/cs.htmComputer Vision Coordinate System arGetTransMat give the position of the markerin the camera coordinate system– Not the reverse entation/cs.htmRendering Coordinate System When you use ARToolKit with OpenGL,remember that OpenGL is a right-handedcoordinate system, with Z coming to you– i.e. the camera is facing in the direction of -Z– If you want the position of the camera in themarker coordinate system you need to inverse thistransformation toolkit/documentation/cs.htmRendering Coordinate System ation/cs.htmRendering Coordinate System . ARToolKit uses a calibrated camera perspectivethat results in an off-axis projection matrix forOpenGL– Can not be created by calling gluPerspective, butrequires the extra parameters of glFrustum Rather than decomposing ARToolKit's projectioninto parameters to pass to glFrustum, directlyload the OpenGL projection matrix by:– Setting glMatrixMode(GL PROJECTION MATRIX)– Calling edu/artoolkit/documentation/cs.htm10

10/5/2015Introduction to FrameworkARToolKit Framework ARToolKit is a software ToolKit like GLUT It furnishes predefined functions that need tocall in a specific order for developing an ARapplication– Can also use different parts of the ToolKit separately ARToolKit supports multiple platforms– While attempting to minimise library dependencieswithout sacrificing /documentation/devframework.htmIntroduction to Framework .ARToolKit Architecture ARToolKit uses– OpenGL for the rendering part– GLUT for the windows/event handler aspect– Hardware-dependent video library– Standard API on each platform The API is in C and a lot of samples providegood starting points for creating newapplications– Skeleton mARToolKit StructureHierarchical Structure of ARToolKit The ARToolKit library consists of four modules: Hierarchical structure of ARToolKit using GsubModule– AR module: core module with marker trackingroutines, calibration and parameter collection– Video module: a collection of video routines forcapturing the video input frames– Gsub module: a collection of graphic routines basedon the OpenGL and GLUT libraries– Gsub Lite module: replaces GSub with a moreefficient collection of graphics routines, independentof any particular windowing 1

10/5/2015Hierarchical Structure of ARToolKit . Hierarchical structure of ARToolKit usingGsub Lite ModuleARToolKit Pipeline The modules respect a global pipeline metaphor– i.e. video- tracking- display User can easily replace any module with another– i.e. gsub with Open Inventor ocumentation/devframework.htmData Types ARToolKit manipulates a lot of different kindsof variable Internally, it uses global variables, that restrictre-entrant part of the code Otherwise, standard multi-argument interfaceare used based on a data-flow approach ARToolKit uses different image formatsbetween different RToolKit Data-Flow Information about the detected markers iscontained in the ARMarkerInfo structure definedin ar.h in the include documentation/devframework.htmARToolKit Documentation All ARToolKit functions generally begin with theprefix ar, and a specific prefix for each module– arVideo for video– arg for graphics– ar for main coreARToolKit Hardware More information:– tion/devframework.htm12

10/5/2015ARTooKit Camera The choice of a camera is the most important– Resolution, frame-rate, optics distortion, are some ofthe main parameters for selected your camera But supported platforms, parameters of thedriver (like auto-contrast) can also influence yourchoice Some cameras enable by default auto-contrastthat decrease your performance, or offers 25Hzframe-rate with a lost of image qualityARTooKit Camera The most efficient solution keep a video capturecard with a PAL or NTSC camera that offers alarge choice and a really good quality camera– But a PAL camera can be really expensive accordingits characteristics The signal deliver is RGB image (color palette)that reduce hardware or software paletteconversion cost– i.e. distortion, latency /artoolkit/documentation/hardware.htmARTooKit Camera The traditional choice is USB or Firewire camera Can notice that frame-rate or color palette (RGB,YUV, YUV compressed) depend mainly on thebandwidth of the technology USB Camera used generally compressed formattransmission like YUV:4:2:2 YUV:4:1:1– i.e. lossy compression Firewire Cameras offers better solution, but camerawith full RGB color palette are generally expensiveARTooKit Camera Table 1USB1.1max 1.5 MByte/smost low cost solution.IEEE1394a60 Mbyte/sgood solution with a standardizedcamera protocol.PCI32Bit1.0 (33Mhz)125.89 MByte/sPCI64Bit2.0 (33Mhz)251.77 MByte/sUSB2.0max 50 MByte/sbadly supported on Linux.IEEE1394b100 Mbyte/sfew cameras support this protocol.PCI32Bit2.1 (66Mhz)251.77 MByte/sPCI64Bit2.1 (66Mhz)503.54 MByte/s– A compressed format in VGA remains a good edu/artoolkit/documentation/hardware.htmARTooKit Camera Table 2SIF RGB 15 fps27 MBit/s (3.37 MByte/s)SIF RGB 30 fps55 MBit/s (6.87 MByte/s)SIF YUV 4:1:1 15 fps13 MBit/s (1.62 MByte/s)SIF YUV 4:1:1 30 fps27 MBit/s (3.37 MByte/s)VGA RGB 15 fps106 MBit/s (13.25 MByte/s)VGA RGB 30 fps221 MBit/s (26.37 MByte/s)VGA YUV 4:1:1 15 fps53 MBit/s (6.63 MByte/s)VGA YUV 4:1:1 30 fps106 MBit/s (13.25 ocumentation/hardware.htmHMDs and ARTooKit ARToolKit uses computer visiontechniques for image-basedrecognition and tracking Since it uses only a single camera, aself-contained head tracking systemcan be developed if this camera isfixed to an HMD In this way AR applications can bedeveloped which use entation/hardware.htm13

10/5/2015Non-HMD and ARTooKit It is not necessary to have a head mounteddisplay to use the ARToolKit– A camera connected to a computer is the onlyrequirement Without an HMD ARToolKit applications can beviewed on a computer monitor– With an HMD a more immersive experience can becreatedARToolKit Limitationsand /documentation/hardware.htmGeneral Limitations There are some limitations to purely computervision based AR systems Naturally the virtual objects will only appearwhen the tracking marks are in view This may limit the size or movement of thevirtual objects It also means that if users cover up part of thepattern with their hands or other objects thevirtual object will documentation/cs.htmLimitations Pattern Complexity The simpler the pattern the better. Patternswith large black and white regions are themost effective– i.e. low frequency patterns Replacing a 4.25 inch square pattern with apattern of the same size but much morecomplexity will reduce the tracking range from34 to 15 umentation/cs.htmLimitations Range The larger the physical pattern the furtheraway the pattern can be detected and so thegreat volume the user can be tracked inPattern Size(inches)Usable mitations Marker Orientation Tracking is also affected by the markerorientation relative to the camera As the markers become more tilted andhorizontal, less and less of the center patternsare visible and so the recognition becomes lkit/documentation/cs.htm14

10/5/2015Limitations Lighting Overhead lights may create reflections andglare spots on a paper marker and so make itmore difficult to find the marker square To reduce the glare patterns can be madefrom more non-reflective material For example, by gluing black velvet fabric to awhite base The 'fuzzy' velvet paper available at craftshops also works very entation/cs.htmRange Parameter Tracking Error with mentation/benchmark.htmPattern Size Parameter Tracking Error with Pattern entation/benchmark.htmAngle Parameter Tracking Error with mentation/benchmark.htmFirst StepFirst ARToolKitExample Once ARToolKit has been installed there is asample program, simpleTest or simple accordingyour ARToolKit version, in the bin directory thatcan be run to show the capabilities of ARToolKit In order to run the code you need to print outthe hiroPatt.pdf paper fiducial marker that iscontained in the directory patterns Best performance is achieved if this is glued to apiece of cardboard to keep it entation/userstartup.htm15

10/5/2015Running ARToolKit In each platform you have generally two choices:click on the program from your OS explorer orstarting from the command line– the last choice is better since it give you the error andstandard output stream (and ARToolKit used it a lot) Each platform offer a dialog box to setup thevideo before starting the main AR mentation/userstartup.htmLinux On a Linux double click on the simple icon in thebin directory from your gnome or kde explorer– Notice than you will not have error or output streamdisplay Otherwise start a terminal, go the bin directoryand runsimple– If you have V4L this dialog will be umentation/userstartup.htmsimpleTest Output In order for the virtual object toappear the entire black squareborder of the marker must be in viewand also the pattern containedwithin the square border must bevisible at all times If the virtual image does not appear,or it flickers in and out of view it maybe because of lighting conditions This can often be fixed by changingthe lighting threshold value used bythe image processing ocumentation/userstartup.htmWindows On a Windows PC doubleclick on the simple.exeicon in the bin directoryfrom your windowsexplorer A dos console windowwill open and when thecamera is detected thefollow dialog will entation/userstartup.htmMacOS On MacOS X double click on the simple icon inthe bin directory from your mac explorer A console window will open and when thecamera is detected the follow dialog will open Otherwise start the Terminal program, go the bindirectory and run umentation/userstartup.htmsimpleTest Output If you hit the ‘t' key on the keyboardyou will be prompted to enter a newthreshold value This should be a value between 0and 255; the default is 100 Hitting the ‘d' key will show thethresholded video image below themain video window Possible tracking patterns found inthe input image are marked by a redsquare in the thresholded image This will help you check the effect oflighting and threshold values on thevideo mentation/userstartup.htm16

10/5/2015Camera and Marker RelationshipsARToolKit Tutorials ARToolKit gives the position of the marker inthe camera coordinate system, and usesOpenGL matrix system for the position of thevirtual object These different elements and theirrelationship is explained in the next umentation/tutorialcamera.htmARToolKit’s Coordinate System‘exview’ Example The example demonstrates:– The camera position in the marker (in 3D and text)– A 3D model of the camera and the umentation/tutorialcamera.htm‘relationTest’ Example Computes the relativetransformationbetween two markers– Important for havingdifferent documentation/tutorialcamera.htmMulti-Marker Tracking ARToolKit can give you the position of multiplemarkers as a function of the camera coordinatesystem Can also have a set of markers that define onlyone position– i.e. Glue to a cardboard plane ARToolKit can do that with a specific set offunctions based on the multiMarker m17

10/5/2015Using Multiple Markers‘multiMarker’ Configuration File Print the pattMulti.pdf Run multiTest mVideo Initialization Configure the video input– vconf video configuration string Start video captureARToolKit Code– arVideoCapStart(); In init(), open the video– arVideoOpen( vconf );– arVideoInqSize(&xsize, &ysize); When finished, close the video path– arVideoCapStop();– arVideoClose();Changing Image Size For input captureGraphics Handling: libARgsub Set up and clean up the graphics window– vconf “videoWidth 320,videoHeight 240";void argInit( ARParam *cparam, double zoom,int fullFlag, int xwin, int ywin, int hmd flag );void argCleanup( void ); Note – the camera must support this image size For display– argInit( &cparam, 1.5, 0, 0, 0, 0 ); The second parameter means zoom ratio fordisplay image size related to input image–––––cparam: camera parameterzoom: zoom ratiofullFlag: 0: normal, 1: full screen modeXwin, ywin: create small window for debughmd flag: 0: normal, 1: optical see-through mode18

10/5/2015Main Functionmain(){init();argMainLoop( mouseEvent,keyEvent, mainLoop);}Graphics handling: libARgsub Go into the iterative cyclevoid argMainLoop(void (*mouseFunc)(int btn,int state,int x,int y),void (*keyFunc)(unsigned char key, int x, int y),void (*mainFunc)(void)); Swap buffers– void argSwapBuffers( void );mainLoop Functionif( dataPtr (ARUint8 *)arVideoGetImage()) NULL ) ge(dataPtr, 0, 0 );arVideoCapNext();argSwapBuffers();Graphics handling: libARgsub Set the window for 2D drawing– void argDrawMode2D( void ); Set the window for 3D drawing– void argDrawMode3D( void );– void argDraw3dCamera( int xwin, int ywin ); Display image– void argDispImage( ARUint8 *image, int xwin, intywin );Image capture: libARvideo Return the pointer for captured image– ARUint8 *arVideoGetImage( void ); Pixel format and byte size are defined inconfig.h– #define AR PIX FORMAT BGR– #define AR PIX SIZE 3Detecting a Marker Key points:– Threshold value– Important external variables– arDebug – keep thresholded image– arImage – pointer for thresholded image– arImageProcMode – use 50% image for imageprocessing AR IMAGE PROC IN FULL AR IMAGE PROC IN HALF19

10/5/2015Marker Detection/* detect the markers in the video frame */if(arDetectMarker(dataPtr, thresh,&marker info, &marker num) 0 ) {cleanup();exit(0);}for( i 0; i marker num; i ) {argDrawSquare(marker info[i].vertex,0,0);}Using a Pattern Key points– Pattern files loading– Structure of marker information Region features Pattern Id, direction Certainty factor– Marker identificationPattern File Loadingint patt id;char *patt name “Data/kanjiPatt”/* load pattern file */if(patt id arLoadPatt (patt name) 0){printf ("Pattern file load error !! \n");exit(0);}Marker Structuretypedef struct {int area;int id;int dir;double cf;double pos[2];double line[4][3];double vertex[4][2];} ARMarkerInfo;Making a Pattern Template Use of utility program:– mk patt.exe Show the pattern Put the corner of red linesegments on the left-topvertex of the marker Pattern stored as atemplate in a file 1:2:1 ratio determines thepattern region usedChecking for Known Patterns/* check for known patterns */k -1;for( i 0; i marker num; i ) {if( marker info[i].id patt id) {/* you've found a pattern */printf("Found pattern: %d \n",patt id);if( k -1 ) k i;else/* make sure you have the best pattern (highest confidencefactor) */if( marker info[k].cf marker info[i].cf )k i;}}20

10/5/2015Getting 3D Information Key points– Definition of a real marker– Transformation matrix Rotation component Translation componentFinding the Camera PositionTransformation Matrixdouble marker center[2] {0.0, 0.0};double marker width 80.0;double marker trans[3][4];arGetTransMat(&marker info[i],marker center, marker width,marker trans);ARToolKit Coordinate Frame This function sets transformation matrix frommarker to camera into marker trans[3][4]– arGetTransMat(&marker info[k], marker center,marker width, marker trans); You can see the position information in thevalues of marker trans[3][4]– Xpos marker trans[0][3];– Ypos marker trans[1][3];– Zpos marker trans[2][3];Virtual Object Display Key points– OpenGL parameter setting– Setup of projection matrix– Setup of modelview matrixAppending your Own OpenGL Code Set the camera parameters to OpenGL Projectionmatrix– argDrawMode3D();– argDraw3dCamera( 0, 0 ); Set the transformation matrix from the marker tothe camera to the OpenGL ModelView matrix– argConvGlpara(marker trans, gl para);– glMatrixMode(GL MODELVIEW);– glLoadMatrixd( gl para ); After calling these functions, your OpenGL objectsare drawn in the real marker coordinates21

10/5/20153D Model Loading ARToolKit does not have a function to handle3D CG models 3rd party rendering software should beemployed– OpenVRML– OpenSceneGraph– 3D parsers– Unity– etcFinding Multiple Transforms Create object list– ObjectData T *object; Read in objects - in init( )– read ObjData( char *name, int *objectnum ); Find Transform – in mainLoop( )for( i 0; i objectnum; i ) {// Check patterns// Find transforms for each marker}Loading Multiple Patterns Object Structuretypedef struct {char name[256];int id;int visible;double marker coord[4][2];double trans[3][4];d

4 Vuforia Features Provide faster local detection of targets with capacity of tracking 5 targets simultaneously Efficient tracking in low light conditions and even though target is covered partially Extended tracking capabilities, which enable app to keep tracking targets and helps maintain a consistent reference for augmentations of

Related Documents:

pembelajaran augmented reality dan kelompok siswa yang diajar dengan menggunakan media pembelajaran virtual reality sebesar 5.71. Penelitian ini menunjukkan bahwa ada peningkatan hasil belajar dengan menggunakan media virtual reality dan augmented reality, serta terdapat perbedaan efektivitas antara media virtual reality dan augmented reality.

virtual reality reality augmented reality augmented virtuality mixed reality real environment virtual environment alex olwal course notes mixed reality 9 augmented reality: definition [Azuma 1997; Azuma, Baillot, Behringer, Feiner, Julier & MacIntyre 2001] 1) real virtual objects in real environment 2) runs interactively and in realtime

alternative reality market. The Alternative Reality Landscape Virtual Reality Augmented Reality Mixed Reality What it Does Changes reality by placing the user in a 360-degree imaginary world. Visible world is overlaid with digital content. Like AR, but virtual objects are integrated into and respond to visible surroundings. Where it Stands

Augmented reality; privacy; usability; law; policy ACM Classification Keywords H.5.1—Information interfaces and presentation (e.g., HCI): Artificial, augmented, and virtual realities. K.5.0—Legal aspects of computing: General. Introduction Although a vision for augmented reality (AR) has been around for decades, the technology is today .

Augmented Reality in Education ISBN: 978-960-473-328-6 . 1 Augmented Reality in Education EDEN - 2011 Open Classroom Conference Augmented Reality in Education Proceedings of the “Science Center To Go” Workshops October 27 - 29, 2011 Ellinogermaniki Agogi, Athens, Greece. 2File Size: 2MB

Keywords: spatial knowledge; augmented reality; driving; head-up display . Effects of Augmented Reality Head-up Display Graphics' Perceptual Form on Driver Spatial Knowledge Acquisition NAYARA DE OLIVEIRA FARIA ABSTRACT In this study, we investigated whether modifying augmented reality head-up display (AR HUD) graphics' perceptual form .

Augmented Reality "Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data." (Wikipedia) "Augmented Reality (AR) is a variation Virtual Reality

that modern reality presentation technologies are compelling mediums for the expression of digital IoT streams. Such reality presentation technologies include the eXtended Reality (XR) family of technologies 3-- Augmented Reality (AR), Mixed Reality (MR) and Virtual Reality (VR) - rendering as well as more mature and accepted