UbiBeam: An Interactive Projector-Camera System For .

3y ago
23 Views
2 Downloads
642.17 KB
7 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Nixon Dill
Transcription

UbiBeam: An Interactive Projector-Camera System forDomestic DeploymentJan Gugenheimer, Pascal Knierim, Julian Seifert, Enrico RukzioInstitute of Media InformaticsUlm University89069 Ulm, Germany{jan.gugenheimer, pascal.knierim, julian.seifert3, enrico.rukzio}@uni-ulm.deIn Proceedings of the Ninth ACM International Conferenceon Interactive Tabletops and Surfaces (ITS ’14). ACM, NewYork, NY, USA, 305-310.DOI http://dx.doi.org/10.1145/2669485.2669537

UbiBeam: An InteractiveProjector-Camera System forDomestic DeploymentJan GugenheimerInstitute of Media InformaticsUlm University89069 Ulm, Germanyjan.gugenheimer@uni-ulm.deEnrico RukzioInstitute of Media InformaticsUlm University89069 Ulm, Germanyenrico.rukzio@uni-ulm.dePascal KnierimInstitute of Media InformaticsUlm University89069 Ulm, Germanypascal.knierim@uni-ulm.deJulian SeifertInstitute of Media InformaticsUlm University89069 Ulm, Germanyjulian.seifert3@uni-ulm.deAbstractPrevious research on projector-camera systems hasfocused for a long time on interaction inside a labenvironment. Currently they are no insight on how peoplewould interact and use such a device in their everydaylives. We conducted an in-situ user study by visiting 22households and exploring specific use cases and ideas ofportable projector-camera systems in a domesticenvironment. Using a grounded theory approach, weidentified several categories such as interactiontechniques, presentation space, placement and use cases.Based on our observations, we designed and implementUbiBeam, a domestically deployable projector-camerasystem. The system comprises a projector, a depthcamera and two servomotors to transform every ordinarysurface into a touch-sensitive information display.Author KeywordsSteerable projection; projector-camera system; domesticdeployment; ubiquitous computingPermission to make digital or hard copies of part or all of this work forpersonal or classroom use is granted without fee provided that copies are notmade or distributed for profit or commercial advantage and that copies bearthis notice and the full citation on the first page. Copyrights for third-partycomponents of this work must be honored. For all other uses, contact theowner/author(s). Copyright is held by the author/owner(s).ITS 2014 , November 16–19, 2014, Dresden, Germany.ACM 2669485.2669537ACM Classification KeywordsH.5.2 [Information Interfaces and Presentation (e.g.HCI)]: User Interfaces

Figure 1: The UbiBeam Systema Compact and SteerableProjector-Camera SystemFigure 2: Possible scenarios forthe usage of projector-camerasystems in a domesticenvironmentIntroductionDesign ProcessPublic displays, smartphones, and tablets are devices thataim for constantly providing information to users inubiquitous usage contexts. They all can be regarded asinitial steps towards ubiquitous and everywhere displays asenvisioned by Weiser [2]. However, such physical devicesstill cannot fully achieve the ubiquity and omnipresence ofWeiser’s vision, as they do not fully blend into theenvironment.We conducted an exploratory field study to investigate therequirements and to gain a deeper understanding of howprojector-camera systems can be used in domesticenvironments. To collect data, we visited 22 households(10 female, 12 male) between 22 and 58 years of age(M 29) and conducted semi-structured interviews. Theparticipants were provided with a mock up whichconsisted of an APITEK Pocket Cinema V60 projectorinside of a card box mounted on a Joby Gorillapod. Thislow-fidelity mock up was used to stimulate the creativityof the participants.Recent research aims to achieve this ubiquity bysimulating omnipresent screens with a projector-camerasystem (e.g. [5, 3, 1, 6, 7]). The main focus of theseprojects was to research the interaction with projecteduser interfaces. While previous work provides valuableinsights into either interaction techniques or technicalimplementations, most of these projects focused oninstrumented laboratory environments. Very little of themresearched the use and interaction of projector-camerasystems outside of the laboratory. Therefore, theinteraction space is limited to interaction with the contentand not on deployment or domestic scenarios for the user.In this work, we introduce UbiBeam (figure 1), a smalland portable projector-camera system which is designedbased on an in-situ study in the homes of 22 people. Weenvision a future where such devices will be sold inhardware stores. They could be available in different formfactors, either as a replacement for light bulbs or a simplesmall box which can be placed in several ways inside theusers’ environments (figure 2). The design of these deviceswill not only focus on the interaction with the content butalso on aspects such as deployment and portability. Thiswork is a first step towards developing projector-camerasystems for end users as it provides system and designrequirements derived from an in-situ study.The interviews consisted of a questionnaire about the useof a projector-camera system and the creation a potentialset-up using the mock up (figure 3). To analyze the data,we selected a grounded theory approach. The datagathering was conducted using semi-structured interviews,notes, pictures and video recordings of several sessions.Two of the authors coded the data using an open, axialand selective coding approach. The initial researchquestion was: ”How would people use a small and easydeployable projector-camera system in their daily lives?When and how would they interact with such a device,and how would they integrate it into their home? ”.During the process we discovered that the four maincategories the participants were focusing on when theyhandled the projector-camera system were:Projector-Camera System placement: Where was theprojector-camera system mounted inside the room ?Projection surface: What projection surfaces did theparticipant choose ? Interaction modalities: Whatmodalities were used for the input and why ? ProjectedContent/Use Cases: What content did the participantwant to project for each specific room and ?

Content and Use CasesThe exact use cases were dependent on which room theparticipants were referring to. However, two largerconcepts could be derived from the set-ups theparticipants created: information widgets andentertainment widgets. We consider information widgetsas use cases in which the participant almost only wants toaggregate data. The most use cases were used as an aidin finishing a specific task characteristic to the room.Entertainment use-cases were mostly created in the livingroom, bedroom and bathroom. Here the focus was onenhancing the free time one spends in these rooms andmaking the stay more enjoyable.Figure 3: Users building andexplaining their setupsPlacement of the Projector-Camera SystemSimilar to the use cases, the placement can be dividedinto two higher concepts: placing the devise in reach andout of reach. Participants placed the devices in thebedroom, bathroom and in the kitchen mostly within theirreach. Each time the device was mounted on waist orshoulder height. In the living room, working room andcorridor participants preferred a mounting above bodyheight. These were also rooms where participants couldimagine a permanent mounting. For this reason thedevice was placed in a way that it could project on mostof the surfaces and was ”not in the way ” (P19).Orientation and Type of SurfaceFor every interface participants preferred flat and planarsurfaces. In the introduction to the study it was explainedto each participants that it is technically possible toproject onto non planar surfaces without distortion.Nevertheless, only one participant wanted to project ontoa couch. All others created flat and planar interfaces: ”Iprefer flat surfaces even if they are undistorted” (P1).Therefore the only classification which could been madeto the projection surfaces was if they were horizontal, liketables or vertical, like walls. Both types of surfaces wereused almost evenly spread in the kitchen, bedroom,working room and living room. However in the corridorand the bathroom mostly vertical surfaces were used dueto the lack of large horizontal spaces. The projectionsurface was mostly used to support the use-case and wasinfluenced by the room.Interaction ModalitiesThe main interaction modalities participants requestedwere speech recognition, touch or a remote control. Othertechniques such as gesture recognition, shadow interactionor a laser pointer were mentioned rarely. The interactionmodality was highly influenced by the room and theprimary task in there. The location of the surface was abig influence on the interaction. If the surface was thetable, touch was preferred. If the surface was a wall theremote control was used. One participant explained thathis choices are mostly driven by convenience: ”You see, Iam lazy and I don’t want to leave my bed to interact withsomething ” (P22).Derived Requirements for PrototypeAfter analyzing the data from the semi-structuredinterviews we combined the results with the questionnairesand derived several requirements for our prototype of adomestically deployed projector-camera system.Analyzing the semi-structured interviews participantsalways wanted more than only one fix surface in everyroom. Considering the placement out of reach, weconcluded that the projector-camera system must besteerable. Furthermore, due to the high amount ofrequests, the interaction with the device itself must bemediated trough a remote control. However theinteraction with the projected interface should be

Figure 4: Implementation of theUbiBeamimplemented with touch to be able to create interactivetabletops. The form factor was mostly dictated by theprojector used. We analyzed the set-ups of theparticipants and found out that the distance between thedevice and surface was between 40cm and 350cm (Mdn 200cm). The projected surfaces sizes varied from the sizeof a cupboard door to a whole wall. Therefore, theprojector used must be an ultra-compact DLP to have ahigh brightness at the required distance and still have asmall form factor. Since participants wanted to carry thedevice into several rooms and have different use cases themount must offer a quick and easy deployment. A lastissue which came up several times was the focus of theprojector. Participants did not want to adjust the focusevery time they deploy the device in a new location.Therefore an auto focus must be realized.ImplementationFigure 5: Hardware Constructionfor the Pan-Tilt Unit and theAuto FocusHardware ArchitectureUbiBeam (figure 4) uses the ORDROID-XU as theprocessing unit which offers a powerful eight-core systembasis chip (SBC). A WiFi-Dongle and a wireless keyboardare also connected to the SBC. The Carmine 1.08 fromPrimeSence is used as a depth camera. It offers a widerange advantage in comparison to smaller Time-of-Flightcameras. Moreover, it is well supported by the OpenNIframework. As for the projector we opted for theultra-compact LED projector ML550 by OPTOMA (a 550lumen DLP projector combined with a LED light source).It measures only 105 mm x 106 mm x 39 mm in sizeand weights 380 g. The projection distance is between0.55 m and 3.23 m. For the pan and tilt of the system,two HS-785HB servo motors by HiTEC are used. Thesequarter scale servos offer a torque of 132 N cm. To beable to provide an auto focus, we built similar to [6] aSPMSH2040L linear servo which is attached to thefocusing unit of the projector. To control the actuators,an Arduino Pro Mini is used.Autofocus. The focus of the Optoma 550ML is manuallyadjusted via a small lever. To realise automaticadjustment of the focus, the movement of the lever iscontrolled with a servo (SPMSH2040L). The servo isglued to the designed servo mount as shown in figure 5.To determine the required position of the servo for a givendistance, a calibration task was conducted whichdetermined a formula which calculates a PWM signal to aparticular distance with a maximum error less than 40 µs.The final hardware construction measures 10.5 cm x 12.2cm x 22.5 cm including the pan-tilt unit and weighs 996g. To be able to easily mount the device to a variety ofsurfaces we adjusted it to a Manfrotto Magic Arm. Thehardware components can be bought and assembled forless than 1000 USDSoftware ImplementationBuilding a stand-alone projector-camera system requires alightweight and resource saving software. Therefore, weused Ubuntu 12.04 on the ODROID. For reading RGBand depth images, OpenNI version 2.2 for ARM is used.Image processing is done with OpenCV in version 2.4.6.Visualisation of widgets is accomplished with Qt (version4.8.2), a library for UI development using C and QML.Based on the results of the qualitative study, we designedthe interaction with UbiBeam following a simple concept:after running our software the projection becomes a touchsensitive interaction space. The user creates widgets onthis space (e.g. calender, digital image frame etc) andinteracts with them via touch (figure 6). The orientationof the device itself is done with an Android applicationsending pan and tilt commands. After moving the device

to a new space the auto focus and touch detectionrecalibrates automatically and creates a new interactionspace.Touch Algorithm. The touch detection was implementedbased on an algorithm presented in [4]. A key feature isthat touch is detected on any physical object without userdriven calibration tasks. The developed touch detectioncan be separated into four parts. First the scenery isanalyzed and a spatial image, the ground truth, isgenerated. This obtained image is filtered for noise andused to calculate a binary contact image while touchdetection is running. The contact image is filtered andsimple blob detection detects contact points. In a laststep, contact points are tracked over time andtransformed into interaction events which finally triggerevents intended by the user. Detected contact points aretracked over time to classify them into different touchevents (touch down, long touch, move, touch release).The spatial ground truth image is generated by temporalfiltering of 30 single depth images.Figure 6: Deployment ofUbiBeam inside a KitchenPicture Distortion. To be able to project distortion freecontent onto surfaces not perpendicular to the device, apre-warping of the projected content had to be done. Firsta plane detection on the depth map is executed followingthe concepts Yoo et al. [8]. This enabled us to findpossible projection surfaces. Then four points situated onone of the detected planes, spanning a rectangle of thedesired size are determined. Finally, the affinetransformation which transforms the widget to thedetermined points is calculated and applied to render acorrected representation of the widget.Developing Widgets. The developed framework allows adynamic loading of widgets. All the complexity of thespatially aware projection, dynamic touch detection andmovement of the projector-camera system areencapsulated and hidden from the view of the widget.This enables a straight forward widget development. Twodifferent possibilities are supported to create a newwidget. Developers are able to implement a providedinterface to create a more desktop like looking widgets.Alternatively, developers can implement widgets using QtUser Interface Creation Kit (Qt Quick). It uses QML todescribe modern looking, fluid UIs in a declarative manner.Discussion and Future WorkAs mentioned in the introduction we envisioned small anddeployable camera-projector systems which are designedfor domestic use. In current set-ups, aspects likeportability, deployment or domestic use cases andprojection surfaces where not taken into account.Therefore, this work provides valuable insights into thedomestic use of projector-camera systems. In a next stepwe would like to deploy the system and collect qualitativefeedback over a longer time period. The design of thesystem is suitable to be able to conduct a long term study.This would provide insights not only into the use of thesystem but also into how often it is used.ConclusionIn this work we provided an insight into how people woulduse a projector-camera system inside their homes. Weconducted a qualitative study using grounded theory thatdiscovered and analyzed four important categories adomestically deployed projector-camera system must focuson (Use Cases/Content, Placement of theProjector-Camera System, Projection Surface, InteractionModalities). Furthermore, the results from the qualitativestudy showed relationships between these categories. Weshowed that users differentiated between, basicinformation aggregation to support a specific in task in a

room and entertainment to enhance free time. Based onthese results, we derived requirements (Steerable, RemoteControl Interaction, Touch Input Interaction, FastDeployment, Auto Focus) for a first prototype, andexplored different form factors. In a final step, weimplemented UbiBeam, a steerable camera-projectorsystem which is designed based on requirements wederived from the study.[4][5]AcknowledgmentsThe authors would like to thank all study participants.This work was conducted within the TransregionalCollaborative Research Centre SFB/TRR 62”Companion-Technology of Cognitive Technical Systems”funded by the German Research Foundation (DFG).[6]References[1] Harrison, C., Benko, H., and Wilson, A. D.Omnitouch: Wearable multitouch interactioneverywhere. In Proceedings of the 24th Annual ACMSymposium on User Interface Software andTechnology, UIST ’11, ACM (New York, NY, USA,2011), 441–450.[2] Weiser, M. Human-computer interaction. MorganKaufmann Publishers Inc., San Francisco, CA, USA,1995, ch. The Computer for the 21st Century,933–940.[3] Wilson, A., Benko, H., Izadi, S., and Hilliges, O.Steerable augmented reality with the beamatron. InProceedings of the 25th Annual ACM Symposium on[7][8]User Interface Software and Technology, UIST ’12,ACM (New York, NY, USA, 2012), 413–422.Wilson, A. D. Using a depth camera as a touchsensor. In ACM International Conference onInteractive Tabletops and Surfaces, ITS ’10, ACM(New York, NY, USA, 2010), 69–72.Wilson, A. D., and Benko, H. Combining multipledepth cameras and projectors for interactions on,above and between surfaces. In Proceedings of the23Nd Annual ACM Symposium on User InterfaceSoftware and Technology, UIST ’10, ACM (New York,NY, USA, 2010), 273–282.Winkler, C., Seifert, J., Dobbelstein, D., and Rukzio,E. Pervasive information through constant personalprojection: The ambient mobile pervasive display(amp-d). In Proceedings of the 32Nd Annual ACMConference on Human Factors in Computing Systems,CHI ’14, ACM (New York, NY, USA, 2014),4117–4126.Xiao, R., Harrison, C., and Hudson, S. E. Worldkit:Rapid and easy creation of ad-hoc interactiveapplications on everyday surfaces. In Proceedings ofthe SIGCHI Conference on Human Factors inComputing Systems, CHI ’13, ACM (New York, NY,USA, 2013), 879–888.Yoo, H. W., Kim, W. H., Park, J. W., Lee, W. H., andChung, M. J. Real-time plane detection based ondepth map from kinect. In Robotics (ISR), 2013 44thInternational Symposium on (Oct 2013), 1–4.

Projector-Camera System Figure 2: Possible scenarios for the usage of projector-camera systems in a domestic environment Recent research aims to achieve this ubiquity by simulating omnipresent screens with a projector-camera system (e.g. [5,3,1,6, 7]). The main focus of these projects was to research the interaction with projected user interfaces.

Related Documents:

The PolyVision Driver Projector Add-in runs in the background whenever your computer is on. Configuring the PolyVision Driver Projector Add-in The ēno one whiteboard can interact with the PolyVision projector when you configure the PolyVision Driver Projector Add-in to identify the computer port where the projector is connected.

2-9V in unit & 2 AA in camera. Match polarities ( ) and ( ). Set camera date back, close camera lens and connect plug to camera port. 2 3 Secure camera, open camera shutter, and slide unit power switch to (ON) and back to (OFF), then push camera test button. Close camera Shutter, remove camera & load film, connect plug to camera, close cover. 4

User Manual Replace a Pro 3 Camera battery You can leave the camera housing in place so the camera position stays the same. 1. Release the camera from the camera housing. Press the button on the charging port underneath the camera. The camera clicks as it disengages from the camera housing. 2. Pull the camera all the way out of the camera .

manual. WARNING! MERCURY HAZARD. This projector’s bulb contains a small amount of mercury. If the lamp breaks, adequately ventilate the area where the breakage occurred. The projector bulb may require special handling and disposal at end-of-life. Protecting Your Projector Do: Connect projector’s cables before turning the projector ON.

Refer to these sections to learn more about your projector and this manual. Projector Features Notations Used in the Documentation Where to Go for Additional Information Additional Projector Software and Manuals Projector Part Locations Projector Features The Home Cinema 3200 and 3800 pr

Many projector lenses are not centered on the chassis of the projector. Ensure that the centerline of the lens is aligned perpendicular to the center of the screen. Position the lift offset the same distance as the offset of the projector lens. If your projector

the following sections. The basic workflow for projecting through a network is: 1. Set up a wired or wireless network for the projector. See "Setting up a wired network for the projector" on page 6 or "Setting up a wireless network for the projector" on page 8 for details. 2. Connect your computer or mobile device to the projector.

Anatomy should be a worthwhile investment of your time . Purpose of the Anatomy The Anatomy provides an entry-point for people seeking to understand asset management. There are . 1Version 32VP3uVblh2n2g2uVraVdhhu2Vcplp uyul2VtfmANVdDDVon 32hVouhuoCu4N. 8 1Version 32VP3uVblh2n2g2uVraVdhhu2Vcplp uyul2VtfmANVdDDVon 32hVouhuoCu4N .