2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) Implementation of Virtual Eye Gaze Based Interaction 1 Prof. Mangesh Kakade, 2Namrata Ghagre, 3Neeraj Choudhary, 4Raghav Dapke, 5Akash Badnag 1 Asst. Professor, 2Scholar , 3 Scholar, 4 Scholar, 5 Scholar 1 Department of Electronics and Telecommunication, 1GNIET, Nagpur, India, 2 Department of Electronics and Telecommunication, 2GNIET, Nagpur, India, 3 Department of Electronics and Telecommunication, 3GNIET, Nagpur, India, 4 Department of Electronics and Telecommunication, 4GNIET, Nagpur, India, 5 Department of Electronics and Telecommunication, 5GNIET, Nagpur, India Abstract : Eye gaze or eye tracking is a way of accessing your computer or communication aid using a mouse that you control with your eyes. The Tobii systems follow your eyes with amazing accuracy to see where you are looking on the screen. You can then select the item you are looking at by dwelling (staring at the screen for a length of time), blinking or clicking with a switch. Eye tracking technology has been studied in the field of Human Computer Interaction to understand the user’s point of regard in analyzing user interface designs, and also as an interaction device in its own right. While most prior research used eye tracking sensors for interacting with desktop monitors, recent advances in head-mounted displays for Virtual Reality have also driven development of head-worn eye trackers. VR HMDs with eye tracking technology are becoming more accessible, such as the FOVE HMD. Using an HMD with such capability, a computer can observe and learn user attention. Well-designed eye gazebased interaction could potentially offer more natural and implicit interaction that impacts the VR experience in a significant way. Early investigation with eye tracking for interaction in an HMD-based VR environment has shown performance benefits compared to pointing with fingers [3]. The interaction method used was selection based on eye-fixation time which has been widely adopted for 2D interfaces to solve the Midas touch problem. Fixation or dwell time is a standard delimiter for indicating a user’s intention to select an object through eye gaze alone. Dwell time typically ranges from 450 ms to 1 second for novices, but can be improved over time to around 300 ms in the case of gaze typing. However, this time constraint can negatively impact the user experience. Index Terms – HMD, VR I. INTRODUCTION When the required dwell time is too short, it puts pressure on the user to look away, avoiding accidental selection, but if it is too long, it results in longer wait times. While there are various approaches for developing novel eye gaze-based interaction, forcing unnatural eye movements could quickly cause fatigue or eye strain. If the method is too complex, it could end up overwhelming the user and require long training times. To prevent such problems, we need to understand natural eye movements and design interactions based on them. Prior research showed four primary types of natural eye movements: (1) saccade, a quick eye movement with a fixed end target, (2) smooth pursuit, a smooth eye movement towards a moving target (3) vestibulo-ocular reflex (VOR), an automatic eye movement that counters head movement when fixating on a target, and (4) vengeance, converging/diverging our eyes to look at targets at different distances. Previous research explored various interaction methods based on natural eye movements, such as detecting head gestures based on VOR leveraging smooth pursuit for auto calibration spontaneous interaction on public displays and interacting with 2D GUI controls. However, these were mainly designed for 2D interfaces on desktop monitors or large-screen displays. In this paper, we report on our explorations into designing novel eyegaze-based interaction techniques leveraging natural eye movements for immersive VR experienced in an HMD. JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 617
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) We introduce three novel interaction techniques, based on saccade, smooth pursuit, and VOR. We also report on our initial user study and discuss the relative strengths and weaknesses of the techniques. Fig1.1. Block diagram of 3D virtual environment Radial Pursuit (RP) is a novel eye-gaze-based selection method for VR using smooth pursuit, a natural eye movement when our eyes lock onto a moving object. RP can be useful in circumstances where a small target needs to be selected or the target is located among cluttered objects in a small volume where disambiguation is important. Since long dwelling is unnatural for our eyes because they normally saccade several times a second, it can be very difficult using only the gaze-dwell technique for selection. To overcome this problem, we leverage smooth pursuit. Previous research has shown that interaction techniques based on smooth pursuit can be versatile and robust Nevertheless; we could not find any work applying this technique in immersive VR. RP expands cluttered objects away from each other, reducing the ambiguity and enabling the user to clearly gaze at an object of Interest. The model will create a forum for these researchers together, present their ideas, and to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. Specifically, we want to encourage these communities to think about the implications of pervasive eye tracking for context-aware computing, i.e. the ability to track eye movements not only for a couple of hours inside the laboratory but continuously for days, weeks, or even months in people’s everyday life’s. The workshop aims to identify the key research challenges in pervasive eye tracking and mobile eye-based interaction and to discuss the technological and algorithmic methods required to address them. Fig 1.2 Eye movement according object This project converts the PoG output from the Mobile Eye into a virtual world gaze vector. This is a vector starting at the user’s eye position and heading off in the direction of their line of sight. Within the VE, this vector can be used to indicate potential areas of visual interest, or as advanced methods of controlling the environment. Being glasses mounted, the Mobile Eye’s frame of reference is that of the head tracker offset by the distance from the tracker to the eye. This relationship provides a method of converting from the (x,y) PoG coordinate output into the 3D virtual world gaze vector. JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 618
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) Eye Tracking Control Module Serial Capture, Alignment and Decoding. The Mobile Eye streams the encoded tracking information as consecutive 10 byte blocks of serial data. This component locks onto the stream to locate the start of each block, and then decodes the data into a structure which contains the PoG coordinates in the video stream. If the tracker fails to calculate the eye position, (e.g. due to the user blinking or removing the glasses), a status byte within this structure is used to indicate an error condition. Mapping PoG Coordinates on to a Virtual World Plane. The PoG coordinates can be considered as the (x,y) coordinates on a plane that is a constant distance and perpendicular to the user’s head position. A similar plane can be created in virtual space maintaining a fixed position relative to the user’s head tracked location. A relationship between the real and virtual gaze positions can be obtained by having the user fixate on a known point on the virtual plane, while reading the PoG coordinates streamed from the Mobile Eye. II. RESEARCH METHODOLOGY 2.1 Eye Gaze Tracking Analysis A review of the importance of eye gaze tracking systems in different areas will be done. A review of the technology behind eye gaze tracking systems will also be done, along with a state-of-the-art of already existing technologies. 2.1.1 Introduction Eye gaze tracking means the tracking of the approximation of the eye gaze direction of a user. Most of the time, the approximation of the eye gaze direction of a user means that we try to identify which object a user is looking at. In this case, with a computer screen, we try to find the coordinates of the screen the user is looking at. Depending on the method used to track the eye gaze, the device has different degrees of freedom. The most simple eye tracking systems give the direction of the eyes based on the head (mounted head systems) or based on a fixed position of the eyeball (systems with head fixation). More complicated systems allow the user to move their head in front of a fixed device. These systems implicitly track the head. The eye gaze trackers that are wearable in order to be used in 3D world have to track the direction of the gaze not only based on the head but also in space. It would be more suitable to use the term eye and head tracker, than using the term eye gaze tracker in this case [6]. Working with video based eye gaze trackers allows to get the gaze direction but also the size of the pupil. When the pupil becomes wider or when the pupil narrows, it indicates an emotional response to the scene being looked at. It is quite interesting for research to look into that; but the change of the size of the pupil can be also caused by a change in lightning. If a large amount of light enters the pupil, then the pupil narrows and on the contrary if the pupil does not receive enough light, it becomes wider. This kind of work focuses more on the size of the pupil than on the gaze direction; it is another interpretation of the term eye gaze tracking [6]. The term eye gaze tracking includes also some works that track the whole eye, focus on the shape of the eye (even when it is closed) and contains the eyebrows in the eye tracking. Consequently, the term eye gaze tracking represents a lot of different types of eye tracking [6]. 2.1.2 Eye Gaze and Communication Gaze is one of the main keys to communication. Most of the time, when we are asking something, the object of the question does not need to be mentioned if we are looking at it while asking the question. For example, if we ask what is that? While looking at a specific object, there is no need to specify what the object is; assuming the person asked is aware of what we are looking at [6]. A pioneer in research about gaze and interaction with gaze, Bolt, wrote in [2] about the importance of the communication through gaze. He explained it with this example: "Consider the case of asking the question "What is your favorite sport?" in the presence of several people, but looking at Mary, say, not at Frank, Judy, or Dave. Then you utter the selfsame words, but now looking at Dave. The question is a different question; the difference lies not in its verbal component, but in its intended addressee given by eyes." In this example, we can observe that by simply looking at a person, communication is already established. Most of the time, people start by looking at a person to address him or her. Another example is that if we stare at a person for too long and for no reason, this is not polite. These examples show that our eyes are not here only for vision but also for communication and it is our responsibility to handle what we are looking at [6]. If we compare human eyes with animal’s eyes, the main difference is the absence of white around the pupil in the animal eyes (see Figure 3.3). No white eyeball is visible in animal eyes, in particular concerning mammals, even if the eyes for both human and mammals work in a similar way. Because of the absence of white in the animal eyes, it becomes more difficult to find out what animals are looking at. Based on that, we can wonder if knowing gaze direction have been helpful in the development and evolution of the humans [6]. JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 619
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) Studies have been made, and in one of them, Vertegal et al. explained that there is a high probability that in a discussion, the person listening (p 88%) or the person spoken to (p 77%) is also the person that is looked at [29]. In [8], Eibl-Eibesfeldt notices that a communication channel can be created just by looking at someone; but depending on our culture, this can be interpreted as an act of aggression. The cultural background also seems to play a role in the movement of the eyes. In [3], a study is conducted to find the differences in scene perception between Americans and Chinese. The result of the study is that the Americans are looking at the foreground objects longer and sooner while the Chinese are more looking at the background. The formation of a person is also taken into account for the movement of the eyes. People reading different texts in different languages have a different eye response: it depends if the person speaks the language that they are reading, if the person is learning it or if they do not speak the language at all. Eye motions imply high levels of cognition and are based on the intent of a person. Even if there are universal aspects as biology, eye motions rely a lot on personal Characteristics. Every person sees the world with their own eyes [6]. 2.1.3 Eye Gaze and Human Computer Interface Using eye gaze interface is not so much spread unless it is really necessary; the only users of this technology are disabled people who cannot move anything else than their eyes. These interfaces are realized to control the computer only with gaze. And these systems work very fine. But for a person with no disability, it seems that using eye gaze for interaction is less efficient than other modalities (keyboard, mouse, speech interaction), even if eye motions are fast and if looking is an easy task [6]. The following two sections present a list of advantages and disadvantages of eye gaze interfaces. 2.1.4 Advantages This section presents the advantages of using eye gaze tracking for HCI. Ease of use Using eye gaze interaction would allow to free the hands and by consequence not to overstress the hand muscles. Plus it would not add some load to the eye muscles because eyes move anyway without any interaction constraint. A simple example shows that when using the mouse to click on a button, the eyes follow the movement of the mouse on the screen in any case [6]. Faster interaction Eye movements are fast and using them to interact would be fast as well. But nowadays most of the interactions with eye gaze, as eye typing systems for instance, is slower than with ordinary inputs (keyboard for example). Combining eye gaze and another modality would allow to speed-up the interaction [6]. No maintenance The eye tracking systems based on video do not need interaction or contact: that is why maintenance is not mandatory. Unlike mice and keyboards, the eye tracking devices do not need to be cleaned, which becomes a real problem for mice and keyboards. For example, a glass or a strong transparent material can be placed in front of the camera and this way, it becomes vandalism-proofed; which is not possible for keyboards and mice [6]. Hygienic interaction Another advantage of eye gaze tracking systems is their hygienic interface. Because of the absence of contact needed with such systems, it would fit perfectly in environments with high hygienic requirements. For example, in an operation room for surgery, eye gaze systems would fill the requirements; there is indeed no need of touch interaction with the systems. It can also be useful for public interfaces when there is epidemic menace and when there is a need for hygienic interaction [6]. Remote control Nowadays, with the technology, it is possible to have remote control with eye gaze tracking systems. Detecting the eyes over meters of distance is now possible with the lenses of the camera and its high resolution. It is even possible to detect the eyes at one meter with low cost eye gaze tracking systems [6]. Interaction certified Using eye gaze interaction certifies the presence of a user in front of the webcam as well as the attention of this user. For example, if we use a cell phone with eye gaze interaction, we do not fear to call someone by accident while the cell phone is in our pocket. Eye gaze interaction can require some specific behaviour of the user; for instance, for the user to go on with further features, it can be asked to read a warning text [6]. Users activity detailed The eyes reveal a lot about someone activities; tracking them gives useful information about what the user is doing. Without further data analysis, an eye gaze tracking system provides information about what the eyes are looking at, and this is already of big potential for context-aware systems. With simple data analysis, it is possible to detect whether a user is JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 620
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) reading or doing other activities for example. With further data analysis, it is possible to detect emotional or physical condition of the user, as their level of literacy and their age [6]. 2.1.5 Disadvantages This section presents the disadvantages of using eye gaze tracking for HCI. Eyes control Because eyes are what we use for vision, it is common that eyes do Unconscious movements. This can mess the system that uses eyes as input. Controlling our eye motions and at which level we are able to do it are two components not wellknown. By controlling the eye motions, we mean to remove the unconscious movements of the eyes to only keep the intended movements. The only thing we know is that we are able to control where we look because it is a requirement for our social life. The unknown variable is whether we are able to train our eye muscles as well as we train our fingers in order to play guitar for example [6]. Midas touch because once again we use our eyes for vision, using them for computer interaction can cause conflicts. This conflict is called the Midas Touch problem. The main problem is to know if our eyes are just looking around to gather information on our environment or if they are invoking an action for interaction. If we cannot distinguish these two kinds of action; it can cause unwanted action for interaction.The interface has to distinguish the intentional eye movements and the natural ones.There is also the question of the blinking objects and the moving objects that can result in conflicts as well. For example, a blinking pop-up on a web page can disturb our eye movements. It is still a topic for researchers [6]. Eyes muscles overstressed Using to such an extent the muscles for specific actions causes damage and it is known as repetitive strain injury (RSI). The same thing happening to the eye muscles is a real fear. This fear is justified, but because the eyes move without any break, even in our sleep, it seems that it is not a problem [6]. 2.1.6 Eye gaze and other Applications Historically, eye gaze tracking was used for a interface design application (as seen previously). The knowledge of what a person is looking at is very important to design cars, devices, cockpits and others. Studies about these fields are based on the hypothesis that what a user is looking at is supposed to show the thought that is on the top of the stack of cognitive processes. It is very useful for designing and improving interfaces because the meaningfulness, visibility and placement of the elements of the interface can be estimated. Eye gaze tracking is used a lot for HCI researches but only some applications have been made for consumers; because of the midas touch problem. The application is supposed to react only at the right time in appropriate situations and not every time the direction of our gaze changes III. RESULTS AND DISCUSSION 3.1 Experimental Result The Eye gaze Edge is an eye-operated communication and control system that empowers people with disabilities to communicate and interact with the world. By looking at control keys or cells displayed on a screen, a user can generate speech either by typing a message or selecting pre-programmed phrases. Eye gaze Edge Systems are being used to write books, attend school and enhance the quality of life of people with disabilities all over the world. Different experiments were conducted to prove that the proposed method of embedding given input image of eye. A specialized video camera mounted below the Eye gaze Edge screen observes one of the user’s eyes. Sophisticated image processing software in the system analyses the camera’s images 60 times each second and determines where the user is looking on the screen. Nothing is attached to the user’s head or body. 3.2Implementation of proposed methodology for Eye Gaze implementation The following are the steps that are used for this project work: 3.2.1 Face detection and trackin A 15 second calibration procedure is required to set up the system for a particular user. To calibrate, the user looks at a small circles as it moves around the screen. There is no need to recalibrate if the user moves away from the screen and returns later JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 621
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) Figure 3.2.1: Face detection and tracking Window 3.2.2 Implementation of proposed methodology for invert image Figure 3.2.2: Invert image output 3.2.3 Implementation of extracting the gray scale image: A user operates the system by looking at rectangular “keys” or cells that are displayed on the control screen. To “press” a key, the user looks at the key for a specified period of time. And extract gray scale image from the invert image. 3.2.3 Fig. Output of Gray Scale Image JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 622
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) 3.3 Implementation Canny edge detection The Canny edge detector is an edge detection operator that uses a multi stage algorithm to detect a wide range of edges in images. Canny also produced a computational theory of edge detection explaining why the technique works. Canny edge detection is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. Canny has found that the requirements for the application of edge detection on diverse vision systems are relatively similar. Thus, an edge detection solution to address these requirements can be implemented in a wide range of situations. The general criteria for edge detection include: 1. Detection of edge with low error rate, which means that the detection should accurately catch as many edges shown in the image as possible 2. The edge point detected from the operator should accurately localize on the center of the edge. 3. A given edge in the image should only be marked once, and where possible, image noise should not create false edges. 3.3.Fig. Output of Canny edge detection Virtual environmental menu With a virtual environment and eye tracking from Tobii Pro, the only step left is to analyze the actual data. Using data enabled by the Pro VR Integration, Tobii Pro VR Analytics provides instant access to an individual's attention and behaviour patterns while they are completely immersed in VR. Eye tracking metrics, interaction measures, and automated visualizations are easy to understand and customize for our project needs. 3.3.1Fig. Output which shows virtual environment JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 623
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) OUTPUT OF DESIGN Here we show the output of our project, in that Eye tracking is the key to truly immersive research. To gain objective and reliable insights into human behavior, you need to use high-quality eye tracking data in your analysis. In order to bring our years of eye tracking knowledge into VR. 3.3.2Fig. Output design The Eye gaze Edge is an eye-operated communication and control system that empowers people with disabilities to communicate and interact with the world. By looking at control keys or cells displayed on a screen, a user can generate speech either by typing a message or selecting pre-programmed phrases. Eye gaze Edge Systems are being used to write books, attend school and enhance the quality of life of people with disabilities all over the world. When working with eye tracking in VR, you are able to leverage the benefits of both techologies. VR allows you to create any type of simulated environment, where visual stimuli and scenarios can be quickly switched or easily repeated, while eye tracking gives you insights into where the participant's visual attention is at each moment of the experience and what visual elements trigger certain responses and behaviors. 3.3.3 Fig. Output design IV. REFERENCES [1] R.J.K. Jacob. What you look at is what you get: eye movement-based interaction techniques. In: Proc. of CHI 1990, pp. ,1990. [2] FOVE VR. "https://www.getfove.com, accessed 2016-11-28". [3] V. Tanriverdi, and R.J. Jacob. Interacting with eye movements in virtual environments. In: Proc. of CHI 2000, pp. 265-272, 2000. [4] P. Majaranta, U. Ahola, and O. Špakov. Fast gaze typing with an adjustable dwell time. In: Proc. of CHI 2009, pp. 357-360, 2009. [5] K. Rayner. Eye movements in reading and information processing: 20 years of research. Psychological bulletin, 124, (3), pp. 372, 1998. [6] O. Špakov, and P. Majaranta. Enhanced gaze interaction using simple head gestures. In: Proc. of UbiComp 2012, pp. 705-710, 2012. [7] D. Mardanbegi, D.W. Hansen, and T. Pederson. Eye-based head gestures. In: Proc. of ETRA 2012, pp. 139-146, 2012. JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 624
2019 JETIR March 2019, Volume 6, Issue 3 www.jetir.org (ISSN-2349-5162) [8] K. Pfeuffer, M. Vidal, J. Turner, A. Bulling, and H. Gellersen. Pursuit calibration: making gaze calibration less tedious and more flexible. In: Proc. of UIST 2013, pp. 261-270, 2013. [9] M. Vidal, A. Bulling, and H. Gellersen. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proc. of UbiComp 2013, pp. 439-448, 2013. [10] J. Kangas, O. Špakov, P. Isokoski, D. Akkil, J. Rantala,, and R. Raisamo. Feedback for Smooth Pursuit Gaze Tracking Based Control. In: Proc. of AH 2016, pp. 1-8, 2016. [11] M. Dhuliawala, J. Lee, J. Shimizu, A. Bulling, K. Kunze, T. Starner, and W. Woo. Smooth eye movement interaction using EOG glasses. In: Proc. of MUI 2016, pp. 307-311, 2016. [12]S. Jalaliniya, and D. Mardanbegi. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements. In: Proc. of CHI 2016, pp. 5801-5811, 2016. [13] O. Špakov, P. Isokoski, J. Kangas, D. Akkil, and P. Majaranta. PursuitAdjuster: an exploration into the design space of smooth pursuit-based widgets. In: Proc. of ETRA 2016, pp. 287-290, 2016. [14] Pupil labs. " https://pupil-labs.com, accessed 2016-11-28". JETIR1903690 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org 625
The term eye gaze tracking includes also some works that track the whole eye, focus on the shape of the eye (even when it is closed) and contains the eyebrows in the eye tracking. Consequently, the term eye gaze tracking represents a lot of different types of eye tracking [6]. 2.1.2 Eye Gaze and Communication Gaze is one of the main keys to .
The evolutionof the studies about eye gaze behaviour will be prese ntedin the first part. The first step inthe researchwas toprove the necessityof eye gaze toimprove the qualityof conversation bycomparingeye gaze andnoneye gaze conditions.Then,the r esearchers focusedonthe relationships betweeneye gaze andspeech: theystati sticallystudiedeye gaze
2.1 Hardware-based Eye Gaze Tracking Systems Hardware-based eye gaze trackers are commercially available and usually provide high accuracy that comes with a high cost of such devices. Such eye gaze trackers can further be categorized into two groups, i.e., head-mounted eye trackers and remote eye track-ers. Head-mounted devices usually consist .
and Schmidt, 2007) worked on eye movements and gaze gestures for public display application. Another work by (Zhang et al., 2013) built a system for detect-ing eye gaze gestures to the right and left directions. In such systems, either hardware-based or software-based eye tracking is employed. 2.1 Hardware-based Eye Gaze Tracking Systems
the gaze within ongoing social interaction Goffman (1964) considered the direction of gaze - Initiation of contact and its maintenance - Gaze is the indicator of social accessibility - "eye-to-eye ecological huddle" Nielsen(1964) - role of gaze direction in interaction - analysis of sound films records of 2-persons discussions
2.1. Gaze Communication in HHI Eye gaze is closely tied to underlying attention, inten-tion, emotion and personality [32]. Gaze communication allows people to communicate with one another at the most basic level regardless of their familiarity with the prevail-ing verbal language system. Such social eye gaze func-
nipulated to elicit effective eye gaze communication within human-robot interactions. Section 6 presents technology-focused research, covering the various systems and frameworks for developing robot eye gaze. The paper concludes in Section 7 with questions for future research that will expand the understanding of eye gaze in HRI. 2. Background
The Eye-gaze Tracking System The eye-gaze tracking system used is called MagikEye and is a commercial product from the MagicKey company (MagicKey, n.d.). It is an alternative point- and-click interface system that allows the user to interact with a computer by computing his/her eye-gaze.
business to business (b2b); business to consumer (b2c); promotions; segmentation eg first-time buyers, Branding: brand values; brand ‘personality’; benefits of brand to owner (permits premium pricing; aids differentiation; cross-product promotion); brand extension; brand fingerprinting (all aspects of contact with customers supporting branding values and positioning) 3 Understand the role .