Artificial Passenger

The AP is an artificial intelligence-based companion that will be resident in software and chips embedded in the automobile dashboard. The heart of the system is a conversation planner that holds a profile of you, including details of your interests and profession. A microphone picks up your answer and breaks it down into separate words with speech-recognition software. A camera built into the dashboard also tracks your lip movements to improve the accuracy of the speech recognition. A voice analyzer then looks for signs of tiredness by checking to see if the answer matches your profile. Slow responses and a lack of intonation are signs of fatigue.

This research suggests that we can make predictions about various aspects of driver performance based on what we glean from the movements of a driver's eyes and that a system can eventually be developed to capture this data and use it to alert people when their driving has become significantly impaired by fatigue

What is an Artificial Passenger?

  • Natural language e-companion.
  • Sleep preventive device in cars to overcome drowsiness.
  • Life safety system

What does it do?

  • Detects alarm conditions through sensors.
  • Broadcasts pre-stored voice messages over the speakers.
  • Captures images of the driver

Head-mounted and remote systems

The difference between the head-mounted and remote eye systems is how the eye tracker collects eye movement data. Head-mounted systems , since they are fixed on a user's head and therefore allow for head movement, use multiple data points to record eye movement. To differentiate eye movement from head movement, these systems measure the pupil glint from multiple angles. Since the unit is attached to the head, a person can move about when operating a car or flying a plane, for example.

For instance, human factors researchers have used head-mounted eye-tracking systems to study pilots' eye movements as they used cockpit controls and instruments to land airplanes (Fitts, Jones, and Milton 1950). These findings led to cockpit redesigns that improved usability and significantly reduced the likelihood of incidents caused by human error. More recently, head-mounted eye-tracking systems have been used by technical communicators to study the visual relationship between personal digital assistant (PDA) screen layout and eye movement.

Remote systems, by contrast, measure the orientation of the eye relative to a fixed unit such as a camera mounted underneath a computer monitor . Because remote units do not measure the pupil glint from multiple angles, a person's head must remain almost motionless during task performance. Although head restriction may seem like a significant hurdle to overcome, Jacob and Karn (2003) attribute the popularity of remote systems in usability to their relatively low cost and high durability compared with head-mounted systems.

Since remote systems are usually fixed to a computer screen, they are often used for studying onscreen eye motion. For example, cognitive psychologists have used remote eye-tracking systems to study the relationship between cognitive scanning styles and search strategies (Crosby and Peterson 1991). Such eye-tracking studies have been used to develop and test existing visual search cognitive models. More recently, human-computer interaction (HCI) researchers have used remote systems to study computer and Web interface usability.

Through recent advances in remote eye-tracking equipment, a range of head movement can now be accommodated. For instance, eye-tracking hardware manufacturer Tobii Technology now offers a remote system that uses several smaller fixed sensors placed in the computer monitor frame so that the glint underneath the pupil is measured from multiple angles. This advance will eliminate the need for participants in eye-tracking studies to remain perfectly still during testing, making it possible for longer studies to be conducted using remote systems

Software: Data collection, analysis, and representation

Data collection and analysis is handled by eye-tracking software. Although some software is more sophisticated than others, all share common features. Software catalogs eye-tracking data in one of two ways. In the first, data are stored in video format. ERICA's Eye Gaze[TM] software, for instance, uses a small red x to represent eye movement that is useful for observing such movement in relation to external factors such as user verbalizations. In the other, data are stored as a series of x/y coordinates related to specific grid points on the computer screen.

Data can be organized in various ways--by task or participant, for example-and broken down into fixations and saccades that can be visually represented onscreen. Fixations, which typically last between 250 and 500 milliseconds, occur when the eye is focused on a particular point on a screen . Fixations are most commonly measured according to duration and frequency. If, for instance, a banner ad on a Web page receives lengthy and numerous fixations, it is reasonable to conclude that the ad is successful in attracting attention. Saccades, which usually last between 25 and 100 milliseconds, move the eye from one fixation to the next fixation . When saccades and fixations are sequentially organized, they produce scanpaths. If, for example, a company would like to know why people are not clicking on an important link in what the company feels is a prominent part of the page, a scanpath analysis would show how people visually progress through the page. In this case, such an analysis might show that the link is poorly placed because it is located on a part of the screen that does not receive much eye traffic

Detailed description of preferred embodiments

Before explaining the disclosed embodiment of the present in detail it is to be understood that the invention is not limited in its application to the details of the particular arrangement shown since the invention is capable of other embodiments. Also, the terminology used herein is for the purpose of description and not of limitation.

The novel invention can analyze video sequences of a driver for determining when the driver is not paying adequate attention to the road. The invention collects data with a single camera placed that can be placed on the car dashboard. The system can focus on rotation of the head and eye blinking, two important cues for determining driver alertness, to make determination of the driver's vigilance level. Our head tracker consists of tracking the lip corners, eye centers, and side of the face. Automatic initialization of all features is achieved using color predicates and a connected components algorithm. A connected component algorithm is one in which every element in the component has a given property. Each element in the component is adjacent to another element either by being to the left, right, above, or below.

Other types of connectivity can also be allowed. An example of a connected component algorithm follows: If we are given various land masses, then one could say that each land mass is a connected component because the water separates the land masses. However, if a bridge was built between two land masses then the bridge would connect them into one land mass. So a connected component is one in which every element in the component is accessible from any other element in the component.




1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

Copyright © V2computers 2007 through 2018