U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content
FacebookYouTubeTwitterFlickrLinkedIn

Federal Highway Administration Research and Technology
Coordinating, Developing, and Delivering Highway Transportation Innovations

 
Public Roads
This magazine is an archived publication and may contain dated technical, contact, and link information.
Public Roads Home | Current Issue | Past Issues | Subscriptions | Article Reprints | Author's Instructions and Article Submissions | Sign Up for E-Version of Public Roads | Search Public Roads
Back to Publication List        
Publication Number:  FHWA-HRT-16-006     Date:  September/October 2016
Publication Number: FHWA-HRT-16-006
Issue No: Vol. 80 No. 2
Date: September/October 2016

 

Where Were You Looking?

by Ashley A. Stafford Sewall, William A. Perez, and C.Y. David Yang

FHWA equips field research vehicles with tracking technology to gather data on drivers’ eye glances to recommend practices and countermeasures to improve transportation safety.

An FHWA research team instrumented this field research vehicle with a camera-based eye-tracking system to determine where motorists are looking while navigating the driving environment.
An FHWA research team instrumented this field research vehicle with a camera-based eye-tracking system to determine where motorists are looking while navigating the driving environment.

Driving is primarily a visual task requiring constant visual and cognitive attention to navigate an ever-changing environment successfully. A loss of vehicle control can occur when drivers divert their eyes at a critical moment.

Crash statistics suggest that distracted and drowsy driving might be a common occurrence. As reported in a 2015 study, Traffic Safety Facts: Distracted Driving 2013, the National Highway Traffic Safety Administration (NHTSA) found that 10 percent of all fatal crashes and 18 percent of injury crashes in the United States were distraction affected. In addition, NHTSA recorded 846 fatalities related to drowsy and fatigued driving in 2014.

“Considering the nature of the driving task and the potentially severe consequences of drivers taking their eyes off the road, it is important to examine where motorists are looking while driving,” says Monique Evans, director of the Federal Highway Administration’s Office of Safety Research and Development. “This task can be accomplished unobtrusively and in real time with camera-based eye-tracking technologies.”

The Human Factors Laboratory at FHWA’s Turner-Fairbank Highway Research Center uses driving simulators and field research vehicles to gain a deeper understanding of the behavior and performance of road users, including how drivers perceive, process, and respond to changing elements in the driving environment.

The Technology’s History

Given that the driving task requires visual attention, there are many benefits of using eye-tracking technology to examine the scan and search patterns of drivers. Researchers have attempted to track eye movements since the late 1800s. However, early eye-tracking systems were not practical for naturalistic investigations because they usually involved heavy equipment and typically required a participant’s head to remain in a fixed position throughout the data collection. According to R.J. Jacob and K.S. Karn in a chapter of The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research titled, “Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises,” many of the earliest systems were extremely invasive and required direct mechanical contact with the participants’ eye (that is, the cornea). In the decades that followed, researchers developed noninvasive methods using light reflection, photographic, and motion picture techniques.

Timeline of the Development of Eye-Tracking Technology

Figure. The timeline presents a brief history of the development of these systems from the earliest observations of eye movements in 1879 to the present in which precise and accurate unobtrusive eye trackers are commercially available. The first box on the timeline reads “1879. Louis Émile Javal. Earliest observations of eye movements when reading. Discovery that eyes move in fixations and saccades.” Second box: “1901. Dodge & Cline. The first nonintrusive eye-tracking technique using light reflected from the camera.” Third box: “1947. Fitts, Jones, & Milton. Motion picture cameras are used to study pilots’ eye movements when using cockpit controls and instruments.” Fourth box: “1948. Hartridge & Thompson. The first head-mounted eye tracker is built.” Fifth box: “1950s & 60s. Alfred I. Yarbus. Discovery that the task given to the observer impacts the movement of the eyes.” Sixth box: “1970s. Significant advancements in eyetracking technologies are made. Psychological theory is used to link eye-tracking data to cognitive processes.” Seventh box: “1980s. Real-time eye tracking allows for an exploration of human-computer interaction.” Eighth box: “1990s. Eye trackers are used to examine the usability of products and technologies (such as Web sites and email).” Ninth box: “2000s. A variety of eye trackers are available for commercial use. Eye trackers are unobtrusive and used in real time with precision and accuracy.”
Advancements in eye-tracking technology enable researchers to investigate important problems via real-time, camera-based systems that are nonintrusive and accurate. This timeline presents a brief history of the development of these systems.

The 1970s brought advances in eye-tracking technologies and, during this period, researchers made efforts to link psychological theory and cognitive processes to eye-tracking data. In the 1980s, various research teams developed the first devices to track eye movements in real time (rather than post-processing) as a means of examining how humans interact with computers.

The eye-tracking technology in FHWA’s new field research vehicle enables researchers to observe drivers’ eye movements in real time while the participant is driving under real-world conditions. On a computer monitor mounted on the back of the passenger seat, an invehicle researcher can see the view of the driver’s face captured by the dashboard-mounted cameras in addition to a closeup of the driver’s pupil.
The eye-tracking technology in FHWA’s new field research vehicle enables researchers to observe drivers’ eye movements in real time while the participant is driving under real-world conditions. On a computer monitor mounted on the back of the passenger seat, an invehicle researcher can see the view of the driver’s face captured by the dashboard-mounted cameras in addition to a closeup of the driver’s pupil.

In recent years, researchers and engineers have developed precise remote eye trackers that do not require participants to be constrained and enable researchers to broaden the scope of their investigations to environments suited to remote systems. At present, these systems enable the participant to move freely, within certain boundaries, and they boast high accuracy in tracking eye movements (typically within 1 degree or better). Particularly relevant, the emergence of remote eye-tracking systems has made it possible for researchers to investigate questions in the naturalistic setting of driving an instrumented vehicle or a driving simulator.

Eye-Tracking Research at FHWA

The primary goal of FHWA’s Human Factors Program is to create a safer transportation environment for all road users. Addressing areas such as traffic control devices, intersections, pedestrians and bicyclists, and intelligent transportation systems, the Human Factors team conducts empirical research and develops strategies to improve transportation safety.

In particular, the Human Factors Laboratory uses vehicles equipped with remote eye-tracking technology to investigate important research questions about driving safety. Recently, the laboratory added a 2011 sedan to the tools available for eye-tracking research. The system installed in this vehicle consists of three infrared cameras mounted on the dashboard and a digital forward-scene camera (mounted to the right of the rearview mirror) to record the visual scene as viewed by the driver. The camera and an updated data acquisition system represent a technological upgrade from the field research vehicle FHWA used in previous eye-tracking studies.

In an effort to measure the overall performance of this new field research vehicle and eye-tracking system, the FHWA team designed a study to collect data from a variety of drivers on various roadways. The study enabled the researchers to discover and resolve potential problems encountered while using the vehicle. In addition, the research provided a look at the eye-tracking and video data that could be obtained using the new vehicle.

Eye-Tracking Instruments and Methods

Twelve Federal employees participated in the study. Of the 12 drivers, the researchers obtained usable data from 11 (the eye-tracking software malfunctioned during one drive). Six males and five females, averaging 46 years of age (ranging from 33 to 67 years), drove the field vehicle. In addition, two drivers wore eyeglasses and one wore sunglasses during their drives. Traditionally, glasses have presented a challenge for eye-tracking systems that use infrared illumination because of reflections. However, the research team collected usable data from these participants despite their optical corrections.

The research vehicle is equipped with the manufacturer’s standard safety features, such as electronic stability control, airbags, and four-wheel antilock brakes. Also, the team installed custom research tools to equip the vehicle for experimental tasks. The equipment includes an infrared eye-movement measuring system that consists of two strobe lights and three cameras aimed at the driver’s face. The strobes and cameras are mounted on the vehicle’s dashboard.

Photo. Interior of the field research vehicle with labels pointing to three cameras mounted on the dashboard and two infrared flashers, also mounted on the dashboard.
This interior photograph of the field research vehicle shows the three cameras and two infrared (IR) flashers that recorded the driver’s face during the study. Video footage captured during the drive was used to determine where the driver was looking while driving the test route.

According to Dana Duke, senior electronics technician at Leidos, the company that installed the eye-tracking system, “The infrared strobes emit no visible light and are of a sufficiently low intensity and short duration to pose no harm to a driver’s eyes. The information from the strobes and cameras is fed into image-processing equipment located in the rear of the vehicle. This image processing produces vectors for the duration of gaze of each eye, and this gaze vector information is subsequently combined with video from the scene camera so that the gaze vector information can be overlaid on video of the forward scene viewed by the driver.”

To examine the field of view of the scene camera and to investigate the technology under a variety of roadway conditions, the team designed a test route that featured three distinct sections: a two-lane road, a four-lane undivided highway, and a four-lane divided freeway. The positioning of the forward-scene camera (mounted to the right of the rearview mirror) affects how much of the roadway is captured by the video camera. With a narrower field of view, it might be that a driver looks at a sign or object along the roadway, but the forward-scene camera does not capture the object of interest (especially as the vehicle moves closer to the object).

Given the importance of knowing what the driver is looking at, one of the goals of this project was to assess what the scene camera could capture given different roadway geometries. The three types of roadway presented unique opportunities to have drivers look at signs to the right, left, and overhead to ensure that the camera’s range was sufficient to capture relevant signage.

After the researchers obtained informed consent, they provided participants with detailed instructions about the driving task, including an overview of the test route. At the start of the route, the participants adjusted the driver’s seat and the side and rearview mirrors. After that, they completed a calibration procedure (typically lasting 5minutes) in which they looked at nine red dots, one at a time, arranged in three rows and three columns with three dots per row and column. This 3x3 grid measured 6 feet (1.8meters) tall by 13 feet (3.9 meters) wide and was positioned on a wall in front of the parked research vehicle. This gaze calibration process ensures that the eye tracker is accurately tracking the drivers’ eye movements to these known points within the 3x3 grid. Participants continued this task until the calibration error was less than a value of 2 degrees for both the right and left eyes, meaning that the eye tracker accurately tracked eye movements to within 2 degrees of the visual angle.

Photo. Shown is the dashboard of the field research vehicle and a two-lane road. Superimposed on the road is a red-lined rectangle labeled “Road Ahead.” Above that is a rectangular area labeled “Overhead,” and to the left and right on the sides of the road are areas labeled “Left” and “Right.”
Researchers use analysis software to define sections of the forward scene of the roadway. In this study, they defined four static regions of interest: the road ahead, the right side of the road, the left side of the road, and overhead. The green circle shown superimposed on the video indicates where the driver is looking. In this screen capture, the driver is currently looking at the “road ahead” region of interest.

Next, they completed a 12.5-mile (20-kilometer) trip during which an invehicle researcher provided the driver with turn-by-turn instructions. Approximately 1 mile (1.6 kilometers) before each turn, the researcher gave the driver a warning and then reminded the driver again as the vehicle approached the turn. Each drive took 25 minutes on average.

The Findings

Although the researchers collected usable eye-tracking data from 11 of the 12 drivers in the study, the team could not upload video data from 2 of those drivers to the anaylsis software, Multi-modal Analysis of Psychophysical and Performance Signals (MAPPS™), because the calibration data were corrupted. Given this limitation, the researchers analyzed the data from only 9 of the drivers. Using a drawing tool within the MAPPS software, they split the video of the forward driving scene into four static regions of interest (ROIs): the road ahead, the right side of the road, the left side of the road, and overhead. The team analyzed the ROIs across the entirety of each participant’s drive.

Once the researchers had defined the static ROIs, they examined the drivers’ eye glances to these regions to determine where participants looked. On average, they looked at the road ahead for 76 percent of the total driving time, which accounted for the largest percentage. For 13 percent of the total drive time, participants looked at the overhead ROI. Finally, for 4 percent and 2 percent of the time, they looked at the right and left regions, respectively.

Upon closer examination, the percentage of time spent looking at the overhead ROI seemed unusually high. The team discovered that some of the glances to this ROI were potentially a product of a malfunction of the eye-tracking technology. Specifically, three of the drivers had a very high number of glances to the overhead ROI (up to 42 percent of the total driving time). When the team removed those participants from the analysis, the average percentage of time spent looking at the overhead region was only 3 percent instead of 13 percent.

The team visually inspected video data for those unusual cases. That investigation showed several instances in which the pipper (the green circle superimposed on the video to display eye movements) got “stuck” in the upper left-hand portion of the screen, in the overhead ROI. The researchers concluded that those instances of malfunction artificially inflated the average percentage of time that participants spent looking at the overhead ROI.

Figure. Shown are four rectangles. The one on top is labeled “Overhead 13%.” Below that, to the left, a rectangle is labeled “Left 2%.” In the middle is a rectangle labeled “Road Ahead 76%.” To the right is a rectangle labeled “Right 4%.”
This figure shows the four static regions of interest and the percentage of time, on average, that drivers spent looking at these regions throughout the study route.

The tracking also recorded eye glances when drivers looked at instruments within or around the vehicle (for example, the instrument panel or the rearview mirror). For 1.5 percent of the total trip time, participants looked at the rearview mirror. Overall, on average, participants did not spend much time looking at the instruments, mirrors, or the radio within the vehicle (0.12 percent for the instrument panel, 0.02 percent for the right mirror, 0.24 percent for the left mirror, and 0.05 percent for the radio). Drivers might have felt the need to concentrate a majority of their visual attention on the roadway ahead, given the nature of the driving task in this study, limiting their glances to objects within the vehicle.

The research team also examined potential differences in drivers’ eye glances to static ROIs when driving on the three roadway types. Of the 12 comparisons analyzed (3 roadway types and 4 static ROIs), 3 were statistically significant. On average, the percentage of time spent looking at the left-side ROI was almost 2 percentage points higher when participants were driving on the four-lane section of the test route than when driving on the freeway. In addition, the percentage of time spent looking at the left-side ROI was, on average, 1.5 percentage points higher when driving on the four-lane section of roadway than when driving on the two-lane section. Finally, the percentage of time spent looking at the road-ahead ROI was 10 percentage points higher when driving on the four-lane roadway than when driving on the freeway.

Gaze Direction Quality

Figure. Two gaze quality graphs with the one on the left labeled “Stable tracking and high quality,” and the one on the right labeled “Inconsistent tracking and periods of high and low quality.” The stable graph shows an almost level line around 0.9 with only a few dips to 0.6. The one on the right shows lows at 0.0, varying constantly, with highs ranging from 0.6 to 0.8.
Depending on the circumstances of the drive, there might be fluctuations in the quality of the eye-tracking measures. The vertical axis of these graphs shows the Gaze Direction Quality values from 0.0, 0.2, 0.4, 0.6, 0.8, and 1.0, in which 1.0 represents the best quality possible. The horizontal axis shows the drive time in percentiles from 0, 20, 40, 60, 80, and 100, in which 1.0 equals the first 1 percent of the drive, 10 the tenth 1 percent of the drive and so forth. The graph on the left shows stable, high-quality eye tracking throughout the drive, while the graph on the right shows significant variability in eye-tracking quality with only brief periods of high gaze quality.

Further, the team assessed changes in eye-tracking quality throughout the drive. Although momentary fluctuations in overall gaze quality might occur, it is critical that gaze quality remains relatively high and consistent throughout the drive to ensure the accuracy of the eye-tracking data.

Gaze direction quality is a measure of how well the eye tracker was able to track eye glance throughout the drive. This value ranges from 0.0 (poor quality, the system is not tracking) to 1.0 (good quality, the system is tracking) and varies based on three main factors: the strength and visibility of the pupil-iris edge, the elliptical shape of the pupil in the image plane, and the consensus of the three cameras throughout the measurement process.

Gaze direction quality for participants who wore glasses or sunglasses showed the largest variations in this measure and continuously seesawed between high and low quality. Yet the effect of glasses and sunglasses on gaze quality was not consistent, and not all of the participants who wore glasses had poor gaze direction quality. It might be that certain lenses affect gaze quality more than others do. However, the team did not collect data on the type of glasses worn by the participants. Although eye-tracking data can be collected from drivers wearing glasses and sunglasses, these data should perhaps be interpreted cautiously because there may be significant variability in gaze direction quality.

What Does All This Mean?

Field research vehicles equipped with eye-tracking technology provide researchers with insight into drivers’ eye glances when driving under real-world conditions. The researchers addressed the technical malfunctions with the eye tracker and with importing the video data into the analysis software and remedied these problems by working with the software developers. Making adjustments can ensure that these problems do not occur during future studies where smooth system performance is critical.

In addition, the study provided a better understanding of the limitations of tracking drivers who wear glasses or sunglasses. The eye tracker in the new field research vehicle can track the eye movements of those drivers successfully, yet by looking at the gaze direction quality value, eye tracking with participants wearing glasses or sunglasses might not always be consistent.

FHWA’s new vehicle enables researchers to examine onroad driving and accurately track drivers’ eye movements to regions throughout the visual scene and to signs located above and along the right and left sides of the roadway.

“Tracking drivers’ glances with real-time, onroad driving is crucial to understanding and reducing driving behaviors that result in drivers looking away from the roadway at critical moments,” says Associate Administrator MichaelF. Trentacoste, head of the Office of Research, Development, and Technology at FHWA. “With the recent addition of the new field research vehicle to the tools available for research at the Turner-Fairbank Highway Research Center, theHuman Factors team can continue to make important strides in safety for drivers and other road users.”


Ashley A. Stafford Sewall, Ph.D., is a research psychologist with the Leidos human factors support team in FHWA’s Office of Safety R&D. She received her B.S. from Erskine College and her M.S. and Ph.D. in human factors from Clemson University.

William A. Perez, Ph.D., is the Leidos program manager on the human factors contract supporting FHWA’s Office of Safety R&D. He earned a B.A. in psychology from the University of Cincinnati and his M.A. and Ph.D. in experimental psychology from Miami University of Oxford, OH.

C. Y. David Yang, Ph.D., is the leader of the Human Factors team in FHWA’s Office of Safety R&D. He joined FHWA in 2008. Yang is the chair of the Transportation Research Board’s Users Performance Section and serves on the editorial boards of the Journal of Intelligent Transportation Systems and International Journal of Transportation Science and Technology. He received his B.S., M.S., and Ph.D. degrees in civil engineering from Purdue University. His doctoral dissertation used principles of human information processing and human factors to develop design recommendations for advanced traveler information systems.

For more information, contact David Yang at 202–493–3284 or david.yang@dot.gov.

 

 

Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000
Turner-Fairbank Highway Research Center | 6300 Georgetown Pike | McLean, VA | 22101