U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content
Facebook iconYouTube iconTwitter iconFlickr iconLinkedInInstagram

Federal Highway Administration Research and Technology
Coordinating, Developing, and Delivering Highway Transportation Innovations

 
ARTICLE
Back to Publication List        

 

Breakthrough in Computer Vision for Highway Transportation Research

Figure shows four stacked photos in a square arrangement.  The top left photo is a street scene showing homes and parked cars to the right and a crossing walk with a pedestrian crossing the street on the right of the photo.  There is a grey arrow pointing to the right between the left and right photos. The top right photo is the same street scene but one of the three parked cars is colored bright green, one is red, and one is blue. The bottom left and right photos are all black and has the outlines of the three parked cars reversed out in white. The photos demonstrate the ability to identify and separate cars automatically in an image.

Figure caption 1: Computer vision approaches identify and separate vehicles in a street scene automatically for highway safety analysis.

 

Highway transportation researchers are collecting and analyzing an increasing amount of video data that provide new and enhanced visual and other sensors for conducting research in multiple areas such as system planning, operations, safety, and infrastructure condition assessment. While the research community is fortunate to be able to collect more and better data, the amount of data has the potential to overwhelm the capacity to assess the data using current methods.  Automating data extraction from video files is expected to dramatically reduce the costs of using these data, making them accessible to the widest possible pool of researchers. FHWA’s Exploratory Advanced Research (EAR) Program working with the Office of Safety R&D is sponsoring six research projects that will explore breakthroughs in machine learning to automate extraction of safety data from the naturalistic driving study (NDS) such as the Strategic Highway Research Program (SHRP2) project.  More information about four of the projects can be found on the links at the end of this article.

 

Figure shows a red car with a passenger from above and with part of the car cut away showing the entire body of the driver.  The figure illustrates the possible relationship between the driver and the roadway context outside the vehicle.  To the left of the car there is a bicyclist driving in a lane next to the car with a text box that lists the contextual features (passengers, pedestrians, bicycles, vehicles, brake lights, signals, traffic condition, weather conditions, cell phones, and sat. nav.). To the right of the car, there are two pedestrians standing on the yellow double lines to cross the street and there is a text box that lists the driver features (head pose, gaze, eye blinks, mouth movement, facial expressions, hands, gestures, and actions).

Figure caption 2: Video data include cabin and roadway views.  This view illustrates possible relationships between observed driver behavior and the roadway context outside the vehicle, which includes other vehicles, cyclists, pedestrians, and lane markings.

 

 

Figure shows two side-by-side head-and-shoulder shots of a driver.  The left image is the actual driver with his critical features (i.e., eyes, nose, and mouth) outlined in red dots. The right image is an avatar of the driver face to obscure his identity while still showing his head pose, eye gaze direction, and facial expressions.

Figure caption 3: Privacy can be a barrier for sharing video data of drivers among researchers.  New methods for obscuring facial features while preserving critical features such as head pose, eye gaze direction, and facial expressions can allow for greater access.

 

 

The SHRP 2 safety area includes elements of the above types of data in the NDS). The amount of data can be orders of magnitude larger than what the highway researcher has worked with previously with research results expected to exceed one petabyte. The NDS involved collection of data on the daily travels of 3,150 volunteer drivers, whose vehicles were heavily instrumented for the study. Those drivers traveled 49.5 million miles during the study period, resulting in over 1.2 million hours of video and vehicle data. See http://www.trb.org/StrategicHighwayResearchProgram2SHRP2/Public/Pages/Safety_153.aspx, for more information on the SHRP2 safety area.

FHWA is also working with experts at Oak Ridge National Laboratory to test the research results and develop methods and test data sets for future researchers to use in comparing and validating the performance of automated data extraction algorithms.

Current methods for assessing the data include a mix of automated and manual, frame-by-frame coding that is not able to manage massive data stream and provides results that are not consistent or error free enough. Large data sets in their present form are too time-consuming and expensive to analyze using traditional data extraction methods, yet they are important for providing understanding of the context around rare events such as crashes.

 

Figure shows a user interface screen that has three photo images side by side.  The first photo shows the driver’s face. The second photo shows the driver’s hands. And the last photo shows the roadway in front of the vehicle with numbers appearing at the bottom left of the image.  There are four control buttons below the images to the left and an engagement meter to the right showing even numbers from 0 (devoid of concentration) to 10 (complete concentration).   And a “Go to Next” button at the way bottom.

Figure caption 4: This graphic shows a user interface for viewing data streams including in cabin video and roadway video.

 

ITS America conducted a technology scan for the USDOT Intelligent Transportation Systems (ITS) Joint Program Office. The scan report, "Connected Vehicles: Trends in Computer Vision," are located at http://www.itsa.org/knowledgecenter/technologyscan. As the technology moves from stand alone to integrated systems and from systems that provide driver or operator warnings to direct control of vehicle and infrastructure systems (i.e. steering, acceleration, braking, traffic signals) there is a critical need for compatible systems architecture and common conditions against which multiple manufacturers can test their equipment and software.

More information on four of the projects is available by clicking on the links listed below:

The other two projects are:

 

 

Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000
Turner-Fairbank Highway Research Center | 6300 Georgetown Pike | McLean, VA | 22101