U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content
Facebook iconYouTube iconTwitter iconFlickr iconLinkedInInstagram

Federal Highway Administration Research and Technology
Coordinating, Developing, and Delivering Highway Transportation Innovations

 
REPORT
This report is an archived publication and may contain dated technical, contact, and link information
Back to Publication List        
Publication Number:  FHWA-HRT-13-052    Date:  May 2013
Publication Number: FHWA-HRT-13-052
Date: May 2013

 

The Exploratory Advanced Research Program

Vehicle Positioning, Navigation, and Timing: Leveraging Results From EAR Program-Sponsored Research

WORKSHOP SUMMARY REPORT   •   November 2012

PART ONE: PRESENTATIONS

Next Generation Vehicle Positioning
David Bevly, Auburn University

David Bevly along with three graduate students (Jordan Britt, Chris Rose, and Scott Martin) from Auburn University presented a talk on the status of Auburn’s next generation vehicle positioning. The objective of the project was to provide ubiquitous precise positioning in regard to vehicle safety and automation in the presence of GPS degradation. Auburn partnered with Kapsch TrafficCom, Penn State University, and Stanford Research Institute (SRI) on this project. The project scope was to assess diverse positioning and data-fusion techniques, characterize achievable accuracy and robustness, and test and demonstrate capabilities on test track and roadway scenarios. The technical approach of this study was to fuse outputs from various technologies in an extended Kalman filter, which is used to smooth multiple measurements, exploit accuracy, and mitigate faults, as shown in figure 1.

A diagram provides an overview of a range of technologies with arrows connecting them to a Fusion Algorithm circle in the center. From left to right the technologies shown are labeled: camera—lateral position, DRSC—ranging, limited GPS, visual odometry, longitudinal position, lidar—lateral position.

Figure 1. Diagram. The study fuses outputs from various technologies.

Each technology was analyzed based on cost, availability, a six degrees-of-freedom (DOF) position, a three DOF position, drifting, environmental influences, and a requirement for an infrastructure, map, and central processing unit (CPU). A chart showing the results of this analysis for each examined technology is included in figure 2.

A diagram in the form of a table lists technologies on the rows and capabilities along the columns. The row titles are GPS, INS, Wheel Speed, PSU-Road Fingerprinting, AU-LDW, UDAR, Camera, SRI-Visual Odometry, Kapsch-Gantry. The column titles are Cost, Current Availability, Six DOF Position, Three DOF Position, Drifting Solution, Infrastructure Requirement, Map Requirement, CPU Requirement, Environmental Influences. The key indicates that a green tick represents “no concern, current system capabilities not affected by criterion”; a yellow tick represents “some concern, criterion may limit implementation or capability”; and a red cross represents “criterion cannot be overcome without additional subsystems.”

Figure 2. Diagram. The subsystem capability analysis matrix.

(NOTE: DOF = degrees of freedom, CPU = central processing unit, PSU = power supply unit,
AU-LDW = Auburn lane-departure warning, LIDAR = light detection and ranging, SRI = SRI International.)

The first two navigation technologies presented were GPS and an inertial navigation system (INS). GPS fared well in availability, three DOF positioning, and drifting solution. It did not satisfy a six DOF position. INS satisfied the cost, six and three DOF positioning, and did well in denied environments, although it had a drifting problem. GPS and INS integration can satisfy all of the listed capabilities.

Jordan Britt presented Penn State University’s road fingerprinting concept, as shown in figure 3. To use road fingerprinting, the road needs to have been surveyed. This is accomplished by driving with a high-grade inertial measurement unit (IMU) and a real time kinematic (RTK) GPS. The road survey must produce a map of the pitch signal, and the road fingerprinting is accomplished using a pitch gyro, wheel odometry, and the previously generated map. Required hardware is not a problem, because pitch gyro and wheel encoders currently exist on most automobiles. The capability analysis profile of this method is positive for cost, current availability, has no environmental influences, and requires no specific infrastructure to be in place. It does not provide a six DOF solution.

The key indicates that a red line represents outside lane markings; a green line represents inside lane markings; a broken blue line represents middle lane markings; a blue dot represents Penn St results; and a red circle represents standalone GPS. The chart's x-axis, ranging from -85-308 to -85.298, plots longitude against latitude on the y-axis, ranging from 32.5965 to 32.5968.

Figure 3. Chart. Penn State University's road fingerprinting concept.

Jordan also presented Auburn's research into the use of light detection and ranging (LIDAR) in lane detection warning systems. Researchers conducted LIDAR tests by measuring the reflectivity on road markers and road edge detection, as shown in figure 4 and figure 5. They tested lane detection under various weather conditions and with various road conditions. Results were filtered and averaged. Except under rain conditions, the results were very good. For road edge detection, they used both distance and estimation of reflectivity.

A view of a highway with the lane detection represented by red lines superimposed on the outer edges and the numbers 1–4 highlighting other elements of the lane.

Figure 4. Photo. LIDAR Base Lane Detection.

Data were bound and filtered. They conducted the tests during both day and night on country roads that had no outside lane markings. After post processing, the researchers found that the final results were quite accurate. The overall results of LIDAR processing were that it did not have a drifting problem and did not require an in-place infrastructure. The LIDAR solution does not provide a six DOF solution.

A chart plots horizontal angle in degrees along the x-axis against echo width along the y-axis. Two vertical red lines represent the road edge on the left at approximately -20 degrees and the right at 40 degrees. Blue lines represent the echo width. Within the red lines are the numbers 1–4.

Figure 5. Chart. LIDAR Base Lane Detection.

Next was Chris Rose's presentation on the use of cameras in lane detection. Cameras already exist in newer vehicles. With no side markings on roads, color is used to determine the edge. This is difficult at night or in the shade. Edge detection is determined by using methods such as a Hough transform, least squares interpolation, Kalman filtering, and determining polynomial bounds. Researchers performed the test with a Webcam at low resolution, with a road width measurement taken far down the road, in day and night scenarios, and with induced error sources, including shadows, headlights, and road intersections. They compared the results of these tests with the physical conditions. The overall results for camera road edge detection were a low-cost solution with currently available technology, having no drift or infrastructure requirements. The camera road edge detection system did not provide a six DOF solution.

Chris went on to present results from testing of the SRI visual odometry system, shown in figure 6. This process tracks features from image to image and extracts egomotion, providing local odometry without GPS initialization. The sensor package includes the following components: two cameras with Ethernet interfaces, two lenses, an IMU operating at 100 Hz, a Netgear Ethernet hub, a computer, cabling, and connectors. The cameras are rear-mounted to minimize glare problems. Researchers tested the system in inclement weather during which feature tracking and positioning remained functional and lenses covered with water droplets cleared once the vehicle reached higher speeds. The researchers determined that hoods overhanging the lenses might be sufficient to reduce the effects of sun glare and water droplets. The overall results from testing the SRI visual odometry system were a three DOF solution with no requirements for maps or specific infrastructure requirements. There were no critical problems with this solution.

A diagram shows an illustration of four vehicles in the foreground with red and black lines bouncing off trees and buildings in the background.

Figure 6. Diagram.Stanford Research Institute's Visual odometry concept.

The final system presented was the Kapsch-Gantry TrafficCom—a dedicated short-range communications (DSRC) system. The initial plan was to estimate the range based on the turnaround time for unsynchronized clocks. It was found that the project hardware was not capable of lane-level precision; however, the sensor may still provide some information if nothing else is available. Researchers at the National Center for Asphalt Technology (NCAT) test track collected time of flight data between the Kapsch radio base station and the Auburn test vehicle. The resulting variation in time of flight measurements proved to be insufficient for lane-level measurements. The overall results for the Kapsch TrafficCom system provided a three DOF, had no drifting problem, did not require a map or CPU, and was free of environmental issues. It did not provide a six DOF position.

Scott Martin then presented results of integration of the various systems. The INS was used as the base system with data from the other sensors fused in an extended Kalman filter implementation. Eighteen states were propagated from this scenario by using nonlinear relationship and IMU measurements. The outputs of the various systems are detailed in table 1.

Table 1. Outputs of various systems.

Subsystem Outputs
INS navigation process Navigation frame acceleration and angle rates (bias corrected)
GPS processor Range/range rates
Positions/velocities
Camera LDW processor Lateral lane position
LIDAR LDW processor Lateral lane position
Fingerprint processor Navigation frame position
Visual odometry processor Navigation frame position

(NOTE: INS = inertial navigation system, GPS = global positioning system, LDW = lane-departure warning, LIDAR = light detection and ranging.)

Integration testing in Detroit, MI, was then discussed. Honda developed the test route to meet road-use class proportioning of vehicle travel in the United States.1 Environments found at this test track included trees, tree canopies, overpasses, buildings, urban canyons, and tunnels. Testing scenarios included testing various combinations of sensors and an extended Kalman filter implementation. The results showed that GPS/INS integration improved results in heavy foliage and urban canyons, and vision updates were provided where the lane of travel was assumed. Further observations showed that, although subsystem integration improved positioning accuracy, it was limited by maps, survey accuracy, and availability. Limitations were identified on road fingerprinting and visual odometry. A new lane detection algorithm is needed that leverages new road edge detection methods and inertial information.

Testing at NCAT was discussed. The facility has a 1.7-mi (2.7 km) oval track, shown in figure 7, and is surveyed for lane markings and centers. The RTK base station supports wireless communications. Researchers collected four data sets of several laps. The results showed that a full system of sensors performed best, followed by a GPS sensor. The full system contained vision and fingerprint aiding, which improved lane-level accuracy. The GPS/INS integrated solution trailed as a result of memory limits.

An aerial view of an oval test track.

Figure 7. Photo. The National Center for Asphalt Technology's oval track.

Rose discussed the results of testing on driveways at TFHRC. A Novatel base station provided RTK corrections. Satellite visibility was degraded in some areas. The results showed limited precision in the fingerprint survey and lane-level accuracy was best with GPS/INS integration because of error correlation.

Conclusions of the Auburn studies were that subsystems help improve lane-level accuracy and continued testing is needed to assess system robustness. During a question-and-answer session, one participant of the workshop asked whether testing had been conducted with Jersey barriers. The response was that testing had not been conducted with Jersey barriers since Jersey barriers are not common near rural Auburn.

1 U.S. Department of Transportation, Federal Highway Administration, "Annual Vehicle-Miles of Travel 1980-2007, By Functional System, National Summary (Table VM-202, summary for 2007)," Jan. 2009, https://www.fhwa.dot.gov/policyinformation/statistics/vm02_summary.cfm.

 

Innovative Approaches for Next Generation Vehicle Positioning
Jay Farrell, University of California at Riverside

Jay Farrell started his presentation with a discussion of real-time positioning and precision mapping. Many intelligent transportation applications require positioning with high degrees of accuracy, a high sample rate, and working in diverse environments at low cost. A solution to these requirements is to fuse high rate sensors, such as an encoder or IMU, with lower rate sensors, such as GPS, cameras, LIDAR, radio detection and ranging (RADAR), DSRC, and signals of opportunity. A chart showing the capabilities of the aiding sensors can be found in figure 8.

A diagram in the form of a table categorizes the capabilities of various sensor categories. The columns along the top are titled Technology, Principle, Range Accuracy, Veh. Cost, Rdwy. Cost, and Req'd Advances. The rows are titled Terrestrial Radio Navigation and Feature Based. Technologies include GPS, DGPS, CPDGPS, Cell Phone, TV—digital, Radio AM Analog, Radio FM Analog, Radio Digital, Packet Radios, Vision, Radar, and Lidar.

Figure 8. Diagram. The capabilities of aiding sensor categories.

(NOTE: GPS = global positioning system, DGPS = differential global positioning system,
CPDGPS = carrier-phase differential global positioning system, TOA = time of arrival, TDOA = time difference of arrival,
AOA = angle of arrival, GNSS = global navigation satellite system, TRN = terrain reference navigation, FB = feature based.)

These aiding sensors have characteristics that help reduce the position uncertainty. For example, in an urban canyon with one GPS satellite available, the car sensor system can only see ahead. With the use of traffic signals, the system can make distance adjustments as the car travels perpendicular to the signal, as shown in figure 9. With LIDAR, there is access to raw data; with RADAR, raw data are inaccessible, only position data—range and angle—are available. Therefore, RADAR needs mapped features on which to plot these positions for the data to be useful.

Two diagrams provide a birds-eye view of a vehicle approaching an intersection. The intersection on the left shows three satellites with two of the satellite views blocked by buildings and trees. The diagram on the right shows traffic signals with one having an unobstructed line of sight, the other appears to be blocked by a tree.

Figure 9. Diagram. Positioning uncertainty reduction.

Farrell then focused on precise roadway feature maps. These features include road and lane edges, sign types and locations, street lights, traffic signals, and stop bars. High precision (sub-decimeter) positioning is needed for next generation applications. Some of these applications could include lane-departure warnings, curve-over speed warnings, signal phase and timing by lane, intersection management, and collision avoidance.

The mapping process was discussed. The first step is data acquisition from vision systems—GPS/INS and LIDAR. The data are gathered from these systems as the vehicle, shown in figure 10, is driven. Data are then smoothed to provide a continuously smooth trajectory. Features are then identified and extracted from the raw data. In the final step, relative map features are combined with absolute trajectory and placed in a geographic information systems (GIS) database in world coordinates.

A front-three-quarters view of a research vehicle with equipment mounted on the roof.

Figure 10. Photo. Vehicle equipped with sensor platform on roof.

No single independent sensor technology is capable of simultaneously attaining the accuracy, integrity, and availability specifications for lane-level positioning in expected diverse environments. Integrated positioning, which fuses asynchronous data from diverse sensors, is the best approach to reliably and accurately estimate vehicle position. Inertial Navigation Systems and Encoder Navigation Systems provide positioning solutions in all environments continuously at high rates; however, their accuracy drifts over time without aiding.

Global navigation satellite systems provide high accuracy in open areas where satellite signals can be received; however, performance degrades in dense urban areas. Feature-based navigation-aiding by using a camera, LIDAR, or RADAR can be successful when mapped features can be reliably detected and tracked. Several forms of ground-based radio communication systems, which can offer potentially useful position information, have been designed to penetrate the urban infrastructure and have the added advantage that their performance characteristics can still be influenced by the engineering community interested in roadway applications.

Farrell concluded the presentation by presenting ideas for future work:

Positioning

Mapping

Positioning and Mapping

 


1. U.S. Department of Transportation, Federal Highway Administration, “Annual Vehicle-Miles of Travel 1980-2007, By Functional System, National Summary (Table VM-202, summary for 2007),” Jan. 2009, https://www.fhwa.dot.gov/policyinformation/statistics/vm02_summary.cfm.

 

Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000
Turner-Fairbank Highway Research Center | 6300 Georgetown Pike | McLean, VA | 22101