U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
Federal Highway Administration Research and Technology
Coordinating, Developing, and Delivering Highway Transportation Innovations
|This report is an archived publication and may contain dated technical, contact, and link information|
|Publication Number: FHWA-HRT-13-052 Date: May 2013|
Publication Number: FHWA-HRT-13-052
Date: May 2013
David Bevly along with three graduate students (Jordan Britt, Chris Rose, and Scott Martin) from Auburn University presented a talk on the status of Auburn’s next generation vehicle positioning. The objective of the project was to provide ubiquitous precise positioning in regard to vehicle safety and automation in the presence of GPS degradation. Auburn partnered with Kapsch TrafficCom, Penn State University, and Stanford Research Institute (SRI) on this project. The project scope was to assess diverse positioning and data-fusion techniques, characterize achievable accuracy and robustness, and test and demonstrate capabilities on test track and roadway scenarios. The technical approach of this study was to fuse outputs from various technologies in an extended Kalman filter, which is used to smooth multiple measurements, exploit accuracy, and mitigate faults, as shown in figure 1.
Figure 1. Diagram. The study fuses outputs from various technologies.
Each technology was analyzed based on cost, availability, a six degrees-of-freedom (DOF) position, a three DOF position, drifting, environmental influences, and a requirement for an infrastructure, map, and central processing unit (CPU). A chart showing the results of this analysis for each examined technology is included in figure 2.
Figure 2. Diagram. The subsystem capability analysis matrix.
(NOTE: DOF = degrees of freedom, CPU = central processing unit, PSU = power supply unit,
AU-LDW = Auburn lane-departure warning, LIDAR = light detection and ranging, SRI = SRI International.)
The first two navigation technologies presented were GPS and an inertial navigation system (INS). GPS fared well in availability, three DOF positioning, and drifting solution. It did not satisfy a six DOF position. INS satisfied the cost, six and three DOF positioning, and did well in denied environments, although it had a drifting problem. GPS and INS integration can satisfy all of the listed capabilities.
Jordan Britt presented Penn State University’s road fingerprinting concept, as shown in figure 3. To use road fingerprinting, the road needs to have been surveyed. This is accomplished by driving with a high-grade inertial measurement unit (IMU) and a real time kinematic (RTK) GPS. The road survey must produce a map of the pitch signal, and the road fingerprinting is accomplished using a pitch gyro, wheel odometry, and the previously generated map. Required hardware is not a problem, because pitch gyro and wheel encoders currently exist on most automobiles. The capability analysis profile of this method is positive for cost, current availability, has no environmental influences, and requires no specific infrastructure to be in place. It does not provide a six DOF solution.
Figure 3. Chart. Penn State University's road fingerprinting concept.
Jordan also presented Auburn's research into the use of light detection and ranging (LIDAR) in lane detection warning systems. Researchers conducted LIDAR tests by measuring the reflectivity on road markers and road edge detection, as shown in figure 4 and figure 5. They tested lane detection under various weather conditions and with various road conditions. Results were filtered and averaged. Except under rain conditions, the results were very good. For road edge detection, they used both distance and estimation of reflectivity.
Figure 4. Photo. LIDAR Base Lane Detection.
Data were bound and filtered. They conducted the tests during both day and night on country roads that had no outside lane markings. After post processing, the researchers found that the final results were quite accurate. The overall results of LIDAR processing were that it did not have a drifting problem and did not require an in-place infrastructure. The LIDAR solution does not provide a six DOF solution.
Figure 5. Chart. LIDAR Base Lane Detection.
Next was Chris Rose's presentation on the use of cameras in lane detection. Cameras already exist in newer vehicles. With no side markings on roads, color is used to determine the edge. This is difficult at night or in the shade. Edge detection is determined by using methods such as a Hough transform, least squares interpolation, Kalman filtering, and determining polynomial bounds. Researchers performed the test with a Webcam at low resolution, with a road width measurement taken far down the road, in day and night scenarios, and with induced error sources, including shadows, headlights, and road intersections. They compared the results of these tests with the physical conditions. The overall results for camera road edge detection were a low-cost solution with currently available technology, having no drift or infrastructure requirements. The camera road edge detection system did not provide a six DOF solution.
Chris went on to present results from testing of the SRI visual odometry system, shown in figure 6. This process tracks features from image to image and extracts egomotion, providing local odometry without GPS initialization. The sensor package includes the following components: two cameras with Ethernet interfaces, two lenses, an IMU operating at 100 Hz, a Netgear Ethernet hub, a computer, cabling, and connectors. The cameras are rear-mounted to minimize glare problems. Researchers tested the system in inclement weather during which feature tracking and positioning remained functional and lenses covered with water droplets cleared once the vehicle reached higher speeds. The researchers determined that hoods overhanging the lenses might be sufficient to reduce the effects of sun glare and water droplets. The overall results from testing the SRI visual odometry system were a three DOF solution with no requirements for maps or specific infrastructure requirements. There were no critical problems with this solution.
Figure 6. Diagram.Stanford Research Institute's Visual odometry concept.
The final system presented was the Kapsch-Gantry TrafficCom—a dedicated short-range communications (DSRC) system. The initial plan was to estimate the range based on the turnaround time for unsynchronized clocks. It was found that the project hardware was not capable of lane-level precision; however, the sensor may still provide some information if nothing else is available. Researchers at the National Center for Asphalt Technology (NCAT) test track collected time of flight data between the Kapsch radio base station and the Auburn test vehicle. The resulting variation in time of flight measurements proved to be insufficient for lane-level measurements. The overall results for the Kapsch TrafficCom system provided a three DOF, had no drifting problem, did not require a map or CPU, and was free of environmental issues. It did not provide a six DOF position.
Scott Martin then presented results of integration of the various systems. The INS was used as the base system with data from the other sensors fused in an extended Kalman filter implementation. Eighteen states were propagated from this scenario by using nonlinear relationship and IMU measurements. The outputs of the various systems are detailed in table 1.
|INS navigation process||Navigation frame acceleration and angle rates (bias corrected)|
|GPS processor||Range/range rates
|Camera LDW processor||Lateral lane position|
|LIDAR LDW processor||Lateral lane position|
|Fingerprint processor||Navigation frame position|
|Visual odometry processor||Navigation frame position|
(NOTE: INS = inertial navigation system, GPS = global positioning system, LDW = lane-departure warning, LIDAR = light detection and ranging.)
Integration testing in Detroit, MI, was then discussed. Honda developed the test route to meet road-use class proportioning of vehicle travel in the United States.1 Environments found at this test track included trees, tree canopies, overpasses, buildings, urban canyons, and tunnels. Testing scenarios included testing various combinations of sensors and an extended Kalman filter implementation. The results showed that GPS/INS integration improved results in heavy foliage and urban canyons, and vision updates were provided where the lane of travel was assumed. Further observations showed that, although subsystem integration improved positioning accuracy, it was limited by maps, survey accuracy, and availability. Limitations were identified on road fingerprinting and visual odometry. A new lane detection algorithm is needed that leverages new road edge detection methods and inertial information.
Testing at NCAT was discussed. The facility has a 1.7-mi (2.7 km) oval track, shown in figure 7, and is surveyed for lane markings and centers. The RTK base station supports wireless communications. Researchers collected four data sets of several laps. The results showed that a full system of sensors performed best, followed by a GPS sensor. The full system contained vision and fingerprint aiding, which improved lane-level accuracy. The GPS/INS integrated solution trailed as a result of memory limits.
Figure 7. Photo. The National Center for Asphalt Technology's oval track.
Rose discussed the results of testing on driveways at TFHRC. A Novatel base station provided RTK corrections. Satellite visibility was degraded in some areas. The results showed limited precision in the fingerprint survey and lane-level accuracy was best with GPS/INS integration because of error correlation.
Conclusions of the Auburn studies were that subsystems help improve lane-level accuracy and continued testing is needed to assess system robustness. During a question-and-answer session, one participant of the workshop asked whether testing had been conducted with Jersey barriers. The response was that testing had not been conducted with Jersey barriers since Jersey barriers are not common near rural Auburn.
1 U.S. Department of Transportation, Federal Highway Administration, "Annual Vehicle-Miles of Travel 1980-2007, By Functional System, National Summary (Table VM-202, summary for 2007)," Jan. 2009, http://www.fhwa.dot.gov/policyinformation/statistics/vm02_summary.cfm.
Jay Farrell started his presentation with a discussion of real-time positioning and precision mapping. Many intelligent transportation applications require positioning with high degrees of accuracy, a high sample rate, and working in diverse environments at low cost. A solution to these requirements is to fuse high rate sensors, such as an encoder or IMU, with lower rate sensors, such as GPS, cameras, LIDAR, radio detection and ranging (RADAR), DSRC, and signals of opportunity. A chart showing the capabilities of the aiding sensors can be found in figure 8.
Figure 8. Diagram. The capabilities of aiding sensor categories.
(NOTE: GPS = global positioning system, DGPS = differential global positioning system,
CPDGPS = carrier-phase differential global positioning system, TOA = time of arrival, TDOA = time difference of arrival,
AOA = angle of arrival, GNSS = global navigation satellite system, TRN = terrain reference navigation, FB = feature based.)
These aiding sensors have characteristics that help reduce the position uncertainty. For example, in an urban canyon with one GPS satellite available, the car sensor system can only see ahead. With the use of traffic signals, the system can make distance adjustments as the car travels perpendicular to the signal, as shown in figure 9. With LIDAR, there is access to raw data; with RADAR, raw data are inaccessible, only position data—range and angle—are available. Therefore, RADAR needs mapped features on which to plot these positions for the data to be useful.
Figure 9. Diagram. Positioning uncertainty reduction.
Farrell then focused on precise roadway feature maps. These features include road and lane edges, sign types and locations, street lights, traffic signals, and stop bars. High precision (sub-decimeter) positioning is needed for next generation applications. Some of these applications could include lane-departure warnings, curve-over speed warnings, signal phase and timing by lane, intersection management, and collision avoidance.
The mapping process was discussed. The first step is data acquisition from vision systems—GPS/INS and LIDAR. The data are gathered from these systems as the vehicle, shown in figure 10, is driven. Data are then smoothed to provide a continuously smooth trajectory. Features are then identified and extracted from the raw data. In the final step, relative map features are combined with absolute trajectory and placed in a geographic information systems (GIS) database in world coordinates.
Figure 10. Photo. Vehicle equipped with sensor platform on roof.
No single independent sensor technology is capable of simultaneously attaining the accuracy, integrity, and availability specifications for lane-level positioning in expected diverse environments. Integrated positioning, which fuses asynchronous data from diverse sensors, is the best approach to reliably and accurately estimate vehicle position. Inertial Navigation Systems and Encoder Navigation Systems provide positioning solutions in all environments continuously at high rates; however, their accuracy drifts over time without aiding.
Global navigation satellite systems provide high accuracy in open areas where satellite signals can be received; however, performance degrades in dense urban areas. Feature-based navigation-aiding by using a camera, LIDAR, or RADAR can be successful when mapped features can be reliably detected and tracked. Several forms of ground-based radio communication systems, which can offer potentially useful position information, have been designed to penetrate the urban infrastructure and have the added advantage that their performance characteristics can still be influenced by the engineering community interested in roadway applications.
Farrell concluded the presentation by presenting ideas for future work:
Positioning and Mapping
1. U.S. Department of Transportation, Federal Highway Administration, “Annual Vehicle-Miles of Travel 1980-2007, By Functional System, National Summary (Table VM-202, summary for 2007),” Jan. 2009, http://www.fhwa.dot.gov/policyinformation/statistics/vm02_summary.cfm.