Skip to contentUnited States Department of Transportation - Federal Highway Administration FHWA Home
Research Home
Public Roads
Featuring developments in Federal highway policies, programs, and research and technology.
This magazine is an archived publication and may contain dated technical, contact, and link information.
Federal Highway Administration > Publications > Public Roads > Vol. 60· No. 4 > Atms Human Factors Experiments Produce Guidelines

Spring 1997
Vol. 60· No. 4

Atms Human Factors Experiments Produce Guidelines

by Nazemeh Sobhi and Michael J. Kelly

Public transit control center in Amsterdam, Netherlands.

Public transit control center in Amsterdam, Netherlands.

With the increasing availability of more capable computers and the development of automated support systems for traffic management, many designers and managers of new transportation management systems and traffic management centers would like to fully automate their traffic management operations. Nevertheless, human operators are now, and will remain for the foreseeable future, crucial components of advanced traffic management systems (ATMS).

Detailed studies of ATMS functions have found that automation may replace humans in many routine sensing, communication, data processing, and decision-making operations. For example, automated congestion detection and incident detection systems and automated response plans are part of many newer systems. The majority of ATMS functions, however, will still require actions, interventions, or supervision by humans. Automation will not eliminate the many problems that are often attributed to human operators, but it will probably change them into different forms.

Designing ATMS to Fit the Operator

We generally think of well-trained human operators as very flexible and able to adapt to new jobs, new workplaces, and new tools. As a result, operators' characteristics and requirements have rarely been examined in detail during ATMS design. Yet, the design of the ATMS concept of operations and the design of the operator-system interfaces _ such as system controls and computer displays _ can have a major impact on the efficiency of the operators. To help ensure that the operators perform their tasks effectively with a minimum of errors, their characteristics, capabilities, and limitations need to be carefully considered in the design of the system and in concepts and plans for its operation.

The military services have long recognized the importance of human factors issues in the design of complex systems. Airplanes, ships, control rooms, weapons, and personal items have all been made more serviceable through a standardized process that emphasizes the user or operator from the initial stages of the design process through the final test and evaluation. Human factors standards and design procedures for such systems and equipment are well-documented in numerous formal publications.

The concept of human factors design for ATMS is simple. The system needs to be designed and built to fit the operator. When tailoring a new suit, the tailor must know a great deal about the wearer _ the sleeve length, the inseam, the neck size, the shoulder breadth, and many other dimensions. When designing a traffic management center, the engineer must know even more about the operator _ eye height above the floor, color discrimination ability, ability to understand information on various displays, maximum (and minimum) workload, and scores of other measures.

Defining Human Factors Guidelines in the Laboratory

Under the sponsorship of the Federal Highway Administration, the Georgia Institute of Technology is conducting a series of experiments that will provide human factors design guidelines for future traffic management centers (TMC). The experiments are being conducted in a high-fidelity simulator of an advanced TMC. The simulator can duplicate the functions and operator workstations of real-world centers, including user-computer interfaces, automated support systems, and remote television cameras.

Based on a network of 13 computers, a large-screen projection television, and four 330-millimeter (13-inch) closed-circuit television (CCTV) monitors, the simulator can be configured into as many as four operator work consoles in one large room, with an experimenter's console in an adjacent observation room. Each operator workstation contains a Silicon Graphics monitor with a touchscreen, keyboard, and mouse. It also contains a monitor with a touchscreen that is configured as a communication control panel.

The projection television is in the front of the control room and typically displays a traffic situation map. The CCTV monitors can display realistic traffic scenes for any of 60 locations in the simulated roadway system. The simulator software includes a number of automated support systems that can be used by the operator to help detect and manage congestion and incidents.

Interfaces for Selecting and Controlling Remote Cameras

The first of a series of controlled laboratory experiments tested various control devices for selecting cameras and controlling their pan and zoom functions. The control devices included a joystick for camera control, a touchscreen on the system monitor, a mouse interface, and numerous keyboard interfaces. Results of the experiment led to rejection of the touchscreen and the joystick interfaces. Camera selection could be done by clicking on a camera icon on the map display with the mouse or by keying in a camera designation number on a keyboard. Pan, tilt, and zoom could effectively be controlled by using the keyboard cursor keys.

An additional finding was that the camera-control process was more efficient when the cameras had preset pointing angles rather than full manual control. With these cameras, a small number of pointing angles _ toward areas most likely to have incidents _ were selected for each camera. Pushing a button or key would move the camera through these selected views. The best interface provided the operator with both preset and manual control, allowing the use of preset controls to quickly survey the viewing area and manual control to fine tune the pointing angle for more detailed examination.

Incident Detection Using Manual and Automated Systems

A second study investigated the potential advantage that might be achieved by using an automated incident detection system _ either in place of, or in addition to, a color-coded, traffic-flow map _ for detecting and verifying roadway incidents. A well-designed incident detection system was found to significantly improve operator performance in incident detection and verification.

The nature of the traffic-flow algorithms to be used in the incident detection system was also explored in this study. These algorithms, even when customized for local conditions, are imperfect. Undetected incidents and false reports of nonexistent incidents are common. Frequently, jurisdictions will "tune" the algorithms to reduce the number of false alarms even though this increases the number of missed incidents and/or the time to detect an incident. Studies in the simulator found that accepting an increased false alarm rate in order to minimize the number of missed incidents and to minimize detection time resulted in the best system performance and operator acceptance.

This result was contingent on the operator having a means to quickly check a reported incident and, if it is a false alarm, to dispose of it. The support system that was developed and tested in this study automatically identified and selected the most appropriate remote camera. If no incident was found at the reported location, the operator could eliminate the report from the screen and the incident log by simply "clicking" the computer mouse on a reject button on the monitor.

Automated Message Posting

A third experiment tested several different kinds of automation for systems that use variable message signs (VMS) to report congestion and incidents at specified locations. A response plan with patterns of VMS messages was developed. An important finding was that the operators needed to see the entire pattern of messages posted as a result of congestion or incidents. The ability to examine one message at a time did not provide full awareness of the situation, probably due to the operators' memory limitations.

Another important finding was that human operators are not good at finding errors in computer-generated response plans. Under automation schemes in which operators had to review and approve such message patterns, operators rarely found the subtle errors. The conclusion was that if the computer-generated response plans are generally adequate, little is added by having the operators check message content before messages are posted.

Publishing the Guidelines

To date, experiments have been conducted and guidelines have been developed for:

  • Displays and controls for remote CCTV camera networks.
  • Applications of large-screen situation displays.
  • User interfaces for automated incident detection and management systems.
  • User interfaces for variable message sign management support systems.
  • User interfaces for managing cellular phone calls.
  • Data fusion displays for traffic-flow sensor systems.
  • Location of CCTV monitors.
  • Appropriate division of functions among center operators.

A preliminary edition of human factors design guidelines for TMC design was based on existing standards and guidelines from other applications. This edition was distributed for review in October 1995 and is currently being revised to include the results of the Georgia Tech experiments. The second edition of the Human Factors Handbook for Advanced TMC Design will be published within the next few months.

Nazemeh Sobhi is a highway research engineer in the Office of Safety and Traffic Operations Research and Development of the Federal Highway Administration. Her expertise is in human factors aspects of intelligent transportation systems. She received a bachelor's degree in computer science from Radford University in 1987 and a master's degree in transportation engineering from the Virginia Polytechnic Institute and State University in 1989. Currently, she is a doctoral candidate in civil engineering at the University of Maryland.

Michael J. Kelly is a principal research scientist and head of the Human Factors Branch at the Georgia Tech Research Institute. He is the principal investigator on the study to develop a handbook of human factors design guidelines for advanced transportation management centers based on simulator research and on lessons learned by existing centers. He received his doctorate in engineering psychology from The Johns Hopkins University in 1975.

ResearchFHWA
FHWA
United States Department of Transportation - Federal Highway Administration