U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000
Federal Highway Administration Research and Technology
Coordinating, Developing, and Delivering Highway Transportation Innovations
This report is an archived publication and may contain dated technical, contact, and link information |
|
Publication Number: FHWA-RD-95-197
Date: December 1996 |
Development of Human Factors Guidelines for Advanced Traveler Information Systems and Commercial Vehicle Operations: Comparable Systems Analysis
CHAPTER 1. INTRODUCTION AND OBJECTIVES
BACKGROUNDATIS SUBSYSTEMSRATIONALE FOR A COMPARABLE SYSTEMS ANALYSISDEFINING LESSONS LEARNEDTYPES OF LESSONS AND GUIDELINES SOUGHTLEVELS OF ANALYSIS PERFORMED
BACKGROUNDThe fundamental premise underlying the U.S. Department of Transportation's initiative in ITS (Intelligent Transportation Systems) is that innovative application of advanced technology is vital to meeting the Nation's transportation requirements for the 21st century. Three major programs are currently under way to assess user requirements and human factors issues in the ITS subareas of Automated Highway Systems, Advanced Traffic Management Systems, and Advanced Traveler Information Systems/Commercial Vehicle Operations (ATIS/CVO). The present work is being performed under Task D of the ATIS/CVO contract. The objective of the comparable systems analysis was to glean "lessons learned" in the areas of in-vehicle information display and the effectiveness of existing guidelines for the design of ATIS/CVO user interfaces. ATIS systems acquire, analyze, communicate, and present information to assist surface-transportation travelers in moving from an origin to their desired destination (IVHS America, 1992). The major objective of ATIS systems is to provide various types of information to the driver of a vehicle that would enhance his/her driving performance and safety. To accomplish this objective, several user services were identified and grouped into functional categories or subsystems. At the time Task D was performed, ATIS system user services were referred to as functions and were grouped into the four subsystems described in the next section.
ATIS SUBSYSTEMSThe proposed subsystems for the ATIS component were: The functional characteristics of these subsystems have been identified in Task C and are listed in table 1. Most of the ATIS functions are also applicable to CVO systems; however, an additional subset of functions is CVO-specific and is usually not addressed separately. The functional characteristics of these four subsystems, and those specific to CVO, are briefly described in the following paragraphs. For a detailed description of the functions, refer to Task C. In-Vehicle Routing and Navigation Systems (IRANS) In-Vehicle Motorist Services Information Systems (IMSIS) In-Vehicle Signing Information Systems (ISIS) In-Vehicle Safety Advisory and Warning Systems (IVSAWS)
RATIONALE FOR A COMPARABLE SYSTEMS ANALYSISAlthough the ITS program is relatively new in the United States, several ATIS systems have been under development for nearly 10 years, particularly in Europe and Japan. Since some manufacturers, developers, and designers have already developed ATIS applications, they may have learned valuable lessons in the process. These discoveries can provide insight into the future development of this technology. The task of developing human factors design guidelines can benefit from investigating existing ATIS and related systems and learning from past mistakes and successes in the development, design, and deployment of those systems. The objective for performing this comparable systems analysis was to compile lessons learned from existing ATIS applications in the United States and to produce preliminary guidelines for the purpose of guiding empirical research to improve the design of future systems. The contract Statement of Work (SOW) specified that at least five systems were to be analyzed as part of Task D, Comparable Systems Analysis. Two systems were mandatory, the TravTek system and the University of Michigan Transportation Research Institute (UMTRI) system. One Commercial Vehicle Operations (CVO) system also had to be included. It was further specified that at least two of the systems selected for analysis should represent application domains outside of highway transportation (e.g., aviation).
DEFINING LESSONS LEARNEDBy performing a comparable systems analysis, we hoped to discover various types of lessons learned by designers, developers, manufacturers, and users of existing ATIS and CVO systems and related non-highway systems. We anticipated that some of the lessons we would learn would involve unexpected or surprising applications or outcomes. These types of lessons would be very valuable in the guideline development process. However, we also included items that are confirmations of unsurprising, known human factors design principles. The reasons for including unsurprising "lessons" were that they may not be known by some readers and reconfirmation may be a positive contribution. One of the objectives of Task D was to make as many field observations as possible. There are insufficient data in the literature on driver use of in-vehicle displays. Any insights, assumptions, and inferences made by various manufacturers were also important lessons to learn, for if they are erroneous, they can be corrected in future ATIS designs. Therefore, certain manufacturer's policies, user preferences, designer experience and opinions, and experimental results were also expected to foster important lessons learned. At the end of each system analysis, statements of lessons learned from that analysis will be listed along with a brief background of each lesson. Implications for future ATIS/CVO system designs also will be provided if not explicitly evident from the statement and description.
TYPES OF LESSONS AND GUIDELINES SOUGHTWhether the system was actually an ATIS system, or a comparable system (e.g., non-highway applications), this analysis looked for lessons learned in similar categories across the systems. Preliminary human factors guidelines also were expected to address these categories. The lessons that were focused on included the following categories and issues:
LEVELS OF ANALYSIS PERFORMEDThe candidate systems selected for this analysis (described in Chapter 2) varied considerably in application, level of implementation, and accessibility. To promote consistent analytic techniques, a structured set of surveys, checklist, worksheets, and interview items were created for each analyst performing the evaluation. In total, five analysts were involved in Task D. Each analyst was instructed to perform the entire structured analysis, if possible. It was apparent that proprietary information, restricted accessibility, and lack of documentation could limit the level of analysis performed. For example, the usability of each system would be best evaluated by a targeted user, in a usability analysis. However, some systems were not available for such an evaluation. Furthermore, it was anticipated that analysts would not be able to interact personally with the non-highway and CVO systems. The following outline describes the levels and procedures that were attempted in completing a full evaluation of selected ATIS/CVO units: Documentation AnalysisSystem documentation was obtained and reviewed whenever possible. Examples included technical manuals, specifications, schematics, training videos/programs, and user manuals. "Heuristic" EvaluationThis analysis was intended to apply the human factors expertise of the analyst to collect information regarding the performance and usability of the system. These data were to be based on the reviewer's observation of and experience with the system. The appropriateness of the various aspects of the system's interface was rated by the reviewer. Since this part of the evaluation involved the analyst's judgment based on personal "hands-on" experience with the unit being analyzed, we referred to this type of analysis as a Heuristic Evaluation (Jeffries and Desurvive, 1992). Target User EvaluationWhenever it was possible, volunteer participants were asked to use the system to perform tasks as instructed by the analyst. The user was asked to perform a verbal protocol while performing a defined list of pre-drive and en route (i.e., "drive") tasks that revolved around scenarios developed in Task C. Design Team Member InterviewsThe interviews were intended to collect information regarding the design team's rationale and criteria for the human factors design of the ATIS/CVO unit. This information was to be obtained through an interview session with system designers, developers, and/or evaluators. Specific items were developed so analysts would perform a structured interview and would make sure certain issues were addressed. The purpose was to discover the guidelines (human factors and other), design criteria, decision-making process, and design constraints involved in the design of the system being analyzed. If the analyst had personal experience with the functions of the system, he/she was instructed to use that knowledge to ask pertinent questions not listed in the interview items.
FHWA-RD-95-197
|