Long-Term Pavement Performance Compliance With Department of Transportation Information Dissemination Quality Guidelines
CHAPTER 5. EVALUATING INFORMATION QUALITY
The methods used by LTPP to evaluate and improve its data are the development of QC systems during data collection and processing, regular assessments of data quality, special studies of key aspects of the data, and data collection processes.
Data Quality Assessments
LTPP has developed two approaches to quality assessments. The first approach was a major program assessment conducted in 1996-1998. The second approach was the development of a quality control/quality assurance system based on the ISO 9001 quality management principles.
The objective of the 1996 assessment was to evaluate the program goals, objectives, and future direction. This was completed by evaluation of the impacts on deviations from the program's plans, number of test sections, data collection deficiencies, and resources. The ultimate objective of the assessment was to develop a revised strategic plan that focused on high-payoff product objectives that met States' needs, improved program efficiency, and provided better quality data for product development. Some of the aspects of this assessment related to the Federal data quality guidelines, including the following:
- The assessment was conducted by a team composed of expert contractor staff and FHWA staff, in concert with a special peer review subcommittee made up of State DOT, TRB, and AASHTO representatives.
- Through the TRB advisory mechanism, stakeholders in State agencies were contacted to obtain input on agency needs as related to the goals and objectives of the program goal established 10 years earlier.
- One member of the assessment team was a FHWA loan staff member from Canada
who was knowledgeable of data quality but did not participate in preparing data system information or public dissemination.
- The findings from this assessment resulted in a highly publicized "Campaign for Program Improvement," which involved meetings with each participating State and Canadian provincial highway agencies to resolve missing data issues and obtain a written agreement on their intended level of support for the program on a test section basis. Program priorities were changed to focus on high-payoff areas of research in the program, and resource allocation reductions were made in areas judged to provide limited impacts. In addition to numerous presentations at national, regional and local meetings, a report was prepared on the results of the assessment.
With the letting of the four regional LTPP data collection contracts in 2001, FHWA required documentation for the preparation of formal data collection and a processing QC system. After central review by a contractor with ISO 9001 certification, these QC documents were transformed into quality management documents based on ISO 9001 principles. Some of the relevant features of this management process, as related to this portion of the Federal data quality guidelines, include the following:
- Designation of a regional data QC manager on each regional contractor staff. The regional data QC manager is responsible for the following:
- Conducting regularly scheduled and impromptu internal audits of compliance with quality control and data collection guidelines and procedures.
- Documenting internal audits conducted and their results.
- Documenting corrective actions resulting from both internal and external audit findings.
- Conducting annual or more frequent reviews and updates of the QC and management procedures.
- The central TSSC established a quality assurance audit team and process to assess compliance by the data collection contractors with their data quality control and management guidelines and compliance with LTPP program requirements. The quality assurance audits include the following:
- Announced office visits to review designated sections of the quality control and management plans. The different parts of the plans are rotated so that each part is reviewed on a 2-year cycle, if budget permits. Prior to an audit, a data review is conducted to identify data issues of concern to be invstigated during the audit.
- Unannounced audits of primarily field data collection personnel. Each regional data collection contractor is required to maintain a data collection schedule posted on the Internet. Auditors arrive unannounced at data collection sites to observe activities and compliance with both the contractor's internal requirements and the program requirements. Negative findings from these inspections are discussed with field personnel and then reported to management.
- An example of an unanticipated and unannounced field audit was conducted in Hawaii. This audit was conducted since data collection efforts on sites not located on the mainland require an alternate set of procedures and tools not common to other test sections.
- All audit results are documented in an audit report that includes a description of audit activities, items reviewed, positive findings, correction action requests, and improvement recommendations. All corrective action requests and improvement recommendations are discussed with the data collection management contractor in order to reach an agreement on corrective actions to be taken. On each audit visit, all previously agreed to corrective action findings and improvement requests are reviewed.
Quality assurance audits are performed on highway agency operated FWD reference calibration facilities used by LTPP. These facilities were developed under cooperative agreements with select highway agencies. The facilities use LTPP-provided equipment and follow LTPP test protocols. Annual audits are performed on conformance to test protocols on the operators of these facilities. Audit results are documented and certificates of compliance are issued.
Evaluation Studies
LTPP has used evaluation studies to analyze data quality issues that can not be identified by mere inspection of the data. Evaluation studies have also been performed in the development, refinement, or implementation of new or advanced data collection systems. The following are some of the evaluation studies conducted by the LTPP program:
- Since the methods used to rate pavement distresses rely on a subjective interpretation by trained personnel, an evaluation study was conducted to examine the between rater variability and variability between rating methods. The study was conducted by engineering and statistical consultant experts and peer reviewed by a TRB ETG on pavement distress monitoring. The study was based on a statistical sample of data obtained from distress rater accreditation workshops. In addition to documenting probable ranges of uncertainty in these measurements, recommendations on improvements to the rating methods resulted from this work. The results of this work were published in the report, Study of LTPP Distress Data Variability, FHWA-RD-99-075.
- An evaluation study was performed on the resilient modulus test on AC in indirect tension developed by the LTPP program when it was managed by the National Academy of Sciences as part of the SHRP. Due to the severity of the problems found with the data from these tests and the uncertainty associated with test results which appear reasonable, the data were archived and removed from the database distributed to the public.
- As a part of the QC system, evaluation studies are routinely performed on advanced field data collection equipment which includes FWDs and pavement profilers. These evaluation studies typically consist of a statistically designed experiment that allows an analysis of variance approach to be used to evaluate the results of side-by-side equipment comparisons. These evaluation procedures have also been used to evaluate equipment during the procurement process.
- The State-sponsored pooled fund study, TPF-5(039), managed by LTPP FHWA staff was established in 2004 to investigate improvements to the LTPP-developed FWD calibration protocols. The contractor is charged with evaluating current methods, procedures, and instrumentation to develop an improved system compatible with current computer technology.
LTPP has established the basis for future evaluation studies that may not be able to be conducted with current funds. For important data elements based on measurements from instrumentation, when possible, equipment calibration factors are stored in the database. This permits evaluations of the effect of changes in the calibration factors over time.
Quality Control Systems
LTPP has invested significant resources in developing data QC systems to address the variety of data sources and measurement technologies employed in the program. The goal of these QC systems is to provide a preventive system of error identification to avoid data errors before entry of data into the database. Within the LTPP program, data quality control is defined as the processes and procedures used to inspect data and data collection equipment prior to entry of data into the database.
The major categories of QC systems developed by LTPP include equipment calibration procedures, equipment calibration checks, operator training and certification, post data collection reviews, data screening from external program sources, and formal quality control management procedures.
Equipment Calibration
In the development of the data collection plan, it was determined that the LTPP program needed to own and operate specialized data collection equipment judged by panels of experts to be critical to its success. While the LTPP program tried to rely on existing technology, in some cases, it had to develop its own measurement technology. In some cases, LTPP had to develop calibration procedures for its specialized equipment and, in other cases, used exiting procedures. Some highlights of LTPP equipment calibration procedures include the following:
- LTPP developed the first reference calibration procedure for FWD in the United States. This procedure provides a calibration procedure independent from the equipment supply contractor. Federal, State, and international highway agencies have adopted the LTPP FWD reference calibration procedure. Changes to sensor calibration factors are stored in the database to permit analytical evaluation of potential impact of these changes by data users.
- LTPP developed cutting-edge equipment calibration procedures for the very sensitive tests on the elastic response of pavement materials in the laboratory. These tests, informally called start-up procedures, involve expert instrumentation engineers who check the internal functions of advanced laboratory electronic measurement equipment to identify sources of bias and error that can not be reliably detected by inspection of the output data. The nature of these checks is to measure the output from a controlled electrical input into the system. The objective is to discern if the instrumentation is correctly calibrated to the manufacturer's specifications.
- Calibrations of distance measurement instruments critical to vehicle mounted pavement related measurements are codified in LTPP directives.
- It is LTPP policy to use equipment whose calibration can be traced back to the National Institute of Standards and Technology, when possible.
In those cases where it is not possible to directly calibrate a device, equipment calibration checks are used to ensure proper function. For example, measurements by temperature sensors are conducted on items of known temperature such as ice and boiling water, and if found to be outside an established range, are either returned to the manufacturer for adjustment or replacement.
It is LTPP policy to identify data collection equipment operators or data collectors in the database to allow evaluation of operator biases and errors.
Data Collection Operator Training and Certification
Due to the complexity and subjectivity of many of the LTPP data collection functions, LTPP has established formal data collection training classes and certification evaluations.
LTPP requires that collection and interpretation of pavement distress data be performed by someone who has an active certification from a LTPP distress rater accreditation workshop. Raters must meet minimum experience and time-based recertification requirements in order to maintain their certification.
The LTPP regional data collection contractors are required to train and certify operators of equipment used to collect LTPP data as a part of their formal data QC management plan. Regional equipment operator training is documented, as required. Evaluation of the performance of new personnel by the regional data collection management staff is also documented, as required.
To promote consistency among regional data collection contractors, national meetings of regional data collection operators have been conducted on each major data collection topic. When the program began, annual meetings of data collectors were held. Due to program budget cuts, national meetings were scheduled on a priority basis, and the use of teleconferences was increased.
When new data collection technologies are being implemented, the LTPP program has used the following process:
- Development of draft guidelines.
- Review and comment on draft guidelines.
- National meeting to develop final guidelines.
- Field pilot activities to test and refine the guidelines.
- Issuances of final guidelines.
- Use of problem reports to document and request guideline changes.
- Update guidelines as appropriate.
Screening of Data from External Program Sources
The data collection plan relies upon multiple data sources. These data are screened prior to entry into the database. For data submitted on paper forms, the first level of screening is for completeness and logic checks on the provided information. Like all other data, these data are also screened after entry into the database using automated methods.
A large amount of data is received in electronic format. Two of the largest modules of data from other agencies are traffic monitoring data and climate data. The following screen method demonstrates the steps used by LTPP on these data:
- Traffic monitoring data are supplied by participating highway agencies in the standard FHWA card formats used for HPMS. The first step in the screening process is to determine if the data are in the correct format and will load into traffic quality control software. After the data are loaded, diagnostic checks are performed on the data. Many of these are graphs of the data used to determine common errors. A data review package containing graphs, a summary of data problems, and proposed actions to deal with problem data is prepared and submitted to the highway agency for review and comment. After agency comments are received, the data are processed, as appropriate.
- Climate data are obtained from the NCDC and CCC. Due to the large amount of data, these data are loaded into database tables. Automated checks on completeness, range, and logical statistics are performed on these data, and flags are set in these records to store the results. Records not passing the checks are excluded from use in the analysis process that creates temporal summary climate statistics for each test site.
Data Error Correction
The LTPP program has established standard and formal methods for data error correction. It has been LTPP's policy to not load known "bad" data into the database. Although steps are taken to prevent entry of erroneous data, there have been numerous instances when data in the database were found to contain errors, or, in some cases, unknown facts have caused previously entered data to be invalid.
When data errors are found, the standard mechanism is to correct the error if possible or remove the data from the database if the error cannot be corrected.
Error correction procedures are contained in formal documents issued by directive by data type. An example of an error correction policy is contained in the following excerpts from LTPP Directive I-85 on Manual Upgrades to QC Checks. This discussion describes the steps to be taken when data fail an automated check.
When a record does not pass a QC check, the first action that should be taken is to determine the cause, examine the data in the record or other related records, and try to rectify the situation if possible. Some types of possible errors that can be corrected include the following:
- Transcription errors. Transcription errors are an inherent problem with any manual data entry system. All data entry should be double checked for this type of error prior to saving a record to the database. When a record fails a QC check, this should be one of the first errors investigated.
- Improper referential data entry in another record. Because LTPP data are obtained from multiple sources, it is possible that a field used for referential links between tables will not have been properly recorded. LAYER_NO is a prime example of this type of correctable problem. There are times when a material testing laboratory may be assigned a LAYER_NO that is later changed in the database due to factors unknown to the laboratory contractor. This can cause a mismatch of material types in the layer tables. This type of error can be easily corrected by assigning the correct LAYER_NO in the mismatched record.
- Improper data acquisition or interpretation. In some cases, the supplier of the data may not have understood the intent or basis for the needed data element. These types of errors are usually associated with level-D range check errors. In these cases, the only recourse is to contact the data source and search for the correct value. For example, the percentage of longitudinal reinforcement steel in PCC pavements should never exceed 1 percent. When an agency has reported numbers in excess of this value, Regional Support Contract staff members should discuss the issue with agency contacts to decide if the correct value can be determined from the available records. In some cases, it may also be possible to resolve issues with photographs or direct field measurements.
- Errors, oversights, and blunders with interpreted data. There are instances where it is possible to reinterpret data from the raw measurements. Distress data from photographic based measurements is an example of a potentially correctable error in interpretation, since the photographs can be reinterpreted. When errors or problems are discovered in transverse profile measurements or distress measurements, the apparent errors should be referred to the data collector for possible correction.
- Potentially rectifiable data. The longitudinal and transverse profile data provide opportunities where erroneous data in the Information Management System (IMS) might be rectified. For longitudinal profile data, other measurement runs on a section may be available to replace runs, which contain spikes or other apparent data collection equipment errors, with other runs performed on the same day that do not contain such errors. Alternatively, on SPS projects, subsectioning of the raw data files can be corrected for apparent Distance Measuring Instrument (DMI) drift. Manually collected transverse profile data, in which the measurement width was varied along a section, may be salvaged with reinterpretation of the raw data.
- Two directly linked fields in a record are in conflict. For example, if a value is provided for the amount of admixture then the corresponding code indicating the type of admixture should not be null or no admixture.
During the QC error resolution process, it is also important to identify errors that are not possible to rectify. Some examples include the following:
- Equipment measurement errors. When a record failing a QC check can be traced to an identifiable equipment measurement error, manual upgrades should not be employed to elevate an erroneous data element to a higher status. When equipment malfunction can be determined, the errant data element should be deleted from the IMS. In records with multiple measurement fields, the "bad" data element should be set to null. In cases where all of the measurement data elements in a record are linked to the same measurement equipment malfunction, then complete removal of the record is the most appropriate action. In situations where a record contains multiple measurements from different sensors and the erroneous data removed from the record causes it to fail a QC check, manual upgrade may be appropriate.
- Required data not available. Circumstances can develop where critically required data are no longer available. There are instances when a required data element was not collected, was collected improperly, or is no longer possible to obtain or measure. These types of circumstance can potentially lead to a test section being removed from the LTPP study, taken out of study, or being recognized as not able to obtain the required data element.
- Indeterminable problem that requires investigation. When new tables are added to the IMS or new QC programs are issued, some records failing a QC check require further investigation to determine the cause. There are instances when it cannot be immediately determined if the error is a result of equipment malfunction, abnormal phenomena, or program error. In these instances, manual upgrades should not be performed until the exact cause for the problem can be determined. Some of these problems are resolved through the Software Performance Report (SPR) process. In general, SPRs should only be issued after it has been determined that the problem is not related to other issues.
The above policy and guidelines apply to data after entry into the LTPP database. Error correction guidelines are also contained in the data collection and processing documents for FWD measurements, profile measurements, seasonal monitoring measurements, AWS measurements, and traffic data.