U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content U.S. Department of Transportation/Federal Highway AdministrationU.S. Department of Transportation/Federal Highway Administration

Pavements

 

Quality Assurance Stewardship Review - Summary Report for Fiscal Years 2003 Through 2005

Background

A revision of FHWA's sampling and testing regulations titled, "Quality Assurance Procedures for Construction," was published on June 29, 1995, as Title 23, Code of Federal Regulations, Part 637 (23 CFR 637). The regulations require each State agency to have in place an approved Quality Assurance (QA) Program for materials used in Federal-aid highway construction projects. Provided certain checks and balances are in place, the regulations provide flexibility in sampling and testing by allowing the use of contractor test results in the overall Agency acceptance decision. In addition, consultants may be used in performing Dispute Resolution or Independent Assurance (IA) if the laboratories have been AASHTO accredited. The States may also use a system approach to IA instead of establishing frequencies based on individual project quantities.

The regulations also add several additional requirements. They include: (1) the State agency's central laboratory was required to become accredited by the AASHTO Accreditation Program by June 30, 1997, and (2) all testing personnel and laboratories must be qualified using State procedures by June 29, 2000.

This review is part of the Federal Highway Administration overall stewardship activities for State agency QA Programs.

Scope

The objective of this activity is to review the State agencies' QA Program practices and procedures, and to ascertain the status of the States' implementation of the 23 CFR 637 QA regulation.

The reviews looked at the entire QA Program in each State. Prior to the start of the reviews in FY 2004 there was some concern expressed over the use of contractor supplied test results in the acceptance decision. Thirty-three States currently allow the use of contractor testing in the acceptance decision. As a result of the number of States that are using contractor test results and the concerns over the implementation of that provision, the reviews that were conducted during FYs 2004 and 2005 had an emphasis on the use of contractor supplied data used in the Agency acceptance decision.

States using contractor test results in the acceptance decision. The states not using the results are ME, NH, VT, NJ, MI, IN, TN, LA, MT, WY, WA, NV, AZ, AK, and HI

The assessments were a joint effort involving the State agency and FHWA Headquarters, Resource Center and Division personnel. Material practices involving the regulation were examined at the State Headquarters, Region/District and construction project level.

Four stewardship reviews were completed in FY 2003, Maine, Missouri, Colorado, and Oklahoma. Three of those reviews include States that used contractor test results in the acceptance decision, Missouri, Colorado, and Oklahoma. Four stewardship reviews were completed in FY 2004, California, Georgia, North Carolina, and New York. Four stewardship reviews were completed in FY 2005, Maryland, Oregon, Minnesota and Connecticut. All of the States that were reviewed in 2004 and 2005 were States that use contractor test results in the acceptance decision.

States reviewed to date as discussed above.

Assessment Procedures

The stewardship reviews included (1) interviews with State agency Headquarters, Region/District and field office personnel and FHWA personnel, (2) review of State agency implementation strategies including policy and procedure documents and office records where applicable, (3) visits to construction projects to assess field practices as appropriate, and (4) identification of best practices.

Entrance conferences were held, as appropriate, with top FHWA Division and State agency personnel to explain the assessment intent and process. Closeout meetings were held with the division and State agency offices to share information obtained from the assessment.

Organization of Report

This is a "state of the practice" report that covers the reviews completed during FY 2003 through 2005. The remainder of the report will cover the positive findings and opportunities for improvement that were found during the reviews and additional resources that are available.

Findings

Positive Findings

The positive findings will be discussed in two broad categories, general items that can be portrayed across a significant number of States and specific items that can be attributed to one or two of the States. Only significant positive findings, which can be used by others, are reported.

  1. General

    In most States, except as noted below, the Independent Assurance (IA) programs are well designed and understood. The programs are being implemented properly and the programs are yielding the desired results. IA is a program of split sampling and testing or reference sample testing to help ensure that the testing is being performed correctly on properly calibrated equipment.

    In all States the Technician Qualification programs for project-produced materials have been understood, designed, and implemented properly. In most States, except as noted below, the Laboratory Qualification Programs have been designed well and implemented properly.

  2. Specific

    1. Electronic Materials Management Systems. States are making progress in electronic management of materials data. Several States are using the AASHTO site manager software, while a couple of States have either an in-house developed system or a third party developed system. The creation of databases has allowed the States to examine their specification limits more easily and ultimately will allow analysis of data to create performance related specifications.
    2. Sample Control System. The State allows the contractors to transport cores for asphalt density and asphalt box samples. However, they use security tape on the box samples that indicates tampering if removed. The cores are shipped in plastic totes that have numbered security tags to prevent tampering.
    3. Joint biannual reviews of the IA program. The State and Division Office perform a biannual review of the IA program. The review covers all aspects of the IA program including implementation at the project and the regional office.
    4. Materials reference sample program. A couple of States have developed internal materials reference sample programs to verify qualification of laboratories and or technicians.
    5. Participation and use of the National Transportation Product Evaluation Program (NTPEP). States are moving toward further use of the NTPEP program. In particular, States are specifying in their specifications that materials need to be tested by NTPEP before the material is considered for approval. The NTPEP is an AASHTO program which tests select manufactured materials. The results of the program can be used as part of a State's approved products list program. The NTPEP program can result in increased assurance of product quality
    6. Meetings with field materials personnel. Most States hold monthly or quarterly meetings with district or regional materials engineers to discuss materials testing issues. This is a good forum for ensuring consistency of interpretation of specifications, test procedures and policy. These meetings also ensure that problems with specifications are identified in a timely manner.
    7. Multiple layers of materials review by the State. Several States have different groups of highly qualified personnel that observe the operations at the plant. This process allows for identifying issues before they become serious problems.
    8. Reduce variability by performing split sample testing prior to production. The State's program calls for split sample testing of flexural samples by all contractor and State testing personnel prior to the beginning of production. This testing has reduced the amount of variability between contractor and State test results.
    9. Accreditation of District/region Laboratories. A State is requiring their District laboratories (in addition to the Central laboratory) to be accredited by the AASHTO Accreditation Program. Requiring additional qualification of the State's laboratories reduces the chances of having test results successfully questioned in disputes.
    10. Improved specifications for PCC. A State developed an incentive/disincentive program for PCC paving which includes water to cement ratio, aggregate quality and gradation.
    11. Calibration of smoothness measurement devices. A State is calibrating their smoothness measurement devices at a test facility. The calibration process includes the recordation of the filter settings used at the time of certification. Many States are specifying incentives and disincentives on pavement smoothness. States need to calibrate smoothness measurement devices are needed to insure the proper and equitable payment for smoothness.
    12. Sampling Hot Mix Asphalt (HMA). Several States sample loose HMA from behind the paver where the final "in place" properties are evaluated. This insures that the sample includes the as placed material and reduces the potential notification of plant personnel of sampling times.
    13. Verification of contractor test data. Several States are using an appropriate statistical process to verify contractor test results.
    14. Calibration of angle of gyration on gyratory compactors. Several States have procedures to ensure that the angle of gyration is being checked.

Opportunities for Improvement

The comments in this section will be discussed in four areas, use of contractor test results, IA issues, other items and fraudulent activities.

  1. Use of Contractor test results

    In most States it was found that the States' validation system needed to be strengthened. The following items were noted:

    1. not using independent samples for State verification samples,
    2. no statistical comparison of contractor and State data,
    3. low State to contractor test comparison ratio of 1 vs. 10 results, and one vs. one comparisons of test results for acceptance,
    4. lack of control of contractor supplied data,
    5. lack of a defined time for comparing test results,
    6. not increasing testing frequencies when test results don't compare,
    7. States are not controlling the sampling location and timing, and
    8. States are allowing biased retesting provisions
    9. security of samples

    The following is a further explanation of each of the areas noted above:

    1. Use of independent samples. It was noted in several States that verification was being performed based on split samples taken by the contractor. Verification of test data needs to be based on independent samples taken by the State. Split samples are an important part of the overall system and can help determine problems associated with sampling and testing procedures and equipment problems. That is why IA testing is required. However, split samples taken by the contractor will not detect fraudulent activity by the contractor which may consist of fabricating samples, switching samples, or taking samples from biased locations.
    2. Use of a statistical comparison. It was noted in some States that a statistical comparison was not being performed between the contractor's results and the State's results. The comparison was being based solely on a one vs. one comparison of results. This method of verification is very weak and will only detect severe problems with contractor test results.
    3. Number of independent samples being compared for validation. It was noted in some States that comparison ratio's of State to contractor results was one vs. five or one vs. ten. When the F-test and t-test are used for comparing test results, a minimum of seven to ten State test results and a maximum of 20 to 30 contractor test results should be used for a reasonable comparison. It is suggested that a method of rolling comparison be incorporated to solve this problem. States can also increase their sampling frequency at the beginning of projects in order to start the comparison earlier. It was noted in some States that there is no limit to the number of test results that are included in the comparison procedure. It is recommended that the number of contractor test results be limited to a maximum of 20 to 30 results because large numbers of tests in the comparison can mask problems in individual test results. It is also recommended that when a non-comparison occurs that a new comparison process begin with the next test results.
    4. Control of contractor supplied data. A need to control the documentation for contractor supplied test results was noted in some States. In some cases the State is not receiving the documentation until three days after the paving. Some States also do not require the contractors to retain the source documentation for the required 3 years and are not periodically reviewing the records. The States should be reviewing source documentation, requiring proper retention of documents and require the submission of test results the next day and before the State supplies their results.
    5. Defined time frame for comparing test results. It was noted that in some cases there were no limits on the time required for validating contractor test results with the State verification test results. Validation of test results should occur as soon as possible due to the risks to both the State and the contractor that the material being supplied and incorporated into the project does not meet specifications.
    6. Increasing test frequency. When contractor and State tests do not validate the State should increase their frequency of testing. This will increase the ability of the validation process to detect differences and also reduce risks for both parties if the States results are ultimately used for payment.
    7. Control of sampling location. It was noted in several States that the time and or location of sampling was being telegraphed to the contractor. In one case separation paper was being placed on the existing hot mix asphalt mat before placement of the next lift. The paper located the area that cores were going to be taken before the lift was placed and compacted. In other cases the random numbers for the sample locations are being given to the contractors for the entire project at the beginning of the project or at the beginning of the day for the entire day. The State must control the sampling location and timing, limit the pre-notification of sampling, and limit the ability of the contractor to modify sampling locations. Also, sampling behind the paver can avoid telegraphing sampling times to the plant operators. Saws can also be provided to separate the layers of cores.
    8. Biased retesting provisions. It was noted that some States allow the retesting of material any time a failing test result occurs and replaces the failing test result with the new result. This practice is highly biased toward the contractor. Under no circumstances should a test result be thrown out unless it is known that the sample is flawed, i.e. poor or damaged sample, or poor test procedures. If additional tests are taken the analysis process needs to be modified to take into consideration the additional number of test results.
    9. Security of samples. The security of the retained (3rd party) sample possession by the contractor. The possession and storage of retained 3rd party samples or dispute resolution/backup samples should be taken immediately by the Agency. Manipulation of the samples or replacement by known passing material could occur when the contractor takes possession of the samples.
  2. Independent Assurance (IA) Program

    1. Many States should review their test result comparison tolerances. In some instances tolerances were developed in the early 1970's and had not been thoroughly examined since then. In many cases the testing variability has improved due to certification programs and improvements to test procedures. Therefore, the tolerances may be too large.
    2. The IA inspectors were taking independent samples. IA should consist of a program of split sampling and testing or reference sample testing to help ensure that the testing is being performed correctly on properly calibrated equipment. Independent samples do not efficiently isolate issues associated with testing and equipment. Unless large numbers of independent samples are taken they cannot detect problems with sampling and test procedures or equipment problems.
    3. IA forms referred to specification compliance. IA is specifically for determining testing competence and not specification compliance.
    4. Gyratory compaction was not included in the IA program. The IA program should cover all test procedures that are used in the acceptance decision.
    5. The IA program did not cover technicians in the QC Laboratories. All technicians including State personnel, contractor personnel or consultant personnel that are performing testing that is used in the acceptance decision must be qualified.
    6. Methods need to be developed to standardize comparison of IA and acceptance test results.
    7. Timely resolution of discrepancies in IA, specification, compliance, and validation need to be documented and included in the project files.
    8. A goal of 90 percent coverage of the active testing personnel per year should be established when the system approach is used for IA.
  3. Other Items Noted

    1. Tester's names were not indicated on reports. All test reports should indicate the person that was responsible for sampling and testing the material. The reports should also include the testers certification number.
    2. A few States need to improve the use of electronic means to track data. A good system of data collection will allow future analysis of data to continually improve specifications.
    3. Some States that are gathering electronic data are not using the data to continuously analyze and improve their specifications based on the actual results obtained within the State.
    4. The dispute resolution system was not formally established and documented. When contractor test results are used for acceptance the State must establish a dispute resolution system to address the resolution of discrepancies occurring between the verification sampling and testing and the contractor sampling and testing.
    5. Qualification programs for all contractor and State laboratories used in the QA program were not established.
    6. Absolute Average Deviation or other inefficient quality measures are being used for acceptance and pay adjustments. The States should move toward a statistically sound Percent Within Limits (PWL) type of specification.
    7. Several of States did not include volumetric properties as part of the acceptance decision for asphalt mix and should move towards having volumetric properties included.
    8. Many States allow multiple options in their test procedures and sampling locations. This provides an increased variability in sampling and testing. A single test method and sampling location will reduce the overall variability of the test results.
    9. Specified periodic re-evaluation for each product that appears on the Qualified Products List should be established. Higher risk products may need to have additional validation and higher test frequencies.
    10. The smoothness specification requirements should be reevaluated to ensure the present parameters are acceptable.
      • Reduce the amount of exclusion areas to improve ride in those areas.
      • Ride bonus - ensure a bonus is only provided for superior ride quality and not allow the majority of contractors (regardless of quality) to achieve the full ride bonus.
      • Balance the material quality and payment with the smoothness quality and payment so contractors won't only put an emphasis on one or the other during production and laydown.
    11. The State is encouraged to have the Central Office assess the QA programs consistent implementation statewide.
    12. We recommend the State require the QC technicians to certify each of their test results. As an example, most states require a certification (i.e. statement) for the technician's signature to indicate that the report reflects the actual test results obtained.
    13. The States should develop density specifications which include the entire longitudinal joint in the evaluation, i.e., eliminate the different requirements for mainline vs. edge of pavement and confined vs. unconfined edge.
    14. The State testing procedures should be more accessible either in hardcopy or electronic format.
    15. When nuclear gages are used for acceptance of asphalt density the State should develop and implement an effective procedure to correlate gages using cores.
    16. The State should consider requiring qualification for contractor/supplier personnel that are performing mix designs.
    17. We have a concern over the amount of available staff when contractor test results are used in the acceptance decision to:
      • monitor asphalt field operations.
      • analyze the data on a daily basis as part of the validation program.
      • To support the staff in the development and implementation of the overall QA program in asphalt to ensure timely development of a QA system that is in compliance with the regulation.
  4. Fraudulent Activities

    No fraudulent activities were discovered during the reviews.

    However, questions concerning fraudulent activities were asked during the eight States that were assessed during FY 2004 and FY 2005 (California, Georgia, North Carolina, New York, Maryland, Oregon, Minnesota, and Connecticut). In response to those questions, two States indicated current ongoing investigations and two States indicated that they had revoked technician certifications due to fraudulent activities. Questions concerning fraudulent activities or the revocation of technician certifications were not asked during the reviews performed in FY 2003.

Available Resources

The following resources are currently available for assistance in dealing with issues raised in this report:

  1. The guideline for these reviews is available on the FHWA Office of Pavement Technology web site. It is available at https://www.fhwa.dot.gov/pavement/materials_notebook/qareview.htm

  2. "23 CFR Part 637," Subpart B - Quality Assurance Procedures for Construction, Federal Highway Administration, Federal Register, Washington, DC, April 2003, http://www.access.gpo.gov/nara/cfr/waisidx_03/23cfr637_03.html

  3. Non regulatory supplement for 23 CFR Part 637, Subpart B - Quality Assurance Procedures for Construction, Federal Highway Administration, https://www.fhwa.dot.gov/legsregs/directives/fapg/0637bsup.htm

  4. Technical Advisory 6120.3, "Use of Contractor Test Results in the Acceptance Decision, Recommended Quality Measures, and the Identification of Contractor/Department Risks", Federal Highway Administration, August 2004. It is available at https://www.fhwa.dot.gov/legsregs/directives/techadvs/t61203.htm

  5. Frequently asked Questions on the Quality Assurance Regulation https://www.fhwa.dot.gov/pavement/materials/matnote11.cfm#qaa

  6. A contract has been awarded for the delivery of NHI Course 134042, "Materials Control and Acceptance - Quality Assurance." The course is four days long and covers the basic essentials of QA. A two-day version of the course is also available.

  7. A 1-day workshop titled "PWL Basic" will be offered by the FHWA Office of Pavement Technology starting in the spring of 2006.

  8. A 1 day workshop titled "PWL Specifications: A Risk Analysis Approach" will be offered by the FHWA Office of Pavement Technology starting in the fall of 2006.

  9. "Optimal Procedures for Quality Assurance Specifications", Publication No. FHWA-RD-02-095, Federal Highway Administration, Washington, DC, April 2003, http://www.tfhrc.gov/pavement/pccp/pubs/02095/.

  10. "Evaluation of Procedures for Quality Assurance Specifications", Publication No. FHWA-HRT-04-046, Federal Highway Administration, Washington, DC, October 2004

  11. The rewrite of the AASHTO Standard Recommended Practice R 9-05, "Acceptance Sampling Plans for Highway Construction" has been published in the 2005 AASHTO Standards. This guide will assist the States in developing specifications.

Status of other Quality Assurance Activities

The following resources are being developed to address issues that are not being covered by existing resources:

  1. A software package is being developed by FHWA as a tool to help analyze risks associated with Percent Within Limit (PWL) specifications. The software will be completed in the summer of 2007.

  2. The Quality Assurance Technologist Course that was developed by the New England Transportation Technician Certification Program (NETTCP) has been finalized and will be available shortly through the Transportation Curriculum Coordination Council (TCCC).

  3. A contract for developing NHI Course 134059 - "Quality Assurance Specification Development and Validation Course" is expected to be awarded during the summer of 2007. The course is expected to be available by the end of 2008. The course will use the software that is currently being developed to assist the States in developing and validating the risks associated with QA specifications.

  4. A task order is being developed with the National Center for Asphalt Technology (NCAT) to explore innovative methods for the acceptance of materials.

Conclusion

The stewardship reviews will continue next year and beyond along with the continued development and updating of resources in order to continuously improve the QA program.

Updated: 06/27/2017
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000