U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content U.S. Department of Transportation/Federal Highway AdministrationU.S. Department of Transportation/Federal Highway Administration

Pavements

 

Quality Assurance Stewardship Review - Summary Report for Fiscal Years 2003 Through 2006

Background

A revision of FHWA's sampling and testing regulations titled, "Quality Assurance Procedures for Construction," was published on June 29, 1995, as Title 23, Code of Federal Regulations, Part 637 (23 CFR 637). The regulations require each State agency to have in place an approved Quality Assurance (QA) Program for materials used in Federal-aid highway construction projects. Provided certain checks and balances are in place, the regulations provide flexibility in sampling and testing by allowing the use of contractor test results in the overall Agency acceptance decision. In addition, consultants may be used in performing Dispute Resolution or Independent Assurance (IA) if the laboratories have been AASHTO accredited. The States may also use a system approach to IA instead of establishing frequencies based on individual project quantities.

The regulations also include several additional requirements: (1) the State agency's central laboratory was required to become accredited by the AASHTO Accreditation Program by June 30, 1997, and (2) all testing personnel and laboratories must be qualified using State procedures by June 29, 2000.

During fiscal year (FY) 2006 a National Program Review (NPR) on Quality Assurance was also conducted by the Office of Professional and Corporate Development. The NPR covered the actions taken by the Division Office in approving and reviewing State's QA Programs. This report will not discuss the results from the NPR and only summarizes the results of the reviews that have been conducted by the Office of Infrastructure.

Scope

The objective of this activity is to review the State agencies' QA Program practices and procedures, and to ascertain the status of the States' implementation of the 23 CFR 637 QA regulation. These reviews are conducted by the Office of Infrastructure as part of the Federal Highway Administration's overall stewardship activities for State agency QA Programs.

The reviews looked at the entire QA Program in each State. Prior to the start of the reviews in FY 2004 there was some concern expressed over the use of contractor supplied test results in the acceptance decision. Thirty-three States currently allow the use of contractor testing in the acceptance decision. As a result of the number of States that are using contractor test results and the concerns over the implementation of that provision, the reviews that were conducted during FYs 2004 and 2005 had an emphasis on the use of contractor supplied data used in the Agency acceptance decision. In FY 2006 three States that used contractor test results in the acceptance decision were reviewed along with one State that did not use contractor test results in the acceptance decision.

A map of the United States: 19 States and agencies do not use contractor test results in the acceptance decision:, , Alaska, Arizona, Delaware, District of Columbia, Hawaii, Indiana, Louisiana, Maine, Michigan, Montana, Nevada, New Jersey, New Hampshire, Puerto Rico, Rhode Island, Tennessee, Vermont, Washington, Wyoming.

The assessments were a joint effort involving the State agency and FHWA Headquarters, Resource Center, and Division Office personnel. Material practices involving the regulation were examined at the State's headquarters, Region/District, and construction project level.

Four stewardship reviews were completed in FY 2003, Maine, Missouri, Colorado, and Oklahoma. Four stewardship reviews were completed in FY 2004, California, Georgia, North Carolina, and New York. Four stewardship reviews were completed in FY 2005, Maryland, Oregon, Minnesota and Connecticut. Four stewardship reviews were completed in FY 2006, Virginia, Wisconsin, Nebraska and Nevada. To date, all but Maine and Nevada use contractor test results in the acceptance decisions.

16 States that have been reviewed through FY 2006: Maine, Missouri, Colorado, Oklahoma, California, Georgia, North Carolina, New York, Maryland, Oregon, Minnesota, Connecticut, Virginia, Wisconsin, Nebraska and Nevada

Assessment Procedures

The stewardship reviews included (1) interviews with State agency headquarters, Region/District and field office personnel and FHWA personnel, (2) review of State agency implementation strategies including policy and procedure documents and office records where applicable, (3) visits to construction projects to assess field practices as appropriate, and (4) identification of best practices.

Entrance conferences were held, as appropriate, with top FHWA Division Office and State agency personnel to explain the assessment intent and process. Closeout meetings were held with the Division and State agency offices to share information obtained from the assessment.

Organization of Report

This is a "state of the practice" report that covers the reviews completed during FY 2003 through FY 2006. The report will cover positive findings and opportunities for improvement that were found during the reviews and additional resources that are available.

Findings

Positive Findings

The positive findings will be discussed in two broad categories, general items that can be portrayed across a significant number of States and specific items that can be attributed to one or two of the States. Only significant positive findings, which can be used by others, are reported.

  1. General

    In most States, except as noted below, the Independent Assurance (IA) programs are well designed and understood. IA is a program of split sampling and testing or reference sample testing to help ensure that the testing is being performed correctly on properly calibrated equipment. The programs are being implemented properly and are yielding the desired results.

    In most States, except as noted below, the Technician Qualification Programs for project-produced materials have been understood, designed, and implemented properly. In most States, except as noted below, the Laboratory Qualification Programs have been designed well and implemented properly.

  2. Specific

    1. Electronic Materials Management Systems. States are making progress in electronic management of materials data. Several States are using the AASHTO SiteManager software, while some States have either an in-house developed system or a third-party developed system. The creation of databases has allowed the States to examine their specification limits more easily and ultimately will allow analysis of data to create performance related specifications.

      One State is customizing an off-the-shelf PROLOG program to store construction and materials data. As part of the system the State is also using PC tablets in the field to collect the data.

    2. Sample Control System. Several States allow contractors to transport cores for asphalt density and asphalt box samples. However, they use security tape on the box samples that indicates tampering if removed. The cores are shipped in plastic totes or coolers that have numbered security tags to prevent tampering.

    3. Joint biannual reviews of the IA program. Although it is not a requirement, one State and Division Office perform a biannual review of the IA program. The review covers all aspects of the IA program including implementation at the project and the regional office.

    4. Materials reference sample program. Several States have developed internal materials reference sample programs to verify qualification of laboratories and or technicians. Some States are also using the proficiency samples that are prepared by the AASHTO Material Reference Laboratory.

    5. Participation and use of the National Transportation Product Evaluation Program (NTPEP). The NTPEP is an AASHTO program which tests select manufactured materials. States are moving toward further use of the NTPEP program. In particular, States are specifying in their specifications that materials need to be tested by NTPEP before the material is considered for approval. The results of the program can be used as part of a State's approved products list program. The NTPEP program can result in increased assurance of product quality. Some States are also scheduling a peer review by National Transportation Product Evaluation Program (NTPEP) to assist the States in using NTPEP data.

    6. Meetings with field materials personnel. Most States hold monthly or quarterly meetings with district or regional materials engineers to discuss materials testing issues. This is a good forum for ensuring consistency of interpretation of specifications, test procedures and policy. These meetings also ensure that problems with specifications are identified in a timely manner.

    7. Multiple layers of materials review by the State. Several States have different groups of highly qualified personnel that observe the operations at the plant. This process allows for identifying issues before they become serious problems.

    8. Reduce variability by performing split sample testing prior to production on PCC Paving. State programs require split sample testing of flexural strength samples by all contractor and State testing personnel prior to the beginning of production. This testing has reduced the amount of variability between contractor and State test results.

    9. Accreditation of district/region laboratories. Some States are requiring their district/region laboratories (in addition to the central laboratory) to be accredited by the AASHTO Accreditation Program. Requiring additional qualification of the State's laboratories reduces the chances of having test results successfully questioned in disputes.

    10. Improved specifications for PCC. A State developed an incentive/disincentive program for PCC paving which includes water to cement ratio, aggregate quality and gradation.

    11. Calibration of smoothness measurement devices. A State is calibrating their smoothness measurement devices at a test facility. The calibration process includes the recordation of the filter settings used at the time of certification. Many States are specifying incentives and disincentives on pavement smoothness. States need to calibrate smoothness measurement devices to ensure proper and equitable payment for smoothness.

    12. Sampling Hot Mix Asphalt (HMA). Several States sample loose HMA from behind the paver where the final "in place" properties are evaluated. This ensures that the sample includes the as placed material and reduces the potential notification of plant personnel of sampling times.

    13. Verification of contractor test data. Several States are using F and t test to verify contractor test results.

    14. Calibration of angle of gyration on gyratory compactors. Several States have procedures to ensure that the angle of gyration is being checked.

    15. Several States are taking proactive actions to mitigate Alkali Silica Reactivity including the use of fly ash, ground granulated blast furnace slag and blended cement.
    16. One State is statistically analyzing State and Contractor data in an innovative manner to accomplish both verification and IA.

      The contractor samples and tests at the rate of 4 samples per lot. The State takes verification samples, at the beginning of production; a minimum of 4 samples are taken the first week of production and at least 1 per lot. The State's verification samples are taken at the plant by contractor personnel under the direction of the State personnel. The verification samples are split and one split is given to the contractor. Analysis is performed in two ways. First, for IA, the split results are compared using IA comparison tolerances. In the figure below IA1 is compared to the contractor split of that sample, sample 4 of lot 1. For validation, the State verification samples are made independent by removing the corresponding contractor splits. In the figure below samples 1,2,3 from lot 1; samples 1,2,4 from lot 2; samples 1,2,3 from lot 3; and samples 1,3,4 from lot 4 are compared to the State's IA1, IA2, IA3, and IA4 with the F& t tests.

      quality assurance graphic as described in the paragraph above

    17. Several States are specifying low permeability Portland Cement Concrete mixes to increase durability on bridge decks.

    18. Several States provide a list of required samples and tests to the project office based on the testing requirements and estimated quantities of the bid items on the project.

    19. Several State have developed comprehensive annual IA reports which include the number of certified technicians, the number of active technicians, the number of technicians that were covered by the IA program, the number of IA reports that had deficiencies, and an analysis of the deficiencies along with the potential systemic solutions to reoccurring deficiencies.

Opportunities for Improvement

The comments in this section will be discussed in four areas: use of contractor test results, IA Program issues, other items, and fraudulent activities.

  1. Use of Contractor Test Results

    In most States it was found that the States' validation system needed to be strengthened. The following items were noted:

    1. Not using independent samples for State verification samples,
    2. No statistical comparison of contractor and State data,
    3. Low State to contractor test comparison ratio of 1 vs. 10 results, and one vs. one comparisons of test results for validation,
    4. Lack of control of contractor supplied data,
    5. Lack of a defined time for comparing test results,
    6. Not increasing testing frequencies when test results don't compare,
    7. States are not controlling the sampling location and timing,
    8. States are allowing biased retesting provisions, and
    9. Lack of security for samples.

    The following is a further explanation of each of the areas noted above:

    1. Use of independent samples. It was noted in several States that verification testing was being performed based on split samples taken by the contractor. Verification of test data needs to be based on independent samples taken by the State. Split samples are an important part of the overall system and can help determine problems associated with sampling and testing procedures and equipment problems. That is why IA testing is required. However, split samples taken by the contractor will not detect fraudulent activity by the contractor which may consist of fabricating samples, switching samples, or taking samples from biased locations.

    2. Use of a statistical comparison. It was noted in some States that a statistical comparison was not being performed between the contractor's results and the State's results. The comparison was being based solely on a one vs. one comparison of results. This method of verification is very weak and will only detect severe problems with contractor test results.

    3. Number of independent samples being compared for validation. It was noted in some States that comparison ratio's of State to contractor results was one vs. five or one vs. 10. When the F-test and t-test are used for comparing test results, a minimum of seven to 10 State test results and a maximum of 20 to 30 contractor test results should be used for a reasonable comparison. It is suggested that a method of rolling comparison be incorporated to solve this problem. The number of, or size of the lots for pay do not have to be the same as the lots that are used in the comparison. States can also increase their sampling frequency at the beginning of projects in order to accumulate test results to start the comparison earlier. It was noted in some States that there is no limit to the number of contractor test results that are included in the comparison procedure. It is recommended that the number of contractor test results be limited to a maximum of 20 to 30 results because large number of tests in the comparison can mask problems in individual test results. It is recommended that when a non-comparison occurs that a new comparison process begin with the next test result.

    4. Control of contractor supplied data. A need to control the documentation for contractor supplied test results was noted in some States. In some cases the State is not receiving the documentation until three days after the paving. Some States also do not require the contractors to retain the source documentation for the required 3 years and are not periodically reviewing the records. The States should be reviewing source documentation, requiring proper retention of documents and require the submission of test results the next day and before the State supplies their results.

    5. Defined time frame for comparing test results. It was noted that in some cases there were no limits on the time required for validating contractor test results with the State verification test results. Validation of test results should occur as soon as possible due to the risks to both the State and the contractor that the material being supplied and incorporated into the project does not meet specifications.

    6. Increasing test frequency. When contractor and State tests do not validate, the State should increase their frequency of testing. This will increase the ability of the validation process to detect differences and also reduce risks for both parties if the States results are ultimately used for payment.

    7. Control of sampling location. It was noted in several States that the time and or location of sampling was being telegraphed to the contractor. In one case separation paper was being placed on the existing hot mix asphalt mat before placement of the next lift. The paper located the area that cores were going to be taken before the lift was placed and compacted. In other cases the random numbers for the sample locations are being given to the contractors for the entire project at the beginning of the project or at the beginning of the day for the entire day. The State must control the sampling location and timing, limit the pre-notification of sampling, and limit the ability of the contractor to modify sampling locations. Also, sampling behind the paver can avoid telegraphing sampling times to the plant operators. Saws can also be provided to separate the layers of cores.

    8. Biased retesting provisions. It was noted that some States allow the retesting of material any time a failing test result occurs and replace the failing test result with the new result. This practice is highly biased toward the contractor. Under no circumstances should a test result be thrown out unless it is known that the sample is flawed, i.e. poor or damaged sample, or poor test procedures. If additional tests are taken the analysis process needs to be modified to take into consideration the additional number of test results.

    9. Security of samples. There have been issues with the security of the retained (i.e. third party) sample being in the possession of the contractor. The possession and storage of retained, third party samples, or dispute resolution/backup samples, should be taken immediately by the State. Manipulation of the samples or replacement by known passing material could occur when the contractor takes possession of these samples.

  2. Independent Assurance (IA) Program Issues

    1. Many States should review their test result comparison tolerances. In some instances tolerances were developed in the early 1970s and have not been thoroughly examined since then. In many cases the testing variability has improved due to certification programs and improvements to test procedures. Therefore, the tolerances may be too large.

    2. The IA inspectors taking independent samples. IA should consist of a program of split sampling and testing or reference sample testing to help ensure that the testing is being performed correctly on properly calibrated equipment. Independent samples do not efficiently isolate issues or detect problems associated with sampling, testing and equipment, unless large numbers of independent samples are taken.

    3. IA forms refer to specification compliance. IA is specifically intended for determining testing competence, not specification compliance.

    4. Gyratory compaction not included in the IA program. The IA program should cover all test procedures that are used in the acceptance decision.

    5. The IA program did not cover technicians in the QC laboratories. All technicians including State personnel, contractor personnel or consultant personnel that are performing testing that is used in the acceptance decision must be qualified.

    6. Methods need to be developed to standardize comparison of IA and acceptance test results including having both test results on the IA form.

    7. Timely resolution of discrepancies in IA, specification, compliance, and validation need to be documented and included in the project files.

    8. A goal of 90 percent coverage of the active testing personnel per year should be established when the system approach is used for IA.

  3. Asphalt

    1. Several of States did not include volumetric properties as part of the acceptance decision for asphalt mix and should move towards having volumetric properties included.

    2. States should consider using a percent of the maximum theoretical density as a target for roadway density instead of percent of control strip.

    3. The maximum theoretical density of the mixture needs to be verified at the start of production and throughout production. The values can change with differences in gradation and asphalt content.

    4. When nuclear gages are used for acceptance of asphalt density the State should develop and implement an effective procedure to correlate gages using cores.

    5. States should develop density specifications which include the entire longitudinal joint in the evaluation, i.e., eliminate the different requirements for mainline vs. edge of pavement and confined vs. unconfined edge.

  4. Other Items

    1. Testers' names were not indicated on reports. All test reports should indicate the person that was responsible for sampling and testing the material. The reports should also include the testers' certification number.

    2. Some States that are gathering electronic data are not using the data to continuously analyze and improve their specifications based on the actual results obtained within the State.

    3. In some States the dispute resolution system is not formally established and documented. When contractor test results are used in the acceptance decision, the State must establish a dispute resolution system to address the resolution of discrepancies occurring between the verification sampling and testing and the contractor sampling and testing.

    4. In some States the qualification programs for all contractor and State laboratories used in the QA program were not established.

    5. Absolute Average Deviation or other inefficient quality measures are being used for acceptance and pay adjustments. States should move toward a more rigorous statistical system such as Percent Within Limits (PWL) for specifications.

    6. Many States allow multiple options in their test procedures and sampling locations. This provides an increased variability in sampling and testing. A single test method and sampling location will reduce the overall variability of the test results.

    7. Specified periodic re-evaluation for each product that appears on the Qualified Products List should be established. Higher risk products may need to include additional validation and higher test frequencies.

    8. The smoothness specification requirements should be reevaluated to ensure the present parameters are acceptable.

      • Reduce the amount of exclusion areas to improve ride in those areas.
      • Ride bonus - ensure a bonus is only provided for superior ride quality and not allow the majority of contractors (regardless of quality) to achieve the full ride bonus.
      • Balance the material quality and payment, with the smoothness quality and payment so contractors won't only put an emphasis on one or the other during production and laydown.
    9. States are encouraged to have the Central Office assess the QA programs to assure consistent implementation statewide.

    10. States should require the QC technicians to certify each of their test results. As an example, most states require a certification (i.e. statement) for the technician's signature to indicate that the report reflects the actual test results obtained.

    11. State testing procedures should be more accessible either in hardcopy or electronic format.

    12. States should consider requiring qualification for contractor/supplier personnel that are performing mix designs.

    13. The technician qualification programs should be the same for both the State personnel and contractor personnel.

    14. Qualification programs for technicians need to be developed for all technicians that are sampling or testing, including testing soils, sampling asphalt mixtures and sampling cores.

    15. States should move away from a stepped incentive/disincentive specification to a continuous pay adjustment specification. With step specifications there may be a significant change in pay at the step which may encourage some fraud to occur.

    16. There are concerns related to the amount of available staff when contractor test results are used in the acceptance decision to:

      • Monitor asphalt field operations;
      • Analyze the data on a daily basis as part of the validation program; and
      • Support the staff in the development and implementation of the overall QA program in asphalt to ensure timely development of a QA system that is in compliance with the regulation.
  5. Fraudulent Activities

    No fraudulent activities were discovered during the reviews.

    However, questions concerning fraudulent activities were asked during the reviews in the twelve States that were assessed during FY 2004, FY 2005, and FY 2006 (California, Georgia, North Carolina, New York, Maryland, Oregon, Minnesota, Connecticut, Virginia, Nevada, Wisconsin, and Nebraska). In response to those questions, two States indicated current ongoing investigations and two States indicated that they had revoked technician certifications due to fraudulent activities. Questions concerning fraudulent activities or the revocation of technician certifications were not asked during the reviews performed in FY 2003.

Available Resources

The following resources are currently available for assistance in dealing with issues raised in this report:

  1. The guideline for these reviews is available on the FHWA Pavements web site at https://www.fhwa.dot.gov/pavement/materials/qareview.cfm

  2. "23 CFR Part 637," Subpart B - Quality Assurance Procedures for Construction, Federal Highway Administration, Federal Register, Washington, DC, April 2003, http://www.access.gpo.gov/nara/cfr/waisidx_03/23cfr637_03.html

  3. Non-regulatory supplement for 23 CFR Part 637, Subpart B - Quality Assurance Procedures for Construction, Federal Highway Administration. The non-regulatory supplement was updated on July 19, 2006. https://www.fhwa.dot.gov/legsregs/directives/fapg/0637bsup.htm

  4. Technical Advisory 6120.3, "Use of Contractor Test Results in the Acceptance Decision, Recommended Quality Measures, and the Identification of Contractor/Department Risks", Federal Highway Administration, August 2004. https://www.fhwa.dot.gov/legsregs/directives/techadvs/t61203.htm

  5. Frequently asked questions (FAQ) on the Quality Assurance Regulation. The FAQs were updated on November 26, 2006. https://www.fhwa.dot.gov/pavement/materials/matnote11.cfm#qaa

  6. A contract has been awarded for the delivery of NHI Course 134042, "Materials Control and Acceptance - Quality Assurance." The course is four days long and covers the basic essentials of QA. A two-day version of the course is also available. http://www.nhi.fhwa.dot.gov/training/brows_catalog.aspx

  7. A 1-day workshop titled "PWL Basic" is being offered by the FHWA. https://www.fhwa.dot.gov/pavement/pwl/basic_pwl.cfm

  8. "Optimal Procedures for Quality Assurance Specifications", Publication No. FHWA-RD-02-095, Federal Highway Administration, Washington, DC, April 2003, https://www.fhwa.dot.gov/pavement/pub_details.cfm?id=89

  9. "Evaluation of Procedures for Quality Assurance Specifications", Publication No. FHWA-HRT-04-046, Federal Highway Administration, Washington, DC, October 2004, https://www.fhwa.dot.gov/pavement/pub_details.cfm?id=367

  10. The rewrite of the AASHTO Standard Recommended Practice R 9-05, "Acceptance Sampling Plans for Highway Construction" has been published in the 2005 AASHTO Standards. This guide will assist the States in developing specifications.

Status of other Quality Assurance Activities

The following resources are being developed to address issues that are not being covered by existing resources:

  1. A software package SPECRISK is being developed by FHWA as a tool to help analyze risks associated with Percent Within Limit (PWL) specifications. The software will be completed in the summer of 2007.

  2. The Quality Assurance Technologist Course that was developed by the New England Transportation Technician Certification Program (NETTCP) has been finalized and will be available at the end of calendar year 2007 as NHI Course 134064

  3. A contract for developing NHI Course 134059 - "Quality Assurance Specification Development and Validation Course" is expected to be awarded during the fall of 2007. The course is expected to be available by the end of 2008. The course will use the software that is currently being developed to assist the States in developing and validating the risks associated with QA specifications.

Conclusion

The stewardship reviews will continue next year and beyond along with the continued development and updating of resources in order to continuously improve the QA program.

Updated: 06/27/2017
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000