Skip to content
Facebook iconYouTube iconTwitter iconFlickr iconLinkedInInstagram
Office of Planning, Environment, & Realty (HEP)
HEP Events Guidance Publications Glossary Awards Contacts

TMIP Peer Review Program Assessment and Evaluation Report

5.0 Effectiveness of TMIP Peer Review Program

The TMIP peer reviews yield many interesting and valuable insights for the host agency. In the preceding sections of this report, the major themes discussed at each of the peer reviews have been described and quantified. This evaluation and assessment also sought to review the overall effectiveness of the TMIP peer review program and whether it is advancing and promoting the overall TMIP goals.

The effectiveness of the TMIP peer review program was evaluated using two different sources:

  1. The past TMIP evaluation and synthesis reports, and
  2. Direct feedback from recent host agency participants.

5.1 Past TMIP Evaluation and Synthesis Reports

As described in Section 4.1 of this report, three synthesis reports have been conducted since the TMIP peer review program was inaugurated in 2003. Table 5 below summarizes the peer reviews conducted from 2003 to 2007 which were examined and summarized in these synthesis reports.

  1. TMIP Peer Review Program Synthesis Report dated November 2004 was prepared by the Volpe National Transportation Center (Volpe, 2004). This synthesis report summarized the first seven peer reviews conducted between 2003 and 2004.
  2. TMIP Peer Review Program Synthesis Report 2 dated September 2005 was prepared by the Texas Transportation Institute (TTI, 2005). This synthesis report summarized the next five peer reviews conducted between 2004 and 2005.
  3. TMIP Peer Review Program Evaluation Report dated April 2009 was prepared again by the Volpe National Transportation Center (Volpe, 2009). This evaluation report summarized the four peer reviews conducted between 2005 and 2007. In addition, this report also interviewed four past host agency participants (DRCOG, SCAG, SEMCOG, BMC) to address the program's overall effectiveness.

Title: Figure 18 Past TMIP Synthesis & Evaluation Reports - Description: Cover of the three synthesis reports. November 10, 2004. September 10, 2005. April 27, 2009.

Figure 18 Past TMIP Synthesis & Evaluation Reports

Table 5: Past TMIP Peer Reviews (2003-2007)
City State Agency Year Synthesis & Evaluation 
Louisville Kentucky OKI 2003 2004_Volpe
Anchorage Alaska AMATS 2004 2004_Volpe
Atlanta Georgia ARC 2004 2004_Volpe
  Iowa IaDOT 2004 2004_Volpe
  North Carolina NCDOT 2004 2004_Volpe
Denver Colorado DRCOG 2003, 2004 2004_Volpe, 2009_Volpe
Los Angeles California SCAG 2003, 2004, 2006 2004_Volpe, 2009_Volpe
San Francisco California MTC 2004 2005_TTI
Colorado Springs Colorado PPACG 2005 2005_TTI
Memphis Tennessee MATA 2004, 2006 2005_TTI
Detroit Michigan SEMCOG 2004 2005_TTI, 2009_Volpe
Baltimore Maryland BMC 2004, 2005 2005_TTI, 2009_Volpe
Newark New Jersey NJTPA 2005 2009_Volpe
San Diego California SANDAG 2005 2009_Volpe
St. Louis Missouri EWGCG 2006 2009_Volpe
Boise Idaho COMPASS 2007 2009_Volpe

The reader is encouraged to review the past TMIP peer review synthesis and evaluation reports for detailed summaries of the peer reviews conducted during the period. The past synthesis reports document the technical recommendations, suggest proposed improvements to the peer review program, and provide feedback collected from a subset of agency participants. Those findings will not be re-iterated here. However, it is worth emphasizing some of the major themes that were expressed in all three synthesis/evaluation reports especially as they relate to the effectiveness of the TMIP peer review program.

The overall effectiveness of the TMIP peer review program as documented in the past synthesis/evaluation reports centered on a few primary elements. Effectiveness was judged based on:

  1. Planning the peer review,
  2. Participant satisfaction, and
  3. Panel and host agency recommendations.

5.1.1 Planning the Peer Review

The authors emphasized the importance in proper planning of the peer review. Host agencies were encouraged to ensure sufficient lead time since planning, preparing for, and then convening the peer review takes a substantial amount of time and effort. The host agency must develop a charge to the peer panel, set the meeting schedule and agenda, select panel members, provide background material, develop presentation materials, and plan/coordinate all the multi-day meeting logistics.

The authors also stressed that the host agency must provide specific information to the peer panel well in advance of the peer review meetings on the details of the model (documentation) as well as the objectives of the meeting (charge to peer panel). This can also be a time consuming exercise. However, panel members must arrive at the meeting with a good understanding of the technical details so valuable meeting time is not spent simply explaining model trivialities or other mundane aspects of the travel model system. To fully take advantage of and engage the peer panel, good model and model development documentation are critical for an effective peer review.

The authors also observed that the most effective peer reviews often began with discussions among senior staff, policy makers, and other model stakeholders on the status of the current modeling tools and procedures as well as initiatives to be explored in the near and long term. A peer review can help make senior managers and policy-makers more aware of the strengths and weaknesses associated with an agency's modeling tools, ultimately enabling them to make better informed decisions about allocating resources to travel model improvement initiatives.

5.1.2 Participant Satisfaction

Both the TTI (2005) synthesis report and the Volpe (2009) evaluation report present findings and feedback elicited from previous peer review host agency participants. Overwhelmingly, agency satisfaction in the program is very high. Almost universally, the peer reviews met or exceeded the agencies' expectations and the peer panels were characterized as collegial, technically skilled, and generally interested in helping the host agency while promoting improved techniques and methods.

The primary agency motivation for wanting to participate in a TMIP peer review is to get an independent assessment of the agency's travel modeling tools and procedures by a group of respected technical leaders. That said, the professional and peer networking opportunities that are made available and fostered through the TMIP peer review program are just as important as the actual technical assessments. The information, skills, and expertise that are shared and transferred among practitioners during TMIP peer reviews are tremendously important. It is an opportunity to exchange different viewpoints and different solutions to complex behavioral and computational problems. This knowledge sharing is critically important as advanced tools and techniques become more widely adopted, and agency staff are simultaneously asked to do more with less.

5.1.3 Panel and Agency Recommendations

The authors of the past TMIP synthesis reports devote significant time to addressing whether or not panel recommendations were actually implemented by the host agency. There are a lot of issues surrounding whether or not an agency implemented a list of short and long-term panel recommendations. Available staff time and resources, the agency's current priorities, planning objectives and responsibilities as well as whether the agency was in total agreement with the peer panel are just a few of the reasons recommendations may or may not have been fully implemented. In our judgment, the percent of recommendations that were implemented is therefore not a great measuring stick for assessing the effectiveness of the program.

Host agency participants interviewed during this synthesis and evaluation effort often recommended more guidance, technical assistance, and involvement in all phases of the peer review process by TMIP staff both before and after the peer review meeting. As described earlier, hosting a peer review requires a good deal of effort on the part of the host agency, especially at medium and small size agencies with few or no dedicated modeling staff. The Volpe (2009) evaluation report in particular documented and proposed a more active involvement by TMIP staff.

5.2 Recent Host Agency Participant Feedback

A number of TMIP peer reviews have now been conducted since the last evaluation report was prepared by Volpe in 2009. Since 2008, twelve additional peer reviews have been convened throughout the U.S. Table 6 below presents the peer reviews that have been conducted since the preparation of the last evaluation report.

Table 6: Recent TMIP Peer Reviews (2008-2011)
Logan Utah CMPO 2008
Davenport Iowa BRC 2008
St. George Utah DMPO 2008
Dubuque Iowa ECIA 2008
Sacramento California SACOG 2008
Austin Texas CAMPO 2009
Philadelphia Pennsylvania DVRPC 2009
Omaha Nebraska MAPA 2010
Burlington Vermont CCMPO 2011
Chattanooga Tennessee CHCNGA-TPO 2011
Monterey California AMBAG 2011
New York New York NYMTC 2011

A set of ten questions was submitted to the modeling contact person at each of the recent host agency participants identified in Table 6 above. The interview questions are identical to those developed by Volpe in 2009, with one exception. The same questions were used in order to draw out the commonalities that might exist across all the agency responses instead of among only the most recent agency reviews conducted since 2008. Appendix B contains the list of questions submitted to each agency.

Responses were received from six of the recent host agencies that were contacted. Table 7 indicates which agency staff responded with feedback on their agency's TMIP peer review experience.

Table 7: Agency Respondents
Logan Utah CMPO 2008
Davenport Iowa BRC 2008
St. George Utah DMPO 2008
Austin Texas CAMPO 2009
Philadelphia Pennsylvania DVRPC 2009
New York New York NYMTC 2011

5.2.1 Recent Host Agency Participant Satisfaction

All six agencies that responded to the set of interview questions expressed very high satisfaction in the TMIP peer review program and indicated they would recommend participation in the program to other agencies as well. Some respondents expressed an interest in participating in another peer review once the agency has had an opportunity to implement some of the recommended advancements identified in the model improvement roadmaps developed by the peer panels. In addition, one individual expressed an interest in actually participating on a peer review panel in the future. These two facts alone testify to the effectiveness and participant satisfaction in the program.

Interestingly, one participant said,

"I would highly recommend involvement in the program. Small agencies with limited technical capabilities are likely to greatly benefit from the program."

However, another respondent indicated that,

"It is recommended that any large agency/MPO with a complex region take advantage of this program. The larger MPOs with complex and diverse transportation regions will benefit most from this program."

Clearly, agency staff who have participated in recent TMIP peer reviews feel they are equally valuable to agencies of all sizes.

5.2.2 Recent Host Agency Recommendations

The recent host agency respondents provided a number of recommendations to improve the TMIP peer review program going forward. Some of the recommendations that were common among the recent host agency responses are presented below:

  1. The production of the final report took too long and was delivered in some cases months after the peer review meeting.
  2. An in-depth review of the "guts-of-the-model" before the actual TMIP peer review by an outside group of consultants proved to be very valuable as it enabled the assembled experts to focus on model improvement recommendations.
  3. The vast range of experience among the peer panels is the program's greatest asset. The TMIP peer review program should continue to select a good mix of practitioners with diverse backgrounds and avoid an over-reliance on consultants as peer panel members.
  4. The host agencies expected a different level of engagement than was provided by TMIP staff. TMIP was primarily just the funder of travel reimbursements while agency staff did most all the heavy lifting in planning and performing the review. Technical assistance with structuring the review agenda, helping to develop information necessary for preparation prior to the review and follow-up after the review was desired.

It is notable that these recommendations elicited from recent host agencies who participated in reviews conducted between 2008 and 2011 echo the sentiments of past agency participants. Specifically, that efforts conducted before the review help to ensure a more productive review; the composition of the peer panel is critically important to the success of the review; and finally, that host agencies desire a greater level of involvement by TMIP staff in the review process before, during, and after the in-person review.

Updated: 5/23/2017
HEP Home Planning Environment Real Estate
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000