Skip to content
Facebook iconYouTube iconTwitter iconFlickr iconLinkedInInstagram
Office of Planning, Environment, & Realty (HEP)
HEP Events Guidance Publications Glossary Awards Contacts

TMIP Peer Review Program Assessment and Evaluation Report

4.0 TMIP Peer Reviews - Trends & Themes

TMIP peer reviews yield many interesting and valuable insights for the host agency. Some of the findings are agency specific, while many are common and can be generalized. A review of the twenty-eight peer reviews was performed to draw out the salient lessons, observed model limitations, suggested recommendations, as well as general policy and modeling trends. This section of the report presents these themes.

Major trends and themes can be culled from the peer review final reports by getting a sense for what was discussed at each of the individual peer review meetings. There are two important elements available in almost all of the peer review final reports that can be used for this purpose. The peer review final reports typically include:

  1. The host agency technical questions posed to the peer panel, and
  2. The panel recommendations delivered to the host agency

The host agency technical questions and the panel recommendations can then be organized into major topic areas to draw out commonalities that exist not only within the industry, but across planning agencies of different sizes and how the major topics have changed over time.

4.1 Review of Host Agency Technical Questions

As part of the TMIP peer review application process, the host agency is required to develop a "charge to the peer review panel." This charge to the peer review panel is often conveyed as a list of topic areas the host agency is interested in and most commonly as a list of ten to fifteen specific technical questions. The technical questions posed to the peer review panel can therefore provide a clear picture of the concerns, challenges and modeling issues from the perspective of a host agency.

The format and structure of the TMIP peer review final reports have evolved over time partly as a result of varied authorship over the years (Volpe 2003-2004, TTI 2004-2005, Volpe 2005-2007, RSG 2010-current). The specific technical questions posed by the host agency have only recently been explicitly included in the peer review final reports as an appendix. However, the technical questions that were likely posed can often be found and extracted from other sections in the older peer review final reports.

There is very good documentation on the specific technical questions posed by the host agency for thirteen of the twenty-eight total peer reviews. The thirteen reviews which have good documentation on the specific technical questions still represent a good mix of the large, medium, and small sized agencies. Table 3 below identifies the agency peer review for which there is good technical question documentation included in the peer review final report. Table 4 characterizes the thirteen reviews that identified the technical questions using the large, medium, and small agency categorization.

Table 3: TMIP Peer Reviews - Documentation of Technical Questions
Technical Questions Posted Count Agencies
Documented in Final Report 13 NYMTC, NCDOT, MTC, DVRPC, IaDOT, MBC, SACOG, CAMPO, MAPA, AMBAG, CHCNGA-TPO, AMATS, CCMPO
Not Documented in Final Report 15 SCAG, NJTPA, SEMCOG, ARC, SANDAG, DRCOG, EWGCG, OKI, MATA, PPACG, COMPASS, BRC, DMPO, CMPO, ECIA
Total 28  

Table 4: TMIP Peer Reviews with Well-Documented Technical Questions
Agency Size Count Agencies
Large 5 NYMTC, NCDOT, MTC, DVRPC, IaDOT
Medium 5 BMC, SACOG, CAMPO, MAPA, AMBAG
Small 3 CHCNGA-TPO, AMATS, CCMPO
Total 13  

4.1.1 Technical Questions - Major Topic Areas

Approximately two-hundred different specific technical questions were posed to the peer review panels in the thirteen host agency reviews where this information is well-documented in the final reports. This broad and diverse set of technical questions was categorized using twenty-one generalized major topic areas. The process by which technical questions were collected and grouped into the generalized major topic areas is somewhat subjective. A sample technical question and the resulting topic area assignment along with the host agency and peer review date are presented below.

"How accurate is the travel model in capturing intrastate and interstate freight movements?" (AMATS, 2004)

Topic Area à Freight Modeling

Some judgment is required as this particular question posed by staff during the Anchorage, Alaska peer review could have been categorized into other major topic areas as well. The intent was to categorize the technical questions in a straightforward way without attributing the same question to multiple topic areas, although there were a few cases when this was done. Appendix A provides detailed descriptions of the twenty-one generalized major topic areas along with an example of a specific host agency technical question that was attributed to the topic area.

Figure 8 illustrates the percent share of technical questions by topic area posed by host agencies sorted from smallest to largest. Eleven percent of all the technical questions posed by host agencies for example were related to calibration and validation. Figure 9 disaggregates the data presented in Figure 8 by agency size. Thirteen percent of all technical questions posed by large-size host agencies for example were related to the topic of calibration and validation. Ten percent of all the technical questions posed by medium-size host agencies and almost none percent of all technical questions posed by small-size host agencies were related to calibration / validation. Finally, Figure 10 disaggregates the data presented in Figure 8 by year. The yearly data was grouped into three ranges: 2004-2005, 2008-2010, and 2011. This was done to eliminate years where few (or no) technical questions were documented in the peer review final reports (e.g. 2003, 2006-2007). Note, the same axis category order and axis scaling are applied to each figure to facilitate comparisons down the page.

Title: Figure 8 Share of Technical Questions by Topic Area - Description: Figure 8 illustrates the percent share of technical questions by topic area posed by host agencies sorted from smallest to largest. Eleven percent of all the technical questions posed by host agencies for example were related to calibration and validation. Click image for source data.

Figure 8 Share of Technical Questions by Topic Area

Title: Figure 9 Technical Questions by Topic Area by Agency Size - Description: Figure 9 disaggregates the data presented in Figure 8 by agency size. Thirteen percent of all technical questions posed by large-size host agencies for example were related to the topic of calibration and validation. Ten percent of all the technical questions posed by medium-size host agencies and almost none percent of all technical questions posed by small-size host agencies were related to calibration / validation. Click image for source data.

Figure 9 Technical Questions by Topic Area by Agency Size

Title: Figure 10 Technical Questions by Topic Area by Calendar Year - Description: The yearly data was grouped into three ranges: 2004-2005, 2008-2010, and 2011. Click image for source data.

Figure 10 Technical Questions by Topic Area by Calendar Year

A few key findings emerge based on this review of the technical questions posed by the host agencies:

  1. Most of the technical questions are centered on modeling guidelines, data collection/preparation, calibration/validation and observing best practices which are to be expected since the peer reviews are primarily model assessment exercises.
  2. Medium and large size agencies tend to ask more targeted and specific technical questions for the peer review panels to address than do the smaller sized host agencies.
  3. In general, host agencies regardless of size seem to be asking the same kinds of questions. However, some topic area questions are more prevalent in certain sized agencies. For example, there are many more questions at small-sized agencies regarding the integration of land-use and transportation planning, likely because medium and large sized agencies have already made some progress towards this goal. The large size agencies also tend to have fewer questions relating to the development of land use forecasts than do small and medium size agencies. In addition, small agencies have more questions about non-motorized modeling than do medium and large agencies.
  4. In general, the same kinds of questions have been asked over the years the program has been in existence with no clear chronological trends. Modeling concepts important in 2004 are still important and frequently discussed today. However, some topic area questions have become more prevalent over time. For instance, questions pertaining to activity-based modeling have been more frequent in recent years which is to be expected as the industry continues to adopt new advanced methods. Questions pertaining to fuel pricing have also been on the rise which is also to be expected given the volatility in prices that have been observed.

4.2 Review of Peer Panel Recommendations

Each TMIP peer review culminates in a list of recommendations which the peer panel presents to the agency staff. Peer panel recommendations are typically delivered as short-term and long-term priorities the agency should consider to improve their travel modeling tools and procedures. As with the technical questions submitted to the panel, a list of about ten to fifteen panel recommendations are presented to the agency staff in the final session of the multi-day meetings which then concludes the peer review. These recommendations are incredibly valuable given the make-up of these peer review panels. As described earlier in this report, these individuals are prominent practitioners in the industry and the nationally recognized technical leaders. A review of their specific recommendations can therefore provide a clear picture of the concerns, challenges and issues as well as solutions for addressing them from the perspective of peer panel experts.

All twenty-eight peer review final reports have good documentation on the recommendations presented by the peer panel at the conclusion of the review.

4.2.1 Panel Recommendations - Major Topic Areas

Approximately 175 different specific panel recommendations were presented to the host agencies in the twenty-eight peer review final reports. This broad and diverse set of panel recommendations was categorized using the same twenty-one generalized major topic areas used to categorize the host agency technical questions in the preceding section. The process by which the panel recommendations were collected and grouped into the generalized major topic areas is somewhat subjective. A sample panel recommendation and the resulting topic area assignment along with the host agency and peer review date are presented below.

"MTC should consider developing a finer-grained regional zone system." (MTC, 2004)

Topic Area à Zones & Networks

Some judgment is again required as was the case in assigning the host agency technical questions to generalized topic areas. The intent was to categorize the panel recommendations in the most straight-forward way possible without attributing the same recommendation to many different topic areas. Appendix A provides detailed descriptions of the twenty-one generalized major topic areas. Peer panel recommendations limited only to very specific agency implementation issues were not considered for this assessment and evaluation (e.g. remove bridge penalties).

Figure 11 illustrates the percent share of panel recommendations by topic area sorted from smallest to largest. Thirteen percent of all the peer panel recommendations for example were related to administrative items. Figure 12 disaggregates the data presented in Figure 11 by agency size. Just over fourteen percent of all the peer panel recommendations made during large-size agency reviews for example were related to the administrative topic area. Eight percent of all the peer panel recommendations made during medium-size agency reviews and slightly more than sixteen percent of all the recommendations made during small-size host agency reviews were related to administrative items. Figure 13 disaggregates the data presented in Figure 11 by year. The yearly data was grouped into three ranges: 2003-2005, 2006-2008, and 2009-2011. Grouping the recommendations into consecutive three-year ranges is possible when examining the peer panel recommendations because the panel recommendations are well-documented in all the peer review final reports which was not the case with the host agency technical questions. Finally, Figure 14 presents the panel recommendations based on the panel's prioritization (e.g. long-term, short-term). Note, the same axis category order and axis scaling are applied to each figure to facilitate comparisons down the page.

Title: Figure 11 Panel Recommendations by Topic Area - Description: Figure 11 illustrates the percent share of panel recommendations by topic area sorted from smallest to largest. Thirteen percent of all the peer panel recommendations for example were related to administrative items. Click image for source data.

Figure 11 Panel Recommendations by Topic Area

Title: Figure 12 Panel Recommendations by Topic Area by Agency Size - Description: The yearly data was grouped into three ranges: 2003-2005, 2006-2008, and 2009-2011. Click image for source data.

Figure 12 Panel Recommendations by Topic Area by Agency Size

Title: Figure 13 Panel Recommendations by Topic Area by Calendar Year. Click image for source data.

Figure 13 Panel Recommendations by Topic Area by Calendar Year

Title: Figure 14 Panel Recommendations by Topic Area by Priority - Description: Figure 14 presents the panel recommendations based on the panel’s prioritization (e.g. long-term, short-term). Click image for source data.

Figure 14 Panel Recommendations by Topic Area by Priority

A few key findings emerge based on this review of the recommendations delivered to the host agency by the peer review panels:

  1. Most of the panel recommendations are centered on increasing the detail of the existing travel modeling tools (e.g. geographic, market segmentation, time of day, land use types, mode choice sets, etc.)
  2. In general the peer panels seem to be recommending the same kinds of improvements regardless of agency size. However, some topic area recommendations are more prevalent in certain sized agencies. For example, a higher share of the recommendations made during small-size agency reviews pertain to administrative items, assignment techniques, and spatial input data (zones & networks) than at medium and large agencies. In addition, recommendations related to multi-scale modeling (subarea, microscale, simulation) are more common at medium and large size agencies than small size agencies. This is not surprising given that small agencies typically lack the staff and resources for multi-scale and multi-resolution modeling.
  3. In general, the same kinds of recommendations have been made over the years the program has been active with no clear chronological trends. Modeling concepts important almost ten years ago are still important and frequently discussed today. However, some topic area recommendations have become more prevalent over time. For example, recommendations pertaining to activity-based modeling, DTA, and microsimulation have been more frequent since 2009 which is to be expected since these advanced techniques are becoming more widely adopted.
  4. More of the panel recommendations tend to be identified as shorter-term priorities and often involve increasing the detail of the existing modeling tools and procedures (spatially, temporally, more detailed input data, etc.)
  5. The inclusion of freight/commercial modeling, transitioning to activity-based demand modeling, dynamic traffic assignment (DTA), microsimulation and land-use modeling frequently appear as long-term panel recommendations.

4.3 Summary of TMIP Peer Reviews

As described in the preceding section, the same set of generalized topic areas were identified to categorize both the technical questions posed to the panel by host agencies and the model improvement recommendations presented to the host agency by the peer review panel. In many cases the peer panels respond directly to certain questions posed the agency with their final recommendations. To provide an overall summary, the technical questions and the panel recommendations were merged and evaluated together.

4.3.1 Major Topic Areas - Questions & Recommendations

This section of the report serves to identify which major topic areas were discussed among all twenty-eight peer reviews. A major topic area is assumed to have been discussed at length if there was at least one technical question posed during the review or if a panel recommendation was made during the peer review about the topic.

Figure 15 illustrates the percent share of questions and recommendations by topic area sorted from smallest to largest. Figure 9 disaggregates the data presented in Figure 15 by agency size. Figure 17 disaggregates the data presented in Figure 15 by year. The yearly data was again grouped into three ranges: 2003-2005, 2006-2008, and 2009-2011. Note, the same axis category order and axis scaling are applied to each figure to facilitate comparisons down the page.

Figure 15, Figure 16, and Figure 17 help visualize and emphasize two important findings from the earlier technical question and panel recommendation summaries:

  1. The kinds of questions and recommendations discussed during the TMIP peer reviews are germane to agencies of all sizes - large, medium, and small agencies alike. For example, adding detail to geographic input data such as traffic analysis zone structures, as well as roadway and transit networks was identified in agency reviews of all three sizes.
  2. The kinds of questions and recommendations discussed during TMIP peer reviews have remained somewhat constant since the program's inauguration. For example, time of day modeling was an important topic in the reviews conducted in 2004 and 2005 and was equally important in 2008 through 2011.

Continued tracking of the TMIP peer review program trends and themes along the dimensions which have been presented in this report will be very beneficial moving forward. TMIP is now and should continue to develop tools that can streamline the assessment/evaluation of the peer review program.

Title: Figure 15 Share of Questions and Recommendations by Topic Area. Click image for source data.

Figure 15 Share of Questions and Recommendations by Topic Area

Title: Figure 16 Share of Questions and Recommendations by Topic Area by Agency Size. Click image for source data.

Figure 16 Share of Questions and Recommendations by Topic Area by Agency Size

Title: Figure 17 Share of Questions and Recommendations by Topic Area by Calendar Year. Click image for source data.

Figure 17 Share of Questions and Recommendations by Topic Area by Calendar Year

Updated: 5/23/2017
HEP Home Planning Environment Real Estate
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000