TMIP Peer Review Program Assessment and Evaluation Report
6.0 Conclusions
This assessment and evaluation took a wide-angle view of the TMIP Peer Review Program by reviewing all twenty-eight of the peer reviews conducted since the program's inception. To summarize the achievements of the program, the reviews conducted between 2003 and 2011 were categorized along a number of different dimensions: by year, geography, agency size, peer panel composition and by motivating factors. The findings of this categorization reveal that the program has performed a well-balanced mix of reviews among large, medium, and small-sized agencies and has done a good job attracting panelists with diverse backgrounds and varied expertise.
The assessment also examined the peer reviews conducted to date to draw out the salient generalized lessons, observed model limitations, and suggested recommendations, as well as general policy and modeling trends. The report illustrates the broad patterns present in the practice of travel modeling that were realized from examining all twenty-eight peer reviews.
The major trends and themes were identified by isolating the specific technical questions posed by the host agency to the peer panel as well as the prioritized model improvement recommendations presented to each host agency by the peer panels. The technical questions and recommendations were grouped into major topic areas in order to quantify what topics have been most discussed over the years.
A few key findings emerged from this in-depth look at the TMIP peer reviews:
- Most of the technical questions are centered on modeling guidelines, data collection/preparation and observed best practices which are to be expected since the peer reviews are primarily model assessment exercises.
- The large and medium sized agencies typically submit more technical questions for the panel to respond to directly than do smaller-sized agencies.
- The host agencies in general seem to be asking the same kinds of questions regardless of agency size with no clear chronological trends.
- Most of the panel recommendations are centered on increasing the detail of the existing travel modeling tools (e.g. geographic, market segmentation, time of day, land use types, mode choice sets, etc.)
- The peer panels in general seem to be recommending the same kinds of improvements regardless of agency size with no clear chronological trends.
- More of the panel recommendations tend to be identified as shorter-term priorities and often involve increasing the detail of the existing modeling tools and procedures. (spatially, temporally, more detailed input data, etc.)
- The inclusion of freight/commercial modeling, transitioning to activity-based demand modeling, dynamic traffic assignment (DTA), microsimulation and land-use modeling frequently appear as long-term panel recommendations.
Finally, the assessment examined past TMIP synthesis reports and elicited feedback from recent host agency participants to evaluate the effectiveness of the TMIP peer review program. Participant satisfaction is very high, and overwhelmingly, agency staff and participants thoroughly benefited from participation in the program. There is almost universal agreement that participating in the program has helped advance the modeling tools and procedures utilized by the host agency.
A list of recommendations for improving the TMIP peer review program has been compiled based on a comprehensive and in-depth review of the twenty-eight peer reviews conducted since 2003, a review of past TMIP synthesis/evaluation reports, as well as 'user-experience' feedback elicited directly from past agency participants. Note, the recommendations below are not in priority order.
- Convene three to four peer reviews per year and more actively promote the program during years when interest appears low.
- Promote the program in parts of the country where TMIP peer review program participation has not yet occurred.
- Continue to attract peer panelists with a diverse set of backgrounds and varied expertise without becoming too reliant on representatives from particular industry sectors and/or particular individuals. The peer networking and knowledge sharing offered by the program is perhaps more important than the technical assessments.
- Continue promoting equal participation among large, medium, and small-sized agencies.
- Consider ways to make peer review meeting materials available to a broader audience beyond just the final report.
- Record and make available the peer review meeting sessions themselves (via web-conferencing tools if agencies agree)
- Post meeting materials such as PowerPoint presentation slideshows, meeting agendas, model documentation and panel recommendations along with final report on the TMIP website
- Consider having the TMIP program play a more active role before, during and after the peer review if budget permits.
- Technical assistance with structuring the review agenda, developing information necessary for preparation prior to the review, and follow-up after the review
- Develop templates that can be used by host agencies to streamline the TMIP peer review application, planning and preparation processes
- Review and help create the peer review agenda so it is clear and it is reasonable to expect that the agenda can be covered in the allotted meeting time.
- Consider having the TMIP program review and comment on published documentation to ensure it is reasonably thorough, prior to agreeing to sponsor a peer review.
- Consider incorporating a "Preliminary Model Assessment" phase as the first phase of the TMIP peer review process which takes place in advance of the meetings with the peer panel if budget permits. The preliminary assessment would provide the "independent eye" often cited by host agencies and could identify if the model is appropriate for the host agency's intended applications. The preliminary assessment would also help shine a light on elements of the travel modeling tools and procedures most deserving time and discussion during the formal in-person review.