November 28-29, 2017
Utah Department of Transportation
The Highway Safety Improvement Program (HSIP) comprises three components: planning, implementation, and evaluation. While planning and implementing projects is critical to addressing opportunities for safety improvement, evaluation is critical to understanding project and program effectiveness. Employing more consistent and reliable evaluation methods will support future HSIP decisions, optimize return on investment of safety funding, and increase the effectiveness of projects and programs.
State transportation agencies continue to establish and enhance HSIP evaluation practices. Many States are tracking at least basic project information, evaluating projects, employing more advanced methods, and reporting results to stakeholders. While States are making progress in enhancing HSIP evaluation practices, there is still a great deal of variation among States. Agencies need specific guidance to track and evaluate the effectiveness of projects, countermeasures, and programs. There is also an opportunity for many to learn from the successes and challenges of others.
To help advance evaluation practices, the FHWA Office of Safety hosted an HSIP Evaluation Peer Exchange on November 28-29, 2017 at the Utah Department of Transportation headquarters in Taylorsville, UT. The purpose of the peer exchange was to facilitate the exchange of noteworthy practices and lessons learned among the States. The peer exchange also served as an opportunity to promote the recent release of the FHWA HSIP Evaluation Guide.
The peer exchange was organized around six main topics related to HSIP evaluation:
- Project Tracking
- Project Evaluation
- Countermeasure Evaluation
- Program Evaluation
- Using Evaluation Results
- Preparing for HSIP Evaluation
For each topic, peer States led with presentations or key remarks followed by a roundtable discussion. At the end of the peer exchange, following the six topics, the participants divided into four breakout groups. The breakout groups provided an opportunity to reflect and share thoughts on strengths, weaknesses, opportunities, and threats to implementing and advancing HSIP evaluation efforts. Each State also identified key takeaways from the peer exchange. Attachment A includes the detailed peer exchange agenda.
The following States attended the peer exchange. Attachment B includes a full list of attendees.
- New Jersey
- South Carolina
Topic Area 1: Project Tracking
This topic discussion focused on how States maintain an inventory of completed HSIP (and non-HSIP) projects. Virginia DOT and Idaho DOT presented their efforts to lead off discussion of the topic, followed by a brief overview by Frank Gross of examples from other States.
Virginia uses tracking tools developed in-house to assist project managers with the tracking of their projects. These tracking tools include dashboards and maps that help present information in an intuitive manner. VDOT uses its recently developed Smart Portal as an intake process for project submittal and readiness to prioritize HSIP funding and feeds those projects to its Integrated Six Year Plan and project tracking tools. VDOT HSIP funds are mainly available for two types of projects: traditional location-specific projects and systemic low-cost projects. Systemic low-cost safety projects include the FHWA proven safety countermeasures: flashing yellow arrows, retroreflective backplates, high-intensity activated crosswalk beacons, pedestrian refuge islands, rumble strips, curve delineations, road diets, and safety edge.
Idaho uses the Office of Transportation Investment system to track all its transportation projects. Idaho staff have made improvements in recent years to enhance the amount of information available in its tracking systems, including detailed scopes, dates of the projects, and start and ends points on the roadway. Idaho intends to use the enhanced information to set up protocols to pull needed information for HSIP evaluation and for countermeasure evaluation.
Frank Gross briefly described Alaska’s use of spreadsheet-based tools for project tracking. He also showed an example spreadsheet used by Massachusetts DOT.
After the presentations, meeting participants engaged in an open discussion on the topic.
California attendees indicated that there are instances in their State where the completion date for a project is tied to the date recorded in FHWA’s Fiscal Management Information System (FMIS). California and Utah use substantial completion date because construction completion can linger. New Jersey is keeping track of its project construction dates.
The attendees discussed about how many years are needed after the completion of a project to have sufficient information to evaluate a project. Most of the participating States indicated they use three years of data, which is consistent with state of the practice. As discussed later, no more than 5 years is recommended due to changing conditions.
Utah continues to improve its crash data so that this information can help with the evaluation of specific locations, such as curves. New Jersey indicated that it continues to use paper crash reports and that there may be over-confidence with data associated with low-volume roads. Minnesota is applying a linear reference system to its crash information, which will help with the automation of its HSIP application process, particularly with an interface for point-and-click selection on State maps.
Participants identified the following opportunities related to project tracking:
- Some States are able to identify and extract safety elements for non-HSIP projects. Other States are interested in how this is accomplished.
- There is a need to track projects with multiple countermeasures as well as projects with multiple sites. There are challenges to both and States are interested in the most efficient ways to track this information.
Topic Area 2: Project Evaluation
This topic discussion focused on the tools States use to evaluate their HSIP projects, with South Carolina featured during the discussion.
South Carolina showed its use of crash diagrams and supporting spreadsheet tables to evaluate its projects. The State set up its evaluation processes also for uniformity with benefit-cost reporting. South Carolina also discussed their use of a color-coded spreadsheet for project tracking. While there is a desire to enhance current project tracking capabilities and move toward something like Virginia’s Smart Portal, the color-coded spreadsheet is relatively simple and easy to implement until other tools are available.
Frank Gross provided examples of project evaluation in other States. He described the use of tables that are used in the different districts of Colorado DOT. He also showed the use of crash diagrams in North Carolina and Wisconsin and the web-based access to project evaluation information at North Carolina DOT.
After the presentations, attendees continued their discussion on project evaluation approaches.
The attendees discussed the use of generic costs to analyze types of projects. Colorado indicated that HSIP is viewed as supplemental funding to State projects and that it is preferable not to have all funds dedicated to one project. For example, CO never uses HSIP funds for total roundabout cost.
The attendees then discussed project evaluation timeframes. Minnesota’s project evaluation approach removes any calendar years that have construction activity. Arkansas works with its public information office to acquire supplemental descriptive information for its projects. California expressed a desire to have more than three or five years of evaluation data with the additional comment that data collection and evaluation should be continuous. Meeting attendees indicated that ten years of evaluation data for a project may be too long of a period because the built environment around the project may have changed significantly during that time. Minnesota uses an Empirical Bayes approach for its evaluation. Idaho currently uses a simple before and after approach, but expressed a desire to use an Empirical Bayes approach.
Participants identified the following opportunities related to project evaluation:
- There is a good deal of variability among States with respect to the use of average crash costs. While there is a need to apply crash costs consistently within a State or agency, there was a discussion on the impact of the magnitude of costs. Some felt the actual costs do not matter as long as the agency applies costs consistently among projects. Others noted the potential impact of artificially higher or lower costs (e.g., higher crash costs will lead to more projects that appear cost-beneficial). There is an opportunity to assess the impact of the magnitude of crash costs in justifying projects and programs.
Topic Area 3: Countermeasure Evaluation
This topic discussion focused on approaches and methods States are using to evaluate countermeasures and develop Crash Modification Factors, including evaluations of multiple project sites and combinations of countermeasures. Minnesota, Kentucky, and Arkansas described their approaches.
Minnesota described its approaches to evaluate systemic projects, namely signing, pavement markings, and intersection lighting. The State has found evaluations are tricky, especially to account for risk factors. Minnesota uses control sections where possible, but noted occasional difficulties to account for those projects with limited mileage (i.e., small samples). They also noted that countermeasure costs are increasing and the total mileage of “treated” roadway is decreasing annually. The State is investigating the use of assistance outside of MnDOT, such as its university system to develop better approaches for systemic evaluations.
Minnesota uses traffic volume and roadway characteristics for its project selection criteria. MnDOT is also decentralized, therefore, the central office provides criteria to the districts so that projects are managed on a district basis. Minnesota could develop rumble strip policy based on some of the evaluations. Although Minnesota is applying nationally proven countermeasures, the State wants to evaluate their actual effectiveness in the State. The effectiveness of a countermeasure is also dependent on location in the State. Minnesota is also trying to identify the number of miles affected annually by HSIP funds.
There is no safety division within the Kentucky Transportation Cabinet (KYTC). The KYTC has four central office HSIP staff and one HSIP coordinator in each of its districts. Kentucky’s annual HSIP reports focus on money spent so it does not capture benefits for non-HSIP funded projects. The KYTC partners with the University of Kentucky to support the HSIP program, including the evaluation of countermeasures and the development of annual HSIP reports. The University of Kentucky uses Wilcoxon Signed-Rank Test and Empirical Bayes approaches to evaluate its projects.
Kentucky has an HSIP Investment Plan. As Kentucky treats more sites with high friction surface treatments, the benefit-cost ratios are decreasing. KYTC has a standard procedure to use rumble strips; therefore, HSIP funds are no longer used for this countermeasure. The same goes for safety edge. The State also refers to safety edge as “durable pavement edge”. HSIP funding helps to change perceptions of safety investment by decision-makers. Kentucky is selective with its high friction surface treatments and restricts the use of specific materials.
Kentucky involves maintenance and administration staff to educate all parties on the benefits of specific countermeasure design and application. Nevada also has a similar approach and this also helps educate contractors. South Carolina trains its inspectors on the application of high friction surface treatments.
Arkansas presented an overview of its systemic and project evaluations. Systemic evaluations were for projects that included cable median barriers, shoulder rumble strips, and centerline rumble strips. Project evaluations focused on treatments including ultra-thin bonded wearing course, roundabouts, passing lanes, and raised medians. The State found a significant reduction in crashes through the installation of raised medians. The State installed cable median barriers as part of a freeway reconstruction bond. A successful rumble strip pilot project helped to justify a second round of rumble strip installations. The State also installed urban roundabouts; however, these were not funded by HSIP during the pilot phase.
After the presentations, attendees continued their discussion on countermeasure evaluation approaches.
Minnesota indicated they use sinusoidal rumble strips only in certain situations as this treatment costs twice as much as traditional rumble trips. The results from a study of sinusoidal rumble strips are documented in Report 2016-23, Sinusoidal Rumble Strip Design Optimization Study.
Frank Gross provided a quick demonstration of FHWA’s Roadway Safety Data and Analysis Toolbox to help identify tools related to evaluation. He suggested that attendees review the following publications:
- Reliability of Safety Management Methods: Systemic Safety Programs demonstrates the value of a balanced approach to safety management (i.e., combination of hot-spot and systemic treatments) and provides a method to balance funding between hot-spot and systemic projects and programs.
- Reliability of Safety Management Methods: Safety Effectiveness Evaluation demonstrates the value of using more reliable methods in project and countermeasure evaluations, highlighting the benefits of the Empirical Bayes before-after method compared to the simple before-after method.
Participants identified the following opportunities related to countermeasure evaluation:
- The need for better approaches to evaluate systemic countermeasures.
Topic Area 4: Program Evaluation
This topic discussion focused on how States evaluate their overall HSIP program. Minnesota led off the discussion by presenting how public attitudes are measured. Utah then described its benefit-cost evaluation approaches.
Minnesota’s SHSP includes an emphasis area to improve traffic safety culture. More accurately, safety culture is at the center of the State SHSP. Minnesota applied the Integrated Behavior Model, which has been used in many industries, to predict intentions to engage in certain behaviors (i.e., driving after drinking or wearing a seat belt). The State conducted a baseline survey that can be repeated to measure changing roadway safety attitudes. If other States are interested in conducting their own measure of traffic safety culture, Minnesota suggests partnering with researchers who are familiar with traffic safety culture and its underlying behavior models, and have the capacity to develop and implement a high-quality survey.
Utah uses three methods to evaluate benefit-cost: three-year crash history, usRAP, and a Bayesian/Predictive method. The State compiles an actual three-year before crash history and a three-year after crash history to calculate the benefit of a project. All individual safety project costs and benefits are added together to calculate the overall program benefit-cost ratio.
During project selection, Utah does not count the effect of multiple treatments at a site, only the treatment with the highest benefit is accounted for so as not to overestimate the impact of treatments. Utah is interested in the calculation of the benefits of new technology and the potential to apply HSIP funds to these future projects. Utah wants to advance their efforts to perform program level evaluations using the Bayesian/Predictive method instead of the simple before-after method. They would also like national guidance on methods for evaluating systemic safety projects. Utah is considering the evaluation of individual safety elements on non-HSIP program/STIP projects.
The attendees then engaged in a discussion about local involvement in safety program data and project selection:
- Utah – local agencies use the same procedures as the State to identify projects.
- Michigan – local agencies have their own local safety program that has a call for HSIP projects on a two-year basis whereas the call for HSIP projects on the State system is on a five-year cycle. Michigan Technological University assists local agencies through the LTAP program, which includes distributing Michigan crash data through the Roadsoft software program. Michigan State Police owns the crash data. Both local agencies and MDOT use Roadsoft for their crash analysis tool.
- New Jersey – local agencies have information on projects three years before and after, MPOs pull the crash data.
- Idaho – the webCARS program is used by larger local agencies in the State, and the local technical highway advisory council creates associated maps. State also has a training program to train enforcement on the quality of data.
- Colorado – local crash data is not found at a single source, DMV has the state data.
- Nevada – all crash data is held at the Department of Public Safety.
- Kentucky – State police owns crash data, and there are memorandums of understanding in place to allow consultants to use the data to support projects.
- Virginia – uses Tableau crash tool and provides access to select users.
- California – crash data is held by the highway patrol; there is missing or delay in capturing fatality data from trauma centers, and not fully capturing all bicycle and pedestrian data, including unreported minor injury/no injury crashes on all roadway types.
Participants identified the following opportunities related to program evaluation:
- People will follow positive examples, so there is an opportunity to recreate dialogue and frame positive messages for marketing safety.
- There is an opportunity to change personal choice through workplace policy.
- There are opportunities to transition from the use of simple before-after methods to more reliable evaluations using Bayesian/Predictive methods.
- There is a need for better guidance on methods for evaluating systemic safety projects because this is currently not included in the Highway Safety Manual.
Topic 5: Using Evaluation Results
This topic discussion focused on how States are using HSIP evaluation results to inform policy and safety investment decisions. Michigan and Minnesota presented their approaches.
Michigan presented four parts of its HSIP evaluation process: calculating benefit-cost ratios, assigning crash costs and weights, support of future decisions, and a multi-objective decision analysis. The State HSIP project application process includes the use of a Time of Return (TOR) form; there are separate forms for the State and local agencies. Funding is allocated by region at a target amount. Multiple countermeasures and crash types may be input into the TOR form. Each crash type and corresponding crash reduction factor are placed on separate lines. Unit crash costs are based on figures published by the National Safety Council. It is important to note that the actual crash cost does not matter, it is important the same costs are used uniformly across applications. The form also considers interest rate and AADT. The overall calculation of the time of the return in years includes fatality and serious injuries that occurred over a five-year period. Overall project costs include right-of-way, construction, and preliminary engineering, but does not include maintenance. Michigan has a set of approved countermeasures including systemic treatments of rumble strips, cable median barrier, and retroreflective post-mounted sheeting.
Michigan communicates its results across MDOT, its partner local agencies, and via public outreach through various mechanisms. The State SHSP has eleven emphasis areas, and these groups are informed of the results. A MDOT listserv publishes weekly fatality statistics. County engineers across the State have an annual workshop. The MDOT design division has helped with the funding of graphics since HSIP funds cannot be used for such activities. More information on Michigan’s Toward Zero Deaths effort can be found at http://www.michigan.gov/ZeroDeaths. The State is working on updated guidelines to focus on systemic projects and investigating combining HSIP and roadside funding.
Minnesota discussed its vision for using evaluation results to inform policy decisions such as safety strategies that should be incorporated into other (non-HSIP) projects. The vision is a circular process where the agency implements projects, evaluates project and countermeasure effectiveness, and influences investment decisions and agency policy based on the results of project and countermeasure evaluations.
To influence policy, there is a need to perform rigorous and reliable evaluations. There have been instances where policy-makers hold-up implementation because there is not a state-specific CMF. For example, the State was asked to develop a reliable state-specific CMF for restricted-crossing U-turns (RCUTs); however, it was difficult to identify suitable reference or comparison sites to perform a rigorous evaluation.
In response to Michigan’s presentation, California expressed that its systemic projects on State highways are expensive and have a long project development period. The State also has a policy that roundabouts need to be considered when intersections are analyzed. California also indicated their project application form allows up to three countermeasures per location and their effects are multiplied.
New Jersey has attempted to develop systemic projects that do not trigger environmental impacts, but as projects are completed, the State is running out of additional project options and opportunities.
Nevada’s shoulder widening efforts included looking ahead five years in advance and working with design teams. There is also coordination with 3R projects and the use of PE and federal funds. Nevada would also like to have a better understanding of true operating costs.
New Jersey reminded participants about accounting for electricity operating costs for traffic signals.
South Carolina and Connecticut set a benefit-cost ratio of one for projects, and then determine a breakeven cost for a project to achieve the ratio. This helps them to quickly decide if project costs are likely to overrun the breakeven cost.
When discussing the opportunity to estimate lives saved and injuries prevented as a measure of program effectiveness, Utah expressed concern about the methods to compute these estimates. UDOT was quick to use the Highway Safety Manual as it represents national state-of-the-practice. When they receive questions about how or why they are using certain methods (e.g., average crash costs), they reference the Highway Safety Manual. The first edition of the Highway Safety Manual does not discuss lives saved or injuries prevented as a measure of program effectiveness.
New Jersey indicated there may be interest in computing lives saved by a project or program.
Frank Gross highlighted additional States and their efforts to communicate results. He pointed that Florida DOT meets periodically with executives and districts, New York State DOT develops progress reports at their central office for distribution to regions, and Colorado DOT provides similar progress reports to regions.
Participants identified the following opportunities related to using evaluation results:
- There is an opportunity to develop a software application to support countermeasure and project evaluation.
- There is a need for better information on project costs, particularly at the planning level for initial benefit-cost analysis when projects designs are conceptual and final costs are still unknown.
- There is a need for better guidance on how to estimate lives saved and/or injuries prevented so the States can reference a document when questioned on the methods used.
Topic 6: Preparing for HSIP Evaluation
This topic discussion focused on the considerations States are making to prepare for HSIP evaluation efforts. Frank Gross presented information about staff and management support in North Carolina, right-sizing evaluations in Montana, and the application of results in Wisconsin.
Kentucky provided information about the relationship between the Kentucky Transportation Cabinet and the University of Kentucky and how the latter supports the State HSIP program. An executive order by the Governor of Kentucky moved transportation research functions from KYTC to the University. Design immunity is not an issue with the University as KYTC continues to serve as the final signing authority for projects. A challenge for the researchers is not having AADT information for local roads, making it difficult to prioritize these locations.
Utah shared its relationship with its colleges and its in-house consultants. UDOT has partnerships with Brigham Young University (for Bayesian analytics) and the University of Utah (for the crash database). This provides an opportunity for students to get exposed to highway safety. Intergovernmental agreements are set up with the colleges. BYU is private; therefore, it can only accept a specific limit on Federal funding. UDOT has consultants serving as support staff with on-site employees. This initiative supports local business and does not grow government. UDOT currently has 1700 employees which is 500 less than 20 years ago. 15 percent of HSIP funding is dedicated to non-engineering tasks, such as data improvements and analysis. UDOT, not the consultants, has the final signing authority for all projects.
Minnesota described efforts to develop a safety evaluation position. MnDOT wanted to employ an analytics expert to replace a retiring position. The position is evaluated on the number of reports produced and is reported back to management. This position opens a door to candidates who do not have engineering skills; however, the ability to advance in the organization is limited as there are few non-engineering jobs or positions. This presents a motivation challenge. Minnesota also has a program that places junior engineers in different roles for 3-6 months at a time over two years. Nevada indicated it just hired its first non-engineer on staff.
Participants did not specifically identify opportunities related to preparing for evaluation.
Breakout Discussion and Wrap-Up
Frank Gross provided a summary of the topics and allowed attendees to comment. The following is a summary of discussion points and comments by topic area:
- Project Tracking
- Think of partners to leverage resources.
- Consider central spreadsheet and other tools; Kentucky and New Jersey are trying to use Excel sheets to develop their tracking report.
- Manual entry can be erroneous; should develop standardized templates; electronic-based templates are ideal.
- Project Evaluation
- Simple level for single projects – don’t use just one data point to develop a CMF!
- Focus on target crashes; understanding the reasons for implementing the project is critical.
- Don’t use partial years of data as there are seasonal fluctuations and anomalies.
- Countermeasure Evaluation
- The more projects the better for a countermeasure evaluation.
- In California, the FHWA HSIP submittal was mainly focused on program level; the evaluation of countermeasures is not well-addressed in the spreadsheet; an application is needed to help with the analysis. AASHTOWare Safety AnalystTM could be used.
- Performance measures – Benefit-Cost, ROI, Public Attitudes.
- Other Comments
- New Jersey – how are states going to react to not making their stated targets?
- Minnesota – funds are already set 4-5 years out. Target setting is too short.
- Idaho – remember to consider the contributions of behavior.
- Kentucky – target setting had strong interest from leadership at KYTC.
- Frank Gross reminded everyone that communicating the results is important.
Each State then presented the key takeaways from the peer exchange that they would like to explore in greater detail in the future. The following is a summary of key takeaways by State:
- Improve the evaluation of benefits.
- Celebrate achievements.
- Address portion of funds taken from HSIP to fund other projects.
- Better communicate the benefits of safety investments.
- Work with the University of Alabama (data gathering).
- Evaluate all safety-related projects, not just HSIP.
- Conduct more evaluations and have better setup with countermeasure evaluation.
- Seek assistance from consultants and universities.
- Maintain consistency across the agency.
- Improve project tracking tools.
- Maintain good documentation; inform new staff on procedures.
- Develop an evaluation plan; perhaps as a separate document.
- Consider future application of Wilcoxon test.
- California – Local-level
- Increase the number of local agencies applying for funding; half of the local agencies in the State have never applied for HSIP funding.
- Improve delivery of projects.
- Conduct post-construction evaluations.
- Start data collection efforts; a good plan is needed to properly set up the process.
- Introduce performance measures for different programs.
- California – State-level
- Involve federal partners to engage State to invest in evaluation programs.
- Acquire evaluation software or tools; or engage a consultant.
- Build a reputation that the State is doing enough for safety.
- Communicate results with safety stakeholders.
- Develop a new HSIP manual.
- Improve evaluations of countermeasures.
- Reach out to other states.
- Implement Safety Circuit Rider.
- Conduct Program Level review.
- Right-size its evaluations.
- Focus on university and consultant.
- Evaluate entire program.
- Share evaluation results with internal and external stakeholders.
- Collect as much information upfront to support the backend evaluation.
- Update project tracking database for comprehensiveness.
- Develop implementation plan.
- Document all program and evaluation activities.
- Enhance tracking tool for evaluation.
- Assemble a plan for an evaluation program.
- Select countermeasures.
- Focus on re-gearing for evaluation.
- Recognize there no longer is a budget surplus; now it is simply allotment.
- Move past low-hanging, easy projects.
- Use evaluation to optimize project applications as ROI continues to shrink.
- Use context from other States to supplement Kentucky efforts.
- Augment information in project tracking and existing tools.
- Work with regions during scoping process to gather requisite information; lessen the need to scour for information in the future.
- Adopt one-page project cheat sheets like those in North Carolina.
- Improve communication.
- Improve HSIP manual.
- Recognize the importance of tracking, and to backtrack.
- Update documentation appropriately.
- Develop a plan to have an evaluation plan; target the audience and investments.
- Develop new crash report form.
- Coordinate with new evaluation plan.
- Continue to use universities and consultants.
- Update the HSIP manual and use it to educate others and change safety culture.
- Improve data sharing.
- Promote safety culture within Nevada DOT.
- Improve the quality and use of data.
- Merge data from different groups.
- Develop CMF list and standardize it for the state.
- Use consultants and university for countermeasure evaluations.
- Conduct project and program level evaluation.
- Use evaluation results to advertise the HSIP program.
- Improve the vision of evaluation at New Jersey DOT.
- Admit having a problem and work with partners to develop solutions; find help from partners.
- Improve data.
- Celebrate successes.
- Publicize annual evaluation.
- Promote the benefits of the typical countermeasures that are used; provide a Utah context for these projects (i.e. last sites to get treatment, and what were their results).
- Push for systemic designs as part of standard designs.
- Recognize current achievements of strong upper management support and an established HSIP manual and crash database.
- Improve before and after, traffic volume, and Empirical Bayes evaluations.
- Flag projects for evaluation eligibility.
- Improve communication.
- Improve systemic project tracking and tools.
FHWA closed the peer exchange with a discussion of next steps to advance HSIP evaluation practices. Following the peer exchange, FHWA will conduct an HSIP evaluation webinar to share noteworthy practices from the peer exchange and others.
Attachment A: Peer Exchange Agenda
Tuesday, November 28, 2017
- Welcome, Carlos Braceras, UDOT Executive Director, and Ivan Marrero, FHWA Utah Division, Division Administrator
- Overview and Objectives, Karen Scurry (FHWA)
- Introductions, All
HSIP Evaluation Guide
- Benefits of HSIP Evaluation and Overview of Guide, Frank Gross (VHB)
- Virginia’s SmartPortal and Tableau Tools, Deepak Koirala (Virginia DOT)
- Idaho’s Project Tracking and Prioritization Tool, Kelly Campbell (Idaho Transportation Department)
- Roundtable Discussion
- Colorado’s Project Evaluation Template and Online Reports, Frank Gross (VHB) on behalf of Colorado DOT
- Additional State Input:
- South Carolina’s Project Evaluation Spreadsheet, Joey Riddle (South Carolina DOT)
- Roundtable Discussion
- Minnesota’s Systemic Countermeasure Evaluations, Brad Estochen (Minnesota DOT)
- Kentucky’s Experience Using the Shift of Proportions, David Durman (Kentucky Transportation Cabinet)
- Additional State Input:
- Arkansas’ Countermeasure Evaluation Experience, Adnan Qazi (Arkansas DOT)
- Roundtable Discussion
- Minnesota’s Evaluation of Public Attitudes, Katie Fleming (Minnesota DOT)
- Additional State Input:
- Experiences Estimating the BCR for HSIP Projects, Multiple
- Experiences Evaluating Specific Programs (Systemic, Local), Multiple
- Roundtable Discussion
Wednesday, November 29, 2017
Day 1 Recap
Using Evaluation Results
- Michigan’s Experience Communicating Evaluation Results, Heidi Spangler (Michigan DOT)
- Additional State Input:
- Minnesota’s Vision for Using Evaluation Results to Inform Policy, Brad Estochen (Minnesota DOT)
- Experiences Using Evaluation Results to Inform Policy, Multiple
- Experiences Using Lives Saved to Report Progress and Justify Funding, Multiple
- Roundtable Dis.cussion
Preparing for HSIP Evaluation
- Considerations in Preparing for HSIP Evaluation, Frank Gross (VHB)
- Kentucky’s University Partnership, David Durman (Kentucky Transportation Cabinet)
- Additional State Input:
- Minnesota’s Experience Creating Position for HSIP Evaluation, Brad Estochen (Minnesota DOT)
- S.W.O.T. Analysis of HSIP Evaluation, All
- Roundtable Discussion on Challenges, Opportunities, and Key Takeaways, All
- Review of Objectives and Discussion, Frank Gross (VHB)
- Resources, Frank Gross (VHB)
- Final Questions/Comments, All
Closing Remarks, Karen Scurry (FHWA) 11:55 am
Adjourn/Safe Travels! 12:00 pm
Attachment B: Participant List
The following is a list of attendees at the HSIP Evaluation Peer Exchange.
|Alabama||Kim Biddick||Alabama Department of Transportation|
|Arkansas||Adnan Qazi||Arkansas State Highway & Transportation Department|
|California||Richard Ke||California Department of Transportation|
|California||Howard Giang||California Department of Transportation|
|Connecticut||Joe Ouellette||Connecticut Department of Transportation|
|Idaho||Kelly Campbell||Idaho Transportation Department|
|Kentucky||David Durman||Kentucky Transportation Cabinet|
|Kentucky||Tim Tharpe||Kentucky Transportation Cabinet|
|Michigan||Heidi Spangler||Michigan Department of Transportation|
|Michigan||Mark Bott||Michigan Department of Transportation|
|Minnesota||Brad Estochen||Minnesota Department of Transportation|
|Minnesota||Katie Fleming||Minnesota Department of Transportation|
|Minnesota||Mao Yang||Minnesota Department of Transportation|
|Nevada||Lori Campbell||Nevada Department of Transportation|
|New Jersey||Angela Quevedo||New Jersey Department of Transportation|
|South Carolina||Brett Harrelson||South Carolina Department of Transportation|
|South Carolina||Joey Riddle||South Carolina Department of Transportation|
|Virginia||Deepak Koirala||Virginia Department of Transportation|
The following is a list of attendees from the host agency, Utah Department of Transportation
|Scott Jones||Robert Miles|
|Anne Ogden||Rudy Alder|
|Brian Phillips||Tyler Laing|
|Glenn Blackwelder||Clancy Black|
|Jesse Sweeten||Dallas Wall|
FHWA staff in attendance at the HSIP Evaluation Peer Exchange included:
- Karen Scurry, Office of Safety
- Danielle Betkey, Office of Safety
- Joe Heflin, Arkansas Division Office
- Dahir Egal, Colorado Division Office
- Lance Johnson, Idaho Division Office
- Jake Waclaw, Nevada Division Office
- Caroline Trueman, New Jersey Division Office
- Daniel Hinton, South Carolina Division Office
- Roland Stanger, Utah Division Office
Page last modified on September 30, 2015