Alliance for Innovation and Infrastructure | DIRT Blog Update - Alliance for Innovation and Infrastructure
149123
post-template-default,single,single-post,postid-149123,single-format-standard,ajax_updown_fade,page_not_loaded,

NEWS

DIRT Blog Update

08 Jun 2017, Posted in Blog Posts, Blog Posts

Last summer, we published “Improving Upon Our Dig Laws: How State Legislatures Can Help Us Dig Safer,” where we performed an extensive review of the incident data revealed in the Common Ground Alliance’s (CGA) 2014 DIRT report (released in Fall 2015), offered some recommendations states could incorporate into their Damage Prevention programs to improve excavation safety, and offered recommendations CGA could implement to make the data more user friendly.

Specifically, we recommended that states:

1) Implement mandatory incident reporting requirements in state laws and regulations;
2) Increase enforcement efforts to encourage stronger adherence to laws, regulations, and best practices; and
3) Improve utility locating practices to remedy disparity between accuracy of jobs performed by locate firms and those performed by in-house utility locators.

To make the data compiled in the DIRT reports more useful, we also recommended:

1) DIRT should organize data by state instead of region; and
2) DIRT should look at increasing the number of root cause categories to include more narrow incident causes and require more specific incident cause descriptions for submissions.

While the 2015 DIRT report (released in Fall 2016) did not specifically incorporate either of our recommendations, CGA did take two concrete steps to make the data more accurate and accessible, which can “lead to more targeted corrective actions.” First, CGA worked with a data scientist to better assess reports stemming from the same event. This consolidates the data set, but should lead to a more accurate assessment of what caused the incident. Second, CGA introduced a new interactive dashboard allowing “users to filter the data more granularly by factors contributing to damages.”

The first change will make it far more difficult to compare and contrast this year’s data with previous years and assess incident trends, but it may set a new and more accurate baseline making it easier to identify and track trends moving forward. On the other hand, the interactive database is immediately helpful, and is a different means to the same ends as the recommendations we made last year, i.e. making it possible to more specifically identify incident causes.

Unfortunately, incident counts seem to be heading in the wrong direction, with 363,176 events submitted in 2015 compared to 273,599 in 2014. Outside of the increase in the number of events recorded, the 2015 report claims improvements in a number of areas.

First, the report estimates that there were 349,000 incidents in 2014 and only 317,000 in 2015. Essentially, DIRT assumes that even though 273,599 events were reported in 2014, there were actually 349,000 incidents that year. And, that even though 363,176 events were reported in 2015, only 317,000 actually occurred. Operating under this assumption, CGA goes on to estimate that damages per 1,000 transmissions was reduced from 1.6 to 1.54. If true, this is a promising data point, but this assumption is incredibly difficult to wrap our heads around. As we explained in last year’s report:

The predictive value of DIRT’s damage estimation methodology is low. Because the data is not all inclusive, DIRT uses linear regression modeling, including factors such as building permits, construction spending, land area, population, and population density, that draws information from states that “appear to have a substantial number of damages reported to DIRT” to formulate an estimate of event occurrences in a given year.

This is not CGA’s fault, however. Without mandatory reporting laws in place, it is impossible to get reliable data on actual incident occurrences. We continue to strongly recommend that all states update their laws to require mandatory reporting. We also petitioned the U.S. Department of Transportation’s Pipeline and Hazardous Material Safety Administration (PHMSA) to update their regulations to require mandatory reporting as a prerequisite to having a state damage prevention program certified as compliant.

In 2015, the share of damages by root cause group was almost identical to 2014 with one exception – damages attributed to “Excavation Practices not Sufficient” dipped by 5 percent from 50 percent to 45 percent, and damages attributed to “Notification Not Made” jumped by 5 percent from 25 percent to 30 percent. This is both encouraging and troubling at the same time, but as we pointed out last year, incidents attributed to “Excavation Practices not Sufficient” were difficult to categorize:

Excavators are responsible for the largest share of damages, but due to vague reporting the causes are nearly impossible to discern. There are a number of reasons an event could be identified as resulting from “Excavation Practices Not Sufficient.” In this year’s report, 84 percent of events attributed to this root cause are not categorized, making it difficult or impossible to determine how to best prevent these incidents. It also calls into question whether these events were mischaracterized to begin with.

In short, if CGA’s assumptions are accurate, damage prevention practices are improving. Unfortunately, the raw data does not back that conclusion. On the bright side, this year’s report took important steps in trying to more accurately detail root causes for reported incidents. It also created a new more accurate baseline against which to compare future years. Unfortunately, until the raw data is complete and doesn’t need to be run through a regression model, it will be impossible to accurately track incident trends. We are also disappointed that CGA didn’t organize incident data in the report by state, allowing incident rates to be juxtaposed with the quality or lack thereof of state damage prevention programs. We will continue to work towards a mandatory reporting requirement in all states. Hopefully when we report back next year around this time, we will be able to say that we succeeded – and that we can definitively determine through a comprehensive data set, whether incident trends are heading in the right direction, and why they are or are not. Only then can we target next steps in driving incident rates down state by state.