Hurricanes, Quakes and Forest Fires
Hurricane Irma, Maria and the recent quake in Mexico City have shown once again that we are at the mercy of Nature’s wrath. Power companies are scrambling in Houston, Puerto Rico and Florida to re-establish service with the help of reciprocal agreements they have put in place with their Northern brethren. Many areas serving millions of customers had service re-established within hours, some days and some will have to wait likely for weeks or months.
Similarly, annual fire season in California, Wyoming and Montana reminds us how risky it can be to build within a few feet from thick brush and forest.
A large portion of the damage is due to “tree fall” or “tree arcing” (i.e. sparks flying when a tree hits a power line), which is a common phenomenon for utilities even outside of major disaster events. Trees typically fall due to pests, natural rot, wind and soil degradation because of heavy rainfall and erosion. They are the leading cause of outages for utilities. 3% of California wildfires are caused by arcing and there are between 3,000 and 5,000 fires annually as reported by cdfdata.fire.ca.gov On average, a utility will experience 4 such outages per month affecting between 2 and 57 million people. Central Electric Cooperative, an Oregon utility serving three states projects annual cost of $128 million for maintenance dealing with 4,300 events every year.
Just to give you a perspective on size; the US grid alone consists of 200,000 miles of transmission lines and another 5.5 million miles oflocal distribution lines. This vast network of overhead lines creates $5.4 billion in annual maintenance spend with the largest portion going to vegetation management.
Vegetation management programs try to boost reliability and customer service while reducing outage/disruption risk while adhering to increasingly stringent environmental and safety guidelines (FERC FAC003-3). Every utility runs these programs, from the massive regional operators serving millions of customers; like PG&E, Dominion and Duke to local co-ops serving a few tens or hundreds of thousands. Indianapolis Power & Light Co. earmarked close to $7 million in 2016 and “only” $1 million was related to storm damage. The rest was due to “routine” line clearing services. To measure reliability, tree related outages as percent of total SAIFI (System Average Interruption Frequency Index) were 37.85%. As you can imagine, the largest utilities’ programs run into the tens of millions of dollars given a similar event frequency.
Previously, utilities planned their budget based on historical outage and network expansion data, scoped the program accordingly and set treatment plans in line with these plans. This was followed by a rigorous audit and by the time this was done for the entire network, 3-4 years passed.
Risk and cost/benefit-based prioritization of key consumption, distribution or vegetation-growth driven areas was not possible on a total network basis.
The problem with this approach is that it predicts the future purely based on historical facts and it is neither real-time nor finely-tuned to the current conditions on the ground. More importantly, the interval-driven visual inspections of every line mile by contractors reduced productivity. Relying on customer reports to augment surveys is needed but not comprehensive enough to boost reliability significantly.
By leveraging and integrating newly available and relevant data; such as line/transformer location, customers (meters) served, customers with wind/solar backup, load profiles, high resolution photography, satellite imagery or LiDAR and the use of more cost-effective collection options like drones; experts project $1 billion in annual savings potential for North American utilities. As a rule, utilities should be able to save upwards of $700 per brush mile from deploying increasingly sophisticated tools and data to keep vertical and horizontal corridors cleared. As such, Coastal Electric Cooperative, a utility with 1,500 line miles could maintain its $500,000 annual vegetation management budget for a decade by improved herbicide deployment.
To target high-risk (safety and/or commercial) or less resilient distribution areas for mitigation first, the assumption is that all core data elements outlining the grid are accurate. However, this is not true for most utilities. Given a more-or-less laissez-faire set of governance, meters often end up connected to different transformers in application A vs B. Behind-the-meter infrastructure, like residential solar panels, may not be captured if the owner opted against feeding excess power back into the grid. Fixing core data before the significant big data integration effort involving imagery is just as important as fixing your legacy data before seeding your new ERP system. More about that in my next post, by the way.
In summary, savings come from revenue assurance by uninterrupted service, reduced maintenance crew, equipment and herbicide/pesticide spend. Other benefits are reduced call center expense to deal with outbound and inbound outage notifications, legal cost to resolve personal and property damages by “arced” fires and resolving right-of-way challenges by adjacent land owners.
To make this a tangible reality, vast amounts of relational data needs to be collected in various latency modes from operational and historian applications, cleaned, aggregated and conflated with visual formats (2 or 3D) to present a heat-map, bird’s eye view of thousands of miles of line across multiple elevations. It can also show difficulty of access due to terrain steepness and overgrown access roads to schedule the right amount of time and equipment for the assigned crew. Every hour saved amounts to about $250 for a utility and these hours accumulate fast as you can imagine.
If you work for a utility or are an IT contractor exposed to these issues, I would love to hear what data and survey methods you are using today to deal with vegetation encroachment.