Dr. Michael Flaxman
Jun 24, 2021

Modeling & Monitoring Power Line Vegetation Management Risk at Scale

Try HeavyIQ Conversational Analytics on 400 million tweets

Download HEAVY.AI Free, a full-featured version available for use at no cost.

GET FREE LICENSE

NOTE: OMNISCI IS NOW HEAVY.AI

Utilities have long spent considerable time, attention, and treasure on vegetation management, with results that have often been disappointing or even catastrophic. 

It's a global issue that has been felt most notably in Mediterranean climates. California utilities, for example, have been responsible for several deadly and costly fire events (1), despite spending more than 1B dollars per year on utility vegetation management (2). 

Unfortunately, while this issue has received widespread public and utility industry consideration, there are still very few approaches available that are scientifically accurate, up-to-date, and scalable.

This post shows how OmniSci enables utility organizations to orient big geospatial data toward advanced vegetation management and efficient risk mitigation.


Figure 1: Tree mortality (red polygons) relative to power transmission lines (purple) and tree height (blue low to yellow height).


The technical challenge that legacy vegetation management software platforms have is their difficulty integrating and computing the required data due to its incredible detail and the extensiveness of its spatial scope. As a result, various approaches are built on data collected or aggregated at the 30m-1km resolution, often using dated information. 

The objects of interest in this process are living things - trees. You might, on average, have hundreds of healthy trees in a hectare. Just one sick or dead tree within striking distance of a powerline constitutes a risk.    

Similarly, a tree that was doing well last year might be doing poorly this year for many reasons.  And that tree might be in the most inaccessible location possible, for example, in the middle of a remote wilderness area on a steep slope.


Image
Figure 2: Relationship between Canopy Water Content and Eventual Tree Mortality.  Stanford Carnegie Lab. (3)


Fortunately, we have new data sources from the field of "remote sensing" that reliably and accurately measure tree health from space, such as the European Space Agency's Sentinel-2 satellites. These satellites produce open public data at 10m resolution globally, with updates every 2-5 days and "multispectral" capabilities purpose-built for measuring vegetation cover and health.  

They record information outside of the range of normal human vision to measure vegetation stress and canopy moisture levels directly.  Scientific research has shown that these measures can detect health issues up to two weeks before they are visible in the field and can reliably predict and detect tree mortality (4) .

Image
Figure 3: Historic Powerline Strike Data (2014-2016).  Vegetation strikes (green) represent about half of annotated data, but 50% of data lacks attribution and no additional data are available since 2016.  Data source: PG&E


The ability to reliably measure tree mortality turns out to be critical. Unfortunately, most utilities do not publish regular, geospecific information about their vegetation powerline strikes. For example, PG&E has not published such data since 2016. Decisions like this have stopped scientists from using their standard techniques for modeling this critical policy issue. Ultimately, we need new vegetation management regulation policies requiring information be disclosed in a timely and accurate form.

In the meantime, what can we do to improve powerline vegetation control?

I found one answer using modern data and compute techniques to scale a decades-old scientific approach developed by Guggenmoos (5) for smaller areas. Guggenmoos made the point that "baseline" tree mortality rates are both highly significant to powerline strikes and well-studied ecologically. In terms of importance, he reviewed many prior studies of the causation of powerline vegetation disruptions and found that about 85% of them were caused not by "grow-in" but rather by "fall-in" events.  This phenomenon points to a significant failure in powerline vegetation management regulation and strategy as existing regulations focus primarily on 15% of the overall risk.  

In terms of risk modeling at scale, this simplifies data requirements considerably.  Rather than needing to measure "grow-in" processes at the scale of centimeters, which is quite challenging from aerial sources and especially from satellites, it is more important to measure "fall-in" within a variable-width distance of tens of meters and to focus on vegetation health measures.

There are at least three significant aspects to forest health in this context:

  1. All forest stands experience mortality as trees compete with each other for resources over time (amounting to dozens to hundreds of individuals per hectare).
  2. Due to both fire suppression and global climate change, forest health is declining in many parts. Studies in California going back to the 1950s, for example, have shown roughly a tripling in mortality rates, with no evidence of a leveling off (6).
  3. And related to the latter point, droughts and insect infestation events are increasingly common and have spatially distinct mortality patterns.

From a practical perspective, despite having bad and declining forest health conditions, we are fortunate in California to have excellent monitoring data. First of all, we have annual aerial surveys of forest health published by USFS (see red polygons in Figure 1 above). Secondly, several studies directly correlate canopy moisture levels measurable from remote sensing with tree mortality rates (Figure 2). Finally, working with Tesselo LLC of Lisbon, Portugal, we were able to take analysis-ready Sentinel-2 data for the transmission lines of California and map forest health at scale. This gave us approximately four years of forest health time-series observations for each 10m power line segment.

Figure 3: Monitoring vegetation health from space and summarizing by powerline spans (data and analyses courtesy of Tesselo LLC)


The work above shows the value of addressing forest health's temporal and spatial components using high spatial resolution imagery and weekly health measurements.  But of course, the world of remote sensing offers several additional sensor types.  Some of these sensors are higher cadence, such as geosynchronous fire-detection satellites providing 15-minute updates. Others include much higher resolution, but with annual updates.  

In addition to using vegetation health data like Tesselo's, it is also possible to use a simple map overlay to combine synthesized risk data from other models with powerline corridors.  For example, the dashboard below combines a 30m resolution risk model (440 million samples across California) from USGS with public data on transmission powerlines.  It shows the probability of flame lengths greater than 8 feet under 97% worst weather conditions.  Typically, flame lengths of this magnitude can only be stopped with aerial intervention.



LIDAR data is worth calling out since it can provide a branch-level measurement of 3D vegetation structure. There are both public and private sources now available. The USGS is well underway with a project to map the entire US with moderate resolution LIDAR, and these data are freely available today for ⅔ of the country (7).

Image
Figure 5:  Measuring Physical Strike Tree Possibility using LIDAR heights. 
Shorter Trees (Blue) Cannot Hit lines while taller trees (Orange) have strike potential

 

Conclusions

As mentioned above, the cost of powerline vegetation management is high - one of the major outlays for most utilities. Loss of life and failures in economic activity can also be staggering. This represents at least "reputational risk" and has already led to a utility bankruptcy in California.

Yet appropriately characterizing vegetation powerline risk is a challenge. It requires new approaches, leveraging datasets such as those described in this post. But it also requires new computational tools and techniques. The work above would not be possible, for example, without GPU-accelerated analytics. You probably wouldn't want to analyze the USGS 30m California dataset with 440 million samples in a traditional GIS system. Much less the multi-billion point LIDAR dataset shown above.  

The good news is that most of the data and tools to do this are already available. So I'd encourage anyone interested to experiment with the datasets above (using OmniSci Free if you'd like). Also, you might want to encourage your local utility or state regulator to require public release of "tree strike" and vegetation management data. The lack of transparency in those particular datasets limits the ability of researchers to improve risk models further. But as another potentially severe fire season approaches, we must do what we can to improve these critical analyses.




Dr. Michael Flaxman

Dr. Michael Flaxman is HEAVY.AI's Spatial Data Science Practice Lead. At HEAVY.AI, Dr. Flaxman’s team is focussed on the combination of geographic analysis with machine learning, or “geoML.” He has served on the faculties of MIT, Harvard and the University of Oregon. Dr. Flaxman has participated in GIS projects in 17 countries. He has been a Fulbright fellow, and served as an advisor to the Interamerican Development Bank, the World Bank and the National Science Foundation. Dr. Flaxman previously served as industry manager for Architecture, Engineering and Construction at ESRI, the world’s largest developer of GIS technology. Dr. Flaxman received his doctorate in design from Harvard University in 2001 and holds a master’s in Community and Regional Planning from the University of Oregon and a bachelor’s in biology from Reed College.