Grid Resilience Has an Alignment Problem

Category:

Student Voices

Featured

News

Author:

Katherine Cunningham

Evening Sky with powerlines

Eighty-three percent of U.S. power outages are caused by extreme weather; yet, adaptation and resilience have received roughly $8 billion in federal investment, compared to $650 billion for mitigation and decarbonization. That asymmetry was hard to ignore at the inaugural Power Resilience Forum hosted by The Ad Hoc Group and Latitude Media in Houston.

Utilities, regulators, investors, and technology providers spent two days wrestling with a shared problem: everyone agrees the grid needs to be more resilient, but significant work is needed on how we measure risk and the benefits of avoided risk, allocate capital, and govern decisions to make this happen.

In a follow-up conversation with Katherine Cunningham, MS/MBA'21, whose work at The Ad Hoc Group focuses on understanding these issues to help energy and climate technology companies scale within the utility sector, the same three friction points kept surfacing: the challenge of measuring risk and quantifying the benefits of avoided risk, misaligned incentives, and regulatory bodies that are being asked to approve increasingly complex plans without the staff, data, or expertise to do it confidently. 

Resilience planning starts with risk assessment, but many utilities still categorize risk as low, medium, or high, static buckets applied to a dynamic problem. For wildfire risk specifically, exposure can shift hour by hour based on wind speed and vegetation moisture, meaning utilities without real-time, granular risk data cannot make informed operational decisions about where to act first. Accurate risk assessment is the foundation, but it only answers half the question. The other half is whether a given investment is actually worth it, and that requires cost-benefit analysis.

California has been on the leading edge for this evolution. The CPUC's Risk-Based Decision Framework now requires utilities to quantify the costs and benefits of each proposed mitigation measure and demonstrate that benefits exceed costs, moving beyond earlier risk-spend efficiency approaches that measured risk reduction per dollar but did not evaluate whether the investment was justified in absolute terms. That evolution reflects a broader shift in how the industry is beginning to think about the value of resilience investments, not just their cost. Rigorous cost-benefit analysis also produces clearer tradeoffs: undergrounding power lines (burying them below ground to eliminate exposure) can cost $1.5 to $5 million per mile, while covered conductors (insulated overhead lines that reduce ignition risk without full burial) can eliminate the vast majority of ignition risk in the highest-risk corridors for around $900,000 per mile. The right choice depends on the specific risk profile of each corridor, not a blanket preference for one technology.

According to LBNL's Bridging the Gap report, utilities and regulators share common frameworks for resilience planning, but there is no industry-wide standardization around which methods work best for specific risks or performance metrics. Regulators evaluate utility-submitted risk assessments as part of reviewing investment and cost recovery requests, which means the quality and transparency of those submissions matter enormously. The NARUC Wildfire Regulator Guidebook, published in early 2026, marks meaningful progress in equipping commissioners to evaluate these plans, though the broader work of standardization is still underway.

Even when stakeholders agree on the level of risk, their incentives rarely point in the same direction. Utilities, regulators, investors, and insurers operate on different timelines and answer to different accountability structures, and those gaps show up in practice. The clearest example from the forum was what participants called the weak link problem: an investor-owned utility may invest heavily in wildfire mitigation, but if a neighboring cooperative lacks the same resources and ignites a fire, the entire region bears the consequences. Resilience is only as strong as the least-prepared participant in the system. States are experimenting with ways to address this.

In Texas, House Bill 145 establishes a framework for utilities to develop wildfire mitigation plans and introduces liability protections tied to compliance with those plans. While implementation will vary, this type of approach reflects a broader shift toward linking planning requirements with financial and legal incentives. Alignment, in this context, requires shared financial mechanisms and institutional support, not just technical standards. 

The increasing threat of wildfire sharpens these incentive problems because it is the only climate risk that a utility can directly cause. PG&E filed for bankruptcy in 2019 after its equipment sparked devastating wildfires in California. PacifiCorp and Hawaiian Electric have since faced billions in wildfire-related settlements, credit downgrades, and, in Hawaiian Electric's case, ongoing bankruptcy risk, demonstrating that wildfire liability is no longer a tail risk but a material threat to utility financial viability. Credit rating agencies now evaluate wildfire exposure as part of long-term viability assessments, which means a utility without a credible mitigation strategy faces higher borrowing costs, making the investments harder to justify at exactly the moment they are most necessary. Utilities need capital to reduce risk, but elevated risk raises the cost of that capital, a cycle that hits smaller cooperatives and municipal utilities the hardest.

The problem is not just who bears liability; it is how capital recovery rules shape what gets built in the first place. In regulated utility structures, hardware investments can typically be added to the rate base and earn a regulated return, meaning ratepayers fund them over time through their bills. Software-based resilience solutions have historically faced a higher bar for rate recovery, though the industry has made meaningful progress in developing capitalization pathways for these tools. The mechanics of capitalizing software investments are still not well understood within many utilities or among regulators, and closing that knowledge gap is as important as the regulatory innovation itself. Until that knowledge is more widely distributed, the incentive structure continues to favor physical infrastructure over data-driven improvements, even where the latter may deliver more risk reduction per dollar. 

Even where measurement improves and incentives are partially realigned, institutional capacity remains a binding constraint. Commissions are often under-resourced and working against compressed timelines. In some states, utilities submit mitigation plans that commissioners must evaluate within 180 days. These plans typically include risk assessment methodologies, cost-benefit analyses, and requests for new technology that commission staff are not always trained to assess independently. In certain jurisdictions, commissions have intentionally avoided formally approving mitigation plans out of concern that doing so increases their legal exposure, a hesitation that is rational given the constraints, but one that defers resilience decisions at exactly the level of government responsible for making them.

The data challenge compounds this, but the technology landscape is evolving to address it. A new generation of tools is helping utilities generate the kind of continuous, structured data that better supports both internal decisions and regulatory submissions. Companies like Technosylva are developing wildfire modeling and risk analytics platforms that allow utilities to assess dynamic risk conditions, while TreeSwift delivers high-resolution vegetation intelligence at scale, giving utilities a more precise and actionable picture of where to act first.

The regulatory infrastructure is beginning to catch up as well. The new NARUC guidebook gives commissioners a practical playbook covering risk management, financial mechanisms, and cost recovery, designed to be adapted to each state's regulatory context rather than applied as a uniform standard. Resilience is not only capital-intensive. It is governance-intensive, and the tools, guidance, and institutional momentum needed to close that gap are, for the first time, beginning to arrive together. 

At the Power Resilience Forum, someone posed a simple question: “What is the actual end goal of resilience?” The answer was straightforward. The fewest outages possible, and when they do happen, they affect the fewest people for the shortest amount of time. That goal is widely shared. The path to it is not. Getting there requires measurement that is consistent enough to compare across jurisdictions, incentives that reward prevention rather than just restoration, and regulatory institutions with the capacity to make proactive decisions under uncertainty. None of those things are technically out of reach, and the early signs of progress are real. What's needed now is the institutional will to engineer them deliberately and at scale, before the next disaster makes the cost of inaction impossible to ignore.

Stay Connected

Don’t miss what’s next

Join the Erb Institute mailing list to learn more about our programs and opportunities!

Stay Connected

Don’t miss what’s next

Join the Erb Institute mailing list to learn more about our programs and opportunities!

Stay Connected

Don’t miss what’s next

Join the Erb Institute mailing list to learn more about our programs and opportunities!

700 East University
Kresge Hall, 3rd Floor West
Suite 3510
Ann Arbor, MI 48109

© 2026 Frederick A. & Barbara M. Erb Institute. All rights reserved.

700 East University
Kresge Hall, 3rd Floor West
Suite 3510
Ann Arbor, MI 48109

© 2026 Frederick A. & Barbara M. Erb Institute. All rights reserved.

700 East University
Kresge Hall, 3rd Floor West
Suite 3510
Ann Arbor, MI 48109

© 2026 Frederick A. & Barbara M. Erb Institute. All rights reserved.