Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can Artificial Intelligence Outsmart Natural Disasters?

As natural disasters grow more severe across the country, local governments are increasingly using predictive analytics to understand where and when an emergency will impact their communities.

fire and smoke rising above a suburban neighborhood at night
Shutterstock.com
When fires start in Gilpin County, Colo., they burn hot and grow fast. Floods in Texas’ San Antonio River basin spill across highways, blocking emergency responders, and Norfolk, Va., sees homes inundated by coastal storms. Earthquakes shaking the Pacific Northwest risk derailing trains, injuring residents and causing power outages at hospitals.

Prediction and early detection tools — as well as automated responses — aim to help local governments reduce the damages of these kinds of natural disasters. Today’s tools are warning residents, triggering mitigation and helping first responders react more effectively. As artificial intelligence (AI) advances, sensors proliferate and data collections grow, prediction and detection technologies are likely to become more precise and effective.

Tarek Ghani, assistant professor of strategy at Washington University’s Olin Business School, and Grant Gordon, former senior director of innovation strategy at the International Rescue Committee, envision using AI to predict disasters in advance, thus enabling responders to take swifter actions to prevent or mitigate them.

Such tools could also anticipate how the crises would develop, guiding responders to be more effective in their interventions, they write in their 2021 article Predictable Disasters: AI and the Future of Crisis Response.

Not all disasters are equally accessible to AI, however, and the technology is most reliable at analyzing events for which the root causes are well understood, plenty of data is available to train the algorithms and instances are recurrent enough that the models’ predictions can be compared against reality and fine-tuned, Ghani and Gordon write.

Floods are a strong example.
San Antonio River
The San Antonio River Authority uses a forecasting model with data from the National Weather Service to predict floods in near real time.

Adobe Stock

The San Antonio River Authority (SARA) uses a tool to predict floods 12 hours in advance and inform emergency responders, its senior technical engineer Wayne Tschirhart told GovTech*.

In Bexar County, Texas, where the tool is currently deployed, rainfall can turn into a full-fledged flood within two hours, but the National Weather Service only updates its predictive models twice a day, meaning a flood could come and go before a new alert is out. SARA sought to supplement those services with its own, locally focused projections that it reruns every 15 minutes, giving a picture as close to “real time” as possible.

SARA uses data from the National Weather Service’s forecasting products to inform its model. To ensure the system stays accurate, SARA compares its estimates against on-the-ground readings it pulls every 10 minutes from water gauges placed at high-risk areas like dams and low-water crossings.

Disaster responders need to balance a desire for early alerts against their need for accuracy.

“We can get up to 12 hours pretty accurately, and it’ll go out to 24 hours; that’s our maximum prediction horizon,” Tschirhart said. “We want to know … now that the rain has fallen, within the next few hours, what can we expect from this creek as far as flooding is concerned? The further you go out [in your projections], the riskier it gets.”

We want to know … now that the rain has fallen, within the next few hours, what can we expect from this creek as far as flooding is concerned? The further you go out [in your projections], the riskier it gets.
The idea is to inform emergency responders quickly so they can route around submerged bridges and roadways when rushing to help, rather than being forced to backtrack. Responders identifying inaccessible areas would also know to call for help from neighboring jurisdictions that may have easier access.

Earthquakes aren’t as easily — or feasibly — predicted as floods, however. In fact, the U.S. Geological Survey (USGS) stresses that accurate earthquake predictions are impossible now and in the “foreseeable future,” with scientists able, at best, to give the probability of a significant quake hitting an area “within a certain number of years.”

Researchers and city agencies seeking to minimize quake damage are instead focused on detecting, and responding to, the first signs of a quake as rapidly as possible.

USGS offers an early detection system in California, Oregon and Washington known as ShakeAlert. It uses a network of seismic sensors to detect and evaluate the first vibrations of an earthquake (known as primary waves, or P waves) then relay their readings to a data center.

If four separate sensors register shaking, the system’s algorithms assume it is a real event, rather than a false positive caused by a hypothetical truck collision with a sensor or other non-quake vibrations, explained Bob de Groot, USGS ShakeAlert national coordinator for communication, education, outreach and technical engagement.

Algorithms then estimate the scope, location and severity of the earthquake’s shaking. USGS gives those details to its various message distribution partners — including public transit agencies and companies like Google — which then send out warnings to residents and critical infrastructure operators via cellphone alerts, app-based notifications and other methods.

Messages should arrive seconds before the earthquake’s damaging secondary waves hit.

Ideally, residents receiving alerts have time to drop, cover and hold on to something to reduce their chances of injury.

Some cities and institutions also automatically trigger public safety responses when they receive ShakeAlerts. San Francisco’s transit system automatically slows trains, for example. Other common automations include activating hospitals’ backup generators, opening elevator doors at the nearest floor and shutting water utility valves to avoid risks of reservoirs emptying.

Ghani and Gordon also thought automations could helpfully accelerate post-incident response.

Mitigating large-scale disasters can require extra emergency funding, and automated systems could send financial aid to areas where algorithms calculate a high probability of a serious event occurring. Different levels of funding could be automatically unlocked as crises worsen, ensuring responders have resources at hand to get straight to work, rather than be delayed by the need to seek and wait for aid.

“Instead of an operational infrastructure grounded in post hoc fundraising and service delivery, a future humanitarian system could orient around an operational structure that flexibly increases capacity for rapid response as a crisis worsens,” Ghani and Gordon wrote.
earthquake fault line in Mammoth Lakes, Calif.
ShakeAlert from USGS registers quakes in West Coast fault lines, like the Inyo Earthquake Fault in Mammoth Lakes, Calif.
Adobe Stock

Earthquake Warnings, Models Hit Their Limit


ShakeAlert produces analysis within seconds or tens of seconds from an earthquake’s start, and in earthquake response, every instant of advance notice counts.

To achieve its speed, ShakeAlert can only collect a “snapshot” of information before analyzing and transmitting findings — otherwise the warning comes too late. What it offers is a rapid-fire best-guess assessment of the situation.

“With ShakeAlert, we have a trade-off between time and accuracy,” de Groot told GovTech. “We make sure we’re as accurate as possible in the short[est] amount of time as possible.”

There are other limits, too. People near an earthquake’s epicenter are so close that they are likely to feel shaking before receiving a warning, because it still takes time for multiple sensors to trigger, algorithms to analyze and partners to send out alerts.

Residents sometimes say they want to receive ShakeAlerts about any earthquake they can notice, not just those that risk injuring them. But cellular messaging wasn’t built with this kind of speed in mind and sending phone alerts to large populations — say, an entire city — uses precious seconds. Adding recipients who don’t absolutely need to know slows the message, de Groot said.

“The more people that an app has to deliver to, the harder it is to move it quickly,” de Groot said. “Even though the cellphone company has the information within a couple of seconds, it takes time to push it out to the phones just because of their systems.”

Instead, USGS and partners only send cellphone-based alerts about quakes magnitude 4.5 or higher and only to people who’ll feel at least a weak shaking.

The system faces one other handicap: recognizing the Big One.

ShakeAlert is likely to struggle to accurately calculate the impact of any earthquake over magnitude 8, due to lack of data, de Groot said.

The world has seen only four magnitude 9 earthquakes since 1952, per USGS, and none in the Northwestern U.S. region where ShakeAlert focuses. The last was over 300 years ago.

That leaves scientists trying to build models using synthetic data and extrapolations from events in other countries — until one hits here.
fire sensor from N5 in Gilpin County, Colo.
Sensors from N5 are strategically placed throughout Gilpin County, Colo., to detect chemicals in the air and send them to a cloud-based algorithm to determine fire danger.
N5

To Fight a Fire, You Have to Find a Fire


By the time Gilpin County, Colo., residents see enough smoke to prompt a 911 call, fires are often already out of hand. In remote areas, a tree hit by lightning might smolder for weeks without a passerby to detect it before it ignites into a full-blown conflagration, said Gilpin County Emergency Manager Nate Whittington.

And smoke is only a vague indicator of where firefighters need to go, as shifting winds can create confusion, Whittington told GovTech.

The chase to locate the flames delays response and means that firefighters in the mountainous region may not discover if the fire is up a steep climb until they arrive. Some sites are impassable to firetrucks and unreasonably slow to hike up to on foot, requiring responders to call for helicopter or plane assistance.

The county hopes a tool can help them more rapidly detect and accurately pinpoint nascent fires. That means knowing in advance whether to dispatch an aviation team — saving valuable minutes — and catching lightning-struck trees while they’re still only smoldering.

“I’m hoping that these sensors can create it to where we are fighting fires — we are not fighting wildfires,” Whittington said.

As of March 2022, the county was early into adopting a fire detection and location system from firm N5. N5’s chief revenue officer, Debra Deininger, said Giplin is the first commercial deployment.

That system uses sensors mounted throughout target areas that are designed to detect chemical traces, smoke particulates and gases in the air as well as take heat readings, CEO Abhishek Motayed told GovTech. Sensors relay readings to a cloud-based algorithm that analyzes the data to update digital maps and deliver alerts and coordinates to responders’ cellphones.

The algorithms are intended to analyze sensor readings to differentiate between smoke from innocuous situations — home chimneys or campfires — and smoke from dangerous burnings. Seeing whether several sensors light up can also help, with multiple sensor activations more likely to confirm a spreading fire, Motayed said.

Gilpin County will pilot N5’s system this summer, and Whittington says one incident during earlier testing was particularly promising.

When the forestry service conducts controlled burns, sensors are brought along to collect data that will help the algorithm learn to distinguish between normal and abnormal air conditions. The evening following one such burn, a pile of vegetation reignited unintentionally; the system detected the unusually hot heat signature. This abnormal reading prompted N5 to call Whittington, who then contacted dispatch about sending someone to check it out. He was still on the phone when a 911 call came in reporting a fire on the pile.

The tool would’ve given a warning significantly in advance of the 911 caller’s, if its sensors had been programmed with the alert metrics being designed at the time, Whittington said.

“If sensors had been programmed to where we needed them, that notification would have come in 36 minutes before that 911 call,” Whittington said.

Whittington had several goals when selecting a technology solution, such as being able to withstand severe weather, but said he didn’t plan to evaluate its level of effectiveness based on specific metrics.

Instead, he’s taking a broad look and judging the investment as useful or not based on whether it helps save lives or fails to detect a fire.

“The best testament that I’m going to have to this technology is if one of my sensors goes off and I can evacuate all my people before that fire even gets close enough to their house that I have to worry about it. Then I can look back and say, ‘Yes, that worked,’” he said.

SARA similarly defines the success of its flood prediction tool in terms of lives potentially saved and time and effort spared by helping emergency responders better direct their resources and efforts.

In one instance, too, early flood prediction allowed SARA to warn a federal jet engine facility five hours in advance, allowing the facility to hoist sensitive equipment out of the water’s path.

AI Fills in the Blanks


Technologies are also being used to help get ahead of disasters before they start, such as in Norfolk, Va., where its Office of Resilience uses tools to provide residents with tailored advice about better protecting their homes against flooding.

Messages around flood risks have traditionally been too general, describing all members of a community as facing the same level of risk. This overlooks how differences in home construction and the frequency and depths of the flooding events they’re exposed to all influence the risks facing a property, Norfolk Coastal Resiliency Manager Matt Simons told GovTech.

The elevation of property grades and elevation of the home matter, as do factors like the building’s age, foundation type, and the existence of a basement or flood vents.

Blanket advice about flood preparation may not feel urgent to residents, either. Simons hopes to provide more meaningful guidance and better encourage residents to take action through a tool that offers recommendations personalized for their individual situations.

If one of my sensors goes off and I can evacuate all my people before that fire even gets close enough to their house that I have to worry about it. Then I can look back and say, ‘Yes, that worked.’
Norfolk offers an online Flood Risk Learning Center tool that allows renters and homeowners to plug in their addresses and a few other details, then view information on their chances of experiencing a flood and how high the waters might reach. The tool also suggests mitigations residents can take to lower flood insurance premiums and reduce total damages, such as filling in basements and relocating utilities out of crawl spaces.

The flood risk system draws on FEMA flood zone information, elevation data collected when Norfolk takes lidar readings of the city and housing records held by the city Real Estate Assessor’s Office to inform its calculations.

But data gaps remain that can make it harder to accurately assess flood risk for some areas. That’s especially the case for lower-income communities where residents are less likely to engage in activities like refinancing homes or seeking building permits that can result in elevation certificates being shared with City Hall.

Norfolk is hoping machine learning models can fill in those gaps and produce estimates about homes’ first-floor elevations and other data that isn’t otherwise already available.

“Usually, homes do very well at handling damage in the crawlspace, but once a flood enters that first floor of living space, each additional inch starts to dramatically scale up the amount of damage. And so that’s a kind of a mystery data point,” Simons said. “The pilot is doing things like filling in data gaps that are very difficult and expensive to get.”

Norfolk piloted the machine learning tools in 2021 on communities for which plenty of data was available. That made it easier to check and verify — or correct — the AI’s predictions.

FEMA’s Hazus Program supports emergency response, preparedness and recovery activities by modeling an area’s potential damages from natural disasters. It can make basic projections drawing on “generalized national databases” or users can feed it local specific information to get more accurate estimates.

Norfolk hopes the machine learning tool will provide data it can use to produce more accurate Hazus damage models. With those results in hand, the city could give its “most vulnerable and risk-exposed populations” better information on reducing risks, Simons said. As of March 2022, Norfolk was testing the tools across all its neighborhoods.

Looking Ahead


As scientists amass more data over the years and as data-collecting technologies like camera-equipped drones and satellites become cheaper and more widespread, algorithmic predictions are likely to become more precise and applicable to new areas.

AI systems haven’t always had enough good data to assess lightning strikes, for example. But University of Washington (UW) researchers announced in late 2021 that enough information had accumulated. They created a machine learning algorithm to anticipate where lightning would strike in the southeastern U.S. It reportedly predicts strikes two days sooner than a popular physics-based prediction method could.

“Machine learning requires a lot of data — that’s one of the necessary conditions for a machine learning algorithm to do some valuable things,” researcher and UW Associate Professor of Atmospheric Sciences Daehyun Kim told UW reporters. “Five years ago, this would not have been possible because we did not have enough data, even from [the World Wide Lightning Location Network].”



*Government Technology is a sister site to Governing. Both are divisions of e.Republic.
Jule Pattison-Gordon is a senior staff writer for Governing and former senior staff writer for Government Technology, where she'd specialized in cybersecurity. Jule also previously wrote for PYMNTS and The Bay State Banner and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.