BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.
To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.
To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.
Catastrophe models have become more accurate and more widely used since Hurricane Katrina struck the Gulf Coast a decade ago.
Before Katrina, modeling was largely a portfolio tool for insurers and reinsurers, said Rick Miller, national property practice leader at Aon Risk Solutions in Boston. Now, underwriters model individual accounts for rate setting and other purposes, and brokers provide the service for their clients.
“Modeling on a per-risk basis has definitely become a must-have in the last few years,” and was not common before Katrina, said Duncan Ellis, New York-based national property practice leader at Marsh L.L.C.
The quality of exposure data used in models, sometimes not very good before Katrina, has improved, experts say.
For example, casino barges moored on the Mississippi Gulf coast, badly damaged in Katrina's storm surge, often were wrongly classified as normal buildings, said Jayanta Guin, executive vice president at Boston-based catastrophe modeler AIR Worldwide. Now, modelers have better data on the construction characteristics, occupancy, height and other aspects of individual buildings, he said.
FM Global, with a portfolio of complex industrial risks, uses a proprietary model that includes site-specific engineering data, said Chris Johnson, executive vice president. “We have some very large and unusual properties,” he said. “They don't lend themselves to homogeneous modeling.”
Modeling alone is not enough, said Iwan Stalder, head of global catastrophe management at Zurich Insurance Group Ltd. in Zurich. “There is a need to adjust, to calibrate” the information to suit model users' specific needs.
The unexpected severity of flooding in New Orleans also made it clear the industry needed to develop “more robust” models for storm surge risk, Mr. Guin said. Models since Katrina have accounted in more detail for factors such as the action of wind on the ocean's surface and coastal topography, he said.
Katrina also improved the industry's understanding of demand surge, the escalation in rebuilding costs caused by a shortage of materials and labor. While demand surge was evident after Hurricane Andrew in 1992, data was sparse until after the 2004 and 2005 hurricane seasons that included Katrina, Mr. Guin said.
Demand surge estimates have since been incorporated into models, Zurich's Mr. Stalder said.
Meanwhile, Katrina's “massive disruption in arterial traffic” focused modelers' attention on business interruption and contingent business interruption exposures, said Tom Larsen, chief product architect at CoreLogic Eqecat in Oakland, California. While business interruption models were “OK at the time” of Katrina, they have become more sophisticated and incorporated more data about individual insured locations since then, he said.
Overall, models are now aiding hurricane preparedness and response in ways not typical before Katrina, experts say.
Zurich uses them to update its exposure by coastal sector as hurricane season approaches, Mr. Stalder said. As a storm nears landfall, the insurer overlays its projected track with model data on client properties in its path to help advise clients on mitigation steps and to direct its claims team.
After a storm, models also help project losses before actual claims start to flow in, he said.
New Orleans is far better prepared for a major hurricane 10 years after Hurricane Katrina devastated much of the city, but the same is not necessarily true for other cities along the Gulf and East Coasts.