BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

WTC attack drives efforts to quantify terrorism risks


A tool unavailable to U.S. insurers at the time of the Sept. 11, 2001, terrorist attacks is helping them manage terrorism risks with increasing sophistication.

The first commercially available versions of that tool--terrorism risk modeling--appeared in the marketplace in 2002. Like catastrophe models, terrorism models help insurers limit their aggregate exposures in a given geographic area. Unlike catastrophe models, however, terrorism models cannot draw upon a century of data, and factors such as terrorist attack frequency remain unknown.

Still, "virtually all insurers who have any material amount of terrorism exposure" are managing the risk with such models, said Peter Ulrich, senior vp-model management at Newark, Calif.-based Risk Management Solutions Inc., which introduced a terrorism model in 2002.

Mr. Ulrich explained RMS' approach to modeling in testimony he presented July 25 to a joint hearing of the House Financial Services Committee's Oversight and Investigations Subcommittee and the Homeland Security Committee's Intelligence, Information Sharing and Terrorism Risk Assessment Subcommittee (BI, July 31). "RMS does not attempt to predict the time and place of the next terrorist attack. Our focus is on modeling the likelihood of an attack occurring at given target, using a specific weapon, and then determining the consequences of such an attack," according to his testimony.

In 2002, Zurich North America developed its own customized version of the RMS terrorism modeling program to assess workers compensation and property exposures, said Dan Loris, a senior vp with the Schaumburg, Ill.-based insurer.

While workers comp exposure was largely tracked on a state-by-state basis before 9/11, Zurich realized after the attacks that it needed to know how many workers each policyholder had in a single building at a given time and how many policyholders Zurich insured in a given area, he said.

"The whole philosophy around reporting and quantifying workers comp exposure changed after 9/11," Mr. Loris said, noting that underwriters focused on risk accumulation in the same way property catastrophe insurers had done.

Zurich's system not only gives it real-time access to workers comp and property data on a building-by-building basis but also allows it to calculate its probable maximum loss from hypothetical terrorist attacks such as a five-ton truck bomb or a two-ton radiological "dirty" bomb detonated outside a building.

"It allows us to optimize our capacity so we aren't shying away from risks we don't really need to worry about," Mr. Loris said.

"We've spent a considerable effort in developing a robust approach to assessing our terrorism risk," said Robert Paiano, senior vp-reinsurance and catastrophe risk management at The Hartford Financial Services Group Inc. in Hartford, Conn.

"Given the unique characteristics of the peril, including the fact that acts of terrorism are intentional and the potential magnitude of the loss is so significant, we use a multifaceted approach in assessing and monitoring our exposures," Mr. Paiano said. This includes "an internally developed deterministic model, a periodic review of loss estimates using a vendor model and monitoring our aggregate limits exposed. We perform this assessment across both our life and P/C operations," he said.

While modeling has allowed insurers to understand their exposure and measure the severity of various types of terrorist attacks, the frequency of and weapons used in attacks remain wild cards.

"Enterprise risk management has really taken hold as a discipline" among insurers, including sophisticated measurement of terrorism and other sources of financial loss, said Thomas Upton, managing director with Standard & Poor's Corp. in New York. But "I don't think most (insurers) feel they have a fully adequate grasp of their exposure. That's why relatively few companies are willing to write terrorism insurance and why we need" a federal coverage backstop, he said.

"The insurance industry has only a marginally better idea today about the likelihood of a terrorist attack than we did five years ago," said Robert Hartwig, president-elect and chief economist of the Insurance Information Institute in New York. He described the frequency of future attacks as "fundamentally unknowable."

Assessing frequency is one of the biggest challenges for modelers, said Jack Seaquist, a senior manager at AIR Worldwide, a Boston-based unit of the Insurance Services Office Inc. AIR, which introduced a terrorism model in 2002, has established processes to do the best job of estimating frequencies "but it's all subject to the limitations of the available information," he said.

Mr. Seaquist pointed to a key difference in information gathering. Whereas governmental authorities are more than willing to share information about natural perils such as hurricanes and earthquakes, governments may not be forthcoming about terrorism-related information, he said, and terrorists operate in secret.

Another challenge is dealing with the fact that terrorist events are localized, he said. "So we need to know exactly where our potential targets are and the insured location down to the street address level. That's where we've been developing a more accurate base of buildings in the U.S., increasing the depth and extent of that database to help clients refine their own data," Mr. Seaquist said.

"One other aspect is the calculation of the maximum foreseeable loss," Tom Larsen, senior vp of EQECAT Inc., an Oakland, Calif.-based unit of ABSG Consulting Inc., said in an e-mail. EQECAT also introduced a terrorism model in 2002. "Again, natural catastrophes modelers have the ability to logically link observations of the past into a credible assertion of future expectations. While 'probabilistic' methodologies do not lend themselves to the calculation of a 'worst event,' they can develop robust estimates of very extreme-level events," he wrote, for example, estimating a loss that has a 0.2% probability to occur in a single year, often called a 500-year event. In addition, "historic data can provide us with a very good understanding of the maximum severity of an event," such as an earthquake of a specific magnitude occurring on a specific fault.

"For terrorism, the trend is that we have not observed the maximum event. As time progresses, we see the 'maximum' event creep forward," with single-building terror incidents giving way to the Sept. 11 multiple attacks. "The extent of the future 'maximum' event is limited only by human ingenuity," he wrote.

While modeling theoretically allows insurers to fix limits on their exposure in a given geographic area, competitive pressure will continue to govern their actual underwriting, some say.

"I don't think very much has changed" because of modeling, said Stephen A. Cozen, an insurance defense lawyer at Cozen O'Connor in Philadelphia. "If you want to write financial services (companies), how can you possibly avoid accumulation and concentration of risk in a small geographic area? You can't."

The III's Mr. Hartwig agreed that competition will continue to be a factor.

"Competitive reasons will probably drag some insurers into situations where they are more highly exposed to terrorism risk than they'd prefer to be," he said.