BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Modeling firms take first look at cyber risks

Modeling firms take first look at cyber risks

As cyber attacks have proliferated and morphed, so have efforts to model the rapidly evolving risk.

From the hacker theft of credit and debit card records of 40 million Target Corp. customers in 2013 to the $81 million that cyber thieves stole in February from the central bank of Bangladesh — and numerous data breaches in between — the issue has the attention of insurers, brokers and modelers.

The cost can be substantial: Last year, Basingstoke, England-based Juniper Research Ltd. predicted the cost of data breaches will increase to $2.1 trillion globally by 2019, up nearly four times from the estimated cost of breaches in 2015.

Scott Stransky, Boston-based assistant vice president of AIR Worldwide's research and modeling group, said the Target data breach “triggered a movement to begin to model cyber risk.”

“We really needed to acquire a lot of data before we could even think about modeling or decide how to model,” Mr. Stransky said.

Armed with subsequent agreements with Richmond, Virginia-based Risk Based Security Inc. and its database of 16,000 cyber incidents and Cambridge, Massachusetts-based BitSight Technologies and its security performance measurement technology, AIR used the “information to give our modeling a real-time view of the risk” and “help calibrate the relative vulnerabilities between certain types of risk,” he said.

That led to the Verisk Analytics Inc. unit's January release of a global cyber exposure data standard to allow clients to capture necessary cyber risk information.

“We don't expect companies will collect everything,” he said. “It's a standard that they'll have to grow into, but not one that they'll outgrow very quickly ... The standard can work on just limited information on the industry of a company and its revenue.”

Rob Savage, director of product management at Risk Management Solutions Inc. in London, said “the abundance and frequency of cyber-related attacks makes it possible to observe and identify those trends and patterns that contribute to the risk modeling process.”

In February, RMS released the Cyber Accumulation Management System, which includes cyber catastrophe loss process models for data exfiltration, distributed denial-of-service attacks, cloud service provider failures, financial thefts and cyber extortion.

The models, devised in collaboration with cyber insurers as well as the University of Cambridge Centre of Risk Studies, examine “extreme but plausible catastrophic events,” he said. They include “who is carrying out the attacks; why are they carrying out the attacks; how are they carrying out the attacks.”

In recent weeks, Guy Carpenter & Co. L.L.C. said it had formed a strategic alliance with Mountain View, California-based cyber security firm Symantec Corp. to devise a cyber aggregation model for insurers to examine their frequency and severity distributions and potential losses.

“While there is not a standardized approach to underwriting this evolving risk, there certainly are key factors that underwriters contemplate when deploying their capacity for security and privacy coverage,” Julia Chu, New York-based managing director of strategic advisory at Guy Carpenter, said in an email.

The factors include a company's prior loss experience, industry class, whether point-of-sale technology is used and the portion of business conducted in the United States versus abroad, she said.

Willis Re, which released PRISM-Re in 2015 to enable insurers to quantify and manage their portfolio exposure to data breaches, plans an update to include network outages, said Alice Underwood, New York-based executive vice president and head of analytics for North America.

“I think we all recognize that what we've got in the market right now is absolutely first-generation stuff, and it will get better over time, just as the property catastrophe models have continued to improve. But, of course, it's a different science that goes into these models, just like in the terrorism models,” she said.

Karen Clark, co-founder and CEO of Boston-based Karen Clark & Co., said it likely will take years to build a “fully probabilistic cyber model.”

“This may not be the best approach for this peril,” Ms. Clark said in an email. “Many of the emerging risks are more challenging to model than hurricanes and earthquakes, so insurers should not expect a one-size-fits-all methodology.”

Lauri Floresca, senior vice president at Woodruff-Sawyer & Co. in San Francisco, said that while modeling cyber risk has improved in the past year, cyber risk is “a constantly changing landscape.”

Jeremy Platt, New York-based senior vice president and cyber solutions specialty practice leader at Guy Carpenter, said in an email that “cyber is more similar to perils like terrorism than hurricanes or earthquakes, because the motivations, vectors and methods are evolving rapidly — and the attackers can react and respond to changing defenses and mitigation.”

Read Next