Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Risk managers hampered in risk analysis by insurers’ inconsistent data formats

Reprints
Risk managers hampered in risk analysis by insurers’ inconsistent data formats

The lack of standardization in claims and exposure data restricts risk managers’ ability to efficiently analyze their losses and risks, experts say.

Insurers tend to use their own data formats, leaving policyholders, their third-party administrators or risk management information systems consultants to “scrub” the data to present an aggregate picture, which can be expensive, time consuming and incomplete.

“I think it’s a problem right now, especially when you’re having to use multiple carriers’ risk management systems,” said Kristen Peed, director of risk management for Cleveland-based CBIZ Inc.

“It kind of boils down to, they don’t use the same kind of nomenclature, so it’s not easily compared from one carrier to the next, so if I switch (insurers) it may be harder to understand how that would translate from year to year,” said Ms. Peed, who recently joined the board of the New York-based Risk & Insurance Management Society Inc.

“If you have multiple carriers and you’re trying to import that information, many fields are the same, but there may be missing fields for some, there may be extra fields for others, so there’s just no way you’re going to have a standardized data coming from all sources,” said Randy Nornes, Chicago-based executive vice president with Aon Risk Solutions.

Stephen Rhee, Chicago-based chief digital officer for Arthur J. Gallagher & Co., said there has been progress made in analyzing data on relatively uncomplicated claims, such as slips and falls, by submitting information on an online form, rather than relying on a handwritten description.

The issue, though, becomes more complicated with more complex coverages, observers say. “I’m thinking of cyber insurance, in particular,” where there are no standard forms, Ms. Peed said. When claims data is not truly aggregated, she said, “it’s really hard to assess what the exposure might be.”

Carol Castelloni, vice president of standards at the Pearl River, New York-based Association of Cooperative Operations Research & Development, said cyber, which was once part of other coverages, but has now become a broader stand-alone coverage, “creates the need of continually enhancing and changing how we code the information, both on the policy side and the claims side.”

“We’re having active conversations with members to look at how we structure some of that unstructured information,” she said.

The issue is not necessarily just among insurers, but can exist even within different insurance departments, “so they even struggle internally,” said Janet Dell, CEO of Marsh ClearSight L.L.C., a risk management information systems firm, in Chicago.

TPAs are a factor as well. Chris Knight, Atlanta-based analytics practice leader for Beecher Carlson Holdings Inc., the large-account unit of brokerage Brown & Brown Inc., said, “A client may work with a third-party administrator and they manage their data and have their own data systems and metrics,” but there may be inconsistencies in the data metrics when a brokerage firm looks at clients’ data from multiple TPAs.

Robert Petrie, president and CEO of Chicago-based Origami Risk L.L.C., a risk management information systems firm, said, “I’ve been in the risk management world for 25 years, and if you’d asked me 25 years ago whether we’d still be processing data from multiple insurers and TPAs” with the same technology, “I would have thought that was just a ridiculous proposition.”

Ms. Dell said the problem arose because “there’s been a lot of merger and acquisition activity in the industry and “everyone has their own formats that they wanted to use.” While some organizations have tried to develop an industry standard, there have been “various degrees of adoption because of the time and money involved,” she said.

For a legacy insurer, “your systems were built typically for the rating model that you filed” to capture data around that because it helped determine pricing, Mr. Nornes said.

The problem is particularly acute for midsize and small firms, experts say.

Larger firms will use TPAs “because they’re retaining a lot of the risk themselves,” which means there is more flexibility on the data fields captured, Mr. Nornes said.

However, for midmarket and smaller companies that largely rely on insurers to manage claims, and where insurers take a lot more risk, “you really don’t have any flexibility in what information is captured or shared,” Mr. Nornes said.

Patrick J. O’Neill, president of Redhand Advisors L.L.C., an Atlanta-based information technology consulting firm, said some clients can rely on a RMIS vendor “to do a lot of that aggregation for them, but I would think for a significant portion of the marketplace that’s cost-prohibitive to them, and the main reason is there’s no standard out there around the TPA.”

Mr. O’Neill said, “Smaller companies change insurance all the time. They’re price buyers, so the more you change, the more data points you have, which obviously drives the cost of aggregating your data.” 

 

 

Read Next

  • Flood risk models grow more sophisticated

    The havoc wreaked by Hurricane Harvey and other storms in recent years has taught valuable lessons about the changing nature of flood risk in the United States and the need to rethink old assumptions about the vulnerability of inland properties to flooding and the impact of weaker storms that instigate significant precipitation.