Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Data quality, managing change key for underwriters' predictive models

Reprints
Data quality, managing change key for underwriters' predictive models

NATIONAL HARBOR, Md. — Underwriters need to emphasize data quality and managing change when introducing predictive models into the underwriting process, experts said during the Insurance Accounting and Systems Association Inc.'s annual conference.

Speaking at a panel discussion on Tuesday, Rich Kirste, San Francisco-based chief technical officer and chief actuary at Berkshire Hathaway Homestate Cos., said his company introduced predictive models two years ago to help determine the likelihood of workers compensation claims.

Mr. Kirste said convincing underwriters of the necessity of the new tools can be a challenge.

“The first reaction from some underwriters is to reject it at almost a molecular level,” he said. “You have to expect that.”

Karen Moritz, manager-data science analysts at San Antonio-based United Services Automobile Association, said one key to ensuring that existing employees understand the importance of predictive models is to ensure support from the highest levels of the organization.

“If you can't get senior management buy-in, you may as well give up,” she said.

Ms. Moritz said another way she works to make sure the models are widely accepted is by employing a diverse team to create them. In addition to the eight Ph.D.-level modeling experts, her team also includes business analysts who know the business intimately to help provide a bridge to front-line underwriters.

Another primary consideration for leaders of predictive modeling initiatives is the finding an abundance of quality data. Ms. Moritz said the vast majority of internal transactional data held by insurance companies is not germane for modeling.

%%BREAK%%

“You need a different type of data than transactional data,” she said. “Very few companies have enough of it.”

Mr. Kirste acknowledged a preference for internal data when constructing models but agreed that more is better when constructing predictive models. “How much data do you need? The technical answer is 'a lot,'” he said, adding that technology providers and third-party data sources can help supplement internal data.

“A lot of times, a vendor may have access to more data to help improve the model,” he said.

Dax Craig, president and CEO of Denver, Colo.-based modeling firm Valen Technologies Inc., warned that companies need to be judicious in their use of third-party data, noting that companies sometimes pay for data that they could obtain elsewhere for free.

“External data can be expensive,” Mr. Craig said. “If you don't get 'lift' from it, you can't justify the cost.”

With models in place, it is important to constantly fine-tune them, Mr. Kirste said.

“The world is much more complicated than any model I've ever seen,” he said. “It will always be a work in progress.”

Read Next