The insurance industry is poised to capitalize on increasingly potent analytic technologies and big data, provided it can surmount persistent data storage, architecture and quality issues, experts say.
While many insurance companies have deployed predictive technologies for years in areas such as actuarial departments, the broader technology itself is just reaching maturity. In the case of big data, a class of emerging analytic technologies based on data sets too large for traditional software tools to effectively manage, the technology is more nascent still.
The 2012 Gartner Hype Cycle, an annual report tracking the adoption of emerging technologies and released in August by Stamford, Conn.-based research firm Gartner Inc., is instructive here. While the report places predictive analytics in the highest of the five categories for emerging technologies — the “Plateau of Productivity” — it relegates big data to the “Peak of Inflated Expectations,” its second-lowest level on the scale.
Nonetheless, insurers need to investigate and get ready for these technologies. In a separate report released in August, Gartner vice president and distinguished analyst Kimberly Harris-Ferrante argued that the volume inherent in big data will stress the aged information technology infrastructures present at many insurance companies and likely require large investments in data storage and new analysis tools.
“To be successful, it is critical that P&C and life insurers understand the emerging issues related to big data,” according to the report. “These include volume, velocity of data, complexities, data management, governance and the necessary IT funding.”
Additional challenges associated with big data will revolve around architectural changes necessary to retrieve data trapped in legacy core business systems. As such, the report recommends that insurers identify the limitations of legacy policy and claims management systems to support big data and budget accordingly.
Likewise, a report released in June from New York-based insurance consulting and advisory firm Novarica concludes the industry has significant challenges to overcome before the benefits of big data can be realized. The report, “Analytics and Big Data at Insurers: Current State and Expectations,” surveyed 86 insurance information technology executives who are members of Novarica's Insurance Technology Research Council and found the infrastructure required to run big data wanting, with only 15-20% of respondents indicating their organization is preparing its technology infrastructure for big data in the near future.
The report cited fragmented data environments, a lack of investment in tools as primary impediments and said the huge volumes of structured and unstructured data now available for analysis presents a new set of challenges for insurers.
“The rising tide of "big data' threatens to overwhelm enterprises that haven't yet truly gotten a handle on "little data' (structured enterprise data),” according to the report.
Another stumbling block the report indentified was corporate cultures that value received wisdom over critical analysis. Moreover, the report notes that proficiency in the use of analytics historically has been centered in a few departments within an insurance company.
“While usage of analytics across various actuarial and financial areas is generally widespread, insurers are much less advanced in using analytics for optimizing operational areas like marketing, underwriting or claims,” according to the report. “The insurers that will be best positioned to profit from the potential value of big data will be those who have created a culture where business leaders trust analytics and act on the insights provided.”
Just such a culture is emerging at insurer and reinsurer XL Group P.L.C., said Kimberly Holmes, head of strategic analytics for the company. The insurer's intent is to embed analytics in the decision management process across the enterprise to gain new insights.
“XL is committed to a strategy where big data and analytics are foundational,” she said. “It will revolutionize the way people make decisions.”
Ms. Holmes said that the true value of big data lies less in the size of the data sets used for analysis than in their novelty and accessibility.
“Big data to me is accessing and interpreting data that you never used before,” she said. “A lot of the data that was available 10 years ago wasn't accessible.”
Yet, many of the sources that likely will contribute to the largest amounts of data for analysis in the coming years did not exist a decade ago. A report released by PricewaterhouseCoopers L.L.P. in March contends data derived from sources including mobile devices, social networks and Internet-connected sensors placed on everything from cars to buildings to bridges will define the big data era.
“Insurers who intelligently harness this agglomeration of information will be able to better understand their customers and prospects, develop solutions — not push products — that address very specific customer needs, and build a foundation for a very positive customer experience,” according to the report.