Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Industry gears up for AI decision-making

Reprints
Industry gears up for AI decision-making

While artificial intelligence promises to make sweeping changes across society, insurance industry experts warn that disruption will accompany that change as AI devices begin making their own decisions.

AI will have an impact on such diverse areas as the economy, the environment, politics and the legal system, according to experts.

“We’ve seen this with industrial and technical revolutions over the years and decades and eons,” said Thomas Srail, technology, media and telecom industry leader in North America for Willis Towers Watson P.L.C. in Cleveland. “But clearly there’s risk in the short term for companies and organizations caught in the crossfire. Certain companies will be rendered less useable in the economy, in our society, but others will grow out of that.”

A report released last week by Allianz Global Corporate & Specialty S.E., The Rise of Artificial Intelligence: Future Outlook and Emerging Risks, said AI “will disrupt the labor market, changing the nature of long established roles, and could be used to influence political thinking and opinion.”

The Allianz report differentiates between so-called “weak” AI, such as customer service chatbots in the insurance industry, and “strong” AI agents that display human-like intelligence and the intrinsic ability to generalize and create new concepts. Strong AI agents are expected to be on the market sometime around 2040, the report said.

“The underlying topic is that we are shifting human decision-making processes from human beings to machines,” said Michael Bruch, Munich-based head of emerging trends/ESG business services for Allianz Global Corporate & Specialty. How to program these machines will be very challenging in the future, that’s for sure."

Autonomous vehicles are moving into the transportation sector, and the Allianz reported noted that “despite the promise of streamlined travel, AI also brings concerns about who is liable in case of accidents and which ethical principles autonomous transportation agents should follow when making decisions with a potentially dangerous impact to humans.”

Mr. Bruch cited an example of a car accident where the vehicle has to either strike a child in the road or turn into the path of a tractor-trailer in the opposite lane of traffic.

“There’s a kind of ethical dilemma in how to program the software in such critical cases,” he said. "Governing bodies and regulatory bodies will have to give guidance to data privacy issues and ethical questions.”

“We’re talking about software that teaches itself what to do and what decisions can be made,” Mr. Srail said. “I’m not jumping directly to the ‘Terminator 2: Judgment Day’ scenario, but that is a concern because clearly there are biases in software and programming and decision-making that humans enter into the algorithm and into the decision-making that software will do.”

However, he added, “at the point where software starts to make those decisions and determinations by itself and starts learning on its own and programming itself to do things in certain ways on its own, that’s very different.”

Autonomous vehicles made news last week when a self-driving Uber vehicle struck and killed a woman crossing the street in Tempe, Arizona. The accident is under investigation.

Autonomous vehicles could also potentially cause a shift in the automotive industry in the event of a collision, Mr. Srail said.

“Is it the designer of the components or the software running those hardware components — is the failure there? Or is it the manufacturer, the person who put the components in that vehicle, is it their liability? Is it the owner of that vehicle?” he said.

David Laks, Toronto-based vice president and risk control services manager for Eastern Canada, for Chicago-based Hub International Ltd., suggested that with a collision, “everyone is going to get pulled in.”

“There’s going to be the vehicle manufacturer, the person providing the technology, the person involved in the incident, even the municipality that allowed that vehicle to be test driven in that particular area,” he said. “It’s a long and very expensive legal battle.”

 

 

Read Next