Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Insurers shun user-generated content

Reprints
Insurers shun user-generated content

User-generated content presents a growing and evolving exposure for traditional media companies — one that insurers do not have much of an appetite for covering.

A key risk for media outlets occurs when an individual posts an item on social media and news organizations report the item without following their traditional fact-checking processes, according to experts.

“We’ve seen organizations have to retract very quickly after jumping on a story, publishing it and realizing how embarrassing it was because they relied on something they thought was accurate and realistic and in some cases turned out not to be,” said Thomas Srail, Cleveland-based executive vice president and tech media telecom practice leader, North America for Willis Towers Watson P.L.C.

Underwriters are examining the extent to which news organizations are moving away from traditional newsgathering policies and practices in the shift toward more real-time publication of news, particularly with the use of user-generated content, because that “could arguably raise the risk and raise the chance of litigation and liability,” he said.

Underwriters confirmed they have serious concerns about this trend.

“One thing that clients need to watch out for is the appetite of insurers for user-generated content,” said Angela Weaver, London-based media liability underwriter/ media specialist for Beazley P.L.C. “It’s not necessarily a policy exclusion, but they should make sure the insurer understands what the business model is and to make sure if they do have any user-generated content that their liability for that is covered.”

Underwriters previously asked for premoderation of user-generated content, but “that’s just not feasible nowadays,” due to the volume of content, she said.

A key risk management response is to quickly remove any offending items, said David Finz, New York-based media product leader, senior vice president and client advisor with the Marsh L.L.C. E&O Center of Excellence.

When it comes to copyrighted material, the Digital Millennium Copyright Act of 1998 offers safe harbor provisions that can shield companies from liability specifically for copyright infringements made by their website’s users as long as they have effective notice and takedown procedures and promptly remove the copyrighted material.

“There isn’t necessarily a take-down prerequisite when it comes to other user-generated content that’s perhaps libelous,” Mr. Finz said. “If you’re hosting comments or allowing users to upload content, it’s imperative for media organizations to monitor their websites or chatrooms or whatever electronic forums that they’re hosting to make sure that the content that’s being posted does not create a liability for them.”

Media companies cannot be sued for merely hosting defamatory comments online, per Section 230 of the federal Communications Decency Act of 1996, but can be sued for repeating a defamatory statement posted in other forums, according to legal experts.

For example, a newspaper publisher would be liable for a defamatory statement contained in a letter to the editor it printed, even if the statement is attributed to the writer, said David Greene, civil liberties director and senior staff attorney with the Electronic Frontier Foundation in San Francisco. However, an online publisher would not be liable for defamatory statements by readers published in the comments section of a news story, but it would be at risk if it uses these comments as source material for its own story, he said. And there are exceptions to Section 230 protections, including the publication of intellectual property or federal criminal law violations such as child pornography, Mr. Greene said.

“It doesn’t eliminate the risk, but it does make it more manageable,” he said.

Read Next

  • Media risks rise as public trust fades

    President Donald Trump has accused the mainstream media of being the enemy of the people, but the media’s own enemy remains defamation lawsuits, with social media complicating the equation.