Deep fake insurance losses will test cyber policies, cautions CyberCube

Deep fake insurance losses - artificial intelligence - mask

A cyber analytics specialist is cautioning of the danger and cost of these videos and audio files.

CyberCube, cyber analytics specialist, is cautioning the industry that deep fake insurance losses may test the strength of insurers selling cyber coverage. It pointed out that these videos and audio files are increasingly found in cyberspace and could cause a major threat to covered businesses and their insurers alike.

According to the report, the threat could become a major one within the next couple of years.

This technology provides cyber criminals with powerful abilities for creating false images of real politicians, business leaders and other public figures. As they become fully convincing, they could end up causing substantial deep fake insurance losses.

The CyberCube report was titled “Social Engineering: Blurring Reality and Fake”. Within it, the authors described the ability for criminals to create highly realistic audio and video dupes through the use of machine learning and artificial intelligence. The threat has already started, but as the technology continues to develop and businesses become increasingly dependent on it, the problem has accelerated.

Current social media sharing trends provide ample supply for disasters leading to deep fake insurance losses.

As public figures in virtually every category love to share images, audio and video online, it has provided cyber criminals with a vast amount of data with which they can create convincing audio and photo-realistic simulations of a person. This technique can be used for manipulating and influencing people.

_________________________

Random Success Quotes to Remember ~ “People will accept your ideas much more readily if you tell them Benjamin Franklin said it first.” - David H. Comins

_________________________

Furthermore, “mouth mapping” technology, developed by the University of Washington, can make the simulations even more convincing by mimicking an individual’s unique mouth movements during speech. This extreme accuracy helps to make false images even more imperceptible to human viewers.

Darren Thomson, the author of the CyberCube report, is the company’s head of security strategy. According to Thomson, criminals are already investing in the technology they need to exploit this trend, which could lead to costly deep fake insurance losses.

Deep fake insurance losses - artificial intelligence - mask“New and emerging social engineering techniques like deep fake video and audio will fundamentally change the cyber threat landscape and are becoming both technically feasible and economically viable for criminal organizations of all sizes,” warned Thomson.

Related posts

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.