If Deep fake videos used illegitimately to spread false information in the public domain then truth, morality & integrity naturally falls

Detect Deep fake Videos using Artificial Intelligence (AI) technology

Most of us have watched many Tollywood or Hollywood movies 🎞️ such as Rudhramadevi, Nisshabd (Reaching Silence), Avengers: Endgame, and World War Z. Have you ever wondered what is familiar in these films? Lip-sync technology (a Deep-fake video tool).

Some of us aren’t aware of these tools & fall into the beautiful trap of movies while watching movies. It’s almost impossible to spot deep-fake videos not only by you but also by many computer-based systems. Are you keen to know about these tools before moving on to our basic topic?

Example of Deep-fake videos: Avengers: Endgame
Avengers: Endgame

I am sure, you are! Hence, why do wait? let’s illustrate the above-said technology.

Lip-sync technology

The tool could effortlessly insert words that a person never said, even mid-sentence, or eliminate words. With the aid of the tool, you can manipulate a much smaller part of the image, and then synthesize lip movements that closely match the way a person’s mouth really would have moved if he or she had said particular words. Take a quick look at the positive & negative sides of the technology.

Selfishness is the greatest curse of the human race.

William E. Gladstone (Late Prime Minister of the United Kingdom)

Positive Side

The lip-sync technology is effectively a beneficial tool for video editors 📽️ to fix glitches without re-shooting entire scenes, in addition to tailor TV shows or movies for different audiences in different places.

The functionality of manipulating videos can be easily understandable through any fictional TV show, a movie, or a commercial. The Lip-sync technologies can easily save time and money by using digital tools to clean up mistakes or tweak scripts.

Example of Lip-sync technology: Baahubali The Beginning
Baahubali: The Beginning

Puzzled? Let’s elaborate more about the tool by an example of an Indian epic movie Baahubali: The Beginning. Originally, the movie was shot in Telugu and Tamil languages, but also available to audiences in other languages such as Hindi and Malayalam. It can happen because of the use of the marvelous Lip-sync technology.

Negative Side

The situation becomes worst when mind-blowing technology falls into the wrong hands. And, then, the deep-fake videos are created for the express purpose of distorting the truth.

The problem becomes a tsunami when deep fake tools are illegitimately used to spread false information in the public domain. There are n-number of examples to prove the negative effects such as: to win Elections, to increase TRP of false news, to increase riots, or to spoil the image of any reputed personality.

Read More 👉 Discovery of Security Ink to curb fake currency notes & passports

Deep fake Videos Tools

Lip-sync technologies Face-swapping
Effortlessly insert words that a person never said in the video.Superimpose one person’s face over the video of someone else.
(the computer can detect because the tools are relatively crude and leave digital or visual artifacts)

Think! Question everything! Don’t believe everything you see or hear.

Research on Deep Fake Videos

Deep-fake videos of real people

Prof. Maneesh Agrawala & his team at Stanford University and UC Berkeley have devised an Artificial Intelligence (AI)-based approach to detect lip-sync technology (of deep-fake videos).

He teamed up on a detection tool with Ohad Fried (a postdoctoral fellow at Stanford); Hany Farid (a professor at UC Berkeley’s School of Information); and Shruti Agarwal (a doctoral student at Berkeley).

According to the study titled Detecting Deep-Fake Videos from Phenome-Viseme Mismatches, the program is claimed to spot more than 80% of fakes by recognizing minute mismatches between the sounds people make and the shape of their mouths.

How to detect Deep-Fake Videos?

To spot a deep fake, researchers looked for inconsistencies between mouth formations and phonetic sounds
To spot a deep fake, researchers looked for inconsistencies between mouth formations and phonetic sounds

To spot unethical uses of such technology, the researchers seek out inconsistencies between mouth formations and the phonetic sounds. They looked at the person’s mouth when making the sounds of a “B,” “M,” or “P” because it’s almost impossible to make those sounds without firmly closing the lips.

Real Truth Test: Researchers openly said that their approach is merely part of a cat-and-mouse 🐱🐭 game. As deep-fake techniques improve, they will leave even fewer clues behind.

Agrawala said:

Detecting whether a video has been manipulated is different from detecting whether the video contains misinformation or disinformation, and the latter is much, much harder,”

At last but not least, we need to increase media literacy among people and have to develop strict laws against those who are intentionally producing disinformation and creating fake breaking news everywhere. For more tutorials, please visit Techno Savie’s How To section.

To get all the latest tech news, like us on facebook and follow us on twitter, instagram & LinkedIn.

Megha Jain

Engg & Freelance Author

Straight forward