
Tahir Qureshi
Tahir Qureshi is a seasoned media professional with a nose for news. He can best be described as a complete package, perfectly suited to journalism, since he can unearth buried, forgotten, authentic c ... Read More
Viral Video: Technology is ever-evolving and making things easier for us. It is very useful in the fields of medicine, astrophysics, communication, and many others. Now we have better equipment and machinery to help us in our daily lives. But then there are always flip sides to everything, and technology is not an exception. For example, the internet is a very powerful and useful tool and is inseparable from our daily lives but there are a few bad elements that misuse this wonderful technology for criminal purposes like hacking personal accounts and spreading viruses into systems.
Now we have a technology called “Deepfake” in which a person in an existing image or video is replaced with someone else’s likeness which uses powerful techniques from Machine Learning (ML) and Artificial Intelligence (AI) to manipulate or generate visual and audio content. It is very difficult, rather almost impossible to differentiate the fake from the original.
Highlighting the potential dangers of deepfake, Mahindra Group chairman Anand Mahindra has shared a video of an Artificial Intelligence-generated deep fake video created by a young man as a warning video to show common people how this technology can deceive them by making altered content in an audio-visual form. He has demonstrated it by superimposing the faces of Indian cricketer Virat Kohli, Hollywood actor Robert Downey Jr, and Indian actor Shah Rukh Khan.
He has captioned the video, “This clip which has been making the rounds is rightfully raising an alarm. How’re we preparing, as a society, to guard against potentially deceptive content which at best, can be mildly entertaining, but at worst, divide us all? Can there be tech-checks that act as a safeguard?”
This clip which has been making the rounds is rightfully raising an alarm. How’re we preparing, as a society, to guard against potentially deceptive content which at best, can be mildly entertaining, but at worst, divide us all? Can there be tech-checks that act as a safeguard? pic.twitter.com/wSmvGi4lQu
— anand mahindra (@anandmahindra) January 21, 2023
This is a very serious issue that should be addressed at the earliest because you never know who might, and when, could use this tool for destructive, damaging purposes.
According to Wikipedia, “Deepfakes have garnered widespread attention for their potential use in creating child sexual abuse material, celebrity pornographic videos, revenge porn, fake news, hoaxes, bullying, and financial fraud. This has elicited responses from both industry and government to detect and limit their use.
For breaking news and live news updates, like us on Facebook or follow us on Twitter and Instagram. Read more on Latest Viral News on India.com.
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts Cookies Policy.