Behind the scenes: Making of Sachin Tendulkar’s deepfake video

On his 50th birthday, Sachin Tendulkar sat down for a lunchtime interview with the lifestyle YouTube channel, Curly Tales. Eight months later, a deepfake video of the cricketer goes viral on social media depicting him endorsing a shady gaming app — Skyward Aviator Quest. The morphed video of the interview was used to promote the gaming entity which also falsely asserts that his daughter, Sara, is profiting from it.

Deconstructing the Sachin Tendulkar’s deepfake

India Today’s OSINT(Open-Source Intelligence) team scrutinised the video frame by frame – to decode how the altered video was created. In our analysis, we sifted through numerous videos on YouTube to find the specific interview where his attire aligned with the deepfake advertisement.

Upon analysing individual frames from the video, we found that the segment at the 34.04-second mark corresponds to the part used in the advertisement, evident from the similar hand and face movements. Further, the altered video masks certain parts of the original segment to hide discrepancies in lip-syncing.

A vast array of celebrities’ audio files are typically available in the public domain which are then illicitly employed for cloning voices using AI (Artificial Intelligence) tools. The newly altered videos — mostly, the work of meme-makers and marketers — go viral on social media sites like Instagram and Twitter. The content they produce is sometimes called cheap fakes by cloning celebrity voices, altering mouth movements to match alternative audio and writing persuasive dialogue.

Unravelling Entity Behind the Deepfake Video

The ‘Skyward Aviator Quest’ app is a new addition to the App Store, having been launched less than a month ago. While conducting a domain investigation, India Today’s OSINT team discovered that although the launch date on App Store is of December 18, 2023, its corresponding website was established only two days earlier, on December 16, 2023. This close timing between the app’s launch and the website creation adds to the dubious nature of the app.

Upon downloading from the App Store, the game ‘Skyward Aviator Quest’ unexpectedly redirects to another page called 1Win. This site is a hosting platform for various gaming, casino, and betting applications. It’s evident that 1Win operates on the App Store masquerading as ‘Skyward Aviator Quest’.

Records obtained from Open Corporates, an open-source corporate database show entity is operated by 1WIN NV, a company registered in Curacao, a Dutch Caribbean island. The website of the company asserts its legality in India and other countries, stating it has obtained a Curacao license permitting it to accept sports bets and conduct online casino gambling.

The platform has a staggering 4,23,562 subscribers on Telegram and 150k on Instagram. We encountered several videos similarly advertising the betting games of the 1Win platform. These videos promoted the idea of investing 300 INR to win 3,00,000 INR. However, following the emergence of the controversy, such videos have been removed.

The deepfake video pointed out by Sachin Tendulkar on his Twitter was not the only one. Further research into the entity’s social media handles led us to another video featuring Sachin Tendulkar promoting the same betting app on an Instagram handle. The source of this video appears to be an original advertisement for Paytm featuring Sachin Tendulkar. Again, following a similar process of voice cloning and video morphing, the original Paytm advertisement was rendered into a cheap fake promotional piece for the betting app.

Another deepfake video of Sachin Tendulkar promoting gaming apps

India is just one of many countries where 1win operates. Our research has uncovered their operations in several other nations, including Bangladesh, Nigeria, Pakistan, and Australia, as well as in Chile, Peru, Mozambique, Canada, Burkina Faso, Benin, Egypt, Brazil, Colombia, Mali, Senegal, Mexico, and Argentina.

What are gamers saying?

In the comments section of the advertisement, a concerning trend becomes apparent: numerous individuals have reported issues with withdrawing their winnings after depositing money to play the game. Additionally, users have expressed concerns on platforms such as Quora, the App Store, and Instagram, highlighting the addictive nature of the game and the challenges faced in withdrawing funds.

To tackle these scams, the Ministry of Electronics and Information Technology (MeitY) released an official advisory in December 2023, urging all intermediaries to comply with the existing IT rules. The advisory specifically highlights concerns regarding the spread of misinformation through AI, particularly emphasising the threat posed by deep fakes.

Rajeev Chandrasekhar, minister of state for information technology, responded to the controversy hours after Sachin Tendulkar’s tweet. He says, “There is no separate regulation for deepfakes, existing regulations already cover it under Rule 3(1)(b) (v) of IT Rules, 2021. We are now seeking 100% enforcement by the platforms and for the platforms to be more proactive including alignment of terms of use and educating users of 12 no-go areas which they should have done by now, but have not. As a result, we are issuing an advisory to them,”

Proliferation of Voice Cloning apps

Many of the video clips featuring synthesised voices appeared to use technology from ElevenLabs, an American start-up co-founded by a former Google engineer. In 2022, the company debuted a speech-cloning tool that can be trained to replicate voices in seconds. Later, ElevenLabs also posted on Twitter that it would introduce new safeguards, like limiting voice cloning to paid accounts and providing a new A.I. voice detecting tool.

In the arena of electoral politics, such voice cloning can be put to dangerous use, spreading misinformation in an all-new effective way. Just clone the voice of any political leader, superimpose the audio onto an existing video clip, and share.

There exists an array of social media profiles which create AI voice clones of Prime Minister Narendra Modi singing songs in multiple languages, as uncovered in our previous investigation. If shared without a watermark, they are likely to fool many a layperson.

The technology used to create A.I. voices has gained traction and wide acclaim since companies released a slate of new tools last year. Since then, audio fakes have rapidly become a new weapon on the online misinformation battlefield, threatening to turbocharge political disinformation ahead of the 2024 election by giving creators a way to put their own conspiracy theories into the mouths of celebrities, newscasters and politicians.

Published By:

Rishabh Sharma

Published On:

Jan 16, 2024





Source link

Leave a comment