Ms Ngbeni said the use of her image highlights the dangers of having an unregulated playing field as far as AI is concerned.
South African broadcaster and actress Thami Ngbeni has distanced herself from a deepfake video that purports to encourage people to invest in risky financial schemes.
On Thursday, the eNCA anchor issued a statement warning her many followers on social media to ignore videos in which her likeness is circulating on various platforms to promote financial schemes.
Warning and Disclaimer: I categorically distance myself from any videos currently in circulation that encourage individuals to invest in a particular financial system. This broadcast never took place. It's a deepfake. It's a scam. Be wary of videos featuring my likeness… pic.twitter.com/c0LM7BI75o
— Thami Ngbeni (@LifeWithThami) November 28, 2024
talk to The Citizen On Friday morning, Ngbeni said he first learned about Video from a former colleague who was interested in investing in what it was selling.
“I hadn't seen the video yet at that point. The next day, I got a call from a journalist who said her mother had seen the video and wanted to invest. She immediately knew it was a lie and told me He sent me the video,” Ngbeni asserted.
ALSO READ: TikTok's 'Tea Boy' flooded with compliments, Pretoria school denies bullying allegations
What is a deepfake?
Deepfake content includes images, videos, and audio that are edited or generated using artificial intelligence (AI) tools and may depict real or non-existent people. there is.
In 2023, an image of long-time SABC anchor Leanne Manas was used to promote a scheme similar to the one related to Ngbeni.
“There are also ethical issues that need to be considered as part of calls for stronger regulation of the use of AI,” Ngubeni said.
In the first article published, conversationLykan van Gensen, a lecturer in commercial law at Stellenbosch University, writes that it is concerning that the South African government has yet to take legal action to combat deepfakes.
in the UK earlier this year guardian Nearly 4,000 celebrities have reported being victims of deepfake porn.
American singer Taylor Swift was also the victim of a fraudulent AI tool that painted her image in a pornographic light.
Also read: Social media companies slam Australia's new under-16 ban
deception
Ngbeni said she did not know who made the video using her image, but said she contacted Facebook and they would not help remove the video.
“I reported this to Facebook, but their response has been horribly inefficient and has shown no desire to fight this,” Ngbeni said.
“Previous attempts by other colleagues to obtain support from Facebook have proven futile. This is a matter of global concern.”
Ms Ngbeni said the use of her image highlights the dangers of having an unregulated playing field as far as AI is concerned.
“If left unchecked, future destruction could be catastrophic, affecting the rights, integrity and dignity of individuals and nations. That is another level of deception.”
Read now: Women For Change, Chris Brown submits petition to ministries with over 50,000 signatures