May 6, 2025
Trending News

Voice cloning: sounds like science fiction, but it’s already a reality

  • October 13, 2024
  • 0

The rapid development of artificial intelligence (AI) has brought both benefits and risks. One of the worrying trends is the misuse of voice cloning. Scammers can clone a

Voice cloning: sounds like science fiction, but it’s already a reality

The rapid development of artificial intelligence (AI) has brought both benefits and risks. One of the worrying trends is the misuse of voice cloning. Scammers can clone a voice in seconds and trick people into thinking their friend or family member needs money urgently.


News outlets including CNN warn that millions of people could be affected by such scams. As technology makes it easier for criminals to break into our personal space, it’s more important than ever to be careful in its use.

What is voice cloning?

The rise of artificial intelligence has created opportunities for image, text, voice and machine learning. While artificial intelligence offers many advantages, it also provides scammers with new ways to exploit people for money. You may have heard of the term “deepfake,” where artificial intelligence is used to create fake images, videos, and even audio, often involving celebrities or politicians.

Voice cloning, a type of deepfake technology, creates a digital copy of a person’s voice by recording their speech, accent and breathing from short voice samples. Once the speech model is captured, the AI ​​voice generator can convert the input text into highly realistic speech that resembles the target person’s voice.

Thanks to advanced technology, voice cloning can be done with only a three-second voice sample. “Hello, is anyone there?” A simple expression like. A longer conversation helps scammers capture more voice details, as voice cloning can lead to fraud. Therefore, it is best to keep calls short until you are sure who is calling.

Voice cloning has valuable applications in entertainment and healthcare; it enables remote sound work for artists (even posthumously) and helps people with speech disabilities. However, this situation raises serious privacy and security concerns and reveals the need to take precautions.

How do criminals use this?

Cybercriminals use voice cloning technology to impersonate celebrities, government officials or ordinary people to commit fraud. They create urgency, gain the victim’s trust, and request funds via gift cards, wire transfers, or cryptocurrency.

The process starts by collecting audio samples from sources like YouTube and TikTok. The technology then analyzes the audio to create new recordings. Once the voice is cloned, it can be used in deceptive communications, often accompanied by spoofing caller ID to appear trustworthy. Many cases of voice cloning scams have made it to the newspapers.

For example, criminals staged a theft of $51 million by cloning the voice of a company manager in the United Arab Emirates. A businessman in Mumbai has fallen victim to a voice cloning scam involving a fake call from the Indian embassy in Dubai.

Scammers in Australia recently used a voice clone of Queensland Premier Stephen Miles to trick people into investing in Bitcoin. Young people and children are also targeted. During a kidnapping scam in the United States, a teenager’s voice was cloned and her parents were manipulated into complying with the demands.

It only takes a few seconds for an AI to copy someone’s voice.

How common is it?

Recent research shows that 28% of adults in the UK have experienced a voice cloning scam in the past year, while 46% were unaware such a scam existed. This highlights a significant information gap that puts millions of people at risk of fraud. Almost 240,000 Australians reported falling victim to voice cloning scams in 2022, resulting in financial losses of $568 million.

How can people and organizations protect themselves from this?

The risks associated with voice cloning require an interdisciplinary intervention. Individuals and organizations can take various precautions to protect themselves against the misuse of voice cloning technology.

firstAwareness campaigns and training can help protect individuals and organizations and reduce such fraud.

Public-private partnerships can provide clear information and consent options for voice cloning.

SecondPeople and organizations should use biometric security with liveness detection, a new technology that can recognize and verify live voice as opposed to fake voice. Organizations that use voice recognition should also consider implementing multi-factor authentication.

thirdExpanding investigative capabilities against voice cloning is another important measure for law enforcement.

Finally, precise and up-to-date rules are needed for countries to manage relevant risks. Australian law enforcement agencies are aware of the potential benefits of artificial intelligence. But concerns about the “dark side” of the technology have led to calls for research into the criminal use of “artificial intelligence to identify victims”.

There are also calls for possible intervention strategies that law enforcement can use to combat the problem. Such efforts should be linked to an overall National Cybercrime Plan that focuses on proactive, reactive, and restorative strategies.

This national plan includes a duty of care for service providers, which is reflected in the Australian Government’s new legislation to protect the public and small businesses. The legislation introduces new obligations to prevent, detect, report and stop fraud.

This will apply to regulated entities such as telecommunications companies, banks and digital platform providers. The goal is to protect customers by preventing, detecting, reporting and stopping fraudulent cyber fraud.

risk reduction

Given that cybercrime costs the Australian economy an estimated A$42 billion, it is vital to raise public awareness and take robust action.

Countries like Australia are aware of the increasing risk. The effectiveness of countermeasures against voice cloning and other types of fraud depends on their adaptability, cost, applicability and compliance with regulatory requirements. To reduce the risk of victimization, all stakeholders (government, citizens and law enforcement) must be vigilant and raise public awareness.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *