Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
أخبار التكنولوجيا

A call that sounds like a loved one? Decoding the dangers of AI voice cloning scams | Technology News


On a quiet Delhi afternoon, Laxmi Chand Chawla’s phone rang. Since it was a call from an unknown caller, Chawla was reluctant, yet he answered. On the other end of the line, a man claiming to be a cop told Chwala that his nephew, Kapil, has been taken into custody over a sexual assault case. “We are putting Kapil on the line,” the cop said, and moments later, Chawla could hear the panicked, frail voice of his nephew. Kapil told Chawla that he was innocent and that he did not know what to do. The cop took over the call and told Chawla that they could hush up the matter by paying Rs 70,000 to the party for not pressing charges.

Chawla and his wife, Santosh, managed to collect around Rs 50,000 and transferred the money in a bid to help their nephew. At the same time, they tried reaching Kapil’s parents; however, they were unavailable. Later, the cop called again, this time demanding Rs 2 lakh. The couple sensed foul play. Yet again, they reached out to Kapil’s family. Much to their relief, they found that Kapil was safe and at his home. Kapil was unaware of any police case or phone call, and the couple realised they had been scammed.

The callers had cloned the voice of Kapil to dupe his family. AI voice cloning is becoming increasingly common, with many bad actors using it to scam unsuspecting users. These scammers are not only stealing money but are also manipulating people’s fears and vulnerabilities through advanced technology.

Story continues below this ad

In a similar case, Mumbai resident KT Vinod got a call from someone claiming to be from the Indian Embassy in Dubai. Seconds later, Vinod could hear the cries of his son Amit, who sounded scared. “Please, bail me out,” Amit said. Owing to the fear of losing his son, Vinod did not hesitate. The caller insisted that Vinod pay Rs 80,000 immediately. While Vinod paid the sum, he only realised it was a scam after finding out that his son was safe at home. The voice he heard over the call was generated using AI to sound similar to Amit. Even though Vinod reported the incident, the emotional toll it cost persists.

AI voice cloning gained momentum in the last few years. Today, there are 23,000 monthly searches on AI voice cloning. According to AIPRM, a company specialising in AI prompts, AI voice cloning was among the fastest-growing scams of 2024, and 70 per cent of adults are not confident that they could identify the cloned version from the real voice. Perhaps this explains the spate of AI voice cloning scams in recent times. “Scammers need just three seconds of audio to clone a person’s voice and use it for a scam call,” said Christoph C Cemper, founder of AIPRM.

“Even something as simple as repeatedly saying ‘hello’ during a blank call can give scammers enough data to replicate your voice. It is that easy and even dangerous,” opines Sagar Vishnoi, co-founder of Future Shift Labs and an AI & cybersecurity expert.

Tips to identify the AI voice scams:

The caller will typically claim to be a friend, family member, colleague, or someone you know. Ask the caller a question that only they will know the answer to, or create a secret phrase that only you and the caller would know. If they cannot answer with the correct response, it is likely a scammer.

Story continues below this ad

🎯 If you only hear your friend or loved one’s voice for a brief period, it could be a warning sign, as scammers often use the voice clone briefly, knowing that the longer it is used, the higher the risk of the receiver catching on.

🎯 If you are called from an unknown number, it can be a strong indication of a scam, as AI voice scams often use unknown numbers to make unsolicited calls. If the caller is claiming to be a company or someone you know, hang up and dial them back using a known number, either from your contact list or the company’s official website.

🎯 Be mindful of what you share. Avoid sending voice notes or personal videos to strangers online, because once your voice is out there, it is incredibly easy to misuse.

“Whenever you receive a suspicious call, stay calm. Talk to your family about a secret code or phrase only you know. It’s one of the simplest ways to stay a step ahead of voice-based scams,” says Sagar Vishnoi.

Story continues below this ad

“AI scams have seen a huge rise in recent years, but 2025 may prove to be the most dangerous year yet, with developments in AI and scammers’ tactics growing more sophisticated. As a result, understanding how to detect and avoid falling victim to these scams is crucial to prevent fraud and financial loss. It is crucial to follow the above advice and take caution if you receive any unexpected calls or texts that seem too ‘urgent’ or don’t feel right. However, some people will unfortunately be caught out by fraudsters,” Cemper added.

Cemper’s recommendations:

🎯Register a complaint: Report the scam to a government agency dealing with scams and cybercrime. Register a complaint on the National Cyber Crime Reporting Portal (cybercrime.gov.in) or call the 1930 helpline. Provide as much information as possible on the website about the scam.

🎯Halt transactions: Freeze your bank cards immediately; this is a quick and essential step to ensure scammers cannot access your financial accounts or apply for loans in your name.

🎯Change passwords: Make sure to change your passwords, especially if you use the same passwords for multiple accounts, and ensure these are all unique and strong across all accounts. It is also a good idea to use two-factor or multi-factor authentication to add extra layers of security.

Story continues below this ad

🎯Report AI scams: It is crucial to report AI scams, even if you feel embarrassed or think that the amount is too small to warrant action. No matter how big or small the scam, reporting it helps not only you but also contributes to building data on scams, which allows authorities to take action against fraudsters.

“These scams work not because people are careless but because the emotional weight of hearing a loved one in distress can override logic. Scammers are exploiting this with increasing precision. AI tools can now recreate voice tone, emotion, and even pauses with frightening accuracy, and they only need a short clip to do it. This changes how we think about trust and verification. From everyday people to businesses, we need new rules of engagement. Simple callbacks, identity confirmation, and extra checks may feel inconvenient, but they’ll become essential,” suggests Apurv Agrawal, co-founder & CEO of SquadStack.

According to Agrawal, better detection tools and safeguards in critical workflows are a must, along with greater public awareness around AI-generated voices and impersonation scams. “In particular, companies providing AI-driven customer experience solutions must recognise that the key to combating voice scams lies in real-time detection and multi-layered identity checks that go beyond traditional authentication. AI gives us powerful tools, but it also lowers the cost of deception. We need to keep up not just technologically, but emotionally and socially too.”

The Safe Side

As the world evolves, the digital landscape does too, bringing new opportunities—and new risks. Scammers are becoming more sophisticated, exploiting vulnerabilities to their advantage. In our special feature series, we delve into the latest cybercrime trends and provide practical tips to help you stay informed, secure, and vigilant online.



اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *

زر الذهاب إلى الأعلى