Chairman of the United States Senate Special Committee on Aging Bob Casey (D-PA) hosted a hearing today titled “Modern Scams: How Scammers Are Using Artificial Intelligence & How We Can Fight Back.” The hearing focused on how scammers might use artificial intelligence (AI) to deploy scams and persuade targets of their validity, as well as how AI technology is being used to improve the next generation of systems for identifying fraud.
During a committee hearing on AI scams on Thursday, 16 November 2023, committee chairperson Sen. Bob Casey, D-Pa., released the group’s annual fraud book, which highlighted the top scams from the previous year. The FBI discovered that between January 2020 and June 2021, “individuals reportedly lost $13 million to grandparent and person-in-need scams.”
According to the yearly Senate Committee on Aging study released this month, older Americans lost $1.1 billion to fraud in 2022, with the majority of the frauds utilizing AI technology to replicate the voices of those they knew and other artificially generated ploys.
Sen. Elizabeth Warren, D-Mass., a committee member, said the $1.1 billion figure in overall losses is “almost certainly an underestimate,” because it does not account for victims who do not report frauds due to embarrassment.
In a statement, Casey stated that “federal action” is required to put up barriers to safeguard customers from AI-generated scams. There are currently few laws governing AI capabilities, which witnesses asked lawmakers to address through legislation. “Any consumer, no matter their age, gender, or background, can fall victim to these ultra-convincing scams, and the stories we heard today from individuals across the country are heartbreaking,” he said. “As a parent and grandparent, I relate to the fear and concern these victims must feel.”
Financial imitation and fraud, automated calls, online scams, scamming on dating profiles, stolen identities, and other scams were among the top ten kinds of scams recorded in the fraud book. The most well-known frauds utilized AI technology to imitate people’s voices, after which they made calls to victims, family members, or loved ones, requesting money. Several individuals testified during the hearing that they got calls that sounded just like a loved one was in danger, injured, or being held hostage.
Tahir Ekin, Ph.D., director of the Texas State Center for Analytics and Data Sciences, stated at the hearing that this purposeful tactic of impersonation boosts “their believability and emotional appeal. “Prioritizing the enhancement of data and AI literacy among older Americans and actively involving them in prevention and detection efforts stands as a cornerstone,” stated Mr. Tahir.
One older couple, who featured in a video testimony at the hearing, received a call from someone they mistook for their daughter. She seemed distressed and sought assistance. Terry Holtzapple, who was one of the victims, said, “My daughter was, she was crying on the phone, profusely crying and saying, ‘mom, mom, mom,’ and of course my wife was saying, ‘LeAnn, LeAnn, what is the matter?’, and she repeated it again, ‘mom, mom, mom’ and it sounded exactly like her,”.
Gary Schildhorn, a Philadelphia-based attorney who was also a victim of an AI voice clone fraud, spoke at the court as well. He was about to send $9,000 to the scammer when his daughter-in-law revealed it was a ransom attempt. The scammer, acting as an attorney, called Schildhorn and asked for money to bail out his son from jail after he had a vehicle accident and failed a breathalyzer test.
“There was no doubt in my mind that it was his voice on the phone — it was the exact cadence with which he speaks,” he told me. “I sat motionless in my car, trying to process what had happened. How did they obtain the voice of my son? My only conclusion is that they used artificial intelligence, or AI, to clone his voice… It is very clear that this technology… provides a risk-free way for scammers to prey on us.”
However, because no money was sent, law enforcement informed Schildhorn that no crime was committed and that no further action could be taken. “With crypto and AI, law enforcement does not have a remedy,” Schildhorn stated at the hearing. “Some legislation is needed to allow these people to be identified… so that there is a remedy for the harm that is being caused.” “At the moment, there is no cure,” he explained.
According to the Federal Trade Commission, senior Americans are more prone than younger ones to fall victim to online fraud. Chairman Casey revealed the Aging Committee’s annual Fraud Book during the hearing, as well as a booklet on AI-powered frauds and a bookmark with scam-avoidance suggestions. Chairman Casey also intends to explore steps to ensure that the Federal Trade Commission (FTC) tracks the use of AI in fraud responsibly.
“Today, we heard disturbing testimony about scammers using artificial intelligence to make their ploys more life-like and convincing,” said Casey, the chairman of the committee. “Any consumer, regardless of age, gender, or background, can fall victim to these deceptive scams, and the stories we heard today from people all over the country are heartbreaking. As a father and grandmother, I understand the dread and concern these victims must be experiencing. Federal action is required to erect safeguards to protect consumers from AI while simultaneously enabling those who can use it for good.”