The call sounds routine at first, a calm voice saying it is from Walmart’s fraud department about a $919 PlayStation 5 charge on a shopper’s account. Then the script tightens, asking the customer to “verify” their name, address, and card details to cancel the order. By the time the person realizes something is off, the damage is already done. That is the new reality of AI‑driven phone scams, where cloned voices and big‑box branding collide to hit people right where they feel safest: their everyday shopping habits.
What makes this wave different is not just the tech, but the way it piggybacks on the trust people place in a store they visit for groceries, school supplies, and last‑minute birthday gifts. The scammers are not guessing in the dark. They know that millions of Americans have a Walmart account, that a PlayStation 5 is a high‑ticket item, and that a surprise $919 charge is just scary enough to override common sense.
How the “Carl” robocall hooks Walmart shoppers
The basic setup is simple, but the execution is unnervingly slick. Shoppers report getting automated calls from AI voices that introduce themselves as Walmart representatives, sometimes using the name “Carl,” and claim there is a suspicious gaming console purchase on the person’s account. The caller ID can look generic, but the script leans on specific details, like a PlayStation 5 bundled with a Pulse 3D headset, and a total around $919, to sound like a real order dispute instead of a random phishing attempt. According to reporting on AI-generated calls, scammers are using synthetic voices that stay calm, polite, and eerily consistent, which makes it harder for people to trust their gut when something feels off.
Once the victim is on the line, the script shifts from warning to “verification.” The AI voice walks through a series of prompts, asking the person to confirm their full name, address, and payment details, supposedly to cancel the bogus order. That is the pivot point where a fraud alert turns into identity theft. Consumer advocates at Scamicide warn that anyone who hands over this information is likely to see it reused for new credit lines, drained bank accounts, or other fraud that can take months to unwind.
Social clips have helped spread the word. One widely shared YouTube short titled “Shop at Walmart? THIS AI Scam Is Targeting YOU” plays a recording of a voice saying a PlayStation 5 with a special edition and Pulse 3D headset is being ordered from a Walmart account for an amount of $919, then urging the listener to press a key to cancel the order, a pattern echoed in that clip. On Instagram, posts describe scammers calling and pretending to be Walmart, claiming a pre‑authorized purchase of a PlayStation 5 plus the Pulse 3D headset on a shopper’s account, and urging people to share the warning so others can report the calls to their national consumer protection agency, as seen in one viral post. Those videos and screenshots are not primary evidence of the crime itself, but they show how quickly the script is spreading and how familiar it is becoming to regular shoppers.
Why Walmart is such a tempting AI target
Scammers are not picking Walmart at random. The retailer’s sheer scale, with millions of active shoppers using online accounts and in‑store services, makes it a near‑universal reference point. If a robocall claims to be from a niche boutique, most people will hang up. If it claims to be from the place they bought groceries last night, they listen. Federal regulators say the Walmart impersonation robocall scam is hitting millions of customers, with one report noting that 8 million Walmart customers are affected by a scam attempt that uses an automated call that seems legitimate, according to the FCC data. Another breakdown of the same scheme describes AI voices identifying themselves as “Carl” and targeting millions of U.S. consumers, according to a report that summarizes what the Walmart impersonation robocall scam looks like Per FCC.
Regulators say the volume is staggering. One analysis notes that the Walmart impersonation scam targets hundreds of thousands of people each week as part of a broader campaign that reaches tens of millions of U.S. consumers, according to The FCC. That scale helps explain why scammers are investing in AI voices instead of old‑school robocall scripts. If they can sound like a real customer service agent, they can keep more people on the line for longer, which means more chances to harvest data. It is the same logic that drives targeted advertising, just flipped into a darker business model.
There is also a psychological angle. A PlayStation 5, especially with a Pulse 3D headset, is a high‑status gadget that cuts across age groups, from teens to parents who know it as the thing their kids keep asking for. That makes the fake $919 charge feel plausible to a wide slice of shoppers. It is not a random cryptocurrency token or obscure software license. It is a console they have seen in store displays and holiday ads, which is why the script described Whenever scammers call feels so believable in the first few seconds.
More from Decluttering Mom:

