Criminals are now using AI to create sophisticated phishing emails, build hacking tools and uncover weaknesses in IT systems.
While the world marvelled last year at the ease with which artificial intelligence (AI) tool ChatGPT completed tasks for everyday users, the criminal underworld celebrated the launch of FraudGPT on the Dark Web in July 2023.
Actuary and damages expert Gregory Whittaker describes the emergence of FraudGPT as particularly bad news for retirees, who are already at significant risk of losing their retirement savings to cybercriminals.
Whittaker wrote a prize-winning essay that appeared in the US-based Society of Actuaries Research Institute’s collection of essays entitled The Impact of Artificial Intelligence on Retirement Professionals and Retirees.
In his essay, A Retiree’s Guide to Artificial Intelligence Risks and Mitigating Those Risks, Whittaker describes FraudGPT as “the beginning of a new era of cybercriminal at scale.”
FraudGPT enables anyone intending to commit a cybercrime to create sophisticated phishing emails, build hacking tools and uncover weaknesses in IT systems.
According to Whittaker, “it is likely that we will soon see the end of badly punctuated, misspelt, misdirected and factually inaccurate phishing emails.” This will make it much harder to distinguish between honest communication from financial services providers and fraudulent approaches from criminals.
Criminals using AI to target seniors
The US Federal Bureau of Investigation (FBI) Internet Crime Report 2023, released in March this year, shows that the majority of cybercrime victims were older than 60, and suffered losses in excess of $3.4 billion in 2023.
While similar research does not exist for South Africa, it is safe to assume that retirees in this country are just as vulnerable.
According to Whittaker, cybercriminals frequently target retirees because they are likely to have access to capital through retirement savings. The increasing complexity of financial products, more retirees using computers and smartphones, and criminals using AI, create significant risks for retirees, he adds.
It is critically important, therefore, to educate pensioners about the various types of scams, and also to provide them with practical risk-mitigation strategies that can be used to avoid cyber scams.
He recommends that employers implement social media literacy programmes and cybersecurity training for older employees in preparation for retirement.
“An important consideration is to investigate how retirement changes the social life and social network of retirees. If they have a greater propensity to turn to social media to fill the void created by no longer interacting with colleagues in the workplace, there is the potential that more personalised information will become available to scammers to harvest.”
What to watch out for
Whittaker says while there are many scams targeting consumers, all retirees should be made aware of the following types of cybercrime:
Phishing and spear phishing
Most consumers who bank online have encountered warnings about phishing attempts, whereby criminals try to solicit information such as passwords via emails or text messages that appear to come from a reputable company.
While phishing attempts are sent out widely and randomly, with the senders hoping that someone will fall for the scam, spear phishing is more targeted.
With the help of AI tools such as FraudGPT, criminals can review large volumes of data to identify potential victims and tailor messages that capture the retiree’s unique circumstances.
This makes the approach even more believable for the targeted retiree, increasing the chances that confidential personal information will be shared with the criminal. Whittaker says this is an area of emerging risk for retirees.
Deepfakes
A common deepfake scam uses images of celebrities or trusted public figures claiming on social media, Telegram, or WhatsApp to have made large profits from online trading.
Retirees hoping to increase their retirement savings are tricked into signing up and parting with their money. However, when an attempt is made to withdraw the “invested” funds, the accounts are locked, and the bogus investment company is gone.
Grandparent scam (voice cloning)
Whittaker says that criminals using AI can clone a younger relative’s voice, and use that to call the retiree and report an emergency like a car accident or an arrest, and then ask for money. He explains that in most cases, the caller requests that the call be kept secret and pressures the grandparent for immediate access to the money.
While it is difficult to remain calm and think clearly when a family member calls in distress, any suspicious behaviour should prompt the grandparent to end the call and either call another family member for guidance or return the call on a number known to be genuine.
Families may also want to put in place safe words for all family members to help establish that the caller is authentic.
Mistrust is the best defence
Whittaker encourages all consumers to never share sensitive information over the phone, via email or via social media, no matter what.
“Instead of asking on social media whether something or someone is legitimate, rather call the company you believe you are dealing with, check in with your financial adviser, or call the Financial Sector Conduct Authority to check whether the company or individual is registered.”
This post was based on a press release Issued on behalf of the Actuarial Society of South Africa (ASSA).
0 Comments