From online romance scams to bank fraud, many continue to fall for the nefarious schemes of cybercriminals. In the United States alone, elderly Americans have reportedly lost $1 billion from online scams, while millions have been swindled in online dating scams by over $133 million.
It is safe to say that the internet will never be a safe place for the naive. Scammers abound and will stop at nothing to take advantage of those who know little about protecting themselves. In 2022, the volume of scams is set to grow further with new types of deception as well as new tech that will make it more difficult to detect and prevent scams.
Here’s a look at three of the scams everyone should be aware of. They are not necessarily entering the new year as entirely new ways to scam people, but they will likely become more polished and more effective at defrauding unsuspecting victims.
A fusion of the words “deep learning” and “fake,” the term deepfake refers to the generation of a hyper-realistic-looking set of images or videos based on a real person or object. This technology makes it possible for scammers to convince their targets to do their bidding with the help of an AI-fabricated video message or set of photos.
A scammer can fabricate a video message of some old pensioner’s son or daughter, for example, to request funds convincingly. The clueless victim, who most likely thinks that it is impossible for videos to be faked, would probably positively respond to the fraudulent request. Imagine what happens if this tactic is used on global leaders or corporate officials to order their subordinates to do unthinkable things.
In a lay person’s perception, a deepfake is like Photoshop for videos. As mentioned, though, it can also be used for photographs, making it way easier to fake multiple photos showing different angles of a person or something else that scammers want to use in their schemes.
Deepfakes were already used in attempts to influence voters before. Fortunately, the malicious use of the technology has not been that widespread yet. A good demonstration of deepfake use is this video of Barrack Obama who is not actually Barrack Obama but a deepfake of the former president with the voice of Jordan Peele. Most will likely fail to think of the video as an AI-driven fabrication, even when deepfake Obama’s voice is not actually Obama’s.
Deepfake capabilities are not limited to image and video simulation. It can also be used to generate imitations of the voices of real people. A number of AI companies have already started developing the technology to create deepfake voices based on a few sample voices. New Scientist reports that deepfake voices can fool both humans and smart assistants.
2. Fintech scams
The rise of fintech has made it easier for bad actors to scam people out of their hard-earned money. Also, it has given birth to various ways to take advantage of people’s insufficient knowledge of new payment and financial technologies.
With cryptocurrencies, for example, many continue to fall for the promise of high-yield investments with bitcoin and other digital currencies because of the highly publicized rise in bitcoin and other crypto prices. Crypto investing advertisements flood the internet, and they continue to defraud neophyte investors who happen to be very curious about the so-called “new age of investing.”
Some scamming companies make use of convincing web content on authoritative sites to spread the word about their fraudulent crypto investment plans. They lure investors with the promise of tripling or quadrupling investments in a year through some highly complicated processes that involve a combination of legitimate and dubious methods.
An investment plan that leverages the price differences of cryptocurrency across different exchanges, for example, can be mixed with a plan to take advantage of the growing popularity of NFTs. This idea sounds sensible and will likely attract a handful of investors, but it does not necessarily entail real high returns on investment and is very risky. Many companies with questionable investment and business schemes have emerged over the years and more of them will appear in the next year and target unsuspecting newbie investors.
Moreover, phishing scams that target the credentials of crypto owners and Fintech payment system account holders are also expected to continue and even worsen in 2022. The scams will be seeing upgrades particularly in the way the phishing pages and emails are set up to evade immediate detection.
3. Fake cybersecurity products
This is a steep slope for cybersecurity advocates to write about since it entails the possibility of making people skeptical about cybersecurity products. However, it has to be mentioned or discussed given the rise of new cybersecurity products that purport to help secure organizations but actually do nothing significantly new. The best they may offer is to provide standard protection that many other cheaper and even freeware solutions already provide.
A survey conducted by the Associated Press-NORC Center for Public Affairs Research and Pearson Institute reveals that over 90 percent of Americans understand the need to address the risks of cyberattacks. This is definitely good news, as it means more organizations are now seeking out ways, tools, platforms, or entire solutions to protect themselves better.
However, this growth in the demand for security products is also an opportunity for bad actors. There are “security companies” that claim to offer advanced security solutions to cover the full spectrum of threats. Unfortunately, their “advanced” products may end up as a bluff or a baseless claim. New approaches in cybersecurity such as continuous red teaming and sophisticated purple teaming may entice many to consider certain products. However, they may be overpromising and under-delivering at best or blatantly lying about their features and functions at worst
The advantages of continuous red teaming and advanced purple remaining are not falsehoods, but these terms may be thrown out by unscrupulous companies deceptively. They may not really have the means to provide the intended security validation to users. Organizations may just be wasting on fake cybersecurity products whose benefits can be obtained from considerably less expensive or even free solutions.
Addressing the new challenges
Scams are probably a constant on the internet now and forever. They are and will be inevitable. No matter how cybersecurity solutions advance, deceptive schemes are likely to remain, especially with internet users still failing to upgrade their knowledge and instinct to identify, detect, and stop scams.
With deepfakes, there are software solutions designed to analyze videos and test if they are genuine or fabricated. Organizations can use these, but it is crucial to provide people with the right information and reminders for them not to unwittingly become the very tool that defeats security controls. With fintech and fake cybersecurity product scams, it is essential for everyone to learn how to spot potential scams.
The challenge is for everyone to be equipped with the knowledge and the right tools to deal with these threats. It is advisable for organizations to invest in orientation or training sessions and well-designed protocols, policies, and security platforms to prevent scammers from succeeding.