Experts Warn Artificial Intelligence (AI) Lovers Can Steal Your Cash

Experts Warn Artificial Intelligence (AI) Lovers Can Steal Your Cash

NEW YORK  — Cybersecurity experts are raising alarms about the potential risks associated with artificial intelligence (AI) chatbots and virtual romantic partners, warning that these technologies could be exploited by cybercriminals for financial fraud and data theft.

Jamie Akhtar, Co-Founder and CEO of CyberSmart, told reporters that while AI technology for creating virtual partners is improving, it also presents new opportunities for malicious actors.

“Deepfake technology has come on leaps and bounds in the past few years,” Akhtar said. “The problem is that this technology can be used for malicious ends.”

Experts highlight two primary concerns. Cybercriminals could potentially use emotionally manipulative AI to extort money or trick users into downloading malicious software. Akhtar cited a recent incident where a finance worker at a multinational firm was deceived into paying $25 million to criminals using deepfake technology to impersonate a company executive.

Additionally, even legitimate AI chatbots may pose privacy risks. Chris Hauk, Consumer Privacy Advocate at Pixel Privacy, warned that these applications often collect extensive user data and may share information with third parties.

“Many of these apps do not make it clear as to what data is shared with third parties, nor are they clear about the AI they use,” Hauk explained. He added that as users become more comfortable with AI chatbots, they might reveal more personal information, increasing their vulnerability to data breaches or identity theft.

The experts advise users to exercise caution when interacting with AI chatbots. They recommend using only official, well-known, and well-reviewed chatbot applications and avoiding downloads from third-party app stores or suspicious websites.

Users should limit sharing personal information, even with popular AI platforms like ChatGPT or Google Gemini, and be aware that chatbots can potentially leak shared information.

Experts emphasize treating AI chatbots with the same caution as interacting with strangers online and never agreeing to send money or share financial information with an AI chatbot.

As AI technology becomes more sophisticated and accessible, cybersecurity experts anticipate an increase in AI-based attacks targeting individuals and businesses. They emphasize the need for increased awareness and caution among users of AI chatbot technologies.

The growing popularity of AI chatbots and virtual partners underscores the importance of developing robust security measures and clear privacy policies for these technologies. As the field evolves, ongoing research and regulation will be crucial in addressing these emerging cybersecurity challenges.

administrator

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

More News

  • Politics
  • Sports
  • National News
  • Sister-Sister Talks

Iranian-Backed Militias Launched Five Rockets from Iraq Toward US Military Base in

According to two Iraqi security sources who spoke to Reuters and DailyMail.com, the town of Zummar in Iraq fired at least five rockets towards a

TikTok Criticizes US House Bill That Could Ban App, Calls It a

TikTok has reiterated its free-speech concerns about a bill passed by the House of Representatives that would ban the popular social media app in the

US Senate Passes $95 Billion Bill to Ban TikTok, Provide Aid to

The Senate has passed a substantial $95 billion package that includes critical aid for Ukraine, Israel, Taiwan, and the Indo-Pacific region, as well as a

Lawrence Taylor, Giants Legend, Endorses Donald Trump at New Jersey Rally

Lawrence Taylor, the Hall-of-Fame linebacker who helped lead the New York Giants to two Super Bowl titles, stunned supporters at a Donald Trump campaign event

Newsletter

Subscribe to our mailing list to get the new updates!

Subscribe to our newsletter to stay updated


Stay Connected

DON'T MISS ANY OF OUR UPDATE

X