Weekly News Roundup February 13–17, 2023


Posted on by Kacy Zurkus

The ChatGPT frenzy continues, with artificial intelligence and the AI used to train algorithms making quite a buzz this week. Several news outlets warned about a concept called AI hallucinations.

“According to Greg Kostello, CTO and Co-Founder of AI-based healthcare company Huma.AI, AI hallucinations manifest when AI systems create something that looks very convincing but has no bases in the real world,” Cybernews reported.

Apparently, AI can not only hallucinate but also dream and harass targeted victims. Earlier this week, MIT Technology Review wrote that “AI is dreaming up drugs that no one has ever seen.” Meanwhile, a reputation-management company, Eliminalia, is allegedly able to “bury bad news under fake news” by manipulating search engine algorithms.

All these stories made me think of Taylor Swift’s “Look What You Made Me Do.” So, I went to ChatGPT and entered, “look what you made me do,” to which the chatbot replied, “I apologize if I have caused any negative emotions or actions. However, as an artificial intelligence language model, I do not have the ability to directly influence your actions. I am here to assist and provide information to the best of my knowledge and programming. If you need to talk to someone or require assistance, please let me know how I can help.” I found the response unsettling as it appears to me that this artificial intelligence language model is indeed directly influencing human behavior.

Meanwhile, others are asking whether AI is at its tipping point, and clearly, many are concerned about its potential use. Reuters reported, “More than 60 countries including the U.S. and China signed a modest ‘call to action’ on Thursday endorsing the responsible use of artificial intelligence (AI) in the military.” What are your thoughts? Keep the conversation going on your social channels using #RSAC. To learn more about artificial intelligence and machine learning, visit our Library.

Now let’s take a look at other headlines you might have missed this week.

Feb. 17: “A malicious campaign conducted against entities in Armenia in November 2022 has been spotted by security researchers at Check Point Research (CPR),” Infosecurity Magazine reported.

Feb. 17: The Society for Human Resource Management (SHRM) posted a blog warning HR managers about a rise in fraudulent job applications with tips on how to spot a fake.

Feb. 17: CNN reported, “The FBI has been investigating and working to contain a malicious cyber incident on part of its computer network in recent days, according to people briefed on the matter.”

Feb. 16: Despite the lack of incidents reported in the midterm elections, ABC News reported, “Top state election and cybersecurity officials on Thursday warned about threats posed by Russia and other foreign adversaries ahead of the 2024 elections, noting that America’s decentralized system of thousands of local voting jurisdictions creates a particular vulnerability.”

Feb. 15: According to Skift, “Scandinavian airline SAS said it was hit by a cyber attack Tuesday evening and urged customers to refrain from using its app but later said it had fixed the problem.”

Feb. 15: Many cybersecurity professionals took to social media to honor Chris Inglis, who ended his tenure as US National Cyber Director.

Feb. 15: Homeland Security Today reported, “The U.S. Secret Service, in partnership with the Estonian Academy of Security Sciences and Baltic regional law enforcement authorities, opened a new cyber training program.”

Feb. 14: The Chronicle of Philanthropy reported that Craig Newmark intends to safeguard the country through his support of both journalism and cybersecurity.

Feb. 13: “Apple has released emergency security updates to address a new zero-day vulnerability used in attacks to hack iPhones, iPads, and Macs,” BleepingComputer reported.

Feb. 13: Nearly all 50 states and territories have submitted an application to receive federal funding for strengthening their cyber defenses, part of the infrastructure spending bill recently signed into law.

Contributors
Kacy Zurkus

Director of Content, RSA Conference

RSAC Insights

fraud identity theft artificial intelligence & machine learning disinformation campaigns/fake news security awareness security education business continuity & disaster recovery orchestration & automation

Blogs posted to the RSAConference.com website are intended for educational purposes only and do not replace independent professional judgment. Statements of fact and opinions expressed are those of the blog author individually and, unless expressly stated to the contrary, are not the opinion or position of RSA Conference™, or any other co-sponsors. RSA Conference does not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented in this blog.


Share With Your Community

Related Blogs