Healthcare Facilities Alerted of 'Scattered Spider' Cyber Threat

The group uses a form of AI technology known as deep fakes, which are convincing yet false recreations that can trick people.

By Jeff Wardon, Jr., Assistant Editor


The Health Sector Cybersecurity Coordination Center (HC3) has issued a warning about the Scattered Spider threat actor, which has targeted organizations in many fields, including healthcare. The group is known for using both legitimate, widely accessible tools and other malware in its schemes, even including many variants of ransomware. They have also added RansomHub and Qilin to their arsenal as of the second quarter of 2024. 

Scattered Spider uses AI tools to mimic victims’ voices for getting initial access to targeted organizations, according to HC3’s warning. This follows the recent trend of “deep fakes” in cyberattacks, where the technology can create realistic – yet false – images, videos or audio recordings, TechTarget reports. When this content is used in cyberattacks, it can trick targets into unwittingly following the commands of a cybercriminal. It is expected that Scattered Spider will continue to grow its technology to further avoid detection. 

With the increase of cybercrimes, healthcare organizations must be aware of any potential threat to survive in the digital age. Detecting whether something is a deep fake or a phishing scam is crucial to protecting healthcare organizations from cyberattacks. TechTarget lists these hints to determining if something is a deep fake: 

  • Facial and body movement: When looking at images and videos, deep fakes can be identified by closely looking at participants’ facial expressions and body language. This is because there may be inconsistencies within a person’s human likeness that AI cannot overcome. 
  • Lip-sync detection: When a video is matched up with altered audio from a spoken voice, there can be mismatched syncing in how words are projected. Paying close attention to lip movements can help bring out these issues. 
  • Inconsistent or lack of blinking: Right now, AI has trouble simulating the blink of an eye, meaning deep fake algorithms can create inconsistent blinking patterns or just not even have it. 
  • Irregular reflections or shadowing: Deep fake algorithms also do a shoddy job of recreating reflections and shadows, so look closely at both on surrounding surfaces, in the backgrounds or in the person’s eyes. 
  • Pupil dilation: Most of the time, AI doesn’t alter the diameter of pupils, creating eyes that will appear off. Pay attention if the person’s pupils aren’t dilating naturally to focusing on objects or adjusting to multiple light sources. 
  • Artificial audio noise: Deep fakes can add artificial noise to audio files to mask any potential changes, which is also known as “artifacting.” 

Jeff Wardon, Jr., is the assistant editor for the facilities market. 



November 1, 2024


Topic Area: Information Technology , Security


Recent Posts

What's In the Toilet Bowl Is in the Air, Too

Aerosolized contaminants from toilet plumes remain airborne for up to seven days and can be inhaled by patients and healthcare workers.


Allina Health Facing Fines Over Incidents of Workplace Violence

Nurses report escalating violence, with injuries including concussions and physical assaults.


Yale New Haven Hospital Tops Off Adams Neurosciences Center

The $838 million, 500,000-square-foot Adams Neurosciences Center will feature two new patient towers.


Healthcare Design Trends to Watch in 2025

As a new year approaches, there will be a shift in trends.


UPMC Presbyterian Tower Project Reaches Construction Milestone

The 1.2 million-square-foot, 17-story tower is expected to be completed in late 2026.


 
 


FREE Newsletter Signup Form

News & Updates | Webcast Alerts
Building Technologies | & More!

 
 
 


All fields are required. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

 
 
 
 

Healthcare Facilities Today membership includes free email newsletters from our facility-industry brands.

Facebook   Twitter   LinkedIn   Posts

Copyright © 2023 TradePress. All rights reserved.