A novel side-channel attack, named PIXHELL, has surfaced, capable of infiltrating air-gapped computers by exploiting the “audio gap” and extracting confidential data through the noise emitted by screen pixels.
According to Dr. Mordechai Guri, the leading figure at the Offensive Cyber Research Lab within the Department of Software and Information Systems Engineering at Ben Gurion University of the Negev, “Malware targeting air-gapped and audio-gapped systems generates meticulously crafted pixel patterns that produce noise within the 0 – 22 kHz frequency range.” Dr. Guri expounded in a recently published study.
“The nefarious code harnesses the sound produced by coils and capacitors to manipulate the frequencies emitted by the screen. Acoustic signals can encode and relay sensitive information.”
What sets this attack apart is its avoidance of specialized audio equipment, loudspeakers, or internal speakers on the compromised machine. Instead, it capitalizes on the LCD screen to generate acoustic emissions.
Air-gapping remains a vital security strategy intended to shield critical environments from potential threats by physically and logically severing them from external networks (e.g., the internet). This is generally achieved by disconnecting network cables, deactivating wireless interfaces, and disabling USB ports.
However, these defenses can be breached through insider threats or compromises within the hardware or software supply chain. An alternative scenario might involve an unsuspecting employee connecting an infected USB drive, deploying malware capable of establishing a covert data exfiltration pathway.
“Phishing, malevolent insiders, or other social engineering tactics could deceive individuals with access to the air-gapped system into undertaking actions that undermine security, such as engaging with malicious links or downloading compromised files,” Dr. Guri noted.
“Attackers may also exploit software supply chain vulnerabilities by targeting application dependencies or third-party libraries. Compromising these dependencies can introduce vulnerabilities or malicious code that may remain undetected during development and testing.”
In a manner reminiscent of the recently unveiled RAMBO attack, PIXHELL employs malware on the compromised host to forge an acoustic channel for information leakage from audio-gapped systems.
This is feasible because LCD screens incorporate inductors and capacitors within their internal components and power supply. These elements vibrate at audible frequencies, producing a high-pitched noise when electrical currents pass through the coils, a phenomenon known as coil whine.
Specifically, variations in power consumption can induce mechanical vibrations or piezoelectric effects in capacitors, generating audible noise. A critical factor influencing the power consumption pattern is the number of illuminated pixels and their distribution across the screen, as white pixels demand more power than darker ones.
“Moreover, when alternating current (AC) flows through the screen’s capacitors, they vibrate at particular frequencies,” Dr. Guri explained. “The acoustic emissions are generated by the screen’s internal electrical components. These characteristics are influenced by the actual bitmap, pattern, and pixel intensity projected onto the screen.”
“By meticulously controlling the pixel patterns displayed on the screen, our method generates specific acoustic waves at designated frequencies from LCD screens.”
Thus, an attacker could use this technique to exfiltrate data as acoustic signals, which are then modulated and transmitted to a nearby Windows or Android device, capable of demodulating the packets and extracting the information.
It is noteworthy that the strength and clarity of the emitted acoustic signal are contingent upon the screen’s specific structure, its internal power supply, and the positioning of coils and capacitors, among other factors.
Another key aspect to highlight is that the PIXHELL attack is inherently visible to users viewing the LCD screen, as it involves displaying a bitmap pattern of alternating black-and-white rows.
“To maintain secrecy, attackers might employ a strategy of transmission during user absence,” Dr. Guri suggested. “For instance, a so-called ‘overnight attack’ could utilize covert channels during off-hours, minimizing the risk of detection.”
However, the attack could be adapted for stealth during active hours by reducing pixel colors to very low values before transmission—using RGB levels of (1,1,1), (3,3,3), (7,7,7), and (15,15,15)—which might make the screen appear black to the user.
Nevertheless, this approach significantly reduces sound production levels. Additionally, it is not infallible, as a discerning user might still perceive anomalous patterns if they scrutinize the screen carefully.
This is not the first instance of overcoming audio-gap restrictions in experimental settings. Previous studies by Dr. Guri have utilized sounds generated by computer fans (Fansmitter), hard disk drives (Diskfiltration), CD/DVD drives (CD-LEAK), power supply units (POWER-SUPPLaY), and inkjet printers (Inkfiltration).
As preventive measures, it is advisable to deploy an acoustic jammer to disrupt transmission, monitor the audio spectrum for unusual signals, restrict physical access to authorized personnel, ban smartphone usage, and use external cameras to detect unusual modulated screen patterns.