In December 2025, Zscaler's ThreatLabz team uncovered a new campaign from APT37, North Korea's state-backed cyber espionage group. They called it "Ruby Jumper." The toolkit is sophisticated: five new malware families, a full Ruby 3.3.0 runtime disguised as a USB utility, cloud-based command and control through Zoho WorkDrive, and a bidirectional relay system designed to bridge air-gapped networks using removable media.
The coverage has focused on the technical sophistication. New tools, novel techniques, creative use of cloud infrastructure. But the attack vector that makes all of it work is the same one that Stuxnet used in 2010, that Agent.BTZ used in 2008, and that every single air-gap malware framework documented over the last 15 years has relied on.
A USB drive. Carried by a person. Plugged into a machine on the other side of the gap.
Air gaps don't fail because of technology. They fail because every air gap has a person who crosses it.
The Ruby Jumper Kill Chain
The campaign starts with a malicious Windows shortcut (LNK) file containing a decoy document about the Palestine-Israel conflict, sourced from North Korean state media. When a victim opens it, PowerShell launches silently and extracts multiple payloads from the file itself.
The first payload, RESTLEAF, establishes a foothold using Zoho WorkDrive as its command-and-control channel. This is the first documented abuse of Zoho WorkDrive by APT37, though the group has a history of hiding C2 traffic behind legitimate cloud services.
RESTLEAF downloads SNAKEDROPPER, which does something unusual: it installs a complete Ruby 3.3.0 runtime to %PROGRAMDATA%\usbspeed, renames the interpreter to "usbspeed.exe" to look like a USB utility, and hijacks Ruby's automatic library loading to execute malicious code every five minutes through a scheduled task.
This is where the campaign splits into two operational tracks.
Track 1: Air-gap bridging. THUMBSBD creates a hidden $RECYCLE.BIN directory on connected removable drives and uses it as a dead drop, staging encrypted command files and exfiltration data. When someone carries that drive to an air-gapped system and plugs it in, the malware reads commands from the hidden directory, executes them, and stages the results for the return trip. Bidirectional C2 across an air gap, using humans as unwitting couriers.
Track 2: Propagation. VIRUSTASK infects clean removable drives by replacing the victim's legitimate files with Windows shortcuts bearing identical names. When someone opens what they think is their document, they're actually executing the embedded Ruby interpreter. If the target machine doesn't already have the malware installed, VIRUSTASK handles the initial infection automatically.
The final payload, FOOTWINE, is a full surveillance backdoor: keylogging, screenshot capture, audio and video recording, file manipulation, registry access, and remote shell capabilities. It arrives disguised as an Android package file (.apk) to evade detection.
Fifteen Years of the Same Attack Pattern
Here is the part that should concern every security leader reading this: none of this is new.
ESET published a comprehensive analysis documenting 17 malicious frameworks designed to breach air-gapped networks over 15 years of nation-state operations. Every single one used USB drives as the physical transmission medium. Not radio signals. Not electromagnetic emanation. Not acoustic side channels. USB drives, carried by people.
Stuxnet crossed Iran's air gap in 2010 the same way Ruby Jumper crosses air gaps in 2025: someone plugged in a USB drive. The sophistication of the malware has increased enormously. The fundamental attack vector hasn't changed at all.
This matters because it reveals a systemic failure in how organizations think about air gaps. The industry treats each new air-gap breach as a technical problem requiring a technical solution. But the attack surface isn't technical. It's behavioral. It's the person who carries a drive between networks every day because that's how they do their job.
The EOD Parallel
I spent eight years in Navy EOD watching adversaries do exactly this.
Every time coalition forces defeated one IED trigger mechanism, bomb makers adapted. Pressure plates replaced radio triggers. Command wire replaced pressure plates. Victim-operated switches replaced command wire. The adversary didn't stop building bombs because we found better ways to detect them. They found new ways to deliver them.
The same dynamic plays out in air-gap attacks. Organizations built network isolation as a defense against network-based attacks. So adversaries stopped using the network. They used the people who move between networks, carrying data on removable media as part of their daily workflow. The air gap didn't eliminate the attack surface. It moved it from a protocol you can monitor to a behavior you can't.
Ruby Jumper's VIRUSTASK component demonstrates this perfectly. It doesn't just propagate through USB drives; it hijacks the victim's own files, replacing them with malicious shortcuts that look identical. The attack rides on trust: the user trusts their own files, trusts the USB drive they've been using for years, trusts the process of moving data between systems. That trust is the vulnerability, not the gap itself.
This is the same insight that applies to the nation-state cyber threat pattern I wrote about in the context of Iran's capabilities: when you deny an adversary their preferred weapons, they don't surrender. They adapt. And their adaptation targets whatever assumption you made when you chose your defense.
The Air Gap Is Not a Control. It's a Comfort.
The security industry has been saying "air gaps are a myth" for years. The data supports it.
When Claroty surveyed organizations during the COVID pandemic, 65% reported their IT and OT networks had become more interconnected, not less. Independent assessments routinely find an average of 11 direct connections between networks that organizations believe are air-gapped. In extreme cases, auditors have found up to 250 connections.
Waterfall Security Solutions has argued that true air gaps "largely disappeared in the late 1990s" when organizations began connecting industrial systems to enterprise resource planning software. What replaced them wasn't better security, but carefully managed mythology.
Fortinet's 2025 Operational Technology Security Report found that 50% of organizations experienced at least one cybersecurity incident in their OT systems. These are the systems most likely to be "protected" by air gaps.
The problem isn't that air gaps can be breached. It's that air gaps create a psychological model where organizations stop investing in defense-in-depth on the "protected" side. If the network is isolated, why invest in endpoint detection? If no external traffic can reach the system, why patch aggressively? If the air gap is the control, what other controls do you need?
This is how air gaps make systems more vulnerable, not less. The gap becomes the entire security posture, and everything behind it gets treated as trusted by default.
North Korea's Dual-Track Investment
Ruby Jumper also reveals something about North Korean strategic priorities that deserves attention.
In 2025, North Korean-affiliated hackers stole $2.02 billion in cryptocurrency, a 51% year-over-year increase that pushed their all-time total to $6.75 billion. The Bybit hack alone netted $1.4 billion. These groups accounted for 76% of all service-level compromises in the crypto sector.
At the same time, they're investing in sophisticated espionage tooling like Ruby Jumper: building novel malware frameworks, developing new C2 infrastructure, creating tools specifically designed to penetrate the most hardened targets.
This dual-track approach, revenue generation through crypto theft and intelligence collection through targeted espionage, suggests an operation that's well-funded and strategically directed. The Ruby Jumper toolkit embeds a full Ruby runtime, uses multiple cloud services for C2, and implements custom binary protocols with XOR-based key exchange. This isn't commodity malware. It's purpose-built tooling for persistent access to high-value targets.
The parallel to what I covered in the VoidLink malware analysis is striking. Both Ruby Jumper and VoidLink demonstrate increasing sophistication in malware construction. But where VoidLink's solo developer was undone by OPSEC failures, APT37 operates with state backing that provides resources, infrastructure, and operational discipline. The capability gap between nation-state and independent actors is narrowing in terms of tooling, but the operational gap remains significant.
What Actually Works
If air gaps are an insufficient control for air-gapped systems, what should organizations do instead?
Treat removable media as an attack vector, not a workflow tool. Every USB drive that crosses a security boundary should be scanned, logged, and controlled. Dedicated media stations with write-blocking and automated analysis should be the standard for data transfer between classified and unclassified environments.
Deploy endpoint detection on air-gapped systems. Ruby Jumper's persistence mechanisms (scheduled tasks, RubyGems hijacking, registry keys) are all detectable with proper endpoint monitoring. The "it's air-gapped, so it's safe" assumption is exactly what these campaigns exploit.
Monitor for behavioral indicators. VIRUSTASK replaces user files with LNK shortcuts. THUMBSBD creates hidden directories in $RECYCLE.BIN. SNAKEDROPPER renames executables to look like system utilities. These activities generate filesystem artifacts that behavioral analysis can catch.
Implement defense-in-depth behind the gap. Network segmentation is one layer, not the security architecture. Application whitelisting, integrity monitoring, and privilege restriction should be standard on any system valuable enough to air-gap.
Audit your actual connectivity. If assessments routinely find 11 direct connections between "air-gapped" networks, your gap probably isn't a gap. Know what you actually have before you trust what you think you have.
The Lesson We Keep Not Learning
APT37 didn't invent a new way to breach air-gapped networks. They refined a technique that has worked, without fundamental change, for 15 years. The eighteenth framework in a long line that all exploit the same vulnerability: humans moving data between networks.
The industry keeps treating this as a technical arms race. Build a better air gap. Build a better USB scanner. Build a better detection system. But the lesson from bomb disposal applies here too: you can't out-engineer an adversary who's willing to adapt faster than you can deploy countermeasures. You have to change the model.
That starts with accepting that air gaps are a network control, not a security architecture. It continues with treating the human-mediated transfer as the highest-risk operation in your environment, not the most routine one. And it requires investing in defense-in-depth on both sides of the gap, especially the side you've been assuming is safe.
Fifteen years after Stuxnet, Ruby Jumper proves the air gap is still a people problem. And the people still aren't part of the threat model.