Sandworm Page 7
But Lee himself took a different path. After enrolling in the U.S. Air Force Academy—his father tricked him into it, he says, by telling him he’d never be accepted—he found himself less interested in the endless engineering and physics courses than he was in African studies. He spent one summer on a humanitarian mission to Cameroon, working with an NGO there focused on renewable energy and water supplies. They’d travel across the countryside, sleeping in the locals’ villages, eating meals of fish and a starchy cake called fufu, and setting up simple water filtration systems and solar energy collectors.
Lee had never been much of a technology nerd. He’d played video games and built computers like other kids but never learned to program. In Cameroon, however, he became fascinated by control systems. A basic programmable logic controller, he found, made the machines he was installing vastly more efficient. The book-sized gray boxes with a few blinking lights, sold by companies like Siemens and Rockwell Automation, would allow him to program the solar-powered water filtration systems he’d place in streams so that they could swap their own filters with no manual intervention. Or the same controllers could be programmed to charge a series of car batteries attached to solar panels or wind turbines. That meant more clean water or more energy to power the LED lamps they’d give the villagers, and thus more hours of light each day, real improvements in human lives.
Lee began to see those programmable logic controllers, digital brains capable of altering the physical world around them, as fundamental building blocks of infrastructure and economic development. “I thought, I can teach you how to create energy and power your village. That’s civilization changing,” he says. “I saw control systems as the route to change.”
* * *
■
When Lee graduated from the U.S. Air Force Academy in 2010, he was sent to Keesler Air Force Base in Biloxi, Mississippi, to train as a communications officer. The air force, at the time, was just beginning to take cybersecurity seriously and lumped the new discipline in with that broader category of education. It was there that Lee learned the hacker basics: network analysis, forensics, exercises in “blue team” defense and “red team” attack.
But when it came to courses on control systems and their security, Lee found that his instructors often knew less about that little-understood computing niche than he had learned from his own hands-on time programming controller devices himself.
Then, during Lee’s time at Keesler, he suddenly found that his niche interest was at the center of a buzzing new field of conflict: A mysterious piece of malware called Stuxnet had begun to appear in thousands of computers across the Middle East and South Asia. No one knew what exactly it was designed to do. But the worm seemed to have the ability to meddle with programmable logic controllers, something no one had ever seen before. (Like most of the rest of the world, Lee didn’t yet know that Stuxnet was, in fact, an American creation. It had been built by Lee’s future employers at the NSA along with Israeli intelligence and aimed directly at destroying equipment in Iranian nuclear enrichment facilities, an act that would mark a new era of cyberwar. But we’ll get to that.)
Lee was, at the time, offended by the mere notion of malware capable of attacking physical infrastructure. “Here some asshole had targeted control systems,” he remembers thinking. “The path to making the world a better place was control systems. Someone was jeopardizing that, and it pissed me off.”
As more information about Stuxnet trickled out to the public, Lee’s interest in industrial control system security was elevated to an obsession. He’d spend his time between classes reading every document he could find on the subject. Soon he managed to track down a friendly nuclear scientist at Oak Ridge National Laboratory whom he’d call repeatedly, grilling him over a classified line about the minutiae of programmable logic controllers and the latest findings about the first-ever specimen of malware designed to corrupt them.
Eventually, Lee says, his views of that malware would shift as it became clearer that the code had been designed for a pinpoint strike on a single Iranian complex in Natanz, one that might serve as a key component of Iran’s efforts to obtain a nuclear weapon. But in the meantime, he had somehow become the closest thing to an expert on industrial control system security at Keesler Air Force Base. He found himself teaching other students and occasionally even briefing visiting generals.
At the end of his training, Lee took a position with an intelligence unit at Ramstein Air Base in Germany. Exactly what he did in that first real air force job remains obscured by the increasingly secret nature of his classified work. But he hints that the unit was engaged in intelligence missions for the war on terror, carried out by remotely piloted vehicles like the Global Hawk and Predator drones. Lee focused his work on the security of those vehicles’ control systems. Within months, however, he was noticed by a different agency that would fundamentally redirect his career: the NSA.
Lee had barely settled in at Ramstein when he was ordered to move to a facility elsewhere in Germany.* The small NSA department in which Lee found himself had a strange and exhilarating mission. Fort Meade, the massive NSA headquarters in Maryland, already had well-resourced teams assigned to practically every known threat to American national security. His field unit of around a hundred people was given the remit to function independently, thinking outside that massive organization’s existing patterns of thought—to look where the rest of the NSA wasn’t looking. “It was our job to find ‘unknown unknowns,’ ” Lee says.
Naturally, Lee began asking around about who in the NSA was responsible for tracking hackers that threatened the security of industrial control systems. He was shocked to discover there was no devoted group with that mission. The NSA had teams tasked with finding and fixing vulnerabilities in industrial control system equipment. It had, as Stuxnet would expose, its own offensive teams that invented infrastructure exploitation techniques. It didn’t, however, have a team assigned exclusively to hunting the enemy’s infrastructure-focused hackers.
So Lee offered to build one. He was amazed at how little bureaucracy he confronted; creating the agency’s first industrial control system threat intelligence team required filling out one form, he remembers. “So I became the lead of all of industrial control system threat discovery for NSA overnight,” Lee says.
He was twenty-two years old. “Pretty fucked-up, isn’t it?”
* Though Lee declined to say more about this base, all signs point to the Dagger Complex in Darmstadt. That NSA outpost resides on a small U.S. Army base in the west of the country whose role as an intelligence operation was at the time secret and would only later be revealed in the classified documents leaked by the NSA whistle-blower Edward Snowden.
9
THE DELEGATION
Rob Lee describes starting his job at the NSA as something like connecting his brain to a vast, ultra-intelligent hive mind.
Suddenly he had access to not only expert colleagues but the agency’s corpus of classified knowledge, as well as its vast intelligence collection abilities. Lee, of course, says little about the details of where that intelligence came from. But thanks in part to Edward Snowden, we know that it included a broad array of secret data-gathering tools, labeled broadly as “signals intelligence,” or “sigint,” that ranged from the ability to siphon vast quantities of raw internet data from undersea cables to hacking enemy systems administrators and looking over their shoulders at private networks. “When you’re given access to essentially the entirety of the U.S. sigint system and then surrounded with the smartest people doing this on the planet, you get spun up pretty quickly,” Lee says.
For the next four years, he and a small team of around six analysts spent every working hour tracking the burgeoning, post-Stuxnet world of industrial control system hackers. “Every day was hypothesis-driven hunting. We’d ask ourselves, if I were the adversary, what would I do to break into industrial control systems? The
n we’d go search for that out in the world,” Lee says. “We quickly went past any human knowledge of how to do this stuff and had to come up with our own models and methods and training.” Soon he was writing reports on new critical infrastructure-hacking threats that found their way to the desk of President Obama and briefing the director of the NSA, Keith Alexander.
Lee refuses to talk about the details of his team’s findings. But he hints that they’d uncover new, active industrial control system hacking operations being carried out by foreign governments as often as once a week. Only a small fraction of those hacking teams were ever identified in the media. (He’s careful, however, to describe the operations his team tracked during that period only as “targeting” industrial control systems. Lee won’t say how many—if any—ever followed in Stuxnet’s footsteps and crossed the line to disrupting or destroying physical equipment.)
Even as his team built a global view of an internet roiling with threats to critical infrastructure, Lee notes that he remembers Sandworm stood out. He marked it early as a uniquely dangerous actor. “I can confirm that we knew about them and tracked them,” he says, choosing his words cautiously. “And I found them to be particularly aggressive compared to the other threats we were seeing.”
Then, in 2014, Lee’s dream job abruptly ended. As a fast-rising and sometimes brash upstart, he’d never been particularly compliant with the military’s strict adherence to rank. The NSA’s relatively freewheeling culture had unshackled him from that system. But he was still frustrated by the treatment of air force recruits, who’d sometimes cycle into his unit at the NSA, show real talent, and then suddenly be pulled out again to perform more menial tasks befitting their low rank.
So Lee spoke out, writing a strongly worded article in the military magazine Signal titled “The Failing of Air Force Cyber.” His unvarnished opinion piece accused the air force of incompetence in cybersecurity and railed against the bureaucratic dogma of rank that had stifled improvement and wasted intellectual resources.
Lee hadn’t bargained for the blowback or fully considered that he was still beholden to the same rank structure that he was attacking. Not long after his Signal piece was published, Lee discovered he had been reassigned, pulled out of his hacker-hunting team and back to an air force intelligence unit.
Back in that starched-collar military hierarchy, Lee bristled at his subordination to officers who he felt lacked the expertise he’d gained at the NSA. Worse, he had now been assigned to a team that sat on the other end of the game. He was part of a U.S. Air Force squadron based in Texas, responsible not for cybersecurity but for cyberattack. In other words, he now had orders to engage in exactly the sort of infrastructure hacking that he considered unconscionable. Just four years after first discovering that “some asshole” was targeting industrial control systems, he was that asshole.
He stayed for one unhappy year of highly classified work, then persuaded one of his commanders to let him resign, a nearly unthinkable move in a family of air force lifers. Lee says he wept as he walked out of the base on his last day as an air force officer.
It was 2015. That fall, Lee left Texas and moved to Maryland to attempt to re-create his NSA dream team in the private sector. Not long after, Christmas arrived. And with it, Sandworm reentered his life.
* * *
■
Despite his years working in one of the world’s most secretive agencies, discretion had never been Lee’s strong suit. Shortly after his abbreviated Christmas wedding, he’d linked the Ukrainian blackouts to an active hacker group, one that had already probed U.S. infrastructure, no less. And for the first time in his career, he was no longer bound by security clearances to keep that information hidden. He was immediately determined to warn the world.
In just the days before the New Year, Lee, Mike Assante, and another SANS researcher named Tim Conway had pieced together the broad strokes of the Ukrainian attack. Lee wanted to release it all. “By the twenty-ninth of December, we knew the public needed to know,” he says.
Despite the hints that Sandworm was behind the blackout, Assante thought it was too early to start publicly blaming the attack on any particular hacker group—not to mention a government. The three men agreed that Assante should write a blog post delicately addressing the attack without revealing too many details, to get ahead of any media reports that might hype up or misrepresent the story.
The next day, they published a circumspect post on the SANS website, with Assante’s byline: “A small number of sources in Russia and Ukraine indicate the electrical outage was caused by a cyber attack, specifically a virus from an outside source,” it read. “I am skeptical as the referenced outage has been hard to substantiate.”
Just two days later, on New Year’s Day, however, Lee went ahead with his own blog post, discussing for the first time the BlackEnergy malware sample he’d obtained. The post still took a cautious approach, but it dropped hints at a conclusion. “The Ukrainian power outage is more likely to have been caused by a cyber attack than previously thought,” he wrote. “Early reporting was not conclusive but a sample of malware taken from the network bolsters the claims.” Lee says his intention was, in the least alarmist tone he could muster, to make clear to U.S. power companies that they should check their networks immediately for BlackEnergy infections that might be footholds for Sandworm.
For the next week, Lee, Assante, and Conway continued to exchange intelligence about the attack with the Ukrainian government, the Department of Homeland Security, and the Department of Energy. But after eight days, when no U.S. officials had made any public statement about the attack, they published another post under Assante’s name that definitively confirmed the blackout had been a cyberattack, naming BlackEnergy and KillDisk as tools used in the attack, though not necessarily as the cause of the power outage. They made plans to release a full report with the blow by blow of the attack based on their analysis.
But at that point, to Lee’s immense frustration, a senior DHS official told the SANS researchers to refrain from any further revelations. The request to stand down was directed at Assante, who still had deep government ties from years working at Idaho National Laboratory and the North American Electric Reliability Corporation.
As the Obama administration’s cybersecurity coordinator J. Michael Daniel would later describe it to me, the government argued it wanted to give utilities a chance to address the problem discreetly before it revealed anything about those utilities’ vulnerabilities in public, where it might tip off opportunistic hackers. But Lee was furious: He instead saw the delay as bureaucratic foot-dragging.
In the days that followed, the SANS researchers and the agency officials came to a compromise over Lee’s objections. They’d assemble a fact-finding trip that would travel to Ukraine, meet with the electric utilities that had been victims of the attacks, and put together both classified reports for the government and unclassified reports for the public. Until then, everyone would keep quiet.
Assante and Conway were invited to join the delegation. Lee, whom officials had by then deemed a problematic hothead, was not.
* * *
■
A few weeks later, the team of Americans arrived in Kiev on a bright, freezing winter day. They assembled at the Hyatt, a block from the golden dome of the thousand-year-old St. Sophia Cathedral and just down the street from the Maidan. Among them were staff from the FBI, the Department of Energy, the Department of Homeland Security, and the North American Electric Reliability Corporation—the body responsible for the stability of the U.S. grid—as well as SANS’s Assante and Conway, all assigned to learn the full truth of the Ukrainian blackout.
On that first day, the group gathered in a sterile hotel conference room with the staff of Kyivoblenergo, Kiev’s regional power distribution company and one of the three victims of the power grid attacks. Over the next several hours, the Ukrainian company’s stoic exec
s and engineers laid out the timeline of a ruthless, cunning raid on their network.
As Lee and Assante had noticed, the malware that infected the energy companies hadn’t contained any commands capable of actually controlling the circuit breakers. Yet on the afternoon of December 23, Kyivoblenergo employees had watched helplessly as circuit after circuit was opened in dozens of substations across a Massachusetts-sized region of central Ukraine, seemingly commanded by computers on their network that they couldn’t see. In fact, Kyivoblenergo’s engineers determined that the attackers had set up their own perfectly configured copy of the control software on a PC in a faraway facility and then had used that rogue clone to send the commands that cut the power.
Once the circuit breakers were open and the power for tens of thousands of Ukrainians had gone dead, the hackers launched another phase of the attack. They’d overwritten the obscure code of the substations’ serial-to-ethernet converters, tiny boxes in the stations’ server closets that translated modern internet communications into a form that could be interpreted by older equipment. By hacking those chunks of hardware, the intruders had permanently bricked the devices, shutting out the legitimate operators from further digital control of the breakers.
The serial-to-ethernet converter trick alone would have taken weeks to devise, Assante thought to himself. Sitting at the conference room table, he marveled at the thoroughness of the operation.
The hackers also left one of their usual calling cards, running KillDisk to destroy a handful of the company’s PCs. Then came the most vicious element of the attack: When the electricity was cut to the region, the stations themselves also lost power. Control stations have backup batteries for just such an occasion, but the hackers had turned them off, throwing the utility operators into darkness in the midst of their crisis and slowing their recovery efforts. With utmost precision, the hackers had engineered a blackout within a blackout.