GEOG 479
Cyber-Geography in Geospatial Intelligence

CyberCity

Print

President Obama issued Executive Order 13636, in February 2013, directing agencies and departments of the federal government to take the necessary steps in order to protect US critical infrastructure. This EO was a response to the increasing concerns over cyber intrusions and attacks on cyber resources within the United States.

Turning this EO into a coherent interagency policy is problematic at best. The U.S. would like to protect its classified information and also the intellectual property of US industry, but the questions are twofold - how far should the government go, and what are the government's responsibilities? Mandiant published a report on the extensive activities by the Chinese People’s Liberation Army Unit 61398. It's not clear that the activities of this unit are singularly PLA derived. If the PLA steals trade secrets from a US defense contractor and delivers it to a Chinese company that then uses it for commercial advantage, what do we call that act? Cybercrime? Espionage? Or an act of cyber war? Distinguishing which one will influence the response.

What is the role of cyber deterrence? How does the U.S. persuade cyber agents against cyber-attacks? The theory of deterrence states that you prevent an adversary from taking action by conveying the idea that the retaliation will be too costly to bear. Key to this lies in the ability to communicate the threat of a response and having the capability to carry out the threat of response.

U.S. nuclear policy during the Cold War was pretty straight forward. There was only one major threat, the USSR, and the consequences of a Soviet attack on the US or its allies was understood to mean mutually assured destruction (MAD). In the 21st century, how can we deter both nation states and non-state actors?

We can certainly draw on experiences with naval and airpower for ideas, but the characteristics of the new domain calls for new theories. We need to know what role the state will play as the importance of the domain continues to grow.

Presidential Policy Directive 20 (PDD 20) was recently signed and although classified, according to sources, allows the US military to begin developing offensive targeting capabilities with new authorities. According to a recent Scientific American article, the U.S. military is preparing for a future war waged with computers. In this future world, technologies are much more vulnerable because of the interconnectivity that continues to grow. Some examples include networked bank accounts, streetlights and power grids, real-time transit data that can locate vehicle and rider position, and systems that can move medical records in one hospital to doctors across town. Technology invites cyber attack, with particularly grim ramifications for major metropolitan areas.

The Scientific American article reported the existence of a 48 sqft model developed and run by the SANS institute as a training aid for the DoD (and others) that allows attack and defense scenarios to be played out by computer. This model allows the scenarios to capture the possible kinetic effects produced by such an attack. It also allows for tactics to be developed for mitigation of such effects. The training model, called “CyberCity” can simulate transportation networks, hospitals, banks, and other NII entities. It’s modeled after a small US town of approximately 15000 and has databanks representative in size and detail of such a population. The rest of the NII infrastructure is said to be typical of what would see in such a city and so is a good training venue for cyber offensive and defensive scenarios.

What might the effect of a loss of power look like? Hurricane Katrina occurred in August 2005 and was one of the worst U.S. natural disasters in recent times. It created a multistate event that resulted in a blackout that lasted for weeks in some areas. In a 2008 report titled, “Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack,” the Katrina scenario is used to extrapolate what could occur in a larger nation based event that would destroy much of the power grid in the US. While the scenario envisioned is concerned with a large EMP event, the description is one of the loss of electric generation capability. It documents the potential far reaching and disastrous effects of a loss of the electric grid to the general population. Chapter 2 of the report details the effects on the energy grid.

Listen to the Experts (Optional Talk)

An optional video (14:54) was produced by CBS and can be viewed below or alternatively, can be viewed on YouTube.

It is recommended for viewing to supplement the reading.

This Cyber War episode on “60 Minutes” was originally broadcast on March 4, 2012. The US electric grid system is dependent on large central generators that could be severely damaged by a small number of attackers. The transmission lines often may span hundreds of miles. Their vulnerability is exacerbated by the fact that the lines are now being used to move power between regions to support the needs of new competitive markets for power generation.

An additional 5 minute story from 2009 shows what can happen when an electric generator is manipulated via the web. The Aurora Project proved that network control to a generator is very dangerous. It showed how a 27-Ton, 1 megawatt type power generator can be destroyed by simply hacking into it from a common laptop. Imagine getting access to all of the generators in the U.S. and then simply pushing the enter key on your keyboard.

Some skilled hacking by an evil programmer or an Al-Qaeda professional could accomplish this. A few malignant hackers on the power grid could result a lot of the US economy will dying.

What will we do without electricity for 4 months or maybe up to a year?

This Cyber War episode was on 60 minutes and belongs to CBS. This episode was originally broadcast on March 4, 2012.
Click here for transcript of the Stuxnet video.

STEVE KROFT: For the past few months now, the nation's top military, intelligence, and law enforcement officials have been warning Congress and the country about a coming cyber attack against critical infrastructure in the United States that could affect everything from the heat in your home to the money in your bank account. The warnings have been raised before, but never with such urgency, because this new era of warfare has already begun. The first attack using a computer virus called Stuxnet was launched several years ago against an Iranian nuclear facility, almost certainly with some US involvement. But the implications and the possible consequences are only now coming to light. 

[TICKING] 

STEVE KROFT (VOICEOVER): The story will continue in a moment. 

ROBERT MUELLER: I do believe that the cyber threat will equal or surpass the threat from counter-terrorism in the foreseeable future. 

LEON PANETTA: There's a strong likelihood that the next Pearl Harbor that we confront could very well be a cyber attack. 

MIKE ROGERS: We will suffer a catastrophic cyber attack. The clock is ticking. 

STEVE KROFT (VOICEOVER): And there's reason for concern. For more than a decade, the US military establishment has treated cyberspace as a domain of conflict where it would need the capability to fend off attack or launch its own. That time is here, because someone sabotaged the top-secret nuclear installation in Iran with nothing more than a long string of computer code. 

MICHAEL HAYDEN: We have entered into a new phase of conflict in which we use a cyber weapon to create physical destruction and in this case, physical destruction and someone else's critical infrastructure. 

STEVE KROFT (VOICEOVER): Few people know more about the dark military art of cyber war than retired General Michael Hayden. He's a former head of the National Security Agency and was CIA director under George W. Bush. He knows a lot more about the attack on Iran than he can say here. 

MICHAEL HAYDEN: This was a good idea, all right? But I also admit, this was a really big idea, too. The rest of the world is looking at this and saying, clearly someone has legitimated this kind of activity as acceptable international conduct. The whole world is watching. 

STEVE KROFT (VOICEOVER): The story of what we know about the Stuxnet virus begins in June of 2010 when it was first detected and isolated by a tiny company in Belarus after one of its clients in Iran complained about a software glitch. Within a month, a copy of the computer bug was being analyzed within a tight-knit community of computer security experts. And it immediately grabbed the attention of Liam O'Murchu, an operations manager for Symantec, one of the largest antivirus companies in the world. 

LIAM O'MURCHU: As soon as we saw it, we knew it was something completely different. And red flags started to go up straight away. 

STEVE KROFT (VOICEOVER): To begin with, Stuxnet was incredibly complicated and sophisticated, beyond the cutting edge. It had been out in the wild for a year without drawing anyone's attention and seemed to spread by way of USB thumb drives, not over the internet. O'Murchu's job was to try and unlock its secrets and assess the threat for Symantec's clients by figuring out what the malicious software was engineered to do and who was behind it. 

STEVE KROFT: How long was the Stuxnet code? 

LIAM O'MURCHU: You're talking tens of thousands of lines of code; a very, very long project; very well written; very professionally written; and very difficult to analyze. 

STEVE KROFT (VOICEOVER): Unlike the millions of worms and viruses that turn up on the internet every year, this one was not trying to steal passwords, identities, or money. Stuxnet appeared to be crawling around the world, computer by computer, looking for some sort of industrial operation that was using a specific piece of equipment, a Siemens S7 300 programmable logic controller. 

LIAM O'MURCHU: This gray box here is essentially what runs factory floors. And you program this box to control your equipment. Then you say, turn on a conveyor belt, turn on a heater, turn on a cooler, shut the plant down. It's all contained in that box. And that's what Stuxnet was looking for. It wanted to get its malicious code onto that box. 

STEVE KROFT (VOICEOVER): The programmable logic controller, or PLC, is one of the most critical pieces of technology you've never heard of. They contain circuitry and software essential for modern life and control the machines that run traffic lights, assembly lines, oil and gas pipelines, not to mention water treatment facilities, electric companies, and nuclear power plants. 

LIAM O'MURCHU: And that was very worrying to us, because we thought it could have been a water treatment facility here in the US or could have been trying to take down electricity plants here in the US. 

STEVE KROFT (VOICEOVER): The first breakthrough came when O'Murchu and his five-man team discovered that Stuxnet was programmed to collect information every time it infected a computer and to send it on to two websites in Denmark and Malaysia. Both had been registered with a stolen credit card. And the operators were nowhere to be found. But O'Murchu was able to monitor the communications. 

LIAM O'MURCHU: Well, the first thing we did was we looked at where the infections were occurring in the world. And we knocked them out. And that's what we see here. We saw that 70% of the infections occurred in Iran. That's very unusual for malware that we see. We don't normally see high infections in Iran. 

RALPH LANGNER: Please learn from Stuxnet. 

STEVE KROFT (VOICEOVER): Two months later, Ralph Langner, a German expert on industrial control systems, added another piece of important information. Stuxnet didn't attack every computer it infected. 

RALPH LANGNER: This whole virus is designed only to hit one specific target in the world. 

STEVE KROFT: How could you tell that? 

RALPH LANGNER: It goes through a sequence of checks to actually determine if this is the right target. It's kind of a fingerprinting process, a process of probing if this is the target I'm looking for. And if not, it just leaves the controller alone. 

STEVE KROFT (VOICEOVER): Stuxnet wasn't just looking for a Siemens controller that ran a factory floor. It was looking for a specific factory floor with a specific type and configuration of equipment, including Iranian components that weren't used anywhere else in the world in variable speed motors that might be used to regulate spinning centrifuges, a fragile piece of equipment essential to the enrichment of uranium. And Langner speculated publicly that Stuxnet was out to sabotage Iran's nuclear program. 

RALPH LANGNER: What we knew at this time, that the highest number of infections had been reported in Iran. And second was pretty clear, just by looking at the sophistication, that there would be at least one nation state behind this. Now, you just add one and one together. 

STEVE KROFT (VOICEOVER): By the fall of 2010, the consensus was that Iran's top-secret uranium enrichment plant in Natanz was the target and that Stuxnet was a carefully constructed weapon designed to be carried into the plan on a corrupted laptop or thumb drive, then infect the system, disguise its presence, move through the network changing computer code, and suddenly alter the speed of the centrifuges without the Iranians ever noticing-- sabotage by software. 

LIAM O'MURCHU: Stuxnet's entire purpose is to control centrifuges, to make centrifuges speed up past what they're meant to spin at, and to damage them. Certainly, it would damage the uranium enrichment facility. And they would need to be replaced. 

STEVE KROFT: If the centrifuges were spinning too fast, wouldn't the operators at the plant know that? 

LIAM O'MURCHU: Stuxnet was able to prevent the operators from seeing that on their screen. The operators would look at the screen to see what's happening with the centrifuges. And they wouldn't see that anything bad was happening. 

STEVE KROFT (VOICEOVER): It now seems likely that by the time O'Murchu and Langner finally unraveled the mystery in November of 2010, Stuxnet had already accomplished at least part of its mission. Months before the virus was first detected, inspectors from the International Atomic Energy Agency had begun to notice that Iran was having serious problems with its centrifuges at Natanz. 

LIAM O'MURCHU: What we know is that an IAEA report said that 1,000 to 2,000 centrifuges were removed from Natanz for unknown reasons. And we know that Stuxnet targets 1,000 centrifuges. So from that, people are drawing to conclusion, well, Stuxnet got in and succeeded. That's the only evidence that we have. 

STEVE KROFT: The only information that's not classified. 

LIAM O'MURCHU: Yes. 

STEVE KROFT (VOICEOVER): And there are lots of things about Stuxnet that are still top secret. 

STEVE KROFT: Who was behind it? 

LIAM O'MURCHU: What we do know is that this was a very large operation. We're really looking at a government agency from some country who is politically motivated and who has the insider information from a uranium enrichment facility that would facilitate building a threat like this. 

STEVE KROFT: An intelligence agency, probably. 

LIAM O'MURCHU: Probably. 

RALPH LANGNER: We know from reverse engineering the attack codes that the attackers have full-- and I mean this literally-- full technical knowledge of every damn detail of this plant. So you could say, in a way, they know the plant better than the Iranian operator. 

STEVE KROFT (VOICEOVER): We wanted to know what retired General Michael Hayden had to say about all this since he was the CIA director at the time Stuxnet would have been developed. 

STEVE KROFT: You left the CIA in 2009? 

MICHAEL HAYDEN: 2009, right. 

STEVE KROFT: Does it surprise you that this happened? 

MICHAEL HAYDEN: You need to separate my experience at the CIA with your question. 

STEVE KROFT: Right. 

MICHAEL HAYDEN: All right? 

STEVE KROFT: You can't talk about the CIA stuff. 

MICHAEL HAYDEN: No. And it's-- and I don't even want to suggest what may have been on the horizon or not on the horizon or anything like that. 

STEVE KROFT: Right. If you look at the countries that have the capability of designing something like Stuxnet and you take a look at the countries that would have a motive for trying to destroy Natanz-- 

MICHAEL HAYDEN: Where do those two sets intersect? 

STEVE KROFT: --you're pretty much left with the United States and Israel. 

MICHAEL HAYDEN: Well, yes. But there is no good with someone of my background even speculating on that question, so I won't. 

STEVE KROFT (VOICEOVER): Iran's president Mahmoud Ahmadinejad, shown here at Natanz in 2008, blamed the cyber attack on enemies of the state and downplayed the damage. Both the US and Israel maintain that it set back the Iranian program by several years, which impossible to know is how much damage the attackers might have inflicted if the virus had gone undetected and not been exposed by computer security companies trying to protect their customers. 

RALPH LANGNER: They planned to stay in that plant for many years and to do the whole attack in a completely covert manner, that any time centrifuge would break, the operators would think this is, again, a technical problem that we have experienced, for example, because of poor quality of these centrifuges that we are using. 

LIAM O'MURCHU: We had a good idea that this was a blown operation, something that was never meant to be seen. It was never meant to come to the public's attention. 

STEVE KROFT: You say blown, meaning-- 

LIAM O'MURCHU: If you're running an operation like this to sabotage a uranium enrichment facility, you don't want the code uncovered. You want it kept secret. And you want to just to keep working, stay undercover, do its damage, and disappear. And hopefully nobody would ever see it. 

STEVE KROFT: Do you think this was a blown operation? 

MICHAEL HAYDEN: No, not at all. I think it's an incredibly sophisticated operation. 

STEVE KROFT (VOICEOVER): But General Hayden did acknowledge that there are all sorts of potential problems and possible consequences that come with this new form of warfare. 

MICHAEL HAYDEN: When you use a physical weapon, it destroys itself in addition to the target, if it's used properly. A cyber weapon doesn't. So there are those out there who can take a look at this, study it, and maybe even attempt to turn it to their own purposes. 

STEVE KROFT (VOICEOVER): Such as launching a cyber attack against critical infrastructure here in the United States. Until last fall, Sean McGurk was in charge of protecting it as head of cyber defense at the Department of Homeland Security. He believes that Stuxnet has given countries like Russia and China, not to mention terrorist groups and gangs of cyber criminals for hire, a textbook on how to attack key US installations. 

SEAN MCGURK: You can download the actual source code of Stuxnet now. And you can repurpose it and repackage it and then point it back towards wherever it came from. 

STEVE KROFT: Sounds a little bit like Pandora's box. 

SEAN MCGURK: Yes. 

STEVE KROFT: Whoever launched this attack-- 

SEAN MCGURK: They opened up the box. They demonstrated the capability. They showed the ability and the desire to do so. And it's not something that can be put back. 

STEVE KROFT: If somebody in the government had come to you and said, look, we're thinking about doing this, what do you think, what would you have told them? 

SEAN MCGURK: I would have strongly cautioned them against it because of the unintended consequences of releasing such a code. 

STEVE KROFT: Meaning that other people could use it against you. 

SEAN MCGURK: Yes. 

STEVE KROFT: Or use their own version of the code. 

SEAN MCGURK: Something similar-- Son of Stuxnet, if you will. 

STEVE KROFT (VOICEOVER): As a result, what was once abstract theory has now become a distinct possibility. 

STEVE KROFT: If you can do this to a uranium enrichment plant, why couldn't you do it to a nuclear power reactor in the United States or an electric company? 

LIAM O'MURCHU: You could do that to those facilities. It's not easy. It's a difficult task. And that's why Stuxnet was so sophisticated. But it could be done. 

RALPH LANGNER: You don't need many billions. You just need a couple of millions. And this would buy you a decent cyber attack, for example, against the US power grid. 

STEVE KROFT: If you were a terrorist group or a failed nation state and you had a couple of million dollars, where would you go to find the people that knew how to do this? 

RALPH LANGNER: On the internet. 

STEVE KROFT: They're out there? 

RALPH LANGNER: Sure. 

STEVE KROFT (VOICEOVER): Most of the nation's critical infrastructure is privately owned and extremely vulnerable to a highly sophisticated cyber weapon like Stuxnet. 

SUSAN COLLINS: I can't think of another area in Homeland Security where the threat is greater and we've done less. 

STEVE KROFT (VOICEOVER): After several failures, Congress is once again trying to pass the nation's first cybersecurity law. And once again, there is fierce debate over whether the federal government should be allowed to require the owners of critical infrastructure to improve the security of their computer networks. Whatever the outcome, no one can say the nation hasn't been warned. 

[TICKING] 

Credit: CBS News
map of United States showing visualization of the electric grid
Figure 47. Visualizing the US Electric Grid.
Credit: Map courtesy of National Public Radio (NPR). A series of stories on NPR offer detailed discussions.

All advanced societies on the planet depend on infrastructure, and the more advanced the nation, the higher that dependency — and the higher the consequences of compromise of that dependency.

In developing this course, I was reminded of some of the lessons I learned after analyzing physical infrastructure as networks.

It's not hard to envision the infrastructure network in a prominent country. Such a network is critical to a nation's international trade. Such networks are built for efficiency, and not robustness and resiliency. Such efficiency affects the network two main ways — when it works as designed and when it fails. The nodes in the network above are connected with an almost minimum number of links in order to avoid redundancy. If this network is hit by a random failure, more times than not the result isn't catastrophic. There are numerous places the network can fail with just localized effects.

Unfortunately, the design focus on efficiency might be used against us in an attack. Efficient networks can often handle random failures, but they are vulnerable to targeted attacks on nodes that control the connectivity. The network above is both highly efficient and highly vulnerable. You only need to disable a few nodes for the network to dissolve into disconnected single components halting the flow through the system. This means that geographically targeted attacks can be successful by targeting only a few nodes or a few links. If the plan is to disable the above network, which nodes would you select?

Our systems of the future have to be designed with the conflicting constraints of efficiency and resilience. Resilience requires redundancy and is in conflict with efficiency. Redundancy will provide failover pathways in a network so others are available to continue the flow should one fail. Now, we often must sit and wait until the network is repaired.

Redundancy as a Design Constraint

The secret to resiliency is alternative paths though the network - called "dual homing" in communications networks. This becomes a design question of where to put the alternative paths. Network analysis can be used to determine our easy points of failure. Other factors, such as geography, can assist in determining the most easily attacked nodes and links. We can't plan for all possible attacks but we can build some alternate pathways into the systems so they are more robust. They should degrade gradually after an attack and not fall apart after a few well targeted attacks.

We live in a world of increasing interconnectedness and interdependent networks. In the future, we have to build this infrastructure in new ways that focus not only on efficiency, but also on robustness, and the ability to bounce back from an attack or catastrophic failure.