Many still credit Engelbart only with technological innovations like the mouse, the outline processor, the electronic-mail system, or sometimes, the windowed user interface. These indeed are major innovations, and today they have become pervasive in the environments in which people work and play. But Douglas Engelbart never really gets credit for the larger contribution that he worked to create: an integrative and comprehensive framework that ties together the technological and social aspects of personal computing technology. Engelbart articulated a vision of the world in which these pervasive innovations are supposed to find their proper place. He and other innovators of this new technology defined its future on the basis of their own aspirations and ideologies. Those aspirations included nothing less than the development via the interface between computers and their users, of a new kind of person, one better equipped to deal with the increasing complexities of the modern world.
The Atlantic on the Inventor of the Mouse
Listen to the Experts
My former commander here in Germany was Admiral James Stavridis. He constantly pushed for openness and collaboration between the partner nations of NATO. As the NATO Supreme commander, he tried to imagine global security driven by collaboration - among agencies, government, the private sector, and the public. Transparency would be key, and it was his vision that bridges were the next century's construct for peace - walls just don't work, as has been proven time and again. In a July 2012 Ted Talk, Admiral Stavridis shared moments from recent military history to explain why security of the future should be built with bridges rather than walls. What could 21st-century security look like? He suggests that dialogue and openness will be the game-changers. After 37 years in the Navy, it is a talk well worth listening to (16:43).
The future is only now starting to take shape based on the new connective technologies and one other aspect – what some call the rise of “Big Data.” We have already discussed the transforming technologies that were installed on the African continent in the first decade of the 21st century. What we haven’t discussed up to this point is a 4th aspect (not saying there might not be others, i.e., O3B Inc's launch of 4 new satellites the week of 25June2013, dedicated to providing low cost Internet connectivity to underserved areas) that also contributed to such things as the Arab Spring.
As recently as 2000, most information stored on the planet (explicit knowledge) was stored in an analog form. In the May/June issue of Foreign Policy, one estimate places the amount of digitization of explicit knowledge in 2000 around 25%. Everything else was on paper, on tape, or in some other analog form. This has changed in the same decade as the connectivity has also changed. Today, the same authors, Kenneth Cukier and Viktor Mayer-Schoenberger, estimate that the amount of analog information represents less than 2% of the current total – recall this was the 75% of analog information in 2000. The vast majority of information – 98%, is now in a digital form, and, according to a recent study, over 90% of this digital data has been created in the last 2-3 years.
Sampling used to be the key to understanding the subtle information of behavior and science. A small “good” unbiased sample gave us insights into what was going on in a population without having either the time or the resources to measure a characteristic (a parameter) of the population. With the advent of “Big Data” some advocate that the sample is no longer as important for these data sets. While we may not get to “n=all” of a population, it is becoming evident that for some datasets, we are developing the tools to get closer. What does this mean? Messy data is ok, as long as we get enough of it. Answering “why” becomes harder to achieve even as “what” becomes easier. The authors in this same article refer to a phenomenon of humans to infer causation into data where there may not be any. In a sense, this is similar to the characteristic of apophenia – seeing patterns in nature where none may exist. Regardless, “Big Data” can allow us to see the “what” of what is happening even if we may not understand the “why,” and it also allows us the opportunity to examine the “where” as well. The “geo” piece enabled by earlier implementations of infrastructure in places where there was none previously is also an aspect of Big Data.
An example of the analysis possible is Dr. Ming-Hsiang Tsou’s work on Twitter data discussed earlier and is similar to research results released by Google in predicting the patterns of a flu pandemic cited by the same article above. Google was able to establish the patterns of the spread of flu in the US recently, based on analysis of people's search patterns on certain key phrases and words. Again, the advantage to this is the speed accomplished – hours by Google, as compared to 2 weeks based on current reporting methods by the CDC.
Technologists view technology as the key to this. Given that modern computers and the Internet are enablers by lowering the transaction costs (storage costs, processing costs, and information sharing costs), what has occurred is the ability of masses of information not previously captured in digital form to be digitized. Google is doing this with their augmented reality glasses – digitizing a random glance, and Twitter does the same by digitizing random thoughts and impressions. Implicit knowledge is being digitized as the explicit knowledge once was. Once the thoughts are digitized and shared, the implicit information is transformed, and new value can be gleaned from it – intelligence value from something not capturable in the past. The struggle to explain and understand speaks to the traditional intel analysts looking for the “why” in the data. For the time being, the IC may have to be satisfied with the “what.” Causation may have to take a back seat to correlation alone while events are unfolding. Knowing “what” is happening may have to be considered actionable without knowing the “why” it is happening. The “why” may have to wait until there is time to do a reflection on the events. This is the real struggle in the both the practice (tradecraft) and the explanations centered around what we refer to as the information flow through cyberspace. The old models and ways of thinking are not going to be enough. Why? Because what is being captured and digitized in many forms is implicit knowledge – how people think and what they see. Why they think what they do is a much larger question – but in the current state of digitization, at least the implicit ideas are being visualized dynamically, and there is a huge geolocation piece tied to this data. Governments and organizations that harness the new values created by Big Data (and a geo component is a huge piece to understanding it) will have an edge over those who cannot. An example of seeking causation for understanding can be thought of in McNamara’s fixation on body counts in Vietnam. The metric declared as, “if we kill enough of them, then we must be winning” implied a causation between the number of enemy killed and the desired outcome of the war. It proved to be a huge falsehood the US government bought into at the time and resulted in significant policy mistakes as a result.
Transparency in data in democratic societies is sort of a natural evolution from the passing of such laws as the Freedom of Information Act (FOIA). The presence of such transparency can be indicative of aspects of a society, whether brought about by governance in a democratic society or by connectivity in a less than democratic one. Transparency is transformative – we need look no further than the Mahgreb to understand this. The transparency enabled by the connectivity of technology unleashed the phenomena that brought down governments that had been in power for decades. Whether it leads to democratic governance as a result of the transformation is still in question - one look no further than the evolving situation in Egypt to be confronted with this fact. Connectivity alone may not be enough. That’s why the correlation analysis of such NGO datasets done earlier is important. That’s why the study and discussion of transformational technologies are important. That’s part of the real reason this course is important. Information cannot exist without a medium to support it, whether a stone tablet, a piece of paper, or a hard drive. That medium also has a location. Implicit knowledge now has such a medium that did not exist before, and is both enabled and possible because of the Internet. As stated earlier, this understanding can provide an edge, but it also helps to point out the vulnerabilities created by the connectivity.
The world is increasingly connected and is getting more so. Everything from bank accounts to medical records is available, to some degree, online for a connected society. All this information can exist provided there is a medium to support it, and all of that support is dependent upon a viable, sustainable, and reliable power grid. In a recent article in Scientific American , the U.S. military is studying scenarios for future conflict in Cyberspace. This war would be waged – at least partially – with computers, and the targets in the war would be a peer competitor's information infrastructure. The connected infrastructure creates opportunities for cyber-attacks with disastrous results for large urban areas.
As far back as the Clinton administration, various lists of Critical National Information Infrastructure (NII) (read PDD 63 for a better understanding) were compiled and examined for their criticality and vulnerability. The document lists the following as types of critical infrastructures to be protected:
- Electric power
- Banking and finance
- Petroleum and natural gas
- Emergency services
as pieces of the NII that would need to be defended. These systems do not stand alone but are interconnected in terms of power and information, shown in Figure 45; thus a failure in one system may cause disruption of services or failures in another system. The latest report from the Council on Foreign Relations estimates that "the “Internet of things”—cars, ovens, office copiers, electrical grids, medical implants, and other Internet-connected machines that collect data and communicate—could result in thirty-one billion devices connected to the Internet in 2020." (The August 2013 Scientific American actually refers to an "Internet of Everything" - to me this speaks to a lack of geospatial awareness - we need to think of the web as the Internet of "Everywhere").
The full CFR report can be downloaded: "Defending an Open, Global, Secure, and Resilient Internet". The CFR Task Force found that improved cyber defense and greater resiliency are necessary - but not sufficient by themselves. "Offensive capabilities are required to deter attacks, and, if deterrence fails, to impose costs on the attackers." It recommends the United States launch an "interagency economic counterespionage program that will help prevent foreign services and corporate competitors from stealing secrets from U.S. industry." One aspect of deterrence as a strategy left over from the Cold War is that deterrence is not possible without talking about a second strike capability. According to the "Second Strike" entry on Wikipedia:
"The possession of second-strike capabilities counters a first-strike nuclear threat and can support a no first use ....strategy. Reciprocal second-strike capabilities usually cause a mutual assured destruction defence strategy, though one side may have a lower level minimal deterrence response."
Many questions are on the table for this domain, for example, "What constitutes a second strike capability in Cyberspace?" The answers, assuming they have been defined, remain classified. It has even been argued by Martin Libicki that, "because cyberspace is so different a medium, the concepts of deterrence and war may simply lack the logical foundations that they have in the nuclear and conventional realms."
In the 2 January 2013, Wall Street Journal it was reported that attacks on critical U.S. energy infrastructure are occurring more frequently than realized earlier. DHS’ cyber response team issued a report that thousands of SCADA systems used in the energy infrastructure are linked to the Web and as a result are vulnerable. (Figure 46). Hundreds of attacks were reported to DHS’ Industrial Control Systems Cyber Emergency Response Team (CERT) in 2012, over 40% of which were in the energy sector. The team “has been tracking threats and responding to intrusions into infrastructure such as oil and natural gas pipelines and electric power organizations at an alarming rate.” (The full CERT report can be downloaded).