John Johnston is Professor of English and Comparative Literature at Emory University, where he teaches literature and science, media theory and technology. He is the author of
This is the source
Malware and criminal operations performed by botnets on the Internet not only pose a new threat, but also point to our increasing reliance upon a new form of machinic agency, which I call the webbot assemblage. Whereas news media coverage of its operations considers only their human aspects, mostly in relation to crime and cyberterrorism, Daniel Suarez's recent novel
Examines the implications of the webbot assemblage.
In recent years, sophisticated new forms of cybercrime and cyber warfare have
displaced spam, pornography, and vandalistic viruses as the most visible threats from
the Internet's dark
underside.malicious software,
malware includes
viruses, worms, trojan horses, spyware, rootkits and other software designed to
exploit
vulnerable computer networks.
While certainly justified — indeed, to ignore Internet security and the dangers of
malware would be foolish — this media attention is concerned almost exclusively with
the human face
of a vast technological assemblage whose machinic operations
mostly remain obscure. Specifically, our greatly increased dependency on the Internet
necessarily also means our increased dependency upon a variety of bots
(software robots) and intelligent agent software more generally. Much of what happens
on the Internet is enabled or carried out by web bots, spiders, screen scrapers, and
many quasi-autonomous software systems. Although essential to the functioning of our
current information society, the new forms of machinic agency that bots instantiate
have received very little critical attention outside the circles of Internet security
and data mining professionals. Here, I will examine what I call the webbot assemblage
from multiple, partially overlapping perspectives – first, new malware and Internet
security, second, a contemporary cyber-thriller in which a webbot assemblage figures
centrally, and third, the dynamically changing nature of the Internet itself. My aim
is to sketch a new understanding of the evolving and complex imbrication of human and
machinic agency that the Internet is bringing about. Indeed, the developing
technology of the webbot assemblage is inseparable from many of the dynamical changes
we have witnessed in the Internet itself over the past decade or so, as it has
acquired the traits of an ecosystem defined by netwar, software arms races, and the
possible evolution of low,
barely intelligent forms of artificial life.
Unlike the computer voices on the telephone with which we frequently interact, most
bot activity remains invisible. In eerie silence, countless numbers of bots
tirelessly search for, record, retrieve, sift through, and act upon the
ever-enlarging masses of data without which our contemporary high-tech world could
not function. While most of this activity occurs on the Internet, it is instigated by
and purportedly serves the interests of people at the front-end,
in offices
and at desktops everywhere. Much of financial management, for example, is automated
by bots, which more and more often determine whether or not we get a loan or
mortgage. Bots scan x-rays and MRIs, function as players in online games and as
purchasing agents for brokerage houses. They operate and monitor surveillance cameras
all over the globe, as unblinking eyes that watch and record many of our activities —
our movements, spending habits, commercial transactions, and health records — which
other bots in turn analyze for patterns which are then sold on the market. The
massive increase in cell phone and e-mail surveillance since 9/11 would not be
possible without bots. In fact, the Internet itself, which we commonly think of as a
network of people using machines, is increasingly used for machine-to-machine
exchange, specifically Electronic Data Interchange (EDI). In sum, Internet bots now
automate a widening range and number of activities that until recently only humans
could perform.
Initially, bots were a basic tool for network maintenance and data management. But
with the Internet's accelerated use and expansion in the late 1990s, bots were
developed that could search the Web, download pages or selected bodies of information
following refined search criteria, and then bundle it neatly in a file for the human
user.botnets.
In its simplest form, a botnet is an army of compromised computers that takes orders
from a botherder.
In
The software that creates and manages a botnet makes this threat much more than the previous generation of malicious code. It is not just a virus; it is a virus of viruses. The botnet is modular — one module exploits the vulnerabilities it finds to gain control over its target. It then downloads another module that protects the new bot by stopping antivirus software and firewalls; the third module may begin scanning for other vulnerable systems
zombieto send waves of spam or pornography, or else enslave it in huge botnets that are mobilized in Distributed Denial of Service (DDoS) attacks, which overload and crash the target website, rendering it inaccessible. Initially such attacks were directed against online gambling sites and large corporate web sites, but more recently all sorts of organizations and even governments have been targeted. In
darknetsconstitute an extensive network of underground sites used to develop and market the essential tools of the trade.
One significant side effect of these DDoS attacks has been to make more visible both
the power of bots and our greatly increased dependency upon them. The problem of
countering the threats they pose will be considered later; here let it suffice to
note that with the constant development of Internet security measures we are
witnessing an escalation of the malware wars,
which are usually represented as
an evolving software arms race between the good guys and the bad guys,
with
the future of the Internet at stake. These developments — and foremost the greatly
increased sophistication of the tools that now make up criminal webbot assemblages —
indicate an important historical shift. This is evident in the large numbers of
criminal hackers, their complex organization, and often the concerted nature of their
actions, which sharply contrast with the practices of the preceding epoch, when only
small numbers of relatively isolated hackers wrote and launched computer viruses for
vandalistic, anti-social motives or simply to experiment with software in the
wild.
This historical shift to large criminal organizations and thus to a well-financed criminal hacker class whose motivation is purely monetary or economic is not the whole picture, however. Many recent events suggest that the line between the mafia hacker and the State-supported hacker is becoming blurred. In his book, Inside
counter cyber-offensiveagainst US aggression. As a result, as Carr puts it,
non-State hackers [have become] a protected asset
nomad war machines,which, they theorize, have always existed apart from but can also be appropriated by a State apparatus.
capturebut never fully
internalizethe war machine. Significantly, among the latter's many attributes and associations, secrets and betrayal figure largely. However, in the contemporary context the primarily technical aspect of war machines and the importance of code assume a significance not discussed by Deleuze and Guattari.
The Stuxnet worm provides a highly instructive example. Its apparent purpose was
simple: to slow down Iran's production of high-grade uranium by destroying the
functionality of its centrifuge machines. However, Stuxnet represents a striking leap
forward in complexity and functional design. Based on a rootkit and multi-functional
set of software modules much like the criminal botnet assemblage described earlier,
it adds a large array of components to increase its chances of success — for example,
it exploits several zero-day
Windows vulnerabilities (i.e. vulnerabilities
still largely unknown to software developers and for which no security fix has yet
been issued by vendors), it increases the number of possible infection paths,
includes new antivirus evasion techniques and two forged digital signatures, and it
can be easily updated and possesses a command and control interface.
The kind of multi-functionality now evident in webbot software is dramatically illustrated in Daniel Suarez's popular cyber-thriller,
the Daemon,as this new form of machinic agency comes to be called, the bots carry out increasingly complicated scenarios. First, they recruit several human agents, including a journalist who helps to disguise the Daemon's murderous and destructive actions by shifting the blame onto an investigating police officer whose supposed theft of Sobol's money is publically
exposed.Almost invisibly, armies of bots then remorselessly begin to dismantle our current society and reconstruct it as a fully distributed, automated system.
An unlikely event triggers the novel's initial action: two of the leading programmers
at Cyberstorm Entertainment, a highly successful producer of Internet games, die of
what first appear to be high-tech accidents. One has his throat slit by a wire that
rises up across the path where he normally rides his motorcycle; the other is
electrocuted when he tries to enter the company’s data center and possibly shut down
the servers. However, the ensuing police investigation reveals that these deaths are
very sophisticated, automated executions. Proceeding slowly and methodically in the
face of the skepticism and technical ignorance of the higher ups
in the police
department and FBI, the local homicide investigator and a computer consultant who is
initially a suspect piece together evidence of an unprecedented new type of plot in
progress. It turns out that Matthew Sobol, the wealthy and inventive game designer
who had founded and controlled Cyberstorm, had recently died of brain cancer. For
reasons never fully disclosed, Sobol had programmed bots to scour Internet news
sources for the announcement of his own death, and then, in response to this
announcement, to set in motion a vast complex of orchestrated events, including the
destruction of the FBI agents who attempt to search his California mansion. The novel
renders this attempted search-turned-siege as a vivid action sequence. The house is
defended by a bot-controlled, weaponized Hummer programmed to hone in on the heat
signatures of the FBI agents; inside, it is booby-trapped with high tech weapons like
subsonic broadcasts that leave the attacking SWAT team writhing in nausea. Later in
the novel a whole fleet of autonomous vehicles will be built according to online
specifications and will constitute a mechanized army ready for attack.
Much of the novel's action is made possible by Sobol's modification and deployment of the software he had developed for his hugely popular Massive Multiplayer Online Role Playing Games,
lifeof his own as a
recruiting avatar.After one of their combat encounters, Boerner leaves Gragg an encrypted clue that will unlock this special interface, after which Gragg is led to take intelligence and skill tests and then recruited by the now dead Sobol, who appears to Gragg in a video made before his death.
Gragg is only one of many among the criminals, the disaffiliated, and the out of work who are similarly induced to join Sobol’s secret network. Membership gives them access to this special graphical interface from
In essence Sobol is using the GPS system to convert the earth into one big game map. We're all in his game now
call-outsidentify human agents to one another and the resources that are locally available. Beginning with the actions of bots and progressing to multiple, hybrid forms of agency operating at several levels, Sobol's online games become an autonomous network whose agents begin to penetrate into and transform social, economic, and political reality.
In effect, Sobol's online game world functions as a transformational matrix for
bringing about a fully distributed and automated society, initially engineered by
webbots and other robotic agents that collectively constitute a remorseless machine —
the Daemon
of the title. In computer technology, a daemon
refers to a
small computer program or routine that runs invisibly in the background, usually
performing house-keeping tasks such as logging various activities and responding to
low-level internal events. Analogously, the reader of the novel doesn't directly
perceive the actions of the webbots, only humans carrying out their instructions —
for example, at a small firm where engineers are converting newly purchased SUVs into
autonomous vehicles according to specifications received online from an outsourcing
company. Over the course of the novel this Internet daemon extends its reach into an
increasing number of production and distribution networks, and thus into the economy
at large, slowly and systematically dismantling and rebuilding the world according to
a ruthless logic of efficiency and highly distributed, low-level intelligence. By the
novel's conclusion, the Daemon has infiltrated and taken over the databases of many
large corporate and financial institutions, and successfully frustrated the
government's efforts to defeat it.
Whereas the idea for
In a web-cast lecture entitled
Artificial Intelligence would be the ultimate version of GooglePage said in 2000; and in 2003:
The ultimate search engine is something as smart as people —or smarter
To be sure, Suarez is not the first or only one to wonder if bots might constitute a
new form of machinic life.
low-lifeintelligence, the medium and environment in which network agency could evolve to greater complexity. This double transformation, moreover, points to a specifically
machinicaspect of the webbot assemblage. In effect, it evolves through the doubling back on itself or retroaction of a cybernetic loop: humans build and deploy bots in an extension of human agency, but — from a reversed perspective — the bots also reproduce and evolve by means of the human desire to build more and better bots. When a certain threshold is achieved — that of an autonomous technology — this language is no longer metaphorical, but simply indicates how an assemblage of human and nonhuman agencies has become self-sustaining and self-perpetuating.
technique has become autonomousand operates beyond the control of human agency. These different strands come together in George B. Dyson's Darwin Among the Machines: The Evolution of Global Intelligence, where he writes:
Everything that human beings are doing to make it easier to operate computer networks is at the same time, but for different reasons, making it easier for computer networks to operate human beings
lives onas a form of artificial or machinic intelligence through the operations that the webbots collectively perform. As a consequence, Sobol's relationship to his
daemon spiritappears as complexly ambiguous. At once a posthumous and
posthumanfigure, the Daemon is not the cause but the result of the bots' collective and emergent actions. At the level of Sobol's programmed bots, there is no
human faceor purpose — only an operational logic extending along vectors of a vast communicational network, in effect defining a virtual plane of immanence actualized in a multitude of highly distributed parallel actions. Working
Thus, while the Daemon
is denominated as such by Sobol and assigned this role
as immanent and material cause by Sebeck and the other characters, this totalizing
effect should be understood as a metaphor, as the novel's symbolic staging of the way
humans make sense
of a complex transformation, in this instance of how Sobol's
bodily human intelligence has been extended into and replaced by the concerted
actions of thousands of little intelligences at work, as if they were swarms of
robotic homunculi. In other words, the Daemon provides a convenient fiction by which
a unified and transcendent agency can be attributed to the low-life
actions of
a highly distributed intelligence that is re-making all complex, hierarchical
organizations and structures in its own flat
image. This diffraction or
gearing down of human agency into lower machinic levels is represented as initially
violent and destructive, in keeping both with the violence of Sobol's first-person
shooter games and the literary genre of techno-thriller fiction. However, it is
ultimately not all bad news for humans, who are quite capable of living productively
in flat, web-like networks instead of large scale, corporate hierarchies. Indeed,
such flat networks may well be the necessary bedrock of a more sustainable human
future, as
We can now consider from a wider perspective what Suarez has extrapolated from our
contemporary hi-tech world that requires this embedding of the webbot assemblage and
its particular form of agency in his fictional narrative. We shall see first that it
is
residein the US or at the US government — indeed, sometimes these two entities are blurred. In 1886, the US Supreme Court declared that corporations were
personsentitled under the Fourteenth Amendment to the same protections as living citizens. But as Peter Barns in
... the modern corporation isn't a real person. Instead, it's an automaton designed to maximize profit for stockholders. It externalizes as many costs as it possibly can, not because it wants to, but because it has to. It never sleeps or slows down. And it never reaches a level of profitability at which it decides,This is enough. Let's stop here
demonimage, since profit is not its motive. The Daemon's purpose, rather, is only to perpetuate itself as a fully distributed agency with no central authority and thus as an inversion of the corporate structure.
is a remorseless system for building a distributed civilization. A civilization that perpetually regenerates. One with no central authority
In the 1990s, two researchers for the Rand Corporation, John Arquilla and David
Ronfeldt, defined netwar as an emerging mode of conflict (and
crime) at societal levels, short of traditional military warfare, in which the
protagonists use network forms of organization and related doctrines,
strategies, and technologies attuned to the information age
Hierarchies have a difficult time
fighting networks,
and thus It takes networks to fight
networks
criminals, terrorists, or peaceful social activists
as specific adversaries
—
therefore operate as completely dispersed nodes or multi-channel
cells, in
either case as part of de-centralized or highly distributed networks without central
command and control structures; instead, they are headless
or
hydra-headed
and allow for local initiative and autonomy. Second, a
frequently deployed tactic for attack in netwar is the use of swarms or a massively
large number of agents that can simply overwhelm the enemy. Whereas the terrorist
organization Al-Quaeda serves as an obvious example of the first feature, the massive
swarms of bots deployed in actual Distributed Denial of Service attacks clearly
exemplifies the second.
Both of these features are fundamental to the netwar carried out by Suarez's
darksideand
the ambivalent dynamics of netwarare revealing in this respect. They see the type of conflict they call netwar in relation to a specific, historically repeating pattern:
a subtle, dialectical interplay between the bright and dark sides in the rise of a new form of organization
bad guyson its cutting edge, who are often eager and very quick
to take advantage of new ways to maneuver, exploit and dominate
good guys,in contrast,
may be so deeply embedded in and constrained by a society's established forms of organization that many have difficulty becoming the early innovators and adopters of a new form
cascade across the spectrum of conflict and crime
insidersand
outsidersare no longer so easily separated or even identified. And far from being a transitional phenomenon, it will likely be a
permanent aspect of the new era
The problem evident here, however, is that Arquilla and Ronfeldt assume that specific
types of human subject (good guys
and bad guys
) already exist and are
simply called forth by the emergence of new forms of organization, that the
bad
subjects then appropriate these new forms for their own antagonistic
ends. As a consequence, the authors remain bound by a static and essentialist
conception of human agency. I suggest, to the contrary, that the advent of a new form
of organization and a new technology — and it is not evident that either of these
ever occurs separately — alters the very nature of human agency and thus our
understanding of the human subject. Specifically, as new technology both elicits and
creates new possibilities of agency, a corresponding zone of subjective
indetermination is also created. In the new age of digital connectivity — point and
click, cut and paste, rapid information searches and scanning — in which writing and
using code, adapting to completely mobile communications and collectively
participating in online gaming and social media
are all new forms of action,
the technology transforms what it connects. Specifically, the putatively human
subject is first and foremost (re)defined operationally as a dense node of complex
and adaptive functionalities in multiple networks, and thus a site of uncertain
affects, stoppages, and transductions. These operate as neither simple mechanical
transmissions of force nor as exchanges of meaning, but as both at once, as
entanglements and comminglings in which agency is not only multi-mediated and
multi-modal but viral and memetic. In a technological network society the human is
never fully separated from the nonhuman and the machinic — there are only degrees
of separation.
In effect, human agency diffracts into multiple, interacting sub-agencies — many of
which are nonhuman — only to be (but not always) re-assembled in entirely new
configurations and aggregates. In the primary example developed here, a particular
set of human and machinic agencies working together defines the webbot assemblage.
What I called at the outset the human face
of this assemblage can now be
understood (like Sobol's
In the past few years professional Internet security analysts have become increasingly alarmed by the sophistication and mounting costs of criminal activities on the Internet. A 2007 issue of the journal
the Malware Warsat the Santa Fe institute, representatives from the FBI, academic computer scientists, and security specialists from companies like Google and Symantec quickly arrived at full agreement on several fact-based issues: first, that the accelerated development of malware was driven by huge profits, and financed mostly by criminals residing outside the US, particularly in Russia, Eastern Europe, and China, where there are few if any laws or regulations; and second, that these criminals are highly organized and constantly innovating, sharing, or selling new malware to one another and often working together, especially on large Distributed Denial of Service (DDoS) attacks.
the good guysare way behind, always playing defense or catch-up against ingenious new software and tricks, but because of the very nature of the problem. As one put it:
One software bug or weakness equals millions of compromised hosts.Another was equally blunt:
The rate of evolution is so much higher. Malware has such a high evolvability, it may evolve to the point that the Internet is no longer useable.In sum, not only was there full agreement that evolvability and system robustness are the key issues, but few of the attendees had any problem accepting the assumption that
[software] programs behave enough like organisms that some lessons from nature might be applicable to the Internet and malware.
The upshot of this perspective is that we are not only witnessing but, to varying
degrees, participating in an escalating evolutionary arms race between attacking and
defending software systems. The hope, explicitly stated at the Santa Fe workshop, is
that the good guys
will prevail by keeping the operational costs of defending
systems within reasonable limits, and thus at least stabilize the situation until a
more robust and less vulnerable Internet can be evolved. But of course, arms races
are inherently unstable and thus unpredictable.
Two recent developments provide direct evidence. First, as reported in ComputerWorld Security, a new feature called
geneticcomponent in the webbot assemblage clearly produces additional variations and thereby increases its evolutionary potential.
As already noted, these systems are constantly subjected to new evolutionary pressures. We see further evidence in another development reported in
ideologicaland politically motivated attacks, citing recent DDoS attacks against Australian government websites by anti-Scientology groups incited by the government's plan to block access to pornography on the Internet. But it also reveals that new tactics are being deployed in botnets to avoid detection and thus circumvent defense measures. Heretofore most successful DDoS attacks owe their success to sheer numbers, often involving fifty or sixty thousand zombie machines. Of course, the sheer size of these botnets makes their attacks highly visible. The article notes a new tendency to reduce this visibility — to attack in many irregularly-timed pulses (
throttling, rather than employing a few massive waves), at a much wider bandwidth of IP addresses, and, perhaps more significantly, to employ camouflage by encrypting the operational scripts in innocent-looking data. These new tactics, thus far, have proven to be extremely difficult to defend against. We can therefore expect a new tendency to assemble smaller, smarter, and less visible botnets, which will in turn demand new and perhaps different kinds of defense and counterstrike measures. The more recent revelations about Stuxnet, and in particular its precise targeting capacity and officially unassignable origins, only aggravate the situation, enabling the murky world of cyber warfare to transition rapidly into a potentially global and highly destructive battlefield.
While the escalation of the malware wars
has been represented as a
technological arms race between the good guys and the bad guys,
with the
future of the Internet at stake, the actual discourse necessary for understanding the
complex dynamic of interactions at work has been that of biological ecosystems and
the survival and evolution of complex adaptive systems. This conceptual disconnect is
surely evidence that the concepts of netwar, cybercrime, and even cyber warfare
remain too dependent upon conventional and unquestioned notions of human agency, in
which the capacity for intentionality, self-awareness, and control remain uppermost.
But meanwhile, on another scene — should we call it a form of technological
unconscious
? — new and different forms of agency are at work. Unfortunately,
conventional notions of human agency neither provide a reason for considering the
complex dynamics of the webbot assemblage or even the Internet itself as a
conglomerate assemblage, nor do they instigate any interest in recognizing the
rapidly developing forms of artificial life and intelligence we are busy surrounding
ourselves with and indeed building ourselves into. Unless we analyze the software
assemblages in which these processes are instantiated, we shall fail to perceive and
understand the diffraction of human agency into the mundane, barely intelligent bots
and botnets that operate on and are changing the very nature of the Internet.
Bots.