'One must still have chaos in oneself to be able to give birth to a dancing star'- Nietzsche
Get Adobe Flash player

first-server

Last year saw muted celebrations in commemoration for the 25th anniversary of the demise of the Berlin Wall and the rise of the Velvet Revolution, which seemed back then the first tiles to fall in the reverse domino effect that signaled the collapse of totalitarian communism and the triumph of democratic republicanism and the free market.
Havel called for a new global humanism and for awhile was the toast of Harvard elites and the think tanks of Washington, D.C.
Western bankers and corporations lined up, like gold rush conestogas, in gleeful anticipation for all the loot and booty to come, and which soon windfell, even as Boris Yeltsin repeatedly did the “vodka locomotion” on stage after stage, in lieu of leadership cluefulness. It all ended badly; leave it at that.

But another much more muted, yet far more consequential, 25th anniversary celebration also happened last year – the premiere of scientist Tim Berners-Lee’s World Wide Web protocols, which allowed any two users on the Internet to share information stored on their computers through a network of links.
This system greatly improved information sharing among Berners-Lee and his scientific colleagues, at first, and then, of course, to the rest of the world not long after that.  A utopian idealist, Tim Berners-Lee envisioned a brave new world of open source knowledge, immediate and accessible to anyone, and he understood the paradigm-shifting consequences of such an information storage shift.
But maybe this celebration was so low-key because it, too, has gone badly. So badly that an almost panic-stricken Berners-Lee has recently called for  a ‘Magna Carta for the Web’, in order to bandage the potentially fatal slash and gash wounds privacy and democracy have received at the hands of wolfish capitalists and power aphrodisians.
There’s another momentous 25th anniversary that went virtually unreported last year.  Just before the Berlin Wall fell, Erich Mielke, the long-time head of East Germany’s secret spy service, Stasi, was developing “a comprehensive database of human intentions,” according to Andrew Keen in his just released book, The Internet Is Not The Answer (which I’ll review at a later date).
In an ironic synchronicity that would have impressed C.G. Jung, while Berners-Lee was working toward a data linkage system that would benignly open up world knowledge-sharing, Mielke was working on an almost identical linkage system that would not free citizens but lock them into a disposition matrix of absolute (seeming) predictability.  Mielke yearned to create “der Glaserne Mensch” (crystal man), writes Keen, a totally transparent human that would be much more effective than any old panopticon.
Berners-Lee’s Web would allow humans to travel down link passages they hadn’t even imagined; Mielke would know exactly which passage you would choose and why. However, the fall of the Wall meant the dismantling of the Stasi and the end of Mielke’s relational database dreams, right?
Well, don’t ring dem 1812 overture bells quite yet, Quasimodo.  Because if you look around from the soaring heights of your übermensch vision, like Ol’ Five Eyes does, you’ll note Mielke’s system on full display.
Google’s all-seeing eye is the very model of the Mielke system; it’s right there in all those personalized algorithms. But what’s more, it was in play in 2003 when the US Department of Defense implemented its Total Information Awareness (TIA) program that collected everything on everyone before outrage forced Congress to shut it down by removing its funding.
The dystopian horror that the Edward Snowden revelations have made evident began in the Nixon years (recall Frank Church’s dire warning of 1978), with the kinks of new technologies worked out through the TIA implementation, and reaching, in the NSA’s all-intrusive XKeyscore and the like, a stage of perfection that may have already made democracy, freedom, and privacy either obsolete or impractical.
Given the passive aggression of Google’s algorithmic data collections and the explicitly aggressive data collections of world governments, led by the US and Britain, it’s easy to understand Tim Berners-Lee’s dismay: This is not the World Wide Web he had in mind at all.
Sadly, along with political dissidents, hacktivists, artists and anybody else who deems freedom of thought vital to human progress, it is just such visionaries and true egalitarians as the Tim Berners-Lees of the world who now represent a serious threat to the totalitarian system of order nearly in place.
It is a well-known fact that the germinal idea for the Internet came about in the panicky aftermath of the Soviet launch of the Sputnik satellite in 1957, which exposed the vulnerability of the US telecommunications system to a nuclear first strike.
It was imperative that the message to retaliate got through the military computer network to make the Cold War notion of Mutually Assured Destruction credible. It’s important to maintain the Christopher Walken option, to out-roulette the Russians: Call it the chamber music of the mad.
As most everyone knows the Cold War years from the launch of Sputnik to the fall of the Wall were years largely framed not as a choice between economic systems, but as a moral choice to the world, between good and evil.
There are, of course, deeper philosophical and spiritual questions at work, such as the nature of progress and human civilization; the question of whether we are ‘all created equal’ with full benefits attending or whether that is just an absurd pipe dream for losers; the question of how to marry the high-mindedness of middleclass spoken virtues with the profoundly bleak reality of so many people living hopelessly in the world below the so-called poverty line; the question of whether we should re-think, as Andrew Keen does, and as Thomas P. Keenan does, the whole notion of human-computer merging before it’s too late.
The Internet began as a military idea, and then got handed over to the public for awhile before succumbing to the commercial interests – the Amazons, Googles and Facebooks, especially – that most people can see now, not just the Keens and Keenans. But the Internet grew up in a postmodern world of relative moral values, which was laced with the desire for multi-cultural tolerance and open expression.  It’s clear now that such an approach is incompatible with US military and political ambitions, and the military has claimed back what it probably never stopped seeing as its own.
The ultra potent Stuxnet virus, almost a homophone of Sputnik, would seem to be the US launching of its plans to rule cyberspace.  It is an initiation of a new Cold War, a new fomenting of the moral struggle between good and evil, and a jettisoning of the intolerable relativism that requires gloves to be worn at all times.
Working with commercial interests, which have made Internet participation all but mandatory, the US has declared the Internet a war zone and a principle battlefield in the War on Terror, and, by modus pollens, under the martial control of the US military and its commander-in-chief, the Executive in the White House. Havel’s new humanism has been co-opted by neo-liberal interventionists. Economics is the new MAD.
Christopher Walken has just spun the cylinder of the revolver held to the head of the World Wide Web, where we all live now, while the bankers, always after more bang for the buck, place jeering bets in the background.
Raising what seems like the obvious Question: Do we even have 25 years left?

Leave a Reply

Your email address will not be published. Required fields are marked *