'One must still have chaos in oneself to be able to give birth to a dancing star'- Nietzsche
Get Adobe Flash player

Monthly Archives: February 2015

While there can be little question of the transparency value of all those primary documents that Julian Assange splashed out to the public through Wikileaks a few years ago now, nor of the immense importance of the Snowden revelations in coming to grips with the staggering implications that the Five Eyes global secrets stalking represents for democracy and privacy, an aspect often under-appreciated is the gatekeeper crisis that all these startling eye-openers have brought with them.

Fact is, while the hidden gods are having their merry way with Words, it’s virtually impossible to know which oracle to trust any more.

And there are all kinds of oracles these days. There’s the mainstream media, which includes what’s left of the Press; there’s the Assange prototype of open media based on posting sensitive documents; there’s the Glenn Greenwald selective leak method; there’s Google’s Erich Schmidt calling for a kind of committee to oversee leak distribution. It’s worth briefly considering the pros and cons of each method.

It seems more evident all the time that the mainstream media has lost its way in the frontier wilderness of the digital age. There was a time when media were deeply respected as the so-called Fourth Estate, which took serious the public interest, and advocated on its behalf by aggressive reporting and adversarial coverage of government policies.  Or, at least, that’s what we told ourselves.

But those days are largely gone, more and more people are looking to alternate sources for information about government operations because they don’t sense that the MSM is independent of corporate ownership or telling everything it should in its championing of transparency.

Where the MSM has failed most miserably is in the reporting of foreign affairs, the War on Terror, and the rise of the national security state, which has engulfed every American — not just ‘terrorists’— with its all-pervasive surveillance system, like a Stasi erotic dream.  It hasn’t been much better chasing down the howling wolves of Wall Street.  Unfortunately, for the MSM, these issues are the most pressing of our time.

But there have been many low points. One could argue that the coverage of the 2000 presidential election debacle was downright craven, for instance.  But perhaps the nadir is the 2004 presidential pre-election suppression by the New York Times, at the Bush White House’s request, of evidence that the NSA was wiretapping Americans illegally. The disappointment here was palpable, the trust forever gone.

Enter Julian Assange whose wicked leaks were a welcome shock to a system built on suppression and spying, and served to remind all intelligent adults of the significant gulf between real politik propaganda and real knowledge of political processes at work.  I actually favor the publication of these primary documents that detail operations, as they are often disillusioning in the best possible way.

However, as much as one admires Assange’s chutzpah, his editorialized version of the Collateral Murder video, with its intentional manipulation of viewer response, points up the peril of a having someone with an anarchic disposition in charge of a totally ‘open’ system.

Assange was benign with his political version of Collateral Murder, and I respect what he was hoping to achieve, but not every future hacktivist will be as scrupled or have all his oars in the water.  A totally open system seems rife for disturbing levels of mischief. 

A more problematical gatekeeper is Glenn Greenwald and his selective leaking. This wasn’t much of a problem before Edward Snowden came along and dramatically altered the public’s understanding of the scope and unbridled power of the Five Eyes at work.

But many people who admired Greenwald’s blogs were disillusioned when, following the sensation of his Snowden write-ups, he suddenly jumped ship from the mainstream Guardian, where he was reaching just the kinds of people needing consciousness-raising, to sideline himself for months in order to build a new journalism venture.

But more at issue was the Snowden leaks treasure trough, which he brought with him to First Media, and his decision to leak selectively, rather than in a Wikileaks splash fashion.

In effect, he became the Master Gatekeeper for confidential intelligence whose revelations have so far have proven to be of extraordinary public interest. For many people, this meant that the public was right back to where it was when it gave up on the MSM – not knowing what Greenwald was leaving out (although it’s hard to imagine how it could get much more revelatory than the Prism and XKeyscore programs) and having to implicitly trust in Greenwald’s good will and judgment.

But that was exactly the problem some people had with the arrangement; trusting Greenwald’s judgment; not everyone adores him.  When he sent his lover, David Miranda, with an encrypted thumb drive containing Snowden materials, through GCHQ-infested Heathrow, after having written in the Guardian a couple weeks before about the NSA’s ability to crack encryption, on ehad to wonder if he’s lost his marbles. Further astonishment came when it was discovered that Miranda had passwords in his pocket.

Another, more troubling example was the “special offer” that accompanied Greenwald’s sale of Nowhere to Hide, which leverages his Snowden encounters, which enticed readers to apply for a credit card, vetted by JP Morgan Chase, a financial monster Greenwald has railed against in the past, but more importantly giving them personal data of his readers, a database any number of intelligence agencies would covet. And, jeez, what would happen if that data got lost or stolen?

Admirable writer and fierce advocate of the rule of law though he may be, it’ difficult to trust the judgment of someone’s data leak decisions when their own personal handling of data leaves something to be desired.

Another form of gatekeeping is that advocated by Google’s Erich Schmidt, who has called for a kind of oversight committee to review whistleblower leaks with a view to making sure they are ‘responsible’ leaks that dock rock the ship of State too much. Of course, Schmidt has in mind elite corporate overseers like himself with all his algorithms of power.

The problem with this idea, if it’s not immediately obvious, is that would be akin to Henry Kissinger overseeing the 9/11 Commission, which nearly happened – except he was chased out of town with the torches-and-pitchfork roars of public outrage. Ditto for Schmidtty.

There is no question that when the Fourth Estate abrogated its implicit authority to act as advocate for the people, and, through adversarial posturing, force government to behave with the highest degree of transparency that a healthy democracy requires, it not only helped foment the present crisis of trust, but shot itself in the foot and has accelerated the demise of the Press, whose relevancy in the digital age was already underway.

Probably, as some observers have suggested, we are in for a prolonged period of news fragmentation and decentralization, with people choosing their favorite oracles to receive revelations from, leading to an eventual ‘failed state’ confusion that, naturally, will find the masses looking to the State propagandists for soothing themes, memes, and metaphors: Gatekeepers at the barbed wire fence. 

Hell, we may already be there. I can hear the vicious dogmas barking and fighting over some old bone of contention.

A highlight of this year’s Black History month will be the 50th anniversary of the assassination of Malcolm X in New York’s WashingtonHeights, a neighborhood just north of Harlem. He was gunned down while he was delivering one of his firebrand speeches to a rapt crowd.

In the wake of Ferguson and all the other thousand bullet points of darkness and slow deterioration that have settled upon the black experience in America since his death in 1965, for many African-Americans the Feb. 21 commemoration will no doubt represent a new appreciation for the militant values Malcolm espoused.

For a few days, political rhetoric will abound, conspiracy theories will make colorful re-appearances, white hands will ring black bells, and as with the Berlin Wall and the Velvet Revolution the mainstream media will stim sentimental tears down well-fed cheeks for a news cycle or so, then it’ll be back to baton and macing Dixie to keep her in line. However, I’d like to eschew all that jazz and instead talk about what Malcolm X and I had in common.

Now, you wouldn’t think that someone who grew up in a household where the adults referred to blacks by racial slurs would have much wisdom in common with the likes of Malcolm X and his army of angry, silent, well-dressed acolytes, but I did: three things, actually.

The first was good old Hate. If love is the twinkly twilight that makes us all feel warm and pretty, then hatred is its dark matter firmament, the source soup or secret sauce connecting the dots of all visible materiality. And, certainly, I shared that dispositional pattern — that hatred —with Malcolm X. We all do, in degrees.

But that’s not what I had in mind. Rather, our common hatred was in the form of the racial lines one must not cross. In the mid-Sixties I was living in Mattapan, a lower middle class and predominately Jewish section of Boston that was (is) separated from lower caste Dorchester by Blue Hill Ave. and Morton Street, and was just a stone’s throw, as it were from Roxbury up Blue Hill a bit, where Malcolm X was firing up local blacks with his consciousness-raising speeches about the lines of hatred whites had taught blacks to draw around themselves like cages. And then he’d push his listeners to erase those lines “by any means necessary.”

Morton Street and Blue Hill Avenue weren’t exactly akin to the passage between Israel and the Gaza Strip, but the hostility between the two cultures was sufficiently charged that crossing into the other’s territory was dangerous for whites, and for young blacks meant immediate surveillance, and likely the cops.

This friction between America’s two great ex-slave diasporas, who together have provided more cultural value and depth to American life than any others, dazed and confused me for a long time — until I better understood the corrosive nature of capitalism.

The Jews I lived among were quiet, cultured and devout. On trash collection days my neighbors would throw out amazing stuff (to an eight-year-old) — working bicycles, mounds of books, and antiques, like the old Victrola wind-up with a stash of platters where, right there on the sidewalk, I put on a record and heard a Beethoven symphony for the first time.

But blacks were scary, my Jewish friends said, people to fear and, ultimately, to hate. “If you get off at Dudley [a train stop in Roxbury], you will be jumped,” one of them might say. And so I hated blacks, for no real reason. That was the line drawn, and I drew it around myself.

The second thing I had in common with Malcolm X was a gone father, a mother prone to nervous breakdowns, and a carousel of foster homes. This may be our deepest sympathetic chord. Because such a combination of seeming abandonment and neglect, of constantly being handed off like a football, creates an existential crisis that is beyond issues of race and goes to the core of one’s being.

There’s the black diaspora, but it is surely blacker to be put aboard a ship that sees no shores. It’s not possible to be anchored in selfhood, when your so-called being-in-the-world is flotsam on a sea of anxiety.

I can understand how Malcolm would later reject all authority, until he temporarily found a haven in Elijah Muhammed and his Nation of Islam teachings. Malcolm’s was a deeper journey than mine, perhaps, if for no other reason than his crisis of integument.

When he embraced Islam in prison, and set his life “straight” as a result, giving up criminality to become the charismatic leader of a black supremacy movement under Elijah Muhammed, he may have discovered in his early abandonment the inspiration to think outside the box and the courage to believe in himself.

I found similar purpose and inspiration in my later phenomenological studies and Buddhism, which released me from the anxiety of impermanence. Most of the time.

The third thing I had in common with Malcolm X was name-changing. Malcolm Little found that he needed to change his identity altogether, to wrest control of his fate from the white system. Changing one’s name can provide a sandbox in which to experiment with the possibilities of self that were there before the dominant cultural structures imposed themselves on the will, without providing the self with a sense of belonging.

Little rejected the corruptive desires of capitalism, by embracing an Islam that promised purification, equality and a new start.

Similarly, I had long held in reserve a pen name that was in fact a kind of alter ego that allowed me to suspend the self I had been given and focus my attention on the pure act of writing. At the same time, I wanted to make a political statement that asserted a certain conclusion I had drawn about the world.

Thus, I began writing some but not all pieces under the name Jim Crowe. To me, the world was clearly demarcated by a series of dividing lines that spoke of exclusions, whether race, class, sexuality, or economic control.

During this new commemoration season, as the usual suspect pundits apply their analysis anew to old conspiracy theories about Malcolm’s assassination, I will dwell instead on his legacy of representing the black/white divide that still controls the sociological dialectic of global power.

The truth is, I didn’t really learn much about Malcolm X until well into adulthood. I tend to avoid fetishized icons, whatever their stripe. But I have to admit that Malcolm X’s speeches still move today with their prophetic understanding of the underlying dynamics of power and exclusion.

His speeches where he posits the consequences of America’s imperial actions “coming home to roost” is, even today, perhaps more so than ever, a fiery, soul-confronting reality check that, when properly understood, goes a long way toward revealing the machinations of our Kissinger-esque New World Order; it’s the same old world order – just the truncheons are shinier and the prisons are fuller than ever with black angry faces strutting around the yard all cock-a-doodly-doo, like the shape of things to come.

‘Some people are finding it harder and harder to distinguish between real people and targets from some Call of Duty scenario’

It”s so easy to misconstrue what we see. So easy, in fact, that eyewitnesses are not regarded as terribly reliant conduits of reality and their testimony in a court of law is routinely regarded as suspect.

It”s not much better for what we hear (or think we hear). Many years ago in Abu Dhabi, in my IB English class, I conducted a Chinese Whispers session as prep for a novel we were reading (Camus” The Stranger). By the time my very brief message got around from ear to ear, from student 1 through 19, it had changed dramatically.

We often bring what we expect or want to perceive into the perception and communicate accordingly.

Misconstruing is not limited to the phenomena of everyday life, but is a problem for science as well. Heisenberg’s Uncertainty Principle, for instance, states that formulating an absolute truth about a phenomenon is problematical because the object of our scrutiny is altered by the scrutiny itself.

Misconstruing came to mind the other night when I was watching a re-run of a Seinfeld episode – “The Diplomat”s Club” – the one in which Elaine is peevishly looking after a grumpy old rich guy who is now on his death bed. Elaine lights up when she learns that her ministrations have not been for nought, as he has named her in his will in appreciation for her service.

Elaine visits him in his hospital room and, wishing to make him more comfortable, carries a pluffed pillow over toward him, not realizing that she is being spied upon, and that the spy sees not an attempt at comforting but an attempt at murder, which she “interrupts” heroically and proceeds to undermine Elaine, who, as result loses her place in the will and, of course, the bounce in her step.

But misconstruing does not always lead to a punch line. There are more recent examples in “real life” where misconstruing has lead to misidentification and profound repercussions. For instance, last September, just after a British-accented ISIS jihadist beheaded a journalist, Australia used emergency powers for the first time to conduct a massive raid on an Islamic man in Sydney, reportedly affiliated with ISIS, who, an informant said, owned a scary sword.

After hysterical TV coverage of his arrest, his sword turned out to be ceremonial and plastic (a fact not subsequently widely reported), and the Islamic sect he belonged to was actually hostile to that inspiring ISIS. Luckily, the man was not killed in the raid, but the end result was the almost-immediate passage of sweeping new anti-terror legislation in parliament that contains severe implicit restrictions on civil liberties and the threat of torture.

It is not much of a step from misconstruing to misidentifying, and unlike as with the Sydney swordsman, it does not always end well for the targeted one. After the Boston Marathon bombing the FBI released a photo of the suspects online. One viewer thought the photo looked like his “friend,” Sunil Tripathi, a BrownUniversity student, and he began tweeting this information. Soon Reddit picked it up, and almost immediately a swarming vigilante mob began pursuing Sunil. The trouble is, after a panicky few hours, Sunil was soon thereafter discovered in the ocean, drowned (circumstances still unclear).

Recently, Melissa Howard had a piece in Overlandmagazine, “Spot the Terrorist,” that details not only facets of Sunil’s case but the rise of online vigilantism in general. No doubt, there’s a lot of instant gratification to mobbing a target. As with the delightful distancing of remote-control drone kills and the lovely story arcs of long distance American sniper work, donning a virtual white hood on the Internet must seem like Heaven’s playground to all the reactionary-aggressives out there.

There isn’t any question that some people – perhaps many people – are finding it harder and harder to distinguish between real people and targets from some Call of Duty scenario. But when you swarm, not only do you belong to “a greater cause” for awhile, but the guilt and shame load is widely distributed, and consequently diluted.

Early last year, there was the Joseph Kony saga, where children were crowdsourced on Facebook to “Get Kony.” This is an example of how dangerous cyber-manipulation can be (the sensation died, and Kony is still at large). It doesn’t help that Facebook was recently discovered to be experimenting, in cooperation with the US military, on the emotional manipulation of users.

This corporate-military alliance is not an anomaly: We ourselves are being militarized, drafted into service, and those who resist will be tomorrow’s burning cross victims, and with more than 1.5 million people on the US secret Watch List (i.e., suspects, dissidents) there’s plenty of torching ahead.

We are all potential Sunils, Konys and bin Ladens, in the abstract. This is not mere paranoia. One of the more frightening portions of the recent film Citizenfour came early on when government whistleblower Jacob Appelbaum told a group of activists that data trails follow us everywhere, and if one is even perceived as a suspect or government target that that status will follow you around for the rest of your life. East Germany’s Stasi reportedly had some 170, 000 informants willing to snitch, undermine and even some scores to earn brownie points from The Man, but with the Internet prevalence available informants no doubt reach into the millions.

But what’s more, digital vigilantism and stalking feeds the fires of shrill lunatics and psychopaths who were already a problem before America took her gloves off and announced to the world, “No more Mrs. Nice Girl.” There used to be an American TV program called Perverted Justice, in which vigilantes would “expose” various miscreants to the public (they seemed to specialize in pedophile cases), and worked to do all they could do to destroy their target, including anonymously messaging the target’s friends, family, and co-workers to germinate seeds of suspicion and hate. The program was finally cancelled, after being exposed for corruption, misrepresentations and mistakes, but not before they had fatally tarnished targets.

It gets worse still. Given the technology now available (and the terrifying stuff on its way), what with infra-red cameras, thermal imaging, rootkits, wireless highjacking, surreptitious PC enslavement, anyone could wake up one day and discover themselves a target of abuse and degradation as part of someone else’s idea of fun or a good hunt. And depending on the stuff you are made of, you may go Luddite, or get meek, or become a model prisoner. After all, it’s a case of “national security,” right?

In this near-future mirrored world you may even try to break free. There’ll be leeway, no doubt. You will be allowed to confront yourself in the mirror. Your masters will e-jubilate when you give yourself the rude finger in the mirror. They will delight with you when you play the mirror as an absurdist prop. And they may even look the other way when frustration builds and you may ty to shoot you way out of the glass house. To them, it’s all a game. But may the gods help you if you turn, in earnest, to the one-way mirror and tell them what you really think, your expression all contempt and derision.

Because it’s all about controlling the narrative, and if you try to escape the part they’ve set for you to become a True Man, if you try to alter the dialogue or the story arc – well, next thing you know, it won’t be like the relatively benign Truman Show, but more like Camus’ The Stranger, wherein peripheral personages testify against your dispositional deficiencies, and you”ll wake up one day treated like Peter Lorre in M, until pursued and trapped, they come to “take” you in the end, like Stan in Pinter”s The Birthday Party, to blow out your bloody candles.

Aye, now it’s dark; there’s no misconstruing that.

‘Maybe this celebration was so low-key because it, too, has gone badly’

Last year saw muted celebrations in commemoration for the 25th anniversary of the demise of the Berlin Wall and the rise of the Velvet Revolution, which seemed back then the first tiles to fall in the reverse domino effect that signaled the collapse of totalitarian communism and the triumph of democratic republicanism and the free market.

Havel called for a new global humanism and for awhile was the toast of Harvard elites and the think tanks of Washington, D.C.

Western bankers and corporations lined up, like gold rush conestogas, in gleeful anticipation for all the loot and booty to come, and which soon windfell, even as Boris Yeltsin repeatedly did the “vodka locomotion” on stage after stage, in lieu of leadership cluefulness. It all ended badly; leave it at that.

But another much more muted, yet far more consequential, 25th anniversary celebration also happened last year – the premiere of scientist Tim Berners-Lee’s World Wide Web protocols, which allowed any two users on the Internet to share information stored on their computers through a network of links. 

This system greatly improved information sharing among Berners-Lee and his scientific colleagues, at first, and then, of course, to the rest of the world not long after that.  A utopian idealist, Tim Berners-Lee envisioned a brave new world of open source knowledge, immediate and accessible to anyone, and he understood the paradigm-shifting consequences of such an information storage shift.

But maybe this celebration was so low-key because it, too, has gone badly. So badly that an almost panic-stricken Berners-Lee has recently called for  a ‘Magna Carta for the Web’, in order to bandage the potentially fatal slash and gash wounds privacy and democracy have received at the hands of wolfish capitalists and power aphrodisians.

There’s another momentous 25th anniversary that went virtually unreported last year.  Just before the Berlin Wall fell, Erich Mielke, the long-time head of East Germany’s secret spy service, Stasi, was developing “a comprehensive database of human intentions,” according to Andrew Keen in his just released book, The Internet Is Not The Answer (which I’ll review at a later date).

In an ironic synchronicity that would have impressed C.G. Jung, while Berners-Lee was working toward a data linkage system that would benignly open up world knowledge-sharing, Mielke was working on an almost identical linkage system that would not free citizens but lock them into a disposition matrix of absolute (seeming) predictability.  Mielke yearned to create “der Glaserne Mensch” (crystal man), writes Keen, a totally transparent human that would be much more effective than any old panopticon. 

Berners-Lee’s Web would allow humans to travel down link passages they hadn’t even imagined; Mielke would know exactly which passage you would choose and why. However, the fall of the Wall meant the dismantling of the Stasi and the end of Mielke’s relational database dreams, right?

Well, don’t ring dem 1812 overture bells quite yet, Quasimodo.  Because if you look around from the soaring heights of your übermensch vision, like Ol’ Five Eyes does, you’ll note Mielke’s system on full display. 

Google’s all-seeing eye is the very model of the Mielke system; it’s right there in all those personalized algorithms. But what’s more, it was in play in 2003 when the US Department of Defense implemented its Total Information Awareness (TIA) program that collected everything on everyone before outrage forced Congress to shut it down by removing its funding.

The dystopian horror that the Edward Snowden revelations have made evident began in the Nixon years (recall Frank Church’s dire warning of 1978), with the kinks of new technologies worked out through the TIA implementation, and reaching, in the NSA’s all-intrusive XKeyscore and the like, a stage of perfection that may have already made democracy, freedom, and privacy either obsolete or impractical.

Given the passive aggression of Google’s algorithmic data collections and the explicitly aggressive data collections of world governments, led by the US and Britain, it’s easy to understand Tim Berners-Lee’s dismay: This is not the World Wide Web he had in mind at all.

Sadly, along with political dissidents, hacktivists, artists and anybody else who deems freedom of thought vital to human progress, it is just such visionaries and true egalitarians as the Tim Berners-Lees of the world who now represent a serious threat to the totalitarian system of order nearly in place. 

It is a well-known fact that the germinal idea for the Internet came about in the panicky aftermath of the Soviet launch of the Sputnik satellite in 1957, which exposed the vulnerability of the US telecommunications system to a nuclear first strike. 

It was imperative that the message to retaliate got through the military computer network to make the Cold War notion of Mutually Assured Destruction credible. It’s important to maintain the Christopher Walken option, to out-roulette the Russians: Call it the chamber music of the mad.

As most everyone knows the Cold War years from the launch of Sputnik to the fall of the Wall were years largely framed not as a choice between economic systems, but as a moral choice to the world, between good and evil.

There are, of course, deeper philosophical and spiritual questions at work, such as the nature of progress and human civilization; the question of whether we are ‘all created equal’ with full benefits attending or whether that is just an absurd pipe dream for losers; the question of how to marry the high-mindedness of middleclass spoken virtues with the profoundly bleak reality of so many people living hopelessly in the world below the so-called poverty line; the question of whether we should re-think, as Andrew Keen does, and as Thomas P. Keenan does, the whole notion of human-computer merging before it’s too late.

The Internet began as a military idea, and then got handed over to the public for awhile before succumbing to the commercial interests – the Amazons, Googles and Facebooks, especially – that most people can see now, not just the Keens and Keenans. But the Internet grew up in a postmodern world of relative moral values, which was laced with the desire for multi-cultural tolerance and open expression.  It’s clear now that such an approach is incompatible with US military and political ambitions, and the military has claimed back what it probably never stopped seeing as its own.

The ultra potent Stuxnet virus, almost a homophone of Sputnik, would seem to be the US launching of its plans to rule cyberspace.  It is an initiation of a new Cold War, a new fomenting of the moral struggle between good and evil, and a jettisoning of the intolerable relativism that requires gloves to be worn at all times. 

Working with commercial interests, which have made Internet participation all but mandatory, the US has declared the Internet a war zone and a principle battlefield in the War on Terror, and, by modus pollens, under the martial control of the US military and its commander-in-chief, the Executive in the White House. Havel’s new humanism has been co-opted by neo-liberal interventionists. Economics is the new MAD. 

Christopher Walken has just spun the cylinder of the revolver held to the head of the World Wide Web, where we all live now, while the bankers, always after more bang for the buck, place jeering bets in the background.

Raising what seems like the obvious Question: Do we even have 25 years left?