'One must still have chaos in oneself to be able to give birth to a dancing star'- Nietzsche
Get Adobe Flash player

Monthly Archives: April 2015

he signs of democracy’s demise have been there for a long, long time.

In the end, Socrates preferred his hemlock to their democracy and flipped the hoi polloi the bird (check out Crito’s sit-com cringe at the gesture), as he slipped into the Underworld. Who could blame him?

He’d been wrongly accused of impiety for not supporting the state-sponsored terrorists (for what are the gods, if not terrorists?), had his lamb basted and spit upon for pursuing “physicalist” hunches (i.e., early scientific methodology), and been abused for corrupting youths with his outrageous sophistic questions, such as, “Yes, but why do you believe that? Show me the evidence.” 

Oh, he was a gadfly, that one, a real ripe arse pimple, anarchist and symbol crasher. And don’t think for a second that twerk in the face about owing Asclepius a cock escaped raised eyebrows among the bare ruined choirboys either.

Whether shouted from the left chamber of the political fulcrum or the right, all you hear is the endless huzzah of appeals to world democracy, as if the collapse of the Bill of Rights in the world’s most exceptional democracy were not a revelation but an illusion promoted by conspiracy theorists.

In the endless millennial battle between Left and Right, the latter always had the upper hand because, garbed in the sadistic leathers of the ruling fascistas , they have almost always controlled both the means of destruction (military) and the means of production (industrial). 

Being essentially non-violent in its approach, and at least tacitly egalitarian in the distribution of desires, all the Left ever had going for it was the language of idealism, with language itself being the sluiciest ideal of all.  This is not to say the Right has no ideals – after all, what is the Project for a New American Century, if not an ideal unleashed like Boris Karloff in the night? But the Left moved us for awhile with Utopian idealism.

But the Left went totally wrong somewhere, down another road not taken and, man, that has made all the difference. Francis Fukuyama once sighed that we’d reached the end of history and it wasn’t gonna get any better than neoliberal policies enforced by neocon thugs. But this end of history narrative wasn’t anything like the one promised in Hegel’s Phenomenology of Spirit, where the ‘anti’ disposition of the slave results in the master being kicked right in the thesis (“Who’s holdin’ the whip now, mofo?”) and they finally realize their sin-thetic co-dependence and come together at last in a spiritual bigbangery.

No, in our scenario, we get the Marquis de Sade as president; Abu Ghraib the Winter Palace.  In the neo-con-lib end of history, the dynamism of Lefty language itself ends, and the triadic dialectic is just another over-priced psychotropic with nasty side effects.

Yes, the signs of the end of the Left have been there for awhile. 

It’s there in the media and in the masses raising a ruckus about the “Orwellian” nature of the all-pervasive surveillance state we live in. But it’s not new, sudden or revelatory any longer. We’re post-Orwellian. Sen. Frank Church warned usabout the surveillance state and secret government three decades ago, and we did next to nothing to stop it, as if the revelation itself, the “speaking truth to power,” were a talisman that would protect us.

Even now, most Lefties seem to think the problems will go away, if we just keep critically reading about it. Passing the “Freedom Bill,” they argue, is the first step in taking back the Republic, although the bill itself is akin to the inmates demanding more visits and phone calls – and not even threatening the warden with action if he doesn’t concede. No one is more deluded than the bourgeoisie.

The signs of the end were there in the simple aggregate build-up of contradictions, warped reasoning, and sacro-sanctimonious ownership of all things ethical.  The inability to modify the totalitarian pressure of political correctness. Just because you fully support a social cause and work to change the inequity should not also mean you must embrace the repellent. 

Some “feminists” are harpies who consciously leverage the “ism” to dominate others. Many gays are conservative, comfortably bourgeois and indifferent to progressivism.

A large proportion of the few African-Americans permitted to move on up to the East Side often forget their roots as they gorge on a piece of the porky pie.

Just because Israel has a right to exist doesn’t mean you have to wear a t-shirt for radical Zionists and their expansionism. Forcing artificial tolerance, even among natural Lefties, has resulted in resentment and fissure.

The signs are there in the gatekeepers we’re forced to trust for full disclosure of critical public interest information.  Thanks to the abrogation of adversarial responsibility by the MSM, the Left (and the rest) is forced to rely on and trust anarchists like Julian Assange and opportunists like Glenn Greenwald, or else go it alone in the vetting process.

The signs are there when you hear otherwise intelligent people yawn at the latest trenchant pronouncements of the Left’s pre-eminent intellectual gatekeeper, Noam Chomsky, who has consistently unpacked the structures of power and language for more than a half decade. It’s as if his brilliant resilience were held against him.

Likewise the case for Ralph Nader, who has been the champion of consumer affairs and governmental integrity for decades, and yet, incredibly, was scourged by the Left for his role in “undermining” the 2000 Florida election results of Al Gore.

This persecution of Nader allowed these comfortably numb Lefties to avoid the inconvenient truth of demonstrable racism and massive fraud that led to the results in that election.  Such signs of systemic collapse are unwelcome threats to their delusions.

The end is near when all the wonderful work of post-modernism to deflate the unexamined norms, customs and canons of centuries-old structuralist values became reified and turned into a movement, instead remaining an effective analytical tool.  The result is: academia has its own sub-cultures and speaks its own proprietary language.

It can be reasonably argued that in the humanities departments in universities around the world whole sections have been taken over by a post-modernism that pushes a privileged politics at the price of being accessible to the masses they would epiphanize.

Postmodernism erupted out of the Lefty fervor of the ’60s Paris salon culture, the relativist insights tearing into pieces the assumptions of master narratives. This had profound political consequences in Europe, which has always had an experimental core, but not so much in other places, beyond the theoretical, especially the capitalist-dominated US. Abbie Hoffman, with his levity and spontaneity and street theater antics, came the closest to embodying the 1968 spirit of Paris and Prague.

In the end, what had been radically Left became essentially Right, fetishist and fascistic. The Left became caught up as never before in the ownership of words, its one bona fide commodity, and a bizarre emphasis on plagiarism ensued. 

It is the property seizure of language by Lefty elites and righteous academics that is the surest sign yet that the Right is all that’s left.

Keen Internet Not

Failed States of Conscience
Last year saw the publication of the first sizable waves of what promises to be a coming tsunami of ‘data driven’ apocalyptic narratives detailing the accomplished ravages of latter day technology and shouting out the horrors to come, perhaps no matter how humanity responds now. Some stand-out examples include, Thomas P. Keenan’s Technocreep, which describes the all-pervasive penetration of the technological into every facet of our lives and posits that we may have already reached the point of no return in the symbiotic dialectic between man and machine; and Elizabeth Kolbert’s The Sixth Extinction paints a similarly abysmal future, arguing that we may have so fouled our own nest that humans are now irredeemably on the path to extinction. Andrew Keen’s The Internet Is Not the Answer, while certainly dire in its analysis and outlook, does at least offer up a token hope and point to solutions, even if they are unconvincing.
Keen’s thesis really can be neatly summed up by laying out the Question and then expanding on its implications. He asks: “What can help us create a better world in the digital age?” Originally, he says the answer was the Internet. Citing New York Times columnists and assorted economists, Keen describes an almost-idyllic late 20th century American middle class, citing what New York Times columnist George Packer calls a period noted for “state universities, progressive taxation, interstate highways, collective bargaining, health insurance for the elderly, credible news organization,” as well as publicly funded research; in short, a system that the Internet might have helped tweak and fine tune. But now, Keen sees that all as an exploded dream, and cites innumerable examples of how the Internet has been usurped by the usual greedy, unregulated controllers of our collective destiny.
Continue reading

Early last year I shot English Breakfast tea skyward through my nose when I read in a Sydney paper that the world was in the throes of “a collage poetry plagiarism epidemic.”  Heck, being global and all, I guess you’d have to call it a pandemic.

Now, see, I knew there were all kinds of problems in the world, what with abstract nouns rooting like rabbits and needing beer-fuelled culling by joystick-riding heehaws, but I never saw poetry as a virus carrier before, and chances are, that had I thought so, I’d have welcomed it as a nifty development.

Of course, plagiarism’s a serious topic, and I shouldn’t stuff, like some punk from the Old Geysers Gliberation Front. When I was a kid I plagiarized a Moody Blues number that I owned so deeply in my heart that even today I’m not sure it wasn’t John Lodge who nicked it from me.

After all, hadn’t Dylan purloined “Blowin’ in the Wind” from my back pocket?  Well, most kids plagiarize; they are often trained that way. Ideological plagiarism is universal, and compulsory. Now more than ever.

Then, suddenly, I became a man, and was not accused of plagiarizing again, until a couple of years ago, when a throw-away blog comment was savaged by Glenn Greenwald’s white-blood-cell cult. Actually, I never saw more over-referencing in my life than amongst Glenn’s commentariat, so important was the property of words to them, which I found curious given his predilection for liberalism.

But then there’s a lot of funny stuff that goes on between ironic smirks in that lot. In keeping with our theme, Greenwald’s early book on two-tiered justice was treated as though it were a fresh and original idea.  Funny stuff, indeed.

Anyway, most of us know that the serious kind of plagiarism means intentionally presenting someone else’s work as one’s own.  A more problematical secondary definition would be presenting someone else’s ideas as your own.  I emphasize intentionality because it can be tricky: For instance, it seems clear, when you give it a good listen, that George Harrison’s “My Sweet Lord,” did, in fact, plagiarize the main melody of the Motown hit, “He’s So Fine.” 

And yet, they are both gorgeous, unique songs, that don’t depend simply on the (much-quoted) melodic line. It’s even dicier when it comes to the origin of an idea and what happens to it when it is responded to dialectically by other minds. This is what postmodernism refers to as Intertextuality.

We mustn’t confuse plagiarism with copyright infringement or lack of originality, because we’d all be in the jammer slammer if that were verboten. Nor is it okay to accuse by lazily saying, “It looks like I’ve something I’ve read somewhere.” And let’s give up the notion that much originality takes place in the MSM.

This is not a knock; it’s just an acknowledgement that media bring together ideas rather than generating new ones. Likewise, calling anyone on “borrowing” in comments sections is pointless, as even if not anonymous already, no claim is being made to exclusive rights.

I know that personally I used comments sections as idea scratch pads and have rarely looked back at what I’ve written. Blogs, too, are, by nature, reactionary, which is to say it is usually someone else’s ideas being responded to or linked to.

It’s important to consider the politics of plagiarism, because they can be significant. A plagiarist can see his or her career spiral downward forever after a rightful conviction, depending on the severity. 

But there have been notable exceptions, it seems, to the pariah status that usually accompanies being outed: historian and PBS regular, Doris Kearns Goodwin, lifted whole passages from a another’s work to strengthen her bestseller; but instead of complete infamy, she has gone on to fame as the author of Lincoln, on which the Oscar-nominated film was based.

Dr. Martin Luther King, Jr. plagiarized his PhD (calling into question the “doctor”) and apparently lifted the dream, but his crucial role during the Civil Rights movement of the’60s surely (and rightfully, IMHO) protected his need aura; Bob Dylan has been sometimes shameless, but he remains a perennial candidate for a Nobel in lit (and deservedly so); even Australia’s own artist and historian, the late Robert Hughes (“a polymath in age of imbeciles,” as one art critic rightfully remembered him), had to suffer browbeatings for borrowing toward the end of his impressive career.

The Digital Age has not only complicated things exponentially, but it is an important aspect to consider here. Before the Internet, the distribution of new ideas was actually kind of limited by comparison to the now. Being bound to printed material limited the speed of new information acquisition, while also naturally limiting (time) the breadth of new learning. 

The world was full of select experts we relied on for knowledge attainment. But, just as the Incas and Egyptians were building pyramids in synchronicity but in ignorance of each other, so, too, we can assume that more than one person has thought the same idea simultaneously, or thereabouts, and is not necessarily cribbing from another. 

In short, a person might exclaim, “I’ve heard that somewhere before,” without a necessary link to another source.  The Internet has brought lots of overlap and Intertextuality that didn’t exist the same way before.

Nevertheless, the Internet no doubt adds to the problem. Academia has taken it seriously enough that it now forces most students to submit work through a plagiarism detection system, such as Turn It In (TII). But some academics and many students wonder whether this method is as expedient or as valuable as the previous system of instructor-intensive review. 

But maybe the actual purpose of this detection system is more devious. For instance, the controlling interest in TII is held by venture capitalist Warburg Pincus, which is determined to make big money. Leading to the questions: Who owns the data (students or TII)? Who has access? What is to prevent the corporate purloining of brilliant student work in its early development – TII as plagiarist? That venture capitalist is making money somehow.

Getting back to the collage poetry stoush we started with, while a TII might have stopped the dead poet suicidey in his tracks, it occurred to me as I limned the account that a well-read contest judge might have done a similarly effective job, but the Internet seems to have made that portion of our brains lazy now. 

When I was a young student academics were eager to determine if a poor boy, like Shakespeare, could “really” write such high flown verse; now all any one seems to wonder is if he was gay. 

Actually, I was also quite peeved about the collage stoush because I had just completed a sequence of Ted Berrigan-like sonnets that my lit professor called “a tour de force,” and now not only was that phony Aussie poet banished, but collage poetry was being trashed – just like postmodernism in general. 

Not only are we mooning over the sweet years of the Cold War, but some folks welcome the return of a literary canon.

I haven’t even mentioned class war yet.  High language, with its myriad high-hailed ideas, has always been a middle class proprietary conceit.  People have long been excluded from certain strata of society, based on their language characteristics.  And the influence of one’s economic background on the perception of one’s linguistic ability, and certainly originality, is a common place.

Blacks, Latinos, and poor whites still struggle to pass through the pearly pillars of middle class-managed knowledge libraries; it is often assumed that they are borrowing or stealing from their betters.

In the end plagiarism seems to be one more thing that’s gone wrong with the Internet’s promise to liberate us all from the constraints of authoritarian knowledge acquisition. Words are becoming one more commodity to be controlled by commercial interests that trump the merely human.

  • Full disclosure: I copy and pasted from an earlier draft of this article. Gulp.