CURRICULUM VITAE

CURRICULUM VITAE

When it comes to describing others we seem to have a firm grasp of the adjectives and phrases to accurately profile almost anyone. When it comes to describing ourselves it’s suddenly more complicated, abstruse, refined, opaque, multi-layered, and delicate. It requires patience and thought. This is probably because, unlike everyone else, we are unique, aren’t we? There has never been anyone even remotely like moi.

But when we stop to really think about it, it turns out that how we describe ourselves is the product of ferreting (cherry picking) and selective attention – from what we like (and dislike) in others. Growing up is all about (dis-)identifying with and mirroring off others – called socialization. It’s what the ego does to expand itself: “I am this, but I am not that.” There are no truly original descriptions of oneself because, contrary to popular opinion, there are no truly original personalities. There are unique psyches/souls, DNA, etc., but the persona is something constructed according to the rules of conformity and belonging. Before we can individualize we must first normalize. “Fitting in” has its essential merits, no doubt: It keeps things ordered, compliant, predictable, and safe. But we fit in so deeply and thoroughly that most of us also suppress our uniqueness – except of course in ways established as mostly harmless, amusing, or temporary.

All that aside, I think about how I would profile myself if it came down to seriously doing it. I have to ask: What applies truly and accurately to “this person,” and from where do his (weird) characteristics originate? Are they about me alone, or are they things I’ve learned from those I admire, tried to be like – versus – those I’ve tried to be most unlike? Without others with whom to identify and dis-identify, would “I” still be here at all?

It’s the old question of what happens to a person stranded on a desert island, away from humanity for many years. He goes mad. We’re social beings and need one another to constantly reaffirm who we are and are not. This is what human interaction is about and what it does, mostly without us even knowing. Without those mirrors the “self” fades to nothing, and what follows is a failure to thrive. We literally don’t know who we are – so we say, “what’s the point?” Such individuals start talking to the trees and animals in their effort to rekindle any crumbs of self-hood they can. The result is quite primal.

Again, that said, I come back to myself. At the risk of sounding resigned, it mostly comes down to being a grab-bag of words and phrases gathered along the road in time. For instance, on impulse I would say I’m restless and scattered. I’m also intrigued by boundaries, what isn’t said, for roads not only “not taken” but frequently avoided, audible silences and empty margins lacing together unexplored narratives. It has always been about walking a thin line between belonging and a very “blue” (subversive) imagination stuck in my DNA.

It’s easy to say all this. Everyone I’ve ever known makes either the same, or very similar, claims about themselves. It’s how they actually see themselves even if they’re lives and mine are as antithetical as can be imagined. Everyone’s a writer, an artist, a performer, a pilgrim/seeker, and a mystic. I’ll leave it at that. My only response: sois loyal envers toi-meme. Be true to yourself.

I would also honestly say that I’m a hopeless introvert and part-time recluse who over-indulges the privileges of privacy, obscurity, and anonymity. I’ve found that the most efficient means of achieving anonymity is around/within crowds (hence, the city) as opposed to the rural community (learned the hard way). In the city my home is a kind of latter-day “Balzac’s garret” – a fortress of private fantasies. I square my accounts with the world there, amidst a disarrayed feng sui of objects, strategically positioned to support my favorite illusions, rantings, and dreams. I concoct Shakespearean-like soliloquies reserved for (and truly loved by) my dog. I am the first “physiognomist” of interior design and landscaping. At this level I have no intention of sharing anything, with anyone, and wouldn’t dare anyway.

Which I suppose makes me dreadfully selfish and self-absorbed. But my home (from age and borrowed time) is the universe, shared only with the “four-leggeds.” It’s a phantasmagoria of constantly shifting scenes laid out to resolve, or find detente with, the world’s imperfections. My home mobilizes all my reserves of inwardness. It is the only inner sanctum there is, sacred ground.

In line with “Gotham City” introversion, I’m also the original anachronic flaneur. The crowd lends relief to the power of observation. I observe people from the fringes, in silence. I’m a datum among countless data. The city then becomes another room of my own, a store window where I rearrange the disarrayed. I look and buy in pursuit of bringing to life another shelf-space into my home (and mind). I collect like a church mouse – things small and relatively “worthless” but richly tailored to this body and mind.

The city dwells in the chthonic, the subterranean, while always showcasing it’s best side. The juxtapositions of strata and substrata constantly shake the urban topography. These are elixirs which also shake the psyche. Home and city, inner and outer, form an alchemy of constant movement sparked by the fire and water (calcinatio, solutio) of swirling interactions. I read the signs everyday to inventory what is truly of value, and what isn’t.

In line with this I qualify, like the artist, as one always in some kind of exile. It’s a need to constantly be liberated from conventions, stasis, labels and categories. It means seeking the margins in almost any context – not from some egoistic desire to stand out but for the simple need to breathe. It’s more claustrophobic panic than conspiracy. It just happens to be that whenever I’ve followed a prescribed path it turned out to be a lesson on what not to do the next time, where not to go, who/what not to confide in. It became an instinct to pull away and find my own clarity in the moment.

The rewards of accommodation and fitting in are tempting for many, but the need for air always leads back to voluntary exile. Exile frees me from having to move with caution, of violating protocols and rules, of upsetting others for simply trying to “belong” and then failing. Exile means a refusal to conform, a rejection of mindless norms. — I also have to say here that such exiles occur at mostly subtle levels. They are plainly nondramatic in nature – except of course when unleashed in the privacy of my living room.

Next, I would describe myself as a failed mystic – pilgrim without a shrine. This I know has been used to describe others, intellectuals mostly who have drained their resources trying to believe in “something” otherworldly/supernatural, and still failing. Every argument they posited in trying to legitimize a grand narrative of some kind never survived some festering counter-intuition buried deep down. In the end they could simply no longer lie or pretend. It’s like the Christian saying, “I was a Catholic all my life, until I reached the Age of Reason.” This is the existential cul-de-sac to which I constantly return.

The only footnote to this would be that those with whom I strongly identify are those who stay open and intensely curious about what’s left after the grand narratives – those “spaces” which can’t be named or even approached. My heroes are comfortable floating in that void. The most renowned (spiritual) atheists say that to refuse the possibility of realities beyond our understanding, our ability to grasp, is simply arrogant and stupid. At the same time such spaces are not supernatural and do not imply the existence of a divine being – in other words, terms designed to reduce “it” to conceptual models we can manage and exploit. Touche to that. The ancient term for the numinous (or numinosum) comes to mind: it’s not known what it is but is known to exist.

Next, terms like esotericist, exegete, glossator, observer, and intellectual fit well. Also the flaneur strolling through as many urban streets as possible, taking notes and laying down glyphs (graffiti) on subway walls – “Kilroy was here.” (or was it “SAMO” – Basquiat’s signature hello to New York)? Edward Said stated, “There are no rules by which intellectuals can know what to say or do; nor for the true secular intellectual are there any gods to be worshiped and looked to for unwavering guidance.” It’s always about a “spirit in opposition” which is “found in dissent.” It’s also about “disputing the images, official narratives, justifications of power circulated by an increasingly powerful media – and not only media but whole trends of thought that maintain the status quo, keeping things within an acceptable and sanctioned perspective on actuality.”

But what I really appreciate about Said is his view on the intellectual’s most necessary station – as the consummate “amateur.” He always remains one (at least in spirit), avoiding all the trappings of fealty to bosses, company lines, goals, and salaries. In this sense they are always “exiles in their own country.” Theodor Adorno said (as did Jesus in his own terms), “it is part of morality not to be at home in one’s home.” And as a writer, “For a man who no longer has a homeland, writing becomes a place to live.” He is incorrigibly independent in mind and spirit and answers to no one. Hence “our” eternal economic dilemma: we’re always broke and nearly always ignored.

There is virtually no place for the amateur writer/intellectual. Editors say time and time again, “not interested in personal opinion pieces.” But, I ask, what is not someone’s personal opinion?! What is not seen through someone’s personal lens? Editors, their hired minions, reserve those spaces for their “professional” selves, which means they’re only interested in their own soliloquies captured in their own living rooms.

Not to digress, but it’s worth mentioning the problem of the so-called “intellectual” in America today. He works for the boss, nine-to-five, never rocks the boat or strays outside accepted parameters, and above all “makes himself marketable.” He’s controversial and political, but only in the service of his own needs (which are the same as his boss – the editor, the executive manager, the CEO). He’s seduced by prizes and awards for consistency. He is also “certified” (Ivy League educated) within only one specialty, his own field, never straying into some other intellectual’s jealously guarded territory of expertise. He stays politically correct while feigning a certain “cheekiness” by veering outside boxes with cleverly honed language skills. Fancy slogans and catchphrases become his signature byline, his earmark, his bread and butter. In fact, the truth is, this is the stamp of the pseudo-intellectual – lots of daily noise, no substance. Nothing really moves because it’s not supposed to.

This, as apposed to the amateur who realizes more than anything that his role is to seriously break rules and force confrontations blinded by fear and complacency. He’s the unwanted “conscience” of most professionals trapped by market forces, PC, and specialization. He also breaks into other’s private fields and specialties without fear of giving offense. For him it’s about putting together thoughts, ideas, conspiracies, theories, and possibilities usually avoided. He does this because others simply don’t – they don’t dare. He asks the “wrong” questions, puts forth the “wrong” ideas.

Hence the benefits of staying freelanced and amateur (not in any way synonymous with the dilettanti – those with mere “superficial interest or knowledge” of things. It’s a notion and stigma amateurs have to challenge everyday. In America the college degree and formal accreditation is everything.

The amateur artist observes more honestly than anyone. Nothing stands in his way. As for “art” per se, not much to say about it here, except that, again, nothing is what it seems – particularly in today’s world. We can thank the subterfuges and violence which have become the signature of living today, of simple survival, for the vertigo we all experience. So, again, there’s the appetite not just for the marginal but the underworld, for the esoteric and concealed, for creativity, and for what remains the nemesis of convention. There is never a lack of factual data (from experience) to confront the meretricious, the cheap, outright dishonesty and hypocrisy.

Proust used the word “habit” to describe the dullness of routine, what congealed into convention. Routine and convention both dull the senses. He went through life “in search of lost time” (his “remembrance of things past”) only to realize that what mattered most was relearning the simplest pleasures collected in childhood. This is what artists (including introverts and recluses) do. They search for the pure subtle moments of the everyday. A “Proustian Moment” is that which is simple and delightful, inducing a pause of intense remembering. Much of art is one long Proustian Moment while dealing with the lower rungs of consciousness which habit-ually come with it. I suppose, in the end, doing a “resume” at this time in my life is about trying to summon the memory of who I am. It’s a template for remembering, seeking out my own “lost time,” and keeping my senses from getting dull.

The remarkable thing about a resume is that it’s never static. It constantly changes and is being added to. The deeper it goes into itself, the more spread out it gets, the more it entails, the richer the topography. None of us end up being (or staying) what we say we are.

© 2018 Richard Hiatt

Advertisements

THE DIOGENES CLUB

THE DIOGENES CLUB

I should start a club – the Diogenes Club. Even if I’m its only member. Named after, but definitely not the same as, the Diogenes Club of Sherlock Holmes fame. There are similarities: stretching mental boundaries, treating the impossible as possible – Holmes’ trademark for sleuthing. But my Diogenes goes beyond that pale, beyond where Holmes is allowed to go. Mine dignifies insanity instead of criminalizing it. Holmes has no choice, but I’m not beholden to Scotland Yard or preserving order.

That leaves a window open for “unusual” possibilities. And in this world, starting this new year, there seems to be no other sanctuary for one to go than into those landscapes Holmes has to choice but to apprehend. The deeper we move into those thickets, hopefully, the denser the foliage which will be able to flee detection. Maybe this new year will lead us to blazing new trails. Maybe the undergrowth will posit newer and newer discoveries of the self. Maybe King Kong does exist on some fog-hidden and forgotten tropical island.

I don’t know where this idea is going, if anywhere. But emptiness (as in “emptying out”) is the first requirement for an open window which can free us from confinement. Consider this an overture for us all – a post-apocalyptic adventure (the anti-Christ apocalypse having already happened with Trump – the “orangutan” figurehead on Bosch’s Ship of Fools).

Diogenes himself was a madman – a perfect club’s namesake. He lived in a “ceramic box” (some say a large wine crate) in the streets and reminded passersby that they were all crazy. He held that animals showed more wisdom than humans and chose instead to live with them. Born in Sinope, he became an Athenian, and was arguably the first socialist for defacing its currency – what forced him to exile into Athens. Having arrived, and having no need for conventional shelter, he took his instructions from “a mouse” on how to live. Also, volunteering himself as a slave, he saw himself freer than any aristocrat. Plato said he was “a Socrates gone mad.” He was the classic cynic by attacking convention at every turn for its stupidity. He despised pretensions to knowledge that served no purpose. It was either about “right reason, or the halter.”

He scorned sophisms. Plato defined the human being as a featherless biped animal. Diogenes plucked a chicken and brought it in, saying “Here is Plato’s human.” He quickly deflated a growing superstition that some humans “grew horns” by touching their foreheads and reporting no bumps. He challenged the mathematical proposition that “motion” did not exist by standing up and walking around. He attacked religion, saying the only way to be in accord with nature was to be rational. He chose voluntary poverty to criticize the institutions he knew were corrupt. He slept and ate wherever he wanted, violating social protocols, and called himself a “citizen of the world.” He begged for a living and strategically placed his large “ceramic home” in the city’s center for all to see.

He was also notorious for daily stunts – like carrying a lamp in the daytime, announcing that he was “searching for an honest man” (never having found one). He sabotaged lectures and crashed parties. Upon seeing a peasant drink from his hands, he smashed the only bowl he had, saying, “Fool that I am, to have been carrying superfluous baggage all this time.” He also said that he wished “to be sold to a man who needed a master.” – Not surprisingly, Diogenes was one of the founders of Cynicism.

It raises the old, and new, question: Are we seeking refuge in our own wine crates? Do we wish we could? Is it me, or am I slowing down in an accelerating world, left in “the exiled angeltoxic dust” of a blue cosmic ball spinning off its axis? Does it feel like we are mere satellites spinning out into space by centrifugal forces beyond our control? And, as dust, does that mean we’re no longer needed, no longer relevant? On one level, I have to confess, I hope it’s true. I say allow it to “let us go.”

I’m thinking of a movie scene: The Allies have broken into a French village and “liberated” it, but not before stumbling into a hospital for the mentally insane. It’s a brilliant juxtaposition, what could easily be an essay on “relativism” in a psychiatric journal. The soldiers burst into the sanitarium while killing final decisionthe enemy, while “the insane” watch in horror crying for it to stop. Eventually a couple patients observe and adapt. They pick up some abandoned rifles and start shooting. And as they also begin killing people they smile and announce, “I’m sane again, I’m SANE!!!” – Diogenes’ touche to the Athenians.

Hence my wish to be cast free like an old satellite no longer serving any useful purpose. Another reason is because I feel it’s simply too late for the human species, that is, to save itself. I no longer wish to be its member. Mother Earth (Gaia) has been taken to her limits in her effort to sustain us. But we’ve simply pushed her too far. I think it’s time our species stepped aside and gave another species its turn. I don’t even think it will be long for that wish to become prophesy because almost everything we touch we fuck up – literally. There’s virtually nothing on earth we “improve” unless we define the term differently, around our own expedient needs. If improvement means returning to its “permanent all-natural state,” in harmony with lost-soulnature, then we continue to fail miserably. We’ve become the problem, cancer cells on a body trying to heal itself.

Nature’s lesson: When a wolf gives birth to a bothersome pup, she kills it to protect the litter. Gaia will kill us by virtue of allowing us to kill ourselves – via nuclear war, toxins, famine, global warming, any number of things. To be sure, the planet is not in trouble, she’s not the one “going away.” We are going away. It may take 80 million years for her to clean up our styrofoam and radioactive mess, but no matter. She has all the time in the universe. And even if the earth does go away sometime, nature never does. She (earth), nature (beyond earth), is simply rearranging the furniture. That speck of sand we live on on the shoals of that limitless beach simply becomes part of another sandcastle – another quasar, another nova.

In that light, the truth becomes a kind of liberation. Not to continue indulging excessively and stupidly (we’ve already been doing that), but to transgress old boundaries, old rules, taboos, and constraints is our window. The worst others of our kind can do, in the Holmesian tradition, is censor us, confine us, scandalize us with their not so veiled allusions to crossing lines and breaking rules. Sherlock must pursue the criminally insane.

The insanity felt today (and make no mistake, we’re all feeling it) is an expression of the prison painted so artfully by Rene Ricard in The Radiant Child. An homage to Jean-Michel Basquiat, Keith Haring, Judy Rifka, and others, it depicts creativity as a prisoner:

[A] picture your son did in jail hangs on your wall as a proof that beauty is possible even in the most wretched, that someone who can make a beautiful thing can’t be all bad…. An object of art is an honest way of making a living, and this is [as] much a different idea from the fancier notion that art is a scam and a ripoff. The bourgeoisie have after all made it a scam. But you could never explain to someone who uses God’s gift to enslave that you have used God’s gift to be free.

Our creativity is eternally behind bars. It sits quietly, held hostage, waiting for any opportunity to break out with purity and impunity. Meanwhile it stays hidden, repressed, conscripted and molded to the shapes and sounds of society. It must titillate and entertain, impress us by convincing ourselves that we’re progressive and bold, but never overstep itself. It must never threaten our sense of being here, our preeminence, our centrality to everything. Because even when we admit that we’re not central to anything, we still are. As Umberto Eco said, our mental world is still Ptolomaic: The sun still “rises” in the east and “sets” in the west.

That egocentrism has reached critical mass. There are no more corners to round even as we (blindly) continue turning endless corners. Our centrism is running amok like mice becoming insane, overpopulated in a maze. We’re running in circles on a treadmill hoping to get to somewhere new. Artificial intelligence remains artificial, mimesis remains mimetic, artifice, simulation, the simulacrum – all saccharin, plastic – opioids to the next temporary high. We’ve run out of analgesics for the soul.

Whatever goal we set for ourselves eventually proves deficient and restrictive. Consciousness is endlessly redrawn as it constantly seeks to know itself. And as it moves forward it’s almost as if it has a constant thirst for alienation and disillusionment, even estrangement (from itself). At first, and for a long time, it follows linear pathways, until “time” itself is no more. Then it faces a new conundrum. This is where we are now. Our previously goals for excellence have become insignificant. We are now in an uncomfortable (loud) silence not knowing who we are or where to go.

On one level an existential silence is what we wanted in the first place. Wasn’t that our spiritual goal – genuine emptiness, pure silence? But instead we’ve become terribly self-conscious, and silence still requires its polarity to exist: there is no up without down, no left without right. We’ve overstepped our own ambition and “gotten ahead of our skis,” using today’s vernacular.

The term modern (L. modernus) was first used in the 5th century by Christians to distinguish the present from the Roman and pagan past. Since then it has always required a view of the past to understand the present. This is the dilemma of postmodernism. That is, it defines itself as no longer needing the past, no longer dependent upon it. It revolts against tradition, against all that is normative. Instead we have this “heroic” affinity with the omnipresent, having blown up the continuum of history. We are now “posthistoricist,” “post-avant garde.” We demand unlimited, unbridled self-realization, brain-shattering self-experiences via hyperstimulation “right now.” Though we still reach for the avant garde, it is no longer creative. It simply mimics and repeats (hence the treadmill). Postmodernism is dominant but dead.

Paul Ricoeur put it this way:

[I]s it necessary to jettison the old cultural past which has been the raison d’etre of a nation? …. Whence the paradox: on the one hand, it has to root itself in the soil of its past, forge a national spirit…. But in order to take part in modern civilization, it is necessary … to take part in … something which requires the pure and simple abandon of a whole cultural past…. There is the paradox: how to become modern and to return to sources, how to revive an old, dormant civilization and take part in universal civilization.

Being lost and silent in a disquieting “eternal present” presents a problem of mythologies and archetypes. Are we swimming blindly amidst a new mythology attempting to shape itself in a new age, as Joseph Campbell suggested? Erich Neumann said that historically the archetype “is crystallized in its realization of man in time; [it] enters into a unique synthesis with a specific historical situation.” This is what helps “constellate the individual.” But today one can’t tell whether there is “a monologue or dialogue between man and the ultimate.” There’s no fixed framework we can use to understand the archetype. Hence there is no isolation between self and other or between contexts. Each individual has become “everything” and “all at once.” We don’t even know what’s real or virtual anymore. And one wonders if there will ever be any kind of individual “separateness” again. The archetype hasn’t changed (it never does); rather, we find ourselves prematurely searching for a myth which might explain it.

The Greek tragedians had an interesting word: anagnorisis. It refers to a point in a tragedy when the protagonist recognizes his true identity and the true nature of his situation. Maybe Diogenes in his madness was on to something after all. Maybe it requires self-containment “away from the madness” to actually see our true identity and the real situation before us. Maybe extrication, expatriation from the mass-consciousness, is the only ticket to freedom.

I personally don’t believe there is anything any individual or group can do (at the grassroots or above) to alter the course of this “ship of fools.” We can pray, chant, and raise money all we want, but it’s not going to sail us into any port that’s better or safer. The thinking is all wrong in the first place. It assumes we have to get back on course to where we were. But where we were (thought of as “sane”) was what we were trying leave in the first place. Secondly, it isn’t going to happen because we can’t go backwards anyway. And thirdly, the right direction is “where we are now.” We need to understand it. ThanatosTo embrace the truth is to begin embracing what is possibly a new archetypal “image” taking shape for “this moment.” We need to embrace it, in the spirit of Thanatos (winged boy akin to Cupid, an “Eros” known for his gentle passing into death). The only way to “the other side” is through it. We have no choice anyway.

The world is going mad, or is already mad. I think, in the spirit of Diogenes, we should join it, albeit in suspension from it – like an abandoned satellite. “The mouse” Diogenes listens to for guidance stays clear of the maze and the treadmill.

The old Persian tale comes to mind: The courtier reports to the king that the whole village is going mad after drinking from the well. The king says “we will drink from the same well to show our support for the people. But before we do, we will draw ‘Xs’ on our foreheads. This way we can look at each other and remind ourselves of who we are. Unlike our brethren, at least we will know we’re mad.”

© 2018 Richard Hiatt

STUFF DREAMS ARE MADE OF

STUFF DREAMS ARE MADE OF

The other night I had the most revealing dream I’ve had in a long time. I was in the American Army – at first not, but then magically conscripted – one of those blurred glissades.

When I was young and strong the military culture did not phase or concern me. I always out-performed the average GI Joe in term of physical prowess, coordination and endurance. Getting old(er) has reversed that. Today it’s all about watching “kids” with too much testosterone who have suddenly “grown up”(?) with high hopes and ambition. To use the military vernacular, “young, dumb, and fulla cum.” As their generation takes over and looms larger than life, my own strength wavers in equal, inverse proportions. This dream reflects both transitions side-by-side. Our lives are like passing ships; we look over our shoulders wondering “who was that?”

The first scenes are vague and hard to recollect. But eventually I find myself along the French hedgerows of World War II – the bocage. Strangely, I am crouched, sitting (balancing myself) on a large wooden post among a line of posts connected by barbed wire and forming a fence-line. The ground below is thick with mud. The American Army is all around, engulfing me with passing tanks and trucks. GIs are marching by saying things to me which are undiscernable. But they feel like ridiculing remarks. They’re summoning me to “get off” the post. Perched above it feels like I’m an observer, someone who has somehow materialized out of nowhere. I’m surveying the procession from an elevation which is too high for their comfort. By overwhelming “demand” I start to climb down.

In the next scene I’m down at the lower dark side of a tank as it moves along – almost underneath it. I can see the treads churning mud up and down, slipping, sliding, and sinking as it pushes along. I’m almost sucked underneath and crushed. I’m caught in the treads which are pulling me under, while I desperately hold on to something above. Meanwhile I continue hearing dismissive remarks from passing GIs as if saying to “get trounced,” “commit yourself,” “show your patriotic metal.” I’m flustered, angry, resentful, and helpless. – The dream ends.

This somewhat accurately describes my current residency in a military community and in a country at large which has gone so far over to the lunatic right that “America” feels like one large military encampment – replete with military protocols, mannerisms, salutes, effusive patriotism, censorship, flag-waving, spying on “subversive” activity – and most of all rigid mass-conformity to its rules. This is the feeling I get everyday, what I personally observe, as a “subversive/liberal” in this (what has become a) no-man’s land. The dream is not archetypal (by definition) but “personal” and direct. There are no mysteries or unanswered riddles here. It mirrors my current life in the US, as a citizen.

What alarms me more is that no one in this indoctrinated climate seems to even notice or care about how this culture has been systematically co-opted. No one even flinches at its presence. Much of it is subliminal; but much is also unabashedly loud and direct, even confrontational. People simply say it’s just “the way it is…. it’s America. Get used to it … get with the program!” They wave the flag, stand at attention for the national anthem, and as if mentally conditioned (on cue) affect great emotion about amber waves and purple mountain majesties. Behind everything “American” there’s the phalanx of “heroes” walking in slow motion on their way to national martyrdom for some holy cause, followed up with sounds of the Battle Hymn of the Republic (or a C&W station playing “country-gospel”). Everyone is in a uniform now, either literally or “in spirit” (fireman, police, paramedic, military, garbage collector, bug exterminator, and John Q. Public) ready to serve god and country.

Police now dress like paramilitary units in battlefield (SWAT) attire and helmets, equipped with the latest high-tech assault gear and armor-plated vehicles. They are now “counter-terrorism” units and “first-responders” executing “preemptive actions,” deployed everyday to “protect and reassure” (quoting a news anchor just yesterday). It all about taking for granted that the streets are not there to protect but to control and dominate through an intimidating omnipresence. It’s all about the militarization of America, of Main Street, while the Posse Comitatus Act of 1878 is summarily ignored (prohibiting the US military from carrying out “police-type” functions on American soil; the military will be “kept out of law enforcement”). It’s about the “policification” of the military and the militarization of the police. There’s even an American Military University now in which young people are encouraged to find new careers.

The dark side of all this is that no one dares question it, even if the commander-in-chief is openly racist and average “civilian” attire could one day actually include “funny little hats and armbands.” One dares not reveal a moment’s doubt or hesitation – lest being thrown under a tank. The Espionage Act and Sedition Act are both recrudesced from the muck, reinstated, and posted on every signpost leading in and out of camp. Winston Smith “loves Big Brother” and Sinclair Lewis’ prediction about fascism is recited with an almost sick and eager anticipation: “When [it] comes to America it will be wrapped in the flag and carrying a Bible.”

It’s not so much about the prophesy coming true as it is, again, the public’s non-responsiveness, its silence. It not only embraces it, it celebrates it as if it were some alternate Second Coming. Democracy in America today is still all about utopia: It’s the Orwellian belief that “freedom” means doing away with it, along with civil disobedience and dissent. WAR IS PEACE, FREEDOM IS SLAVERY, IGNORANCE IS STRENGTH. Conformity (monochromatic olive drab, boots on the ground) trumps everything in national importance while the disenfranchised-poor remain disenfranchised and poor. The utopian dream is that eventually we’ll all look alike, attend the same institutions, and think so identically that no one will ever have to worry about uncertainty again. Uncertainty translates to instability, which translates to insecurity, which translates to evil (according to the national religion). There will be no more unknowns. This is the comfort-pill America still craves, the most powerful opioid (once the “opiate”). It’s the new American boot camp.

Upon final reflection of this dream, the feeling of being trampled in the mud under the tank is not just about punishment for questioning the system (or a corrupt military industrial complex); it’s about the direction it’s going (Onward Christian Soldiers marching off to hegemony on behalf of wealthy corporatists) and my “crime” of seeing it from a higher place – absolutely verboten.

Now that I think of it, I also have a camera and a notepad with me on the fence post. I’m supposed to be on “post duty,” but instead I’m critiquing the post. Both camera and notepad are crushed and buried in the mud. I’m almost pulled under as the camera’s strap is still around my neck. But I pull free.

Normal living (for me) is now about barely hanging on and staying above the muddy undertow of a tank. It’s about “treading.” It’s about banishment for being “on the fence” with any questions and uncertainty when it comes to “the American Dream.” It’s about the darkened hedgerows of forced patriotism, supreme sacrifice, and “compulsory love” of country.

George Carlin said it most eloquently: “They call it the American Dream because you have to be asleep to believe it.”

Dreams don’t lie. They uncover lies. It’s the nightmare on Main Street to which citizens have yet to awaken.

(C) 2017 Richard Hiatt

THE MOBIUS ILLUSION

THE MOBIUS ILLUSION

For all intents and purposes the 19th century ended with the First World War. The 20th century began in or around 1888 with a cocktail of energies, ambitions, and future fears – emergent stressors of industrial society, mass-production, urban migration, consumerism, and mass-marketing. The fin de siecle was more an imbrication with past and future than any notion of a clearly demarcated period. It phased in just as much as it phased out – the usual habit of movements, eras, periods, and epochs.

The same seemed to happen with the modern era. Virginia Woolf claimed that it was ushered in with the discovery of self-consciousness in the arts – in 1910 when “all human relations shifted,” Others say it was with the death of King Edward and the first Post-Impressionist Exhibition which influenced social relations, politics, and literature. D.H. Lawrence thought “the old world ended” in 1915 just before the war with a huge push for technology and a revolution in English literature (Yeats, Joyce, Pound, Gide, Valery, Mann, Proust, Vico, Blake, Rilke). Others point to just after the war – 1922, the year of Ulysses and The Waste Land, Brecht’s first play, Lawrence’s Aaron’s Red, Woolf’s Jacob’s Room, Proust’s Sodom and Gomorrah, and O’Neill’s Anna Christe. While even others claim it found itself in the inter-war period when Viennese psychology, African sculpture, American detective stories, Russian music, neo-Catholicism, German innovation, and Italian desperation all converged on each other.

For still others “the modern” is traced to Flaubert, the “first self-conscious novelist” who worried about every nuance of his writing – economy, precision, density. He also heralded in “abstraction” by redirecting the primacy of subject matter. When asked to describe Madame Bovary, he said it was about the “color brown.” Then he said it was about “nothing” – without a doubt a 20th century, even (post-)modern, obliquity.

The genetics of self-consciousness – expressed through absurd creations, random methods, parody, fiction, aleatory art (“chance” art), surrealism, erotica, and“new” psychology – is constantly debated. The four writers, Woolf, Eliot, Forster, and Lawrence, who in 1922 looked at each other and realized they had moved the tectonic plates of literature, were followed by the “lost generation” of mostly American writers and painters in Paris – Hemingway, Macleish, Stein, Cummings, Fitzgerald, dos Passos, Crane, and others who took self-consciousness and existential uncertainty to their logical conclusion.

One comes away from all this with the distinct impression that history’s transitions are very messy. They usher in and out in largely nondescript, overlapping waves. When something “officially” announces itself it’s still palpably transparent, not quite there yet, still arriving. And then all of a sudden it’s over – before it arrived.

In the opposite direction from all this, and at the same time, there is the sense that, as Guy Davenport said, “the twentieth century ended in 1915.” How could this be? And why 1915? Because by then all the artists who had “survived the collapse of civilization” (the negative harbingers of the future) had already “completed the work they had planned.” We’re talking about Joyce, Proust, Pound, Wells, and others. In essence, they had already found the best and worst of what came before and used it to portend what was ahead. They had already written out the 20th century.

They had also written out another aspect of self-consciousness – the experience of “self” as going from the strictly personal to social (belonging) – the seeds of Marxism, the sensibility of identifying with the needs of others, the welfare of the commonwealth, all synonymous with the survival of “self.” This would be a prevailing and dominant theme in the 20th century. Writers had already explored this in (or before) the 1920s. Davenport’s point is taken.

This says that the 20th century merely followed a blueprint, a compass, already foreshadowed by a previous generation who challenged the rules of linear time and history. The issue came down to how it chose to imagine its own story, laid out by those already dead and gone. Or, to say it differently, the future was “all history.”

The point of this mental excursion is this: Even with Fukuyama’s prediction that “history is over” being totally debunked, is it really? Is the future and the unknown already known and being written out? Are denouements and codas being written before their introductions, even before story lines even have a chance to develop? Is this why we are just as overwhelmed and lost as those World War I survivors who stumbled into Paris looking for refuge in the form of a different future? Except now there is no Paris – no place to go.

Nothing surprises us anymore. This may be due to the effects of entertainment and sensory overload. But who’s to say it isn’t also the growing feeling that where we’re going we’ve already been? It’s essentially about past and future transposing (“flipping” is the word we use now). If we’ve been to the future, does it mean we’re inadvertently seeking the past for answers to where we’re going – to a “past-future perfect?” Or is it all just a fabrication leading us back to here and now? Is it all one long arbitrary fantasy without beginning or end, a long desperate odyssey going nowhere? Do we confuse the dips and turns in the road as signposts of places seen and not seen?

The paradox is that we seem to be coming to that existential precipice, but without wanting to. It essentially invalidates the journey and leaves us with nowhere to go, nothing to do, and no one to be – except in those Himalayan landscapes of the mind. Or are we already there, just now waking up to the frigid cold of exposure?

Science and art are now both the culprits in this conspiracy. They’re colluding together to essentially “end us.” Our brainchildren have outgrown us and are controlling both directions. The courses plotted are unknown terrain. We’re on paths already taken and not taken. We see our footprints and yet they’re not our footprints. Speaking of children, it’s like looking into the eyes of our children and saying “that’s me – oh, but wait, it’s not me.”

We’re going round and round like this. We look ahead to where we’ve been and over our shoulders to where we have yet to go. There are no clear signposts on where things start and stop. They overlap like the fin de siecle a century ago. It all comes down to perception, how each of us chooses to map out our landscapes.

Now that we’ve crossed yet another threshold into another century, long after modernism has already grown a prefix of some disrepute (postmodernism), what we bring with us is a kind of dowry for the next hundred years. I don’t pretend to know what that portends, but for me the future can only be what overlaps in those inner folds of my imagination.

For example, imagine this (just for a moment): the “simulation theory” proposed by Nick Bostrom of Oxford University. We are actually being “simulated” by computers in the future. We are simulations of their past in the same way that we now run simulations of reality in computer games. Bottom line: We are simulated ancestors, not real ones (someone else’s “rats in a maze,” as it were). This is as wild as it gets, but it isn’t any crazier than what theologians tell us. It’s also simply the reverse of what was mentioned earlier: the 20th century merely repeated what the fin de siecle writers said “already happened.” The future creates its own past, the past plans its own future. Anything is possible, but it merely shows how arbitrary “reality” is.

For myself I can only describe the odd feeling I have about my own future and past. In one way I’ve been there and “done” the past according to the laws of linear time, rules which are quite clear. But in the most transverse corners of my mind there seems to be no beginning or end, no ancient and modern, no differences between figurative and abstract. Rich complexities are still there, the filigree and flora lining the pathway, and it still excites me. But nothing is new or old. It”s both new and old, death and birth. And whatever it is, wherever it is, I’m right here.

Does that suggest another overlap – of inner and outer worlds, the intersection of private experience and social reality? It seems obvious. What else could it be? Mark it up to “getting old” to which there is a growing corollary. It reminds me of the look I see in the eyes of those who have been around a long time. They know that plus ca change….. Add to that a favorite phrase used by Sherlock Holmes: omne ignotum pro magnifico: “the unknown tends to be exaggerated in importance.”

© 2017 Richard Hiatt

THE FREETHINKER

THE FREETHINKER

The intellectual is always in exile. There are several notions of what an intellectual is, what his function is, but a true freethinker goes against several grains simultaneously in pursuit of something surprisingly simple and straightforward – speaking truth to power. That very simple task bleeds him severely and casts him into hostile landscapes with which he is never fully prepared to deal.

First we need to examine a liberty already taken by myself: that “freethinking” is synonymous with intellection. They are related but definitely not synonymous. Historically, many intellectuals have been those academics, teachers, priests, and administrators hired to do the same thing from generation to generation, to pass on the same information which serves specific classes, enterprises, and interests. If they stepped out of that role they were disciplined. These were (are) not freethinkers. Then there are the so-called freethinkers who are rewarded for challenging status quos, but only to a point. If they step too far outside acceptable parameters they’re no longer called great thinkers but fools and kooks. They’re stripped of their imprimatur and made into clowns.

It addresses and exposes the more immediate concerns over the deceptions which widely characterize our species. Socialization seems to be mostly about order, and order about rules and inviolable limitations. By definition “creativity” is the process of “bringing new things into being.” It’s an exploring and expressing of the unknown. The gift, or curse, of those daring to venture there is not so much about discovery per se but retrieving and translating it in a manner which safely assimilates it into society. One’s very reputation, and sometimes safety, depends on this. Genius or insanity all rides on a facility to sculpt, paint, write, perform, or explain it “well.” Language is everything.

Language exposes and sugarcoats at the same time. It reveals truth in portions, degrees, and frequencies. It also readjusts those portions according to its user’s ability to “listen” to prevailing attitudes and temperament. Those most out of touch with the rhythms and cycles of intellectual, emotional and religious assimilation have been the “troublemakers and radicals” ending up crucified. Only after death and considerable time do they find themselves finally understood and forgiven, usually with great contrition by the living. Most “great” artists are dead artists.

This is the artist’s (and freethinker’s) dilemma: Creative discoveries don’t wait around for translations and presentation. Initially they explode like bombs, shake the soul, and defy rationale explanation. One is quite “out of his mind” to the degree that he’s ventured into those spaces which “bring the new into being.” He must slow down and risk “bringing it down” (losing its primal essence) in order to legitimize it in user-friendly terms. He quite literally rests on a precipice between legitimate discovery and total dismissal, between getting mentally lost in the creative moment (retreating from society) and choosing to be its messenger, its conduit and intermediary.

Many choose the first path, simply slipping out of sight (underground) from membership to the human race. If born into other more exotic cultures, like India, they are still somehow acknowledged for their knowledge and wisdom, choosing to translate what they know in verbal (or psychic) semaphore which only some understand. They are the “lucky ones” to the extent that their societies/communities preserve their social relevance.

At first glance it just seems to me that it’s a minority who actually choose to stay around to contribute to the improvement of the human condition and to risk the repercussions of doing so. It seems that of all the great geniuses in history, scientists and artists, not one has ever escaped serious (and dangerous) scrutiny for his work. Genius simply carries with it severe consequences. Hence, the need to have a strong will (and ego) to enable one to fight off detractors, skeptics, and those simply hostile to new ideas and ways of thinking. Simply put, those without a strong will had better stay home.

I don’t know why I think this – that a majority simply choose the path of the “idiot” (L. idiotes – “one who does not participate”). Maybe it’s just a feeling, but it seems that anyone gifted enough to stray into a creative epiphany or revelation, one big enough to be life-altering, would not wish to face the gauntlets of scrutiny and moral judgment. The discovery would be enough it would seem – for a lifetime. Why would it matter if the world knew it or not? I suppose it depends on what the epiphany is, what field it’s in, and what it’s about. If it’s about curing a disease I suppose the world would need to know about it.

And yet another epi-phenomenon may come into play as well: the theory of “psychic resonance” – that whatever the discovery is is already known by everyone anyway, because it floats in the collective consciousness. All the idiot does is to “bring it down” and translate it. But it’s already there for everyone to experience in the recesses of their own unconscious. It’s simply a matter of time then before someone else brings it into “waking consciousness.” In that sense then, as Einstein said, “there are no [personal] discoveries.”

Bubba Ram Dass once said, “You already know everything I know. You just don’t know it yet.” He observed that people prove it to themselves every time he said something. After he said it they would all nod their heads if saying “oh yeah, that’s right.” “Well, how do you know if you didn’t know?!”

Now it becomes clear why such individuals never allow themselves to take credit for myriads of “discoveries.” Most simply say they are “messengers.” It also explains their total loss for explaining how their discoveries happen. They never seem to know because it’s almost as if it happened to them. They had no choice in the matter. Einstein and others essentially described such moments less than a doing than an “undoing,” a stepping out of one’s own way creating a kind of space for something (visual, auditory, kinesthetic, intuitive) to pass through. Whenever a great songwriter (like Bob Dylan or John Lennon), or a composer (like Mozart or Chopin) was asked how he came up with a song or aria, he invariably said “I don’t know, I just hear it.” When asked if he could ever “do it again, Dylan riposted, “absolutely no way!!” The artist also knows when the conduit closes up. He has no control over it.

Hence, the precarious and dangerous waters of freethinking. It’s the trump card one dares to play for his welfare and survival. This is why I said at the top that the intellectual (as a freethinker) is always in exile. It seems he’s always running for his life as he races faster and faster towards the unavoidable truth. Sometimes he must slow down in order gather himself, but the creative impulse never decelerates and too often drags him into its vortex like a hungry beast. Again, it happens to him. At that juncture he either loses himself or saves himself. – In the end, as someone said, “Why do think Einstein looked like that?!” He had “endorphin rushes” that would have sent any normal person over the edge. Somehow, he managed to stay grounded long enough to produce his theory of relativity. Academia also protected him from the commercial world and media.

I have the sense that many geniuses are always right at the brink of leaving the temporal plane. If they don’t remember to hold on they’ll fly off into higher spheres. In which case the rest of us end up nursing someone’s psychotic mind in a psychiatric ward, someone mumbling nonsense in a wheelchair.

Years ago, living and working in Denver, I used to commute on a hot and crowded bus often filled with irritated (and irritating) commuters. I did this every day. One day during rush hour the bus made a routine stop to leave someone off, and in the middle distance I saw an elderly woman sitting on a porch. She was smiling ecstatically in a way that reminded me of the Buddhist hotai – “laughing Buddha.” She was reaching out into thin air almost as if grabbing beams of light. She would grab and then smile blissfully. In the simplest terms, it interpreted it as a “state of bliss.”

At that moment the bus began moving again, and I instantly came back to myself, to my own physical discomfort, my job, my life situation, to the bus’s “SRO.” In a flash I did a moral inventory of everything. It was a small epiphany of my own – or, was it one transmitted to me from this woman, her consciousness tapping into mine? She was, by all psychiatric and medical standards challenged with “severe dementia.” But then I thought of Thomas Szasz’ famous observation that dementia is culturally relative and that definitions of “sanity” are often “insane.” If life is all about achieving bliss, whether fully “in body” and “of sound mind” or not, she had the goods on us all that day. It was one of the most humbling moments of my life. It was also a reminder that “creativity” goes places, knows and sees things, we can’t even imagine. In my mind I wondered how she would be received in a culture like India’s. Here, she’s “supervised and medicated” for “her own good.” There, such people are teachers.

It reminds me of the time I treated a schizophrenic woman. She was referred to me after being diagnosed several times by other shrinks. She told me her symptoms (“voices” in the night). After a few visits I came to the radical conclusion that there was nothing wrong with her – and I told her this. I told her that what she had was “a gift” which in other cultures would be celebrated. I told her to go home and not to worry about it. It was the first time in her life she had ever been praised for something, and in the following weeks her affect, attitude, and whole persona changed. My colleagues couldn’t believe their eyes. When they asked me what I had done, I responded, “I told her there’s nothing wrong with her.” I punched a hole in the very legitimacy of the psychiatric profession.

The next thing I knew I was being invited to this woman’s wedding. Her “voices” did not end, but what was “missed” for so many years was that it was wrong focus. It wasn’t about the voices. It was about her feelings about the voices and the shame she felt for having them. They were now embraced, part of who she was as a “complete and whole” human being. It went from “I have voices which are abnormal and sick” to “I have voices and it’s okay. I accept them (and myself) anyway.”

As to knowing the history of the freethinkers, my knowledge is very limited. I can only refer to those I’ve most admired for their courage in standing up to tremendous ridicule and sometimes physical adversity. From the standpoint of notable individuals, my own opinion is that one of the very first is mostly forgotten and ignored. She lived in ancient Alexandria – the city which, for its library and emphasis on knowledge, marked the end of the ancient world and the beginning of the modern world. Her name was Hypatia – inventor, mathematician, and astronomer who was tortured and killed by the newly arriving Christian, Bishop Cyril. He also burned the library to the ground. Of the original 500,000 books (many of which consisted of only one copy each) only one-percent survived. Hypatia to me was the first free-thinker, especially one who stood up to religion.

That aside, the movement officially began in the 17th century. In England, as expected, it was again about standing up to Christianity and its Bible. It was also a plea for Deism. In 18th century France, it coincided with the arrival of the Enlightenment thinkers – Newton, Diderot, Voltaire, Jean la Barre, Erasmus Darwin (grandfather to Charles), Condorset, Thomas Jefferson, and even Frederick the Great. In 19th century Germany, it was with the March Revolution against the Church, the “rights of man,” and Humanism.

My two favorite free-thinkers: Bertrand Russell and the late Christopher Hitchens. Russell said that atheists and/or agnostics were “not necessarily freethinkers.” One had to also be free of the “tyranny of his own passions.” He said no one is ever totally free of tradition or his passions, but “in the measure of man’s emancipation he deserves to be called a free thinker.”

Russell was an anti-war activist and later strongly advocated against nuclear weapons. He was fired from Trinity College and went to prison for his pacifism towards World War I, was forcibly exiled for his views on Hitler and England, his views on Stalinism (the antithesis of “socialism”), and attacked the US for the Vietnam War. For lecturing against inviting the US to join Britain in World War II earned him six months in prison. In 1950 he won the Nobel Prize in Literature.

Christopher Hitchens was a polymath gifted with “eidetic memory.” He reminded us all that a true freethinker successfully avoids political labels and categories. (S)He’s neither conservative, liberal, or centrist on anything. (S)He will stir up arguments and criticism from all sides at a moment’s notice. Russell did this, but so did Hitchens in a very big way. He was not just unpredictable, he was a formidable debater. Among those who knew him and/or debated him, the saying went: “You can fool the grand committee, you can bid for sainthood, but you fuck with the Hitch at your own peril.”

Hitchens had his own inner circle of admired and respected freethinkers: Salman Rushdie, Ian McEwan, Richard Dawkins, Noam Chomsky, Martin Amis, and others. Then there were the “Four Horsemen” who faced down religion (“supernatural, celestial dictatorships” demanding “compulsory love”): Richard Dawkins, Daniel Dennett, Sam Harris, and “Hitch.”

In the end, it seems that, even today, there’s only so much room for the freethinker. He’s the “welcomed intruder” at the king’s table, given limited space until such time as he begins to rub elbows with tradition and personal ambition. Then he’s sent into exile, banished from the realm to the nether regions of dispossession and disgrace. But little do the inquisitors know that he’s been sent to his liberation – from the trappings of hubris and arrogance. He lives with the truth everyday, the most intelligent and civilized place of all.

What makes a freethinker is not his beliefs but the way in which he holds them. If he holds them because his elders told him they were true when he was young, or if he holds them because if he did not he would be unhappy, his thought is not free; but if he holds them because, after careful thought he finds a balance of evidence in their favour, then his thought is free, however odd his conclusions may seem. – Bertrand Russell

© 2017 Richard Hiatt

 

 

GODS AMONG US

GODS AMONG US

The differences between the Acropolis and Washington, rock ‘n roll and Chopin, a horse and a Trojan horse, French wine and “black soup” (pig’s blood and vinegar swilled by the Spartans) are the differences of imagination. It seems the first thing the very first human ever did was to imagine – something, somewhere, someone, somehow – which was his motivation, his reason to exist.

From that point on the imagination has been the world’s architect and designer. It molds everything. It’s the lens and viewfinder through which everything is perceived. It also has parameters and dimensions, an ultima Thule beyond which it has not yet ventured – but wants to. At the same time it drags the original imaginings, those of that very first human, along with it. Man’s first motivations (fears, dreams, interpretations) never diminish. They’re written on the cave wall and the subway wall for all posterity – one and the same place. An urban jungle is still a jungle.

What this means is that primitive man and modern man live side-by-side. Both are simultaneously alive. One is the other. One speaks to the other, tries to communicate with the other. And, like it or not, the most sublime and pure sense of the modern is the most primitive. Now it makes sense when the artist says that abstraction is the nearest thing to nature we can get. To the troglodyte it’s not abstract. To us it is. Abstraction is a “modern” viewfinder (in camera obscura).

What are today’s gods? The computer, cash register, fossil fuel, the machine, the smartphone? Each works to rob us of our “primitiveness.” They measure just how far we’ve strayed away from ourselves. It’s one thing to take our ideas, colors, and textures from nature; it’s another to extemporize them to the point of no longer knowing them. And yet what each false god still strives for is the power of regeneration. The imagination has strayed so far as to mistake maps for terrains, fingers for moons, and it tries, rather pathetically, to correct itself in a world of simulacra – imitations without originals. Our hubris has actually convinced us that originals are no longer needed, so we throw them away. We incredulously make our own, or think we do.

Modernity is a kind of stupidity. I think we all know it. I think we intuit the worst and still deny it. We fear the alternative so we just keep plowing ahead while the past never leaves us. We can’t escape who we were and are. It’s like a shadow always following us.

The quintessential epicenter for this collision between ancient and modern, new and old gods, (imaginations doing battle) is the city – the place designed to distance us from instinct and memory. It’s the very “unit” of civilization. It all happens there. The best and the worst in us come out in constant rebellion – the full spectrum of modernity exposed. In that sense there’s really no difference between, say, religion and illegal drugs, or machines, money, power, and smartphones – our “opiates.” These are examples of the imagination stretching beyond the Thule. – Personally, I think I’ll worship a turnip today, while you worship anything you want. At the end of the day I’ll bet that what we pray for comes out roughly even every time. The only proven god amidst it all is the imagination.

Despite our false gods, the only thing we hold on to are the significant moments in our lives which re-mind us of our past. It’s what we commemorate, yearn for and relive every day. We put it under glass, showcase it, archive it, auction it, worship it (as art). It becomes all about re-membering again. The greatest modern art is the most primitive cave drawing, and visa versa. Just look at Joyce’s Ulysses, Picasso’s Guernica, Buckminster Fuller’s geodesic domes, Pythagorus’ and Euclid’s laws, and on and on. Picasso said he drew his greatest inspiration from the Lascaux cave paintings. Stravinsky said the piano was a “percussion instrument” repeating drum sounds from the jungle. Wyndham Lewis said that the artist always finds his way back to “fish” (the cave wall’s vesica piscis, the ichthys). The most obviously modern transposes and transmutes into its simplest components.

Knowing it or not, this is what the artist (and the artist within us all) seek most intrepidly and tenaciously – the primal. It’s the most honest human condition. We intuit and never forget that it’s perennially “alive.” Modernity is a dead zone unless/until we recognize it as a (hero’s) journey through a landscape of the imagination.

It all started going wrong when we decided, as Guy Davenport said, “to explain the mechanics of everything and the nature of nothing.” The fake and the real, the mimetic and the original, prompted civilization to send both polarities to north and south of the psychic map. If and when we ever agreed to embrace the primitive, it went topside into waking consciousness. It was celebrated with the gods we chose for the times. Whenever we denied the primitive it went underground and was assigned to the daemons of heresy, idolatry, and evil – mostly names which come from the ancients themselves and are summarily condemned by today’s sky-god of “light. Today’s chosen symbolism and rhetoric make a mockery of the truth until such time as our culture decides to integrate Hades – the chthonic, the unconscious, the primitive – with the Heavenly Gates.

We need to question our gods every day. The litmus test for legitimacy is their staying power. Which ones keep dying? Which ones never die? Which ones give an updated voice and persona to the archetype? Which ones doesn’t? The archetype has been around longer than we have (it imagines us): Zeus, Poseidon, Hades, Demeter, Hera, Athena, and the rest all live on the cave wall AND in the stick figures of Keith Haring and minimalist graffiti of Jean-Michel Basquiat. The gargoyles in Gothic Revivalism (beaux arts) and the austere/abstract depictions of New York’s Chrysler Building (Heinz Shulz-Neudamm’s Metropolis 1926. Thurman Rotan’s Skycrapers 1932) are all cave wall art. Nothing has changed in 20,000 years.

As long as we want to worship deities, why not agree to be at least honest enough to recrudesce those cave wall deities? Why not go underground and bring Lascaux to the surface again? Picasso, Lewis, and Basquiat have been telling us to do just that. There are gods, and then there are false gods. We know which is which. We just need to stop lying in following the deities of temptation, power, dominion, greed, and freedom without cost (“the easy ride”).

Writing this at Christmas time, this is the best (and only) re-minder I can offer at the moment. There’s nothing wrong in following the Christian myth. We just need to be honest and follow the meaning it teaches at its pagan religio (“roots”) without the Church’s interference (the dark irony locked in Christian obliquity). If we look for it, it’s there. Alan Watts (Episcopal priest, Buddhist) said the Vulgate Midnight Catholic Mass on Christmas Eve was the most beautiful living ritual he’d ever witnessed. It breathes (inspires, expires) our most ancient instincts. It just requires a little imagination.

© 2017 Richard Hiatt

READING THE TIMES

READING THE TIMES

A successful revolution has interesting characteristics and requirements: First, it has a spirit which never dies. It’s like a river – at its beginning, middle, and end all at the same time, hence its past, present, and future. It is all-encompassing in this manner and leaves nothing out. In a sense it creates a narrative which only keeps developing and changing, similar to character development in a novel. The future then is merely the sequel to already written chapters.

Secondly, it becomes an open channel between an individual’s the inner and outer worlds. It is about consciousness and a state of mind which is “in the street” and in one’s heart and mind. It moves simultaneously and equidistantly within and without. The two energies are symbiotic, codependent, and mutually reflective. The more one feels the fury of injustice, alienation and fractured surrealism of the street, the more he lends a voice (and fist) to his own fictionality and anti-rational psychology, the same aleatory displacement and uncertainty which keeps the soul in motion. One is the other’s lighter twin and metaphor. In On Revolution Hannah Arendt spoke of certain truths which “are not held by us but we are held by them….[They are] rooted in the physical structure of the human brain, and therefore [are] ‘irresistible’.”

Literature is crucial. Revolution in the street also means revolution on the page. In the street every aspect imaginable is drawn into its vortex – politics, religion, art, music, philosophy, media, and eventually even fashion (think of the Beatles in the 1960s). On the page it means capturing “the moment” as a function of time and space, for the purpose of finding a narrative. Apropos of the Beatles and the ’60s, one thinks of all the Beat and subversive literature, dark humor, alternative spirituality, drugs, lifestyle literature, literary criticism, feminist and Marxist literature, and so on. The irony about that decade was that there was a global revolution which changed everything, but no American revolution which could claim to have really changed the equations of power. Civil rights, women’s rights – yes. But the real question of power between old and new guards remained unfazed.

We can learn a lesson however from the ’60s and why revolution failed. What it did not render was what Russian philosopher Mikhail Bakhtin called the chronotope (cf. Toward a Philosophy of the Act, The Dialogic Imagination) – an anchor which fixes “time and space” to a central theme, a core event or symbol. This is what allows the narrative to ensue and literature to make sense of “the moment.” The Sixties missed this completely. Bakhtin brilliantly said there are three aspects to the human psyche: “I for myself,” “I for the other,” and the “other for me.” It was an early effort to basically announce James Hillman’s “to say psychological is to say psycho-social.” Together, as one, they moored reality to place and time. Regarding all the above, the Sixties had a major problem with synchronization.

Consider what happened. There was no revolution per se, just a myriad of small, fragmented “revolutions” – electoral, New Age spiritual, environmental, student/campus (SDS,SNCC), artistic/aesthetic, sexual, historical/Marxist, and even a criminal/anarchist element which did nothing but steal from the others and call themselves “revolutionaries.” Too many factions with different, often incompatible, agendas converged and rendered nothing more than mutual alienation and resentment. There was a general “dumbing down” of language and meaning in last-ditch efforts to compromise. Cohesion and substance were sacrificed for meaningless rhetoric like “people power” and “solidarity.” Conventions and rallies gave voices to specious endorsements of sometimes extremely stupid proposals.

Everyone demanded and expected sweeping reforms but ended up compromising on issues so vague and off-point that they had no legal bearing or authority. They went nowhere. Various “high-level” conferencing (if it ever got that far) got so overly burdened with protocols (subcommittees on committees, steering committees on old steering committees, caucuses hearing caucuses) that it became the travesty of a vulgar joke. It was a marathon of misrepresentation, pointlessness, egomania, and terminal sophistry. As someone said, “too many chiefs, too many tribes.”

Unlike, say by contrast, the French Revolution – a real revolution anchored to a different time and place. The French knew they had no choice but to shatter old social and linguistic barriers. They eliminated the particular, the local, and idiosyncratic, and replaced them with a new logic, national order and clarity. It required a means of unified expression, a new language – written and physical. It was a new discourse of all styles funneled into one. The solution in effect was to “impose France” upon itself as a new living force. It forced new attitudes, and eventually new social structures, even new literary outlets (at least the Sixties had this is common). They made way for a distinct literary culture.

For the record, it didn’t automatically bring joy to everyone. It engendered many antagonisms. But the sheer increase in poets, dramatists, novelists, activists, not to mention social critics and political satirists, was striking. It made for a rich synergy for change. Amidst all the clamor, everyone was still, as it were, on the same page. It was anchored to a singular agenda for everyone.

“The moment” as of right now in America (that is, current social structures) allow us to read the political landscape only to a point. There are many centrifugal currents growing in many places. At one level it’s all very readable: there are political, environmental, religious, and ideological crises imploding from all directions and a groundswell of “movements.” But they remain fractured and fragmented, dismissed by the corporate establishment, made sure to peter out like a dying wind (this was also the Sixties’ nemesis). But there’s another element which did not exist in the Sixties: “The moment” is actually more unknowable than imagined because of the existential crisis of postmodernity itself – virtual reality, cyberspace, the conflation of space and time. We don’t even know who we are anymore. Technology has become the most formidable weapon yet to kill movements – not to mention a unified “movement.”

In the early 1800s Paris had its flaneur – “without plan, without order, without method” and yet intellectually and intimately tuned-in to the revolutionary theme. The difference in those days was that, as mentioned, everything was anchored to a common achievement. Its observers and participants could scarcely escape it. Paris furnished not just the background for change, the entire atmosphere set the very terms for change. Again, revolution became a state of mind, the consciousness, of those living it. – Today, the flaneur (the American equivalent) lives up to its modern connotation: the lazy and lost sightseer, observing but having no clue about what he sees. He has no map or compass to lead him to anything greater than the confusing and exhausting blur of billboards and road-signs.

What’s needed is the Bakhtan anchor (chronotope) – a vector, compass, or what Renata Adler long ago hinted to as a “radical middle.” Just recently I heard an echo of this reenter the public lexicon – called the “radical center.” Both suggest a magnetic axis pulling in and uniting the spokes of a spinning wheel. This is extremely difficult to do given the existential miasma we’re in, not to mention technology designed to keep us lost and divided. But one way (perhaps the only way) to offset the corporate demiurge is through consciousness itself.

The system has not as yet devised a computer chip which can change consciousness, except through constant subliminal messaging. A “state of mind” persists in spite of years of forced indoctrination which is quite amazing. This is our queue. In spite of the pressures bearing on us, revolts and movements are inexplicably finding an “axis” of meaning, a rally flag which all the separate parts recognize for asylum and a shared purpose.

We don’t know what it is yet. But it seems to be coming to the surface quickly given all Coat_of_arms_of_France_(UN_variant)the forces now reaching critical mass at the same time. For the French it culminated in its national insignia: Liberte, Egalite, Fraternite – replete with axe, fasces, and olive branch. It required a collision and awkward collaboration of strength of will and understanding, masculine and feminine, to effectively change Republics – but most of all a clarity of purpose and direction in those collisions. The hold of the Revolution of 1789 lay in its proleptic powers (it is “now” – still happening). It’s our burden now to ferret out a common identity and direction for ourselves.

Back in 1963 (before the Sixties really ignited) Arendt said many conservatives believed that global nuclear annihilation canceled the possibility of all future revolutions (oh, how they wished!). Little did they realize what was about to happen. She said the one thing which would never die was “the cause of freedom over tyranny.” It seems to be man’s most powerful proclivity, to dominate and own the lion’s share of everything. Apparently it’s in our DNA.

Hence, Jefferson’s advocation, “a little rebellion now and then is a good thing.” But this needs an upgrade and a reboot: Today I would substitute “revolution” and “necessary.” Mere opposition to authority is no longer enough – because it’s no longer allowed; or if allowed rendered impotent. Smaller, ineffectual movements are granted, but not “a movement” in its most inviolate form. It needs to go further now and achieve a transformative end, a paradigm shift.

© 2017 Richard Hiatt