SOUNDS OF SILENCE

SOUNDS OF SILENCE

If words are just signs, and the signs become mere indicators of other signs, and then metaphors and similes, then who are we when all of that is wiped away? Are we left with another kind of language, or just silence? And into the silence, the loudest voice of all?

I think about this when I listen to people speak – about anything. It’s as if “we” step back and position ourselves behind a wall of artifice, a loud and thick veneer of sounds. And that’s who we are. It’s also as if we’re in a conspiracy agreeing to hide what lies beneath and behind the wall. Behind it seems to be a silence, communication in that silence, which goes around (transcends) that middle ground, the intermediary known as the human ego.

We need, after all, to distinguish ourselves as unique individuals in order to promote meaning and purpose. And language is the only means by which to achieve and maintain our separateness. You have your words, and I have mine. Hence, your persona and mine.

And so, it begs the question: Are we as an evolving species headed in a particular direction with regard to language? As an analogy, I think about the century’s old philosophical argument over the human diet, the diametrically opposite beliefs in the East and West. Aside from the same arguments concerning the wisdom teeth, the ratio of types of teeth which are evolving (devolving), and even the appendix, there’s the corollary disagreement over the types of food we’re supposed to be eating and how much. In the West we seem to think out canines are “coming in,” hence our growing dependency on meat – and more of it. Those in the East say the opposite: we’re not only supposed to be eating lighter, we’re supposed to be eating less. Hence the related claim that the appendix and the wisdom teeth are also “coming in,” not going out.

One might use this analogy to say language as we know it is “going out,” not coming in. What IS coming in is perhaps what we could tentatively call another type of language which requires no words or sounds. Communication continues on without interruption (maybe with some minor stops and starts). It reminds me of two people (or human and animal) who have known each other all their lives. At first and for years words/sounds are used. But by the time they’re reached their autumn years together communication is nearly all telepathic in nature. Think of Anne Sullivan’s work with Helen Keller. Think of the Horse Whisperer. Think of your own elderly dog’s ability to know your every mood.

Telepathic communication, and most forms of silent communication that we know of, still involves words. To know isn’t to really know (to confirm knowing) until reduced to a language we all share, a language we know binds us together. Even to simply think about what you know is thought about in words. But isn’t there an intuitive phase which comes into play prior to (and after) the words? And what if we were to leave it there?

It’s a place we all pass through in childbirth – prior to cooing, babbling, lallation, echoialla (imitations), and expressive jargon (phonemes, morphemes, etc). Mother-child interaction is intuitive and pre-lingual. Then she begins talking to the child using exaggerated intonation, high-pitched voices, repetitions, and short sentences – and “wordplay” begins. – Maybe it’s a place we will pass through again, if not before death, then evolutionarily – phylogeny recapitulating ontogeny?

As this debate is still theoretical (at least in the West), my own bias hails from intuition. And that tells me that not only is the East correct on the subject of food and diet, we are also evolving toward a place of “no language” but “high communication.” “Less” becomes “more.” There are undoubtedly lots of Western-thinkers who believe in words as much as they believe in meat (“thicker” and lots more of it). But in my view more and more humans are finding themselves – to use a figure of speech – in “awe” of life in general. Awe stipulates a condition of utter amazement and speechlessness: no words can describe their experience. Words fail us – fail to live up to , to accurately measure, what is perceived by the senses and/or the intellect.

The story goes that when Louis and Clark came around bends in the river or peered over mountaintops on their way to the Pacific Meriwether Lewis invented (or borrowed) the term “sublime.” He meant it to denote something which could not be described in words. It was simply “breathtaking” and one could not (dare not) diminish it by attempting to describe it. Silence was their only option, a kind of spiritual contract inviting them to communicate at a very private level. These were fleeting moments, that is, until they reached the next turn in the river and the next mountaintop.

Is the “sublime” a harbinger of some kind? Does it demarcate a direction we’re all going? Years ago, before they were exterminated by invading white people ridden with disease, there was a South American (Brazilian) stone-age tribe that spoke in nothing but verbs. It was immediately instructive and challenging. Imagine trying to communicate in verbs only, for even just five minutes. Initially, it seems ludicrous, and the ego wants to stop and call the whole thing absurd. What it’s doing is trying to regain control as it panics for being set aside – even for just five minutes.

But stay with it, and that sense of timelessness becomes a direct threat to the ego. This is because a) we are forced into here & now, and b) the separation between subject and object vanishes. There is no “I” versus “thou,” no “I am this, but not that.” There is just the becoming of what one is doing or witnessing: “raining” instead of “”it is raining,” “eating” instead of “I am eating.” Being turns into endless becoming. – It magically thrusts us into a higher state of consciousness.

And by the way, as Alan Watts so brilliantly called out, “When we say ‘it is raining,’ what is IT?” – “It” is nothing more than a linguistic rule, a grammatical convention to fill in a gap in order to communicate normally. “It” serves no function beyond that. Finally, in a flourish of limericks for which he was so famous, he answered his own question in his essay titled This is It (paraphrased from memory): “I am it, you are it, it is this, it is that, he is it, she is it, it is it, and that is that!” – Slowly we begin realizing the tactics of evasion in ordinary communication.

This brings us to those arts which communicate by successfully circumventing words altogether through pure expression. Dance, music, architecture, painting – they all basically strive to achieve “the sublime.” They work to subtract the middleman, the self, between observer and observed. If done well, they succeed. Tragically, even when successful, we leave the theater or gallery and typically indulge the need to share. And sharing requires words again. “The moment” is robbed of the magic it initially captured. And we’re basically right back where we started. We kill the experience as a condition to simply function again.

So the question arises: Could there ever be an apres theater space where the sharing could remain silent? Another way of basically saying that we’ve become the dance, the painting, the musical score, and remain wrapped in that consciousness? Something to simply carry home with us and prolong “the moment” even in the company of others? More importantly, to try and integrate that higher consciousness into everyday life? Victor Hugo said, “Let us take the hammer to theories and poetic systems. Let us throw down the old plastering that conceals the facade of art.”

And that returns us to, among other things, metaphors. Metaphors probably encompass half the human vocabulary. We use them even when we don’t know it. Words themselves are merely symbols (signifiers) of something else (signifieds). But beyond that, we use abstract references every minute of the day to convey the simplest observations. Aristotle first brought it to public attention, saying it “consists in giving the thing a name that belongs to something else.” None of us can live without metaphors, but they become dangerous when they begin taking control of reality – as language has done anyway.

In her books Illness aS Metaphor and AIDS and its Metaphors, Susan Sontag stressed the tragic consequences of turning illnesses like cancer and AIDS into metaphors of failure, invasions by demons and monsters,, God’s punishment, and so forth. “My subject is not physical illness itself but the uses of illness as a figure or metaphor. My point is that illness is not a metaphor, and that the most truthful way of regarding illness – and the healthiest way of being ill – is one most purified of, most resistant to, metaphoric thinking.”

In the end (if language brings us to an end), it becomes a kind of existential death wish, the ceasing to exist as we know existence. “A final characteristic of man’s existence,” wrote Rollo May, “… is the capacity to transcend the immediate situation…. [E]xistence is always in the process of self-transcending. This capacity is already stated in the term exist – that is, “to stand out.” Existing involves a continual emerging, in the sense of emergent evolution, a transcending of one’s past and present in terms of the future. Thus transcendere – literally “to climb over or beyond “- describes what every human being is engaged in doing every moment….Nietzsche has Zarathustra proclaim, … ‘I am that which must ever surpass itself.’”

Being is becoming, having is knowing, and “being there is knowing there’s nothing to do, and nothing is left undone.” That more or less leaves us out of the larger equation, does it not? There is just “awe” and the sublime, and nothing is left undone or unsaid. As Buckminster Fuller once said, “I seem to be a verb.” The “metaphor” of the river comes to mind. We become it. And of the river Siddhartha says, it’s the only thing which is at its beginning, its middle, and its end all at the same time.

Having exhausted his appetites, the man who approaches a limit-form of detachment no longer wants to perpetuate himself, he loathes surviving in someone else, to whom moreover he has nothing more to transmit…. With neither profession nor lineage, he achieves – final hypostasis – his own conclusion….”

I no longer want to collaborate with the light or or use the jargon of life. And I shall no longer say ‘I am’ without blushing. The immodesty of the breath, the scandal of the lungs are linked to the abuse of an auxiliary verb.” – E.M. Cioran

© 2019 Richard Hiatt

.

WERE WE RIGHT?

WERE WE RIGHT?

In 1880 Henry Adams published a novel, entitled Democracy, intended to be a satire on American politics. One of his characters, Mrs. Madeleine Lightfoot Lee, goes to Washington after her husband and son die to restart her life in a world of high literacy and intrigue. She finds Washington enervating and dull, a bucket of corruption, cynicism, and patterned hypocrisy. After some time she becomes reflective and shares, what I believe is, the one question we all share most everyday:

Who then is right? How can we all be right? Half of our wise men declare that the world is going to perdition; the other half that is is fast becoming perfect…. There is only one thing in life … that I must and will have before I die. I must know whether America is right or wrong.”

Isn’t this what we all ask as we participate in this still young “American experiment?” It feels like we barely hold on by a thread, and what seems to confirm this are the reminders from our own wise men of letters – that we basically let everything go to hell until the very last second. Then crisis inexplicably pulls us together – and thus far that derelict strategy has worked. We seem to shine best only when things get so bad we have no option but to then address the matter fully. It’s playing craps with destiny, but we don’t care. Some would have it that we have a death wish, at least a sadomasochistic streak. Still others would simply say we’re wishing to know the answer to Mrs. Lee’s eternal question – finality.

It’s easy to have faith in a program invented by “heroes” two-hundred forty-three years ago, who we’ve deified as super-human, indestructible, god-like marbled faces neatly aligned in the Capital Building. Our sandstone Parnassus in South Dakota makes them bigger than life. Their words are the lapidary phrasing of giants among men. Words mattered then, and they matter now, though in abbreviated forms. Today, words matter but they’re prone to redaction, low-shelf psychodrama, cryptic semaphore (street RAP), tabloid gossip, and pulp fiction.

Two centuries ago our Founders were even more worried about whether the new experiment would work than we are now. But we ignore that, preferring to remember them as stalwart, strong, level-headed, and supremely confident “fugitives from England” inspired by a new vision. Any confusion about this blurs the official narrative. Our hired hagiographers did their jobs: “They were led by faith and perseverance.” Transpose that same “presentism” onto current times and we’re left without that All-Seeing Eye. “Faith and perseverance” simply don’t work. It’s not enough. We’re on the brink and neurotically unsettled with the term “experiment.” It lords over us like a dark cloud. We want finality to things, decrees to be proven, history reassured and confirmed once and for all.

Gore Vidal thought it was the other way around: “[D]espite their long historic record of bad guesses … they are not we, and did not know; we know.” – In my view they knew; we do not. If it were Vidal’s way we’d have a dreadful absence of heroes and role-models. We put on a brave face, but we see our Founders with braver faces.

Is it through war and battles that we look for Mrs. Lee’s answer, and for our worthiness? Or is it simply about a matter of years of surviving which confirms our “rightness?” And there lies a deeper and more vexing question: If, perchance, America was “wrong,” does it mean that everything we’ve done as a nation was wrong? Our sense of righteousness, our duty to God and country, our treatment of the vanquished, manifest destiny? Would we then be forced to face our actions and the most horrible shame imaginable?

History shows that those who have conquered don’t wait around for redemption, reflection, or second-guessing their actions. They plunder, kill, take, and claim the spoils of war in the quickest and most expedient way possible. Honor, fairness, decency, and morality have nothing to do with it. Those are abstract concepts woven into the historical narrative later on by the “winners.” But still, the truth eventually catches up with the perpetrator. Time and distance clear the way for objective honesty, and we’re doomed to see things “unclouded by longing.”

There are two governments: the first is the “provisional” government: It’s the one we all learn about and are supposed to trust – freedom and justice for all, democracy, checks & balances, and so forth. Then there’s the “permanent” government, the one we’re not supposed to see – the one-corporate party (split in two haves), the elite 1 percent controlling everything, including all the information we learn. The second one cares nothing about democracy. The first one waves democracy like a flag, reminding us every day that we’re “worth it,” that we have the best of everything, that we are the best.

The first one convinces us that we enjoy a democracy, and democracy informs us that plundering is absolutely necessary (called “democratizing”). Plundering after all is God’s plan and it’s what got us here. It’s also God’s plan to spread “democracy” to all corners of the earth – to “save” them of course from themselves. Democracy is the one trigger-word that exonerates us, rescues us, from any doubts about our past. And that means if democracy goes away, we go away. Even if to “democratize” a foreign land really means to “Americanize” it (American banks, American currency, American factories, American sweatshops, American tariffs, America’s national religion which is Christianity, and American military bases). No wonder the third-world has hated us so much for so long.

But again, like the days of our Founders who grappled with a nation’s very meaning and purpose, even democracy is an uncomfortably elusive concept. It’s full of potentially abstract loopholes and high interpretability. It’s susceptible to “compromise.” – “Are we willing to forgo some liberties to preserve and protect our freedom?” – the penultimate oxymoron. Yet a disturbing number of Americans say yes. In fact, they go even further: They contend that their own liberties/freedoms are more important than those of others, hence more important to protect and preserve. White Supremicist, native-born Christian Americans seem to think so.

Gray areas in the American argument, for me, brings up the many campaigns of World War One. It brings up Flanders Field, Gallipoli, America’s “lost battalion” in the Argonne Forest, and so many other horrible events where thousands died. In nearly every one the question surfaced, “Was it worth it?” Or were most of those campaigns the decisions made by arrogant, self-serving, and ignorant commanders, safely distant from the battlefields, who only wanted to be remembered for their “valor and honor?” Many field officers (rarely discussed) committed suicide with horrible “survivor’s guilt.” Again, time and distance return us to our inner demons.

In the end, yes, “we won” the Revolution and the big wars. But we weren’t there. Nor were we in Philadelphia 243 years ago. We simply have no choice but to cover up our shame with redacted versions of history. The white (Christian) guys won again. God was in our corner. Errol Flynn and Ronald Reagan fly off into the sunset. John Wayne fights heathens at Fort Apache to this very day. Forget all the PTSD and failed veterans needs and benefits now plaguing us with the truth. “Reality” isn’t needed anymore. The victors of war rewrite history and fictionalize everything. We’re God’s favorite people once again. Just look forward, never backwards.

So, if the one link – American democracy – in a rusting chain which holds us together is eternally fluid and susceptible to change, will we end up answering Mrs. Lee sooner than later? The enigma grows when we look at the nature of democracy itself. If it fails then we’ll know. If it doesn’t fail then, hopefully, we will never know.

Hindsight is an interesting concept. It’s supposed to give us clarity and enough wisdom not to commit the same mistakes twice. But hindsight in the service of promoting a national lie takes on an entirely different purpose. It’s about selective attention and cheery-picking details which best serve the needs of mythology and propaganda. It’s about replaying mistakes in order to improve future impostures, fraud, and strategic deception – not unlike orthodox religion. Both rely heavily on superstition, hermeneutics, and third- and fourth-hand accounts.

At the end of this day, the verdict is still out on our national virtue. Even after all this time. So many alarums and excursions, noise and saber-rattling, storm and stress, still leave us in serious doubt. It’s almost become taboo to reflect this deeply and lend scrutiny to our national purpose. And yet because of that we seem to suffer from what the psychiatric profession calls “agitated depression.” We never slow down lest old haunting memories might catch up with us. Best to simply keep looking forward and move faster and faster. Count on even more illiteracy, more anxiety, and impatience to discourage too much fact-checking. Who cares anyway?! Americans want entertainment NOW! – escapism NOW! Just give them what they want. Meanwhile, keep the marching bands playing and the flags waving!

Hopefully Mrs. Lee will never get her answer. By its very design democracy is first of all a temperament or state of mind instead of a code of laws, something eternally susceptible to change. In fact, it allies itself with change. Nothing is ever final, and nobody ever knows enough. Secondly, it requires a kind of governing which is supposed to change according to the needs of each generation. This is what makes it a living contract. Government, said Jefferson, belongs to the living. – In short, democracy is always unpredictable, mercurial, and without a final verdict. It is an “eternal civil debate” which isn’t always civil. Anything short of this amounts to oppression and despotism in varying degrees.

As for America itself, Lewis Lapham said it best: “I can no longer identify myself simply as an American. The noun apparently means nothing unless it is dressed up with at least one modifying adjective [white American, Afro-American, Native-American, liberal American, gay American] …. The subordination of the noun to the adjective makes a mockery of both the American premise and the democratic spirit.”

America is also less about the rhetoric of democracy than it is about economics and markets – that is, advertising. It “concerns itself not so much with what is true as with what people believe to be true – with the image or the perception rather than the fact…. [It] relies on a vocabulary meant to persuade and seduce rather than to teach and inform.” (Lapham). – Again, like democracy, it makes America a turnstile of identities, claims, and presentations fitted to the markets of perception and style. Its mythology is constantly refitted to the shifting needs of each generation.

In the end, Mrs. Lee needs to know that democracy and America are never easy, never neatly packaged. Despite all the allusions to tradition and stability, both are a mess most of the time. If America were a human being (Uncle Sam), psychiatrists would diagnose him as neurotic, bi-polar, and borderline. It’s the exact opposite of the kind of Edwardian world to which Mrs. Lee is accustomed. For her to get an answer would be the day she gets the answer she doesn’t want – the day democracy dies. Which does not mean it’s the day America dies (her second unwanted answer).

© 2019 Richard Hiatt

BANANA REPUBLIC AND THE COUP

BANANA REPUBLIC AND THE COUP

There’s the butterfly dreaming he’s a person dreaming he’s a butterfly. Then there’s the ghost looking through a lens looking at a human looking back at the ghost, each wondering if the other is real. And then the phenomenon of the little girl accidentally stepping into the “fourth” dimension, like a rabbit hole.

And now there’s a feeling of an entire culture stepping en masse into a world of mirrors, wondering what’s real. And in that situation we have one of two choices: either realize the truth and engage it, or become complacent and allow ghostly reflections dictate reality. Either it matters that reality is fictional, or it doesn’t. If it does, we need to intervene with our ghosts. If it doesn’t, we stay on the receiving end of “the truth.”

It becomes the stuff of a good novel. What some authors discovered while searching out facts ironically made the best novels. In 1904 Upton Sinclair was a journalist commissioned by his socialist magazine to search out the real Chicago – the stuff that was making it the nation’s second major industrial hub. The factories and living conditions among workers were his targets. Little did he realize that writing about the conditions in slaughter houses, the filth, animal cruelty, stench, the methods of storing meat, would become his opus, The Jungle. Truman Capote investigated a murder spree in Holcomb, Kansas, in an article for The New Yorker. It led him to invent a new genre, the “nonfiction novel” (reality woven into fiction). Neither did he realize that events would lead him to alcoholism and suicidal ideation. He never wrote anything of any consequence again after Holcomb.

For both writers events were too unreal to be real, yet they were horribly real. At that juncture, to “report” facts could only be done in the spirit of how much anyone could mentally fathom. The narrative form was the only option allowing this reportage. It gave the reader the option of having “an out” – to think of it as fact or fiction.

An analogy could be made to shell shock, later called battle fatigue, later called operational exhaustion, later called PTSD. With each passing war (since World War I) naming the same thing had to become more and more euphemized and softened for the public to digest it. As citizens learned more about it, experiencing it vicariously, it became increasingly hard to get their minds around it. The labels responded, going from two hard words (two syllables) to four words (eight syllables) and a hyphen.

Huxley and Orwell were the consummate reporters of hard reality. Some said that their fiction inspired their social-political activism. But the reverse was true. If fiction inspired reality then “crime thought” and talking farm animals wouldn’t have had the social gravitas they so needed. They would have been filed under “high drama” and “children’s books.” What Huxley and Orwell saw in the real world rendered signs and portents which could only be conveyed through fantastic allusions. Yet, again, how they were read was left to the reader himself – fiction’s gift to the reader. Some saw only pure fiction, others saw dark days ahead.

I just finished an article I entitled The Proletariat. It’s a diagnosis of where we are today as a third-world nation and how we’ve become the world’s newest proletarians. It’s a reality “stranger than fiction,” yet here we are. In most of the categories that measure the well-being of a civilized society we’re behind almost every nation in the industrialized world: an increasingly brutal police force, overcrowded prisons, substandard infrastructure, heightened illiteracy, half the population at the poverty level, and so on. What I neglected to add was that in the third-world people also have the weakest grasp of reality of all. “Leadership” is privy to this, and in America it’s leaders are beginning to resemble the same types of right-wing autocrats found in Turkey and Libya.

We’re looking through a darkened glass wondering who it is we’re looking at – because it certainly cannot be us?! Day-to-day lives are increasingly existential in our grasp for meaning. And then it gets even darker when we realize that who it is doing the grasping is not us. We’ve become a nation of ghosts looking at their own shadows, “imitations of imitations,” or as Baudrillard put it, “imitations with no originals.” Everything is surreal, without grounding, as we watch strangers define us.

Another phrase to substitute for the third-world – banana republic. Not unlike Sinclair, Capote, and others, many go out to report on a changing America but come back beset by events too fictional to believe. We cast our leaders as actors now, the White House and Washington as daily “soaps,” even vaudeville (the political process as a “Potomac Two-Step”), and politics as alarums and excursions with heroes and villains facing off in the middle distance between Main Street and Wall Street (or the Middle East). The entire stage is decorated with bananas and “nuts.”

And what is a banana republic? “Socialism for the rich, capitalism for the poor?” Pretty much so. The government and various corporate monopolists get together and collude over private wealth at the expense of the country and its people. Debts are “socialized” by referring them to the national treasury which means taxing the poor. There’s no accountability. Bureaucrats are quite literally “for sale” and are consulted only after the really important matters of finance, law, and the economy have already been decided. There is also the intrusion of tribal and cultish elements where superstition (religion) preempts government. A president is just as apt to call on astrologers and evangelists than economists when the NASDAQ goes south. And lastly, leaders are figureheads one day and despots the next. The only difference between us and, say, Zimbabwe or Venezuela is that we happen to be, quoting Paul Krugman, a “banana republic with nuclear weapons.”

The American writer O. Henry actually coined the phrase himself when hiding from the US government (for embezzlement) in Honduras and Guatemala. It was there that he accidentally stumbled upon the United Fruit Company’s oppressiveness over the poorest people on earth. He witnessed wholesale land-grabs of public lands, slave labor, and stealing of private property. He wrote about how the imbalances between rich and poor got so huge that currency became worthless, and these nations lost their eligibility for international credit. O. Henry fictionalized the experience in short-stories complied in Cabbages and Kings (1904). Later it was the writer Gabriel Garcia Marquez who picked up the torch and wrote about the same accounts in One Hundred Years of Solitude.

Here’s where the stuff of tragedy becomes real and reality becomes high fiction. Americans can no longer tell the difference. Psycho-socially, America is Zimbabwe and Venezuela with a “weak grasp of reality.” It’s so indoctrinated in the Orwellian commandments of “prosperity” and “freedom,” good and evil, that hearing about itself becomes the new reality. To search for real facts is to run into a wall too overwhelming to digest. The only way to fathom it, to get our minds around it, is to read about it in the papers, to hear it from media outlets. Hence we become Sinclair and Capote trying to digest the surreal, but also Winston Smith not believing it.

And here’s where “the operatives,” the plantation owners of banana-ism, take advantage. Seeing how the masses are so easily duped, they make a full-time job of writing the narratives and plot-lines. They furnish the actors, scripts, and story-lines. The nightly news anchor/narrator is the voice of newspeak (purging society of free thought) and doublethink (holding two contradictory beliefs simultaneously) while avoiding thought crime (perspective, clarity). The lie, when told frequently enough (said Goebbels, said Stalin) and making it the most grandiose lie of all, is eventually believed. One peers through the looking glass and sees himself seeing himself but now believes his projection is someone else.

He sees a stranger he’s never met before who he then must steer correctly in a labyrinth of lies, between what’s real and good and unreal and dangerous. His mind is speaking to itself, ego to alter ego, conscious to subconscious, like two strangers fighting each other for clarity. It’s the stuff of film noir. He’s in the dark watching his own movie. Meanwhile, the screenwriter and projectionist enjoy a full-house in noir theater. The plot-line is so convincing that monsters, demons, terrorists, and ghosts are the real thing. Even though we see victims seated out in the darkness, they’re never us.

The script matters, because words matter. The agenda is to hear what’s said in form instead of content, cliché and platitude instead of detailed fact, imagery instead of substance. This is the banana republic’s style of communication. In America substance is boring and takes too much time. Details get in the way. We want the projectionist to go straight to the main feature, then to the climax.

As for the words, political conviction and summary can now be reduced to a label or a bumper sticker. The more empty the words, the more charm it has. The cliché “says it all” – trigger-words and catchphrases are sexy. Just read the sticker and we can all go home. It’s newspeak which reduces the totality of politics down to ten key words: “dream, fear, hope, new, people, we, change, America, future, and together” (said Christopher Hitchens). Our hired wordsmiths know that “intelligent” ideas are not only not needed anymore, they’re impossible to convey. They necessitate complexity, forbearance, reflection, objectivity, and most of all inaction before action. There’s no time for this. They’re way too vapid. Professional mind-benders serve everything up in pill form, instantly, quickly, and all at once. Just add water and it’s “morning in America” all over again.

And as for instant formulas, the media follow the calculus of “threes”: that is, three-year political memories (hence four-year elections), three-day weekly memories (on 24-hour news cycles), three-minute memories (commercials near or around every three minutes), and three-second attention spans (camera must, on average, shift focus every three seconds or the viewer gets bored and loses interest).

In all, the collective feeling of “normal” is one of wading through a fog. There’s no grasping of anything because one doesn’t even know the person doing the grasping. And yet when we move, he moves. When we stop, he stops – like a shadow. The banana metaphor is fitting and instructive. It’s a diagnosis but also an opportunity because the human mind is gifted with a tremendous power of recovery, the capacity to remember, to know that it knows. Of course this violates today’s equivalent of Orwell’s INGSOC. There would be no more Emmanuel Goldsteins, no more Lucifer Effect, no more “war is peace, freedom is slavery, ignorance is strength.” The mirror comes into sharp focus and we finally recognize who we’re looking at, who’s doing the talking.

At that juncture we have two options: to run for the border or stage a mindful/mind-altering coup d’etat. The first coup would be inward, followed by an organized political push with the kind of mental clarity that frightens the gatekeepers. History shows that the most effective and lasting revolutions are those which have remained the most silent. Silence implies concealment, composure, and retention of power. Violent revolutions are always the most costly, inefficient, wasteful, and most likely to fail. Action manifests best through inaction when dictators expect action. Action happens best when they expect complacency and apathy, informed decision-making when they expect ignorance, and enlightenment when they expect capitulation. It’s gradual, methodical, organized, and quintessentially underground and subversive.

It’s a story told many times over: the young hero breaks away from an enslaved world. Somehow he “remembers” another one. He steals away under the dark of night past armed guards and barbed wire and finds that he was right. Overlooking a whole new Eden, breathing fresh air for the first time, he knows he could simply slip away and disappear forever. But he’s pledged to his people. He returns and gathers small groups around small campfires and tells them the news. “There’s another world outside these walls, a whole sky full of stars, freedom to move and think, possibilities never imagined.”

© 2019 Richard Hiatt

THE PROLETARIAT

THE PROLETARIAT

I’ve heard that the definition of a 3rd world country is where the majority of its citizens are at or below the poverty line. It’s military is also the largest, and “richest,” industrial complex of all its industries. For most of the Cold War the Soviet Union held that distinction most prominently in the entire third-world. Today, according to that definition, it’s the USA.

If you haven’t noticed, what’s left of the so-called working-middle class is now scrambling for survival. There’s no more “retirement” as we’ve known it, schools have to hold bake sales to buy supplies, people have no “primary care physicians” (though the media still assumes that we do), we’re splitting pills to afford meals, college students are “nutritionally poor” (BS for “starving”) because many don’t know where their next meal will come from, teenage suicides are epidemic, gun violence almost mimics the Old West, Congress wants take our trust fund (called Social Security), families need to hold two (or three) jobs to pay bills, and the nation’s infrastructure (roads, bridges, railways, buildings) is dilapidated. Meanwhile, the military receives “half” the total discretionary budget for any given year since at least 2001, and probably before that. — It’s an old story by now, but yes, it’s a third-world nation.

So why don’t we finish the analogy? The US must then also have its own version of a proletariat and even a bourgeoisie (petit and haute) – the latter shrinking, the former growing. And so what exactly is the proletariat, since we’re all signing in?

Historically, it refers to the unskilled and mostly uneducated poor who sell their labor power, who suffer under oppressive rules with no control over their living or working conditions. We might think that it pertains to male industrial workers in factories and mills. But the original proletarians were lower-class (Roman) women. The term “proletariat” comes from the Latin for “offspring.” These were women so poor that their only contributions to labor was through their bodies (the womb) – hence proles, or children. It was their only way to serve the state. Government didn’t demand production but “reproduction.” And we might keep in mind that in the sweatshops and cheap labor facilities around the world today, the majority are still women. Even in Marx’s time, the largest community of wage-laborers were not male factory workers but female domestic servants.

With that in mind, we still have wage disparities and discrimination, though these are finally (allegedly) being addressed and slowly eliminated. The sheer reality of the “single mom with children,” working two-three jobs, barely making her rent, no healthcare, using public transportation, insurance (if she has any) reneging on coverage when she needs it, living in substandard housing, rationing food – all makes the original proletariat a concept which simply does not go away. It’s changed very little.

Today there is a blurring line between economic classes like never before. It’s much more than just about “pink” collar workers, middle-management careerists, and the unskilled workers inter-relating. It’s not just because skills are overlapping administratively, managerially, and not just because people in suits work outdoors in “hard-hats.” It’s about all the colors of the collar starting to appear on one socio-economic plateau. It’s because they’re all feeling the same vulnerabilities in their “pursuit of happiness.” They all face a slippery slope dictated by economic forces which demand more and more sacrifice just to survive. If one doesn’t “make it” in the short term, he faces a strong possibility of losing everything. The average household has only $400 saved up in case of an emergency. The snowball effect brings everyone together. Only those already “well-situated” (with money and position) afford the best opportunities for themselves. If they gamble and lose they have a safety net.

The fantasy of self-determination (“rugged individualism”) has always been a myth. Only a very lucky few investors ever “built themselves up from nothing” to become independently rich. And by the way, the phrase picking oneself up “by one’s bootstraps” never meant what we thought it meant. It comes from an 18th century fairy tale intended as a metaphor for an “impossible feat of strength.” It meant “ludicrously far-fetched” or “an impossible task.” Getting rich in America, if you were initially poor, was all of these.

Especially since we’ve drifted from being a manufacturing economy to a service economy, it doesn’t much matter anymore if you’re a mechanic, dockworker, nurse, receptionist, postal worker, caterer, construction worker, waitress, chauffeur, teacher, computer operator, or a civil servant. The terms “service” and “white collar” still obscure significant salary discrepancies, but new fields in technical, administrative, and managerial fields all share a common fault: more and more salaries are going to fewer and fewer people at the top putting employees in one huge category – called “working-middle class” – whether that means lower middle-class or middle-middle class. It’s an American bourgeoisie slipping into a petit bourgeoisie. It is what Terry Eagleton called the “increasing proletarianization of professionals.”

Eagleton reminds us that this trend is a repeat of what occurred in the 19th century when, quoting philosopher John Gray, “the middle classes [were] rediscovering the condition of assetless economic insecurity.” Eagleton wrote, “Many of those who would be traditionally labelled lower-middle class – teachers, social workers, technicians, journalists, middling clerical and administrative officials – have been subject to a relentless process of proletarianization, as they come under pressure from tightening management disciplines. And this means that they are more likely to be drawn to the cause of the working class proper in the event of a political crisis.”

The marshaling of energies is also a repeat of history. It congers an old scary phrase thought to have invented by Marx: the “dictatorship of the proletariat” – a phrase market capitalists have demonized ever since the 1871 Paris Commune of (a truly democratic, though brief, government set up by proletarians). What it simply meant was a popular democracy ruled by the majority. – And by the way, Marx himself never invented the phrase. A fellow confederate named Auguste Blanqui coined the phrase, meaning “rule on behalf of the common people.” Blanqui became president of the ill-fated Commune but had to preside while in prison. The oligarchy saw to it.

The idea of universal suffrage for “the people,” free education no longer controlled by the church and state, the abolition of child labor, employee ownership of businesses, magistrates, judges, and public servants accountable to the public, women’s rights and “feminist initiatives,” separation of church and state, and the idea of cooperative production, was just too much for the Third Republic (established a year before in Versailles). The Commune lasted just a few months when political and military authorities seized the movement, tortured and executed hundreds of innocent people, held numerous court martials, and imprisoned others. The Third Republic resumed control and lasted until 1940 (when the Nazis occupied Paris).

So what of the idea of latter-day “Communards” – the disenfranchised setting out to do exactly what the first Communards attempted to do? Today’s Third Republic would be the regnant hierarchies of power (banks, corporations, and financial institutions whose leaders have never been elected by anyone). We may not coddle to the title itself – way too “Marxist- sounding??” – but the task of restoring the very same moral principles prevail once again. Call it what you want, there’s no denying history repeating its oldest and most shameful behaviors, humans doing what they seem to do best – exploiting the weak — because they can.

Just a few years ago the word “socialism” was laughed at and dismissed as “un-American” as anything could be. It was the great spoiler in getting elected at any level. Then we began learning about it and how literally “half” the government was set up to be socialistic in the first place (public highways, public libraries and schools, post office, Social Security, Medicare, the GI Bill, even the military). And now today “democratic socialism” is “on the table” like never before, not because it changed but because greed and avarice have reached a tipping point for even conservative voters. Citizens in greater numbers not only don’t believe in the old system anymore, they’re facing the frightening reality of where it’s left them.

If socialism has now been acquitted of evil by the peoples’ court, if it’s finally being “reconsidered” by those even at FOX news and removed from the blacklist of conspiracy theories we’ve heard for generations, then why not also Marxism, the Communard, and the proletarian? What’s the hangup in including them in the national vocabulary, especially when they are so unbelievably relevant? It’s really just a matter of semantics, is it not? And, it would seem, just a matter of time.

Such inclusion of old terms and principles does three things: first it helps us learn history (which repeats itself); secondly, it legitimizes the situation we’re in today as something very real and grave; and thirdly, it gives us a vocabulary, a moral compass and a road map on the possibilities and repercussions of political decision-making.

It’s not simply about terms and theories. It’s about changing our way of thinking – about fundamental rights, about boundaries, and most of all about self-governing and empowerment. It’s about the one thing history gives us most – perspective – which we seem to have a dreadful absence of today.

Let’s talk for a second about terms Marx was very keen on – class and labor. Class warfare is something capitalists have always charged socialist and communist countries with – that is, until the divide between rich and poor got so egregiously wide several times in our history. Class refers to much more than just economics. It refers to habits of thought, customs, and values. “Labor” as well does not just refer to physical work. It’s a concept which involves the person himself, his being, for centuries his “pride and joy” (which capitalism then robbed him of). It entails the physical senses and social participation. It involves gender, kinship, sexuality, cooperation, trust, community, tradition, and much much more.

These are factors which Marx not only stressed but which more and more of us across traditional divides are trying (perhaps unconsciously) to rekindle. But with each generation we’re further and further from ever identifying with, feeling pride for, our work. We are entering a new phase of psychological interactions where these terms maybe more abstract (difficult to define) but are more deeply felt than ever.

And here’s Marx again: He said classes don’t recognize their own existence until they gain political representation. They don’t become fully conscious of themselves until the full array of legal and political factors align properly. And even before one large class is recognized, smaller ones remain divided, alone, and mutually out of kilter.

This is where I personally see the majority of Americans today, residing in different economic and social camps but inside walls which are beginning to fracture and come down. They are looking beyond and recognizing themselves elsewhere. A new class is emerging that recognizes its own terms and conditions – something “the other” (the minoritied rich) wants to keep fragmented, alienated, and without power.

I’ve never been an advocate of guns, though I learned how to use them growing up. My understanding of the 2nd Amendment is very different from the current popular interpretations, simply because, again, I listen and learn from history. Most today do not; they don’t give a damn about history insofar as it interferes with their using it as an expedient for their personal interests. But that aside, I see the “invisible hand” of market forces coming back and slapping them in the face. That is, just one industry (guns) now witnessing an entire nation armed to the teeth with military-grade weaponry. Its own product has generated such fear and paranoia as to affect even itself.

The CEOs of the very corporations most instrumental in fleecing citizens now live in fear, though they’ll never admit it. They live in gated communities and ivory towers protected by security guards and surveillance systems. They fear a growing working-class festering like mobs outside the palace gates. More politicians and CEOs than ever worry daily about what festers in the “grassy knoll.” “Freedom” for them, even while luxuriating on their golf courses, hotels, and spas, requires more police dogs, cameras, security clearances, and “patrolled perimeters” with each passing year. Paranoia suffers no limitation. It’s almost poetic how the laws of nature and thermodynamics balance out in the end. You reap what you sow.

So, at this late date, I see a transposition of forces going on: a minority living increasingly afraid of its own shadow, a majority just beginning (hopefully) to gain enough historical perspective to remember who and what they are. If this is indeed the case, it may also serve us to humbly adopt a terminology fitted to the purpose. We lose nothing by it. But what we “feel inside” gains strength and stability.

© 2019 Richard Hiatt

CRITICS, ONE AND ALL

CRITICS, ONE AND ALL

The most basic axiom taken from the Manichean (Buddhist, Jungian) rule book is that the more light we generate, the more darkness is required to amplify it. Foregrounds need backgrounds, just as sound is not sound without silence, plenums without voids, etc.

It brings up an interesting question: In our search for more light, are we not defeating ourselves by not including the darkness in that equation? If the axiom is simple enough to understand then why do so many of our moral guidebooks attempt to eliminate one for the sake of the other? These are conjoined twins that refuse to allow surgical separation and then putting just the one on life support. Darkness needs to be understood (and embraced) in (and as) another “light.”

When I studied art back at college I knew it was attempting to enlighten me in some way. Art wasn’t my major, or minor, but I kept returning to it trying to get clear what it was trying to say to me. Art is supposed to symbolize the highest in human achievement, and the best art is man’s highest achievement of all. It therefore has the most to say to us and about us. We invent art perhaps for a reason that still remains too cryptic for us to understand.

Creativity (art) is a prodigal child – “oppositional defiant” and “conduct disordered.” In other words, it has a rebellious, subversive side which always works against what we intend it to be. It speaks the unspeakable, it shows the darkness we’re not supposed to see. In that sense we’re used to promoting “greatness” in relation to the defeat of darkness. There was always the Christ archetype, or something symbolizing the divine, enlightenment, or goodness – and then there was the vanquished opposite, the force we could all do without, obstructive, disposable. This rule still holds today.

Through time then we’ve all become guardians of a rather fixed pathway to redemption. We’ve formalized pathways which art does not support. Society has never treated good and evil, light and dark, as it should. Hence the history of art itself, but even more so the history of its dark twin. (S)He/It becomes instructive here. It’s a lesson on a direction we should not be going.

The first critics of “boxed creativity” (i.e., high art) had but one simple assignment: to explain the artist’s goal and report whether the artist felt he succeeded in reaching that goal. And there his job ended. It wasn’t to render a verdict on whether the work was good or bad, worthy of serious public consideration, to compare it with other works of art, or to approximate its market value. But this all changed proportionate to an enormous transposition of goals – the needs of the critic over and above the work itself. It was no longer about the art. At some point the critic’s biases superseded what should have remained a sacred, subjective, inviolable, and intensely private relationship between the art and the public.

It’s mindful of the Church butting in as official intermediary between oneself and his God. One was no longer permitted his own understanding or private communication with his own diety. He was henceforth told that if he wanted to receive God he had to do it through those claiming a closer kinship with the divine than he could ever have. The critic here is the defender of “apostolic succession” and sacerdotalism. He is the bearer of the good and bad, high and low, legitimate and illegitimate. And based on what? His own criteria of the sacred and profane, the highest and lowest.

Insofar as all art was religious art for about 1500 years, art critics held great political power for just as long. Their “reviews” simply transferred from the doctrinal to the humanist with the Renaissance (and its rebuttal with the Counter-Reformation). But in both cases, they held positions of great prominence and persuasion in the work of dictating public perception about government, religion, and reality in general.

But through all that time they also supported an agenda of which, in my view, they were tragically unaware. It was because of an unwillingness to examine, not just their official biases, but their unconscious biases as well. Even today the critic takes his own “mental” filters for granted. In the Middle Ages one could hardly be blamed for what little he noticed about his thoughts. Descartes was about as close as he came to it. But today one should be all too familiar with examining absolutes in any category, at any level.

And this leads (in part) to my thesis: Because of this, criticism has reached a cul-de-sac. Fewer and fewer people don’t even listen to critics anymore, let alone believe them. They see them as egotistical nuisances, voices that just get in the way of one’s private experience. These are overpaid “celebs” doing book tours and lectures who have overstepped their bounds with their “thumbs up-thumbs down” decisions about what is and isn’t art. And even if we were to lend more credence to the criteria they say is essential to this business, they’re still wrong half the time anyway. Their tract record is “poor to average” at best whether it’s about a painting, a ballet, a film, or a book review.

I’m not exaggerating when I say that most of the time, when a critic says that something is “good,” I’ve learned not to see it (or read) it, and visa versa. My intuition has never failed me on this. So, in that sense, I suppose you could say critics are valuable for their consistency.

By “filters” I mean personal issues (hangups, fears, obsessions, embarrassments, needs) which pollute their “professional objectivity.” One can almost carve out a psychological profile of a critic just by reading her/his reviews – by simply sifting through what (s)he considers to be good, bad, interesting, likable, worthwhile, a waste of time, “immoral,” “horrible,” and “offensive.” – And as his emotionally charged trigger-words always draw public attention, the critic has carved out a career which looks no different than a gossip column on Hollywood stars and horoscopes. Instantly then, as mentioned, the critic finds himself usurping the subject matter itself (the artist, his work) with himself. We’re almost expected to solicit his advice first before doing anything involving art.

Yes, no one is without his “mental filters” and biases. In the end “objectivity is a myth.” But not everyone claims to be an expert and guide through the mazes of art theory and history, self-appointed by dint of a PhD, academic endorsements, and/or published books claiming to lead us through the wilderness of our ignorance.

So what does this have to do with understanding the dynamics of “light-to-dark?” This became the template on how we’ve learned to treat our most honest creative efforts. It widened the chasm between good and evil – a symbiosis which creativity cannot exist without. Secondly, the dark/evil/inferior/flawed side was to be treated like the plague itself, stowed away on the Ship of Fools. It killed our sensibilities, polluted our efforts to evolve to a more “refined” world.

But coincidentally, something else always stowed away below deck, vowed to always violate rules, standards, and protocols. A subversive “underworld” dynamic in art has nonetheless prevailed – and thanks to nothing else but “high” art itself. The more energy given to the latter, the more given to its shadow trailing behind it. And “high” art has always tried to elude its shadow.

By definition even today high art gains its status “when someone with authority in that world treats it as art,” wrote Carol Duncan (my italics). “This happens in essentially three ways: the work is exhibited in a high-art space, it is noticed in the high-art press, or it is purchased by a high-art collector.” It’s all about “visibility.” We know where this is going.

Duncan continues: “Good critics validate work as high art by publicly testifying that it has art quality. They put into words their experience of a work, identifying the ideas and feelings the work evokes in them and describing the specific way the artist produces that effect.” And then the critique gets even murkier: “Critics thus demonstrate that the work has some transcendent meaning to them, that it gives shape to or illuminates some feeling, value, or truth that they hold to be significant, and that it does so in a form that seems appropriate.”

An effort is at least made to translate the artist’s experience (if that’s even possible, if it’s even appropriate), but what’s done with that calls into question the critic’s own credibility: “[E]ither you see and value the meaning [the artist] points to or you don’t… In this way critics …create an art context for work. They surround it with a mediating discourse … that prompts its reading as art.”

Several lines later the truth is unveiled: “Criticism sorts, labels, and measures the worth of artists, ranking them in relation to each other within one of the ever-shifting trends that waft through the market…. They know what trends are currently receiving critical support…. They know who is showing at what gallery…. who is selling for what at auction…. Artists learn that recognition means becoming visible in these contexts, and many artists measure their own value according to how much of this visibility they achieve.”

In other words, they sell their painterly souls to markets, dealers, and “pushers.” and the critic is perennially front-and-center in how it all navigates and gets negotiated. Lastly, the most damning statement all: “It is … irrelevant whether or not an artist agrees with what a critic says about her work. What matters is whether or not her work is noticed.” The art itself is sacrificed for the strategies of commercial notoriety – financial success and/or fame. In the end, one is no longer an artist but a performer doing a song & dance for talent scouts. At the end of her article Duncan admits, “high art exists largely at the will and for the use of wealth and power – and I would add that it exists for the most part as a means of keeping that power in place.” 1

Then like a good lawyer looking out for her client, Duncan tries to let the critic off the hook for having to even recognize “high” art when his own judgment (and reputation) might be called into question: “Quality … is a word that usually ends discussion. That is because … [it] is thought to be an undefinable and universal essence.” So now, art is once again “subjective.” It’s in the eye of the beholder – how convenient. But again, it returns us to the initial point: layering on so many qualifiers and conditions which wasn’t their job in the first place but which they’ve elected to take on.

The critique has followed a long list of rules for centuries. And all along, as I mentioned, the “low” has always crept up behind the haute coutour like a dark twin. In Paris in the 19th century painters like Manet, Monet, Degas, Pissaro, and Van Gogh set out to upend “high” protocols and standards because they were simply interfering with “the truth” of ordinary life. “Modernism” as a movement was about “self-consciousness,” of seeing life sliced into separate parts and interrogating motives. It intended on not seeing reality in wholes but in fragments of fragments. With the fin de siecle and an approaching World War this was the psychology of the times.

The irony was that even when these artists tried to elude the trappings of the market, which they hated, and neutralize formalism altogether, they failed. Subversive efforts and anti-establishment turns were simply swept up by the market and packaged into a new avant-garde. A more modern term might have been useful then: camp – stylized, posed, and mannered – the new chic. It made these artists into “celebs” which they also hated knowing “celebrity status” would only interfere with doing their work.

But through it all it proved an underlying fact: that the higher you go, the lower you go; the lower you go, the higher you go. It’s no wonder that Freud and Marx were so pivotal during this period (more subversive voices). 2 — Later schools continued to celebrate “low” art (street art), and the market quickly seized upon even that, calling it “anti-art” – the new art. There was no escaping the jaws of commodification and consumerism or the (black-)magic of transmutation.

But maybe this history lesson is telling us something else. Maybe the whole idea of human “progress” was also going in the wrong direction. Maybe it was telling us to stand back and “undo” the process of high and low, because eventually it led to “cul-de-sacs” and the absurd transmutation (negation) of opposites. In other words, no one would know what “high” and “low” even meant anymore. Maybe to go backwards was (and is) to go forwards.

Bring in E. M. Cioran, lifelong Parisian writer and (I would add) critic of critics “In itself, every idea is neutral, or should be, but man animates ideas, projects his theories and flaws into them, impure, transformed into beliefs, ideas take their place in time, take shape as events: the trajectory is complete, from logic to epilepsy … whence the birth of ideologies, doctrines, deadly games.”

Cioran took the position of the “anti-prophet,” the idea of backing out and away from the “critical” investment. He said that, alas, the “instinct for self-preservation” compels us to find “formulas for happiness.” “Under each formula lies a corpse.” Hence, the need to divest from formulas. “The abundance of solutions to the aspects of existence is equaled only by their futility.” For Cioran life itself was about “blind[ing] ourselves [by] our own ambitions.”

This is not to say that everything was purposeless. This was not nihilism. Rather, by emptying ourselves of the investment in such heavy ideas, notions, and theories, we gain access to simpler meanings – meanings not of our own design. “When the doors of perception are cleansed….,” said William Blake.

Granted, it does sound nihilistic when he makes an analogy to stones and inert things: “Only inert things add nothing to what they see: a stone does not lie; it interests no one.” But he’s not saying that “we” become inert, just our mental inventions. We are still here, and instantly life is defined by a newly inscrutable sense of Being. Hence our way out of the “human condition.” As man is infatuated with “ghosts,” he surrenders them, and what the universe could not give before now gives freely to us.

Equally then we must divest from the highs and lows we’ve been taught to value so in life. “History is nothing but a procession of false Absolutes, a series of temples raised to pretexts, a degradation of the mind before the Improbable.” – In artistic terms, Ms. Duncan highlighted those “temples” and “pretexts” earlier on.

What we’re describing is the environment which is (post-)modernism itself. And for the first time we can see our own existential limitations. In his essay The Crisis of the Mind (1919), Paul Valery said, “We … know that we are mortal….” “So many horrors could not have been possible without so many virtues.” “Everything has not been lost, but everything has sensed that it might perish.” – This was written in 1919 and portends the future a hundred years hence: “the free coexistence… of the most dissimilar ideas, the most contradictory principles of life and learning.”

For Valery, World War I was the supreme culmination of contradictory forces converging into meaninglessness. It could just as well be the final benediction to the errors of today, not understanding the symbiosis of light-to-dark. It’s humankind’s penultimate folly: “The wealth of contrasts and contradictory tendencies was like the insane displays of light…. How much material wealth, how much labor and planning it took, how many centuries were ransacked … to make possible such a carnival, and to set it up as the supreme wisdom and the triumph of humanity.”

How much “material wealth, labor and planning” indeed. And “how many centuries.” The final distillation of the all-powerful critic would be the Machiavellis and Hitlers of the six-hundred years. In The Discources Machiavelli had the entire pecking order of critics-to-critics laid out: “Of all men who have been eulogized, those deserve it most who have been the authors and founders of religions; next come such as have established republics or kingdoms. After these the most celebrated are those who have commanded armies…. To these may be added literary men…. All others – and their number is infinite – receive such share of praise as pertains to the exercise of their arts and professions.”

History’s lesson to us is this: We need to undo our collective sense of importance, or that of those who insist upon their own – their arrogance in butting into life’s creative offerings as divinely appointed arbiters of good and evil, high and low, real and unreal. There’s been way too much of this. Stop listening to the artistic, political, religious, philosophical “experts” and begin defending the right to our own human experiences – our privacy with, our “subjective” scrutiny of, what we say our highest achievements are. The way to go forward is to go backwards, but also inward.

Oh, but then Wall Street and the professional standards of measurement will disappear and we’ll be left with the collapse of all civility and order! Art will vanish, spiritual guidance will end, good people will lose their livelihoods, markets will fall, and on and on. This was the same argument capitalists assailed upon socialism – the loss of personal incentive, everything reduced to inert meaninglessness, the loss of the creative impulse, etc. The only thing lost here is the “high” profit motive enjoyed by a community of smug individuals wanting to “measure” your intelligence for you. Alexis de Tocqueville himself said, “Under wage labor, the art advances, the artist declines.” Translated: the artist is singled out by critics from other artists based on the principle of “wages.”

Creativity and inspiration march on, they never die. The new critic just happens to be the ordinary citizen who decides to support it with his money, or not. The “middle-man, intermediary” is going the way of “bad art.”

This is where “low becomes high.” Up until now we’ve been approaching the point of not being able to even distinguish high from low anymore, superior from inferior art. The “new aesthetic/critique” now does the same thing. Only this time we do it with purpose and deliberation, replacing the vocabulary of high and low with our own terms of valuation. “Beauty is truth, truth beauty” said Keats and it stays in the eye of the beholder.

© 2019 Richard Hiatt

Quotes taken from the essay Who Rules the Art World, in The Aesthetics of Power:Essays in Critical Art History (Cambridge University Press, 1993) by Carol Duncan.

Marx’s favorite motto: de omnibus dubitandum – “doubt everything.”

ALL THINGS BEING EQUAL

ALL THINGS BEING EQUAL

… according to their equality,” to finish the phrase. The operative word here is of course “equality” and what it means. A very private and discrete encounter at home, in the wee hours of the night, on my porch, brings the politics of fairness front and center.

I have some new friends I’d like to introduce. They visit every night, though I don’t always see them. They’re denizens of the night, keepers of the nocturne, protectors of the realm patrolling the grounds like praetorian guards. Ninja-like and quick on their feet, they don masks in the true warrior spirit.

One I’ve named Boudica, after the 1st century Celtic warrior queen who led a revolt against Roman occupation – always out front, indomitable and unstoppable. The second (whose gender isn’t certain yet) I’ve named Saladin because he’s not just a conqueror but gentle, kind, merciful and chivalrous. I almost opted for Spartacus instead, because, like the warrior-slave, he’s fearless and doesn’t make a very good slave – to anything. But his gentle demeanor tipped the scales to Saladin – “Sali” for short.

Depending on the time of night, weather, food found (or not found), and of course human interference, Boudica and Sali surface as if from out of nowhere. They curl themselves around my legs. They know me, they trust me. Somehow I’m different to them. I’ve been chosen.

We met quite by accident, or should I say by an accident of mis-identification. Two years ago it was “Rocky” foraging in my backyard during difficult times. I gave him some old scraps of bread and crackers and we became friends. He never abused his boundaries with me and never asked for more than what I offered. Then one evening it was time to say hello again. When I opened the door five babies converged on me, too young to know fear and all apparently looking for a father-figure. All five rushed me with a flurry of instinct and hunger, and before I knew it three of them ran over my feet and into my house. The reaction from “Monkey,” my Chihuahua-Jack Russell mix, was the classic “Kodak moment” – bewilderment mixed with one-part fear and two-parts “elder.”

In other words, at that instant I discovered that “Rocky” (taken from the Beatles’ song Rocky Raccoon) was actually “Adrienne” (Rocky’s girlfriend in the film Rocky). How stupid of me! Here was a trusting mother introducing her new arrivals to a protectorate, a human willing to offer his backyard (and space under the shed) as a safe sanctuary in a very unfriendly and unsafe city. I have to say, her intuition about me was spot on. She read me like a book. And now I had a “family” on my hands.

At first it was touch-and-go, a perilous learning process of “fathering” – that old institution of drawing up rules while protecting and providing. Children listen to nothing, and as raccoons grow up very quickly, the “terrible twos” were upon me before I could say “stop!” Raccoons are distant cousins to bears. They have the same personalities and talent for getting into trouble. What I had were five small bears.

They were everywhere and into everything. At least I finally managed to keep them out of my house. When entering my fenced-in yard under the cover of night they knew they were safe – they were home – and it was play time! Meanwhile, mom transferred “motherhood” over to “fatherhood” like a baton, as she then allowed herself a rest. Five banshees – climbing, scratching, and begging, while she did nothing except watch me with an annoying Cheshire cat-like grin.

Fortunately, the summer saw the children grow up like kudzu. Nature is formidable in readying her critters for the seasons. And before long they were already nearly the size of their mother, and I began confusing them. Fortunately, at this time I also discovered a wonderful chemical in a spray-bottle, called “critter repellent” – a concoction of rotten eggs and garlic – simply awful. A cocktail so odoriferous it promises to offend even skunks. The principle is based on the scent, once sniffed, getting into the victim’s nostrils and not going away for several hours. It convinces it that this is not a place it wants to be.

Each night I took my repellent and laid down a perimeter, a DMZ, around my house. Beyond the house, nearer the shed, I offered my fatherly protection and old bread which through trial and error I learned was their food of choice – cheap and abundant. The plan worked. They now come around to the authorized middle-distance between the alley and house and remain there. The house has been saved.

I’ve been Uncle Rick now for the entire summer. Alas, my storm door suffered damage from the days prior to the DMZ and it requires fixing. But I remind myself that it’s the price of “family,” no differently than my neighbor’s backyard after his grandkids’ weekend stay – all’s fair in love and war. And should I ever forget (heaven forbid) to reinforce the perimeter with repellent, my own new nieces & nephews are sure to attack the house again, in addition to other places they shouldn’t be. They’re inquisitive nature is unbound and rapacious. On the other hand, they are also creatures of habit and learn rules quickly. We’re fairly comfortable with the status quo now. I carry my spray bottle at night like a trusty sidearm.

I go out in the late hours because Monkey needs to pee before we retire to bed – another situation which required quick thinking and improvising. What’s interesting here is that both canines and “coons” have for centuries learned who was the pursuer and who was the pursued. Horrible “coon hunts” more or less determined this role-playing long ago. And today it doesn’t matter that the dog is half the size of the coon, or that the coon could actually, if cornered, turn around and probably kill the dog. The dog always chases, the coon always runs. And my 13-pound chihuahua chases Boudica, Sali, and the other four around the yard – until they stop and figure out the absurdity of this charade. Then everyone stops on a dime, freezing in their confusion and uncertainty, waiting for the other to make the next move. Time stands still and everyone is in an unfamiliar and unsettling kind of wilderness. This is my cue to step in.

It’s almost as if they learn “self-consciousness” for the first time, a human trait. If it weren’t slightly dangerous for Monkey, the scene would be hilarious. But it isn’t funny since this all takes place at or around 3 AM. Neighbors are sleeping, and I’m chasing a dog chasing a raccoon around in circles in my yard. We’re the Key Stone Cops.

Finally, in time we reached yet another modus vivendi which helped calm things down. Before my eyes, Monkey, Adrienne and the kids stopped chasing altogether and have begun to simply to sniff each other out with great mutual curiosity. Again, another Kodak moment. Except for occasional paroxysms of canine adrenaline, the chase has ended, and everyone gets along – guardedly. It’s a convergence of learning and instinct. Animals, more than humans, know the meaning of mutual accommodation. For myself, the process is simply too precious to interrupt.

We still have our minor relapses, spats, hissing, moments of uncertainty. Instinct never rests. Monkey still gets scared, but with five noses sniffing her from all sides who can blame her? When this happens then I intervene. She wants inside, and I immediately oblige. I redirect the situation and “the family” seems to understand. There is what feels like an inscrutable kind of animal empathy here. My masked friends are very patient. I don’t wish to push it, and I’m very aware of anthropomorphizing this entire scene, but I sometimes fantasize these two unlikely bedfellows actually playing together, like distant cousins. I don’t hold my breath, but I play with these thoughts.

Needless to say, I’m a doting uncle. I’m also stuck between worrying about property destruction – children forgetting the rules – and missing their company at the same time when they’re not around. They are quite aggressive which requires some getting used to. But they are also quite lovely and soft in their ways with me.

I didn’t mention the other “three” nieces/nephews who are the siblings of Boudica and Sali. For some reason they’ve chosen to stay close to mom. They never venture far from her, establishing their own safe perimeter with me, not feeling quite safe enough, while by now also exploring their own independence. It’s a wonder of nature why things turn out the way they do.

This is a very private, unique, and discrete relationship. I tell no one about it, knowing the kind of lectures I’d get about “feeding wild animals.” Here in the city I worry about them all the time. And as Fall approaches I’m sure conditions are going to shift with the temperatures in terms of needs, behaviors, and their instinct to hibernate (intermittently). This I know is going to require additional accommodating and improvising. I remember the coldest nights of winter in years past when a raccoon would show up at my door desperately cold and hungry – again at 3 AM. I must prepare for a similar eventuality. I want to be available should conditions become dire.

By sheer coincidence, or synchronicity, I happened upon an old used copy of The Naked Ape, by Desmond Morris, at my favorite used bookstore. I then stumbled into the following words: “The naked ape is essentially an exploratory species and any society that has failed to advance has in some sense failed, ‘gone wrong.’ Something has happened to hold it back, something that is working against the natural tendencies of the species to explore and investigate the world around it. The characteristics that the early anthropologists studied in these tribes may well be the very features that have interfered with the progress of the groups concerned.”

In the final chapter entitled “Animals,” Morris writes, “Up to this point we have been considering the naked ape’s behavior towards himself and towards his own species – his intra-specific behavior. It now remains to examine his activities in relation to other animals – his inter-specific behavior.” He then lists five different ways in how we relate to other species: “as prey, symbionts, competitors, parasites, or predators.”

Knowing this book was written in 1967, I could understand how limited in scope this was. And yet Morris did drop the suggestion that, just perhaps, there was a kind of relationship we were missing: “The second category … is that of the symbiont. Symbiosis is defined as the association of two different species to their mutual benefit…. Where we ourselves are one of the members of a symbiotic pair, the mutual benefit tends to become biased rather heavily in our favor…. because we are in control of the situation and our animal partners usually have little or no choice in the matter.”

Enough said. Alas, not much, if anything, has changed substantively with regard to the meaning of “mutual.” All is fair it seems as long as the bias is clearly on our side. And yet Morris reminds us that the only mutuality between two “inter-specific” species which is fair and sustaining involves no unfair advantages at all. Otherwise, it simply isn’t “mutual,” is it?! – This I think is the relationship we wish finally to establish between ourselves and all species. It is, to put it in context, the final step of humans returning to nature.

I think about this constantly when Boudica and Sali visit me on my porch late at night. It seems like they are “calling me over” to their side of the ledger, their world, in an overture of mutual trust and love. I’m invited to their level.

In my training as a psychotherapist I ran into two schools of thought when it came to what we (or Carl Rogers) called “client-centered therapy” (Rogerian therapy). It required the therapist to basically “lose himself” in his client’s experience – to “come down” to his level and feel his traumas and emotions along with him. Many therapists refuse to do this, even today, fearing they will lose their therapeutic “edge,” their objectivity, and thus their ability to counsel. They claim it’s a matter of boundaries and ethics. To me, it’s bullshit. It’s simply “fear” and not trusting themselves enough. This is a fear, by the way, which harkens all the way back to the infamous Freudian couch – the therapist sitting “behind and above” his patient as an authority figure, in complete control, with all the answers.

Another school of thought echoes my personal response. The therapist never forgets where he is or what he knows. He always comes back to himself and the ability to counsel and offer objective advice. It’s a matter of trusting, not so much the client, but himself.

Needless to say, I came from the latter group. I never lost sense of my role in that process, and my clients never lost their respect and appreciation for “getting on the floor” with them. To do it the “old fashioned way” was not just “spinning our wheels” and a waste of time, I could almost sense that it insulted them as well, at some level – especially at a critical moment when an imminent breakdown could become a “breakthrough.” The experience/therapy was always “we” together. (For the record, I never forced this technique on anyone. For other clients more traditional approaches were needed).

This is the overture always being offered to me by my furry friends. They are my therapist every single time they surface from the underbrush. These are simply amazing little bears, intelligent, inquisitive, adaptable, clever, always communicating, and always risking in order to bond. They go beyond the “half-way mark” just to draw me to it. And I have to say, we humans are the ones always requiring an “unfair bias.” They comply but patiently await for us to let go of our fear. They tell us, “don’t worry, you’ll always come back.”

In the heat of the late night, beneath the city lights and changing moon, Boudica leans into me and whispers:

[Y]ou do not know me, you preserve me, you are my ineffable continuance; your treasure is my secret. Silence, my silence! Absence, my absence, O my closed form, all other thought I abandon, to contemplate you with full heart. You have made yourself an island of time…. My love toward you is without limit…. You await me without knowing me and I am what you lack that you may desire me. You are without defense. What ill you do me with the noise of your breathing! Through this castoff mask you exhale the murmur of stationary existence…. Man lost in your own roads, a stranger in your own mansion, furnished with alien hands that fetter your actions, cumbered with arms and legs that shackle your movements, you do not even know the number of your members and ramble astray in their remoteness. Your very eyes have arranged their own darknesses…. Alas, how you yield to your matter, conforming, dear thing of life, to the weight of what you are…. I am your emanation and your angel. We are nothing without one another and yet between us is pure abyss…. And now this Thing stirs… a declaration of love, a begging, a mumble, all isolated in the universe, without connections, with no one and no other….

—- from ABC, a poem by Paul Valery

© 2019 Richard Hiatt