Monday, June 05, 2017

political stories

narrative induction
Charlotte Linde is a rather brilliant ethnographer broadly within the symbolic interaction school – although not participating in that schools downhill slide into the irrelevance of infinitely coding conversations to make the smallest of small bore points. Rather, she has taken Labov’s idea that a story is a distinguishable discursive unit and researched Life Stories – she wrote the standard book on the subject.

In 2000, she wrote a fine study of an insurance firm with the truly great title, “The acquisition of a speaker by a story: how history becomes memory and identity.” https://www.scribd.com/document/209339886/Linde-How-History-Become-Memory-and-Identity
 Identity, with its columnally Latinate Id seemingly standing for noun in general, has during the course of my lifetime been dipped in the acid of the verbal form, and now little leagurers talk of identifying with their team – their grandparents would, of course, used identify to talk not of a subjective process of belonging, but an objective process of witnessing, as in, can you identify the man who you saw shoot mr x in this courtroom? Conservative hearts break as the columnar Id falls to the ground, but that’s life, kiddo.

Linde’s article circles around a marvelous phrase: narrative induction. “I define narrative induction as the process by which people come to take on an existing set of stories as their own story…” My editor’s eye was pleased and did a little dance all over my face to see that this was the second sentence in the article – getting people to forthrightly state their topic is, surprisingly, one of the hardest things about editing academic papers.  

Narrative induction properly locates story as part of a process of initiation. Linde, in this paper, is obviously moving from her concern with stories people tell about themselves – the point of which is to say something significant about the self, and not the world – to stories people tell about the world. Those stories often are about experiences not one’s own. They are non-participant narratives.

Linde divides the NPN process– as she calls it – into three bits: how a person comes to take on someone else’s story; how a person comes to tell their own story in a way shaped by the stories of others; and how that story is heard by others as an instance of a normative pattern.

There is an area, as Linde points out, where work on this has been done: in religious studies. Specifically, the study of metanoia, conversion stories. But there’s metanoia and then there’s metanoia. There’s St. Paul on the way to Damascas, and there’s Updike’s Rabbit Angstrom, on the way to the relative wealth of a Toyota Car Dealership, owned by his father-in-law. Linde, not having access to St. Paul, opted to study the trainees of a major American insurance company in the Midwest. Like Labov, Linde is interested in class issues. In particular, stories of occupational choice. In her Life Stories book, she presented some evidence that professionals present their occupational choice stories in terms of some vocation or calling, while working class speakers present it, more often, in terms of accident or need for money. Philosophy professors rarely will say, for instance, well, I needed a steady paycheck, looked at the job security of tenure, loved the idea of travel and vacation time, so I went into philosophy. They will give a story rooted in their view of themselves as emotional/cognitive critters. Labov’s work was done in the seventies, and my guess is that there has been some shift. The NYT recently published an article about “quants” in finance, many of whom came from physics, and their stories were all without a moral/personal dimension – they were all about money, not interest in finance. Interestingly, as a sort of saving face gesture, they all talked about how there are “deep problems” in finance.
Narrative induction is obviously about politics. It is one of the great instruments by which power is made into action and organisation.  To my mind, the discourse about democracy, which has become the central discourse in political philosophy, has become sterile; using the insights of narratology might liven it up a bit. There has to be more than democracy in democracy, or democracy just becomes another gimcrack put up job. There has to be stories within a democracy that sustain it. If the stories are simply about who is being elected, I think it is a symptom of democracy’s decay – its surrender to the old monarchial narrative.  
We need, in other words, to start looking at political stories. How they work, and how they do self-work This is an area that has not  been very well explored by political philosophers who want to infinitely suss out what Locke meant, or stuff like that. We need something  more novelistic. We need more ‘what is it like to be’ questions that will allow us to understand the political stories people tell.  And not stories that give political agents the character depth of lab rats pressing buttons for pellets.

Cause those stories, though cynically satisfying, are ultimately untrue. They are even untrue about rats. 

Sunday, June 04, 2017

aryan nation


In 2014, Fortune magazine did a series about white collar convicts in prison. One of the convicts was Matthew Kluger, who was caught on an insider trading charge. Kluger is a name- his father, for instance, won a Pulitzer prize for the book, Ashes to Ashes, about cigarette company financed pseudo-science. Kluger was a highly paid lawyer.
The magazine asked him about race relations in prison. This is what he said:
When I was in transit [from Butner] to here, I was being held at this BOP central transit center in Oklahoma City: 5,000 people sitting there on any given day—it's unbelievable. It's at the airport. They pull the plane right up and they have a jet way. It's quite an amazing thing to see.
So I got there. I went to my room. And my room was occupied by a black guy. I went and started moving in, because we talked and the black guy was also going to Morgantown. About ten minutes later, some big, white, tattooed guy, an Aryan Brotherhood, Texas guy, pulled me aside. He pointed to some tables and said, "This is where we sit."
“We,” meaning you, too.
Well, I was a little confused about that. I wasn't sure whether he was saying, "This is where we sit, so stay away from us," or "This is where we sit, and you're welcome to join us." And then about ten minutes later, a guard came over and said, "I'm moving you to a different room." I said, "Why?" He said, "because your friends there said it was unacceptable to them for you to be in a room with a black guy."
So they moved me into my own room. Which was fine. I mean, I felt bad. I felt very bad. But I was happy enough to get my own room for the six days that I was there.
I've met interesting characters here, and I met interesting characters at Butner. My best friend at Butner was a 28-year-old sex offender who grew up in a trailer in rural South Carolina. I mean, I grew up in Connecticut, and later I went to prep school. This is not really my thing.
So you do meet some interesting people, and you learn to interact with people you would never outside of here have had the opportunity to interact with.

So race matters.

Race matters. Yeah. Race definitely matters. I would say in some prisons they're 50 years behind the times. Here we're only 10 or 12 years behind, or 20 years behind the times. No, there is a very “us and them” view—now, that doesn't mean that there's no interaction. And I have a couple of black friends. But by and large, there's a lot of suspicion and wariness.

I was reminded of this scene when I read the NYT’s lighthearted look at what they are calling the alt-right – it used to be known as white supremicists, but the Times is nothing if not trendy, and kissing the ass of our Republican overlords by using euphemisms for racists is where it is at in the country club world. The article centered on a former felon, Kyle Chapman, who is described here: 

As the founder of a group of right-wing vigilantes called the Fraternal Order of Alt-Knights, Mr. Chapman, a 6-foot-2, 240-pound commercial diver, is part of a growing movement that experts on political extremism say has injected a new element of violence into street demonstrations across the country.
I loooove that description. Kyle Chapman does have a slightly different profession the NYT people decided not to front. He’s been convicted of three felonies, two for armed robbery, one for selling weapons to some urban gang. Otherwise, he’s just a nice commercial diver.
I kid. You know I love our liberal gray NYT rolling over for the right paper of record, and advertising as part of the resistance. So, so… Times-ish.
But this is not so much about bashing the Times as calling attention to something that seems to operate completely under the attention zone of the establishment. It was not commercial diving that shaped Kyle Chapman's racist views. Prison has become a major station in the lives of hundreds of thousands of white males in their late teens and twenties. And that experience has majorly leaked into our national discourse.
The whole prison system in America relies on the kind of massive topown violation of human rights that puts the US on the level of North Korea, for instance, by torturing prisoners, and the encouragement of convict level gangs. The prison system that gave birth to the Aryan Nation is doing nothing about it.
I’m not suggesting that Trumpism is merely the phylogenetic extension of the Aryan nation. Rather, both are the phylogenetic extension of what America chose to become after the civil rights movements of the 60s. The massive incarceration system has leaked into the rest of the system, as it was bound to do. But we can all go da da da, and pretend this didn’t happen. Isn’t happening.

Tuesday, May 30, 2017

on a technological achievement: the movie actor

 

I never know what to say about movies. Not that this has ever prevented me from talking about movies.


My experience of movies has been that the language used about movies doesn’t make sense of that experience. 

When Edison, among others, invented the apparatus for making film, everybody – in the West - had a pretty good idea of what an actor did and what theater was. These ideas were passed onto film, as if film were merely the extension of theater. It did not occur to Edison, or to others in the first period of moviemaking, to do more than let the camera record a basically theatrical experience. It was as if one were just taking a big extended photograph of a play. 

Now, the play is certainly not a spontaneous experience, but it soon became evident that the theater and the movie operate in different dimensions. The actor in a play may rehearse the part, certainly has to memorize the lines, appears in a stage setting, interacts with others who have also memorized lines, etc. – but all within the defining experience of the performance. The actor’s experience of the play and the audiences is equivalent. 

This radically changed with film. It was blown to hell. The idea that the film would mimic the play – photograph it - could not long ignore the technical nature of film making, which allows one to create a performance out of an ensemble of many cuts. And that is key – at that moment, the experience of the audience is fatally and finally cut adrift from the experience of the actor. It is, of course, still possible to film a play, but movies generally are built on the ruin of the old regime, in which the actor experiences the unity of his part in something that occurs from beginning to end at one time. This rarely if ever happens in movies. 

Of course, this became, very early, a trope in film. Since the silent films, movies have loved to show – to gleefully demystify – their making. They love to focus the camera on the camera focusing on the actor, they love to show the fakery of it all, they love to show the director, sitting in a director’s chair, saying cut. The cliché quickly and thoroughly penetrated the culture. 

However, even as the difference made by the movie was exposed again and again, we retained old, theatrical ways of looking at what was happening. We still called the figures mouthing the lines and pretending to be detectives or kings ‘actors’. And though auteur theory wasn’t really codified until the fifties, the characteristics of it in movie appreciation appeared early on – as though the director was an author. 

And so, newpaper and magazine movie critics will write about the performance of the ‘actor’ in the film as something that occurs like the performance of an actor in a play – they will ignore what they know, and what every movie abundantly references – that this is very much a synthesis, rather than a spontaneous unity. The movie references this in its camera work, its transitions, its ‘special effects’, etc., and we know after we have finished it that our experience of it as a performance was an illusion. Even the dimmest movie goer sees through the illusion. The ironic entailment of the reality affect offered by movies is that they become less ‘real’ – they reveal themselves as process the realer they are. 

So what are these figures? Are they actors?

There’s a story told on the DVD of Ni Toit ni Loi (Vagabond). In one of the last scenes in the film, Sandrine Bonnaire, the actress who plays Mona – the film’s central figure – wanders into a small French village where the grapes have just been harvested. The village celebrates by allowing a sort of carnival – men dressed up like wine demons capture whoever wanders by – civilians – and dunks them in a vat of wine, or throws grapes at them. According to the interview, when Bonnaire played in this scene, she was not expecting these grape demons – and she was really terrified by them as they chased her around, and eventually into a phone booth. It is an excellent scene – but it would never work in theater. In the unity of the experience of audience and actors that makes up theatrical performance, and actor who doesn’t know what is happening destroys the code of the performance. He or she isn’t better or worse at that point, but becomes a non-actor. However, this rule simply doesn’t apply in film. This is why film actors often speak of acting a role in terms of the way they physically throw themselves into it – rather than, as theater actors do, the way they throw themselves into it psychologically. Bonnaire lets her hair go, doesn’t wash it, or herself – DeNiro pumps himself up to 250 pounds for Raging Bull – etc. Now, it isn’t the case that the film actor doesn’t try to assume psychological characteristics, or the theater actor is not concerned with the body as an instrument – it is a matter of what is subordinate to what. In a sense, the actor in movies, cut off from the entirety of the film by the process of making the film, is doing something very different than what we call acting. A movie is a riposte to methodological individualism – the fundamental level at which the movie works is not reduceable to the separate and individual contributions of the people involved in it. We understand it that way for giving prizes, and because the myth of the individual is something that, at least in America, we pay lip service to. In making movies, the West invented an art form that it did not have the conceptual structure to understand. 

Which is why I am uncomfortable with saying things about movies. Because the words I have to use were killed by the camera.

Sunday, May 28, 2017

the breaks

The breaks
According to Robert Craven’s 1980 article on Pool slang in American speech, breaks – as in good break, bad break, those are the breaks – derives from the American lingo of pool, which is distinct from British billiard terms. The difference in terminology emerged in the 19th century, but he dates the popular use of break (lucky break, bad break, the breaks) to the 20s. I love the idea that this is true, that the Jazz age, the age of American modernity and spectacle, saw the birth of the breaks. If the word indeed evolved from the first shot in pool – when you “break” the pyramid of balls, a usage that seems to have been coined in America in the 19th century, as against the British term – then its evolution nicely intersects one of the favored examples in the philosophy of causation, as presented by Hume.
Hume’s work, from the Treatise to the Enquiry, is so punctuated by billiard balls that it might as well have been the metaphysical dream of Minnesota Fats – excuse the anachronism – and it has been assumed, in a rather jolly way in the philosophy literature, that this represents a piece of Hume’s own life, a preference for billiards. However, as some have noted, Hume might have borrowed the billiard ball example from Malebranche – whose work he might have read while composing the Treatise at La Fleche. But even if Hume was struck by Malebranche’s example and borrowed it, the stickiness of the example, the way billiard balls keep appearing in Hume’s texts, feels to the reader like tacit testimony to Hume’s own enjoyment or interest in the game. Unfortunately, this detail has not been taken up by his biographers. When we trace the itinerary of Hume as he moved from Scotland to Bristol to London to France, we have to reconstruct ourselves how this journey in the 1730s might have intersected with billiard rooms in spas and public houses. In a schedule of coaches from London to Bristol published in the early 1800s, we read that there is a coach stop at the Swan in St. Clements street, London, on the line that goes to Bristol, and from other sources we know that the Swan was famed for its billiard room. Whether this information applies to a journey made 70 years before, when the game was being banned in public houses by the authorities, is uncertain. One should also remember that in Hume’s time, billiards was not played as we now play American pool or snooker. The table and the pockets and the banks were different. So was the cue stick – , it wasn’t until 1807 that the cue stick was given its felt or india rubber tip, which made it a much more accurate instrument. And of course the balls were hand crafted, and thus not honed to a mechanically precise roundness. 
If, however, Hume was a billiard’s man, one wonders what kind he was. His biographer Hunter speaks of the “even flight” of Hume’s prose – he never soars too much. But is this the feint of a hustler? According to one memoirist, Kant, too, was a billiards player – in fact the memoirist, Heilsberg, claimed it was his “only recreation” – and he obviously thought there was something of a hustle about Hume’s analysis of cause and effect, which is where the breaks come in. 
There’s a rather celebrated passage in the abstract of the Treatise in which Hume even conjoins the first man, Adam, and the billiard ball. The passage begins: “Here is a billiard ball lying on the table, and another ball moving towards it with rapidity. They strike; and the ball which was formerly at rest now acquires a motion.” Hume goes on to describe the reasons we would have for speaking of one ball’s contact causing the other ball to acquire a motion. The question is, does this description get to something naturally inherent in the event?
“Were a man, such as Adam, created in the full vigour of understanding, without experience, he would never be able to infer motion in the second ball from the motion and impulse of the first. It is not anything that reason sees in the cause which makes us infer the effect.”
This new man, striding into the billiard room, Hume thinks, would not see as we see, even if he sees what we see. Only when he has seen such things thousands of times will he see as we see: then, “His understanding would anticipate his sight and form a conclusion suitable to his past experience.”
Hume’s Adam is an overdetermined figure. On the one hand, in his reference to Adam’s “science”, there is a hint of the Adam construed by the humanists. Martin Luther claimed that Adam’s vision was perfect, meaning he could see objects hundreds of miles away. Joseph Glanvill, that curiously in-between scholar – defender of the ghost belief and founder of the Royal Society – wrote in the seventeenth century:
“Adam needed no Spectacles. The acuteness of his natural Opticks (if conjecture may have credit) shew'd him much of the Coelestial magnificence and bravery without a Galilaeo's tube: And 'tis most probable that his naked eyes could reach near as much of the upper World, as we with all the advantages of art. It may be 'twas as absurd even in the judgement of his senses, that the Sun and Stars should be so very much, less then this Globe, as the contrary seems in ours; and 'tis not unlikely that he had as clear a perception of the earths motion, as we think we have of its quiescence.” 
However, this is not the line that Hume develops. His Adam has our human all too human sensorium, and is no marvel of sensitivity. Rather, he belongs to another line of figures beloved by the Enlightenment philosophes: Condillac’s almost senseles statue, Locke’s Molyneaux, Diderot’s aveugle-né. Here, the human is stripped down to the basics. Adam’s conjunction with the billiard ball, then, gives us a situation like Diderot’s combination of the blind man and the mirror – it’s an event of illuminating estrangement.
It is important that these figures were certainly not invented in the eighteenth century. Rather, they come out of a longer lineage: that of the sage and the fool. Bruno and his ass, Socrates and Diogenes the cynic, Don Quixote and Sancho Panza – it is from this family that all these deprived souls in the texts of the philosophes are appropriated and turned into epistemological clockworks. 
 Hume’s point is to lift the breaks from off our necks, to break the bonds of necessity – or rather to relocate those bonds. In doing so, he and his billiard balls are reversing the older tendency of atomistic philosophy, which was revived by Gassendi in the 17th century. Lucretian atoms fall in necessary and pre-determined courses, the only exception being that slight inexplicable swerve when the atoms contact the human – hence our free will. Hume, who had a hard enough time with Christian miracles, did not, so far as I know, discuss the Lucretian version of things even to the extent of dismissing it. 
To be a little over the top, we could say that the eighteenth century thinkers disarmed necessity, exiled Nemesis, and the heavyweight heads of the nineteenth century brought it back with a vengeance, locating it – in a bow to Hume, or the Humean moment – in history. Custom. From this point of view, Hume was part of a project that saw the transfer of power from God and Nature back to Man – although we are now all justly suspicious of such capitalizable terms.
But the breaks survived and flourished. There is a way of telling intellectual history – the way I’ve been doing it – that makes it go on above our heads, instead of in them. It neglects the general populace, the great unwashed. Book speaks to book. To my mind, intellectual history has to embrace and understand folk belief in order to understand the book to book P.A. system.
Which is why we can approach the breaks in another way.
In 1980, I was going to college in Shreveport, Louisiana. I went to classes in the morning, then worked at a general remodeling store from 3 to 10. I worked in the paint department, mostly. At seven, the manager would leave for home, and Henry, the assistant manager, would let us pipe in whatever music we wanted to - which is how I first heard Kurtis Blow’s These are the Breaks. I also first heard the Sugarhill Gang’s Rappers delight this way, and I still mix them up. I heard both, as well as La Donna and Rick James, at the Florentine, a disco/gay bar that I went to a lot with friends – it was the best place to dance in town. Being a gay bar, it was always receiving bomb threats and such, which made it a bit daring to go there. We went, however, because we could be pretty sure that the music they played would include no country or rock. It was continuously danceable until, inevitably, The Last Dance played. 
At the time, I was dabbling a bit in Marx, and thought that I was on the left side of history. At the same time, 1980 was a confusing year for Americans. The ‘malaise’ was everywhere, and nothing seemed to be going right – from the price of oil to the international order. There were supposedly communists in Central America, African countries were turning to the Soviet Union, and of course there was the hangover from the Vietnam War – the fantasy that we could have won that war had not yet achieved mass circulation, so it felt like what it was, a plain defeat. 
I imagined, then, that the breaks were falling against a certain capitalist order. In actuality, the left – in its old and new varieties – was vanishing. Or you could say transforming. The long marches were underway – in feminism, from overthrowing patriarchy to today’s “leaning in”; in civil rights, from the riots in Miami to the re-Jim Crowization of America through the clever use of the drug war; and in labor organization, from the union power to strike to the impotence and acceptance that things will really never get better, and all battles are now rearguard. 
 My horizons were not vast back then – I didn’t keep up with the news that much, but pondered a buncha books and the words of popular songs. But I knew something was in the air. As it turned out, Kurtis Blow’s breaks were not going to be kind to my type, the Nowhere people, stranded socially with their eccentric and unconvincing visions. However, after decades of it, I have finally learned to accept what Blow was telling me: these are just the breaks. That is all they are. 
You’ll live.

Saturday, May 27, 2017

De Quincey: our goth

- You like structure, right? Says the woman in the line ahead of me at the grocery store, laying down a fistful of coupons.

I, like her, like structure. But I like it not only for what it does, but for presenting itself to be undone. The first stage could be called realistic – the second stage is definitely gothic, in the broadest sense. There’s living structure, and there’s the undead. There’s Johnson, and there’s De Quincey.
De Quincey, in English literature, introduced the gothic moment into essay and autobiography. He reinvented the most gothic thing of all, the murder story: where the usual Newgate version was sensationalist and moralizing, De Quincey parodied the moralizing and introduced an element of suspense that we now consider to be natural to the genre. Suspense is inherently anti-mythic – myth, with all its dramas, follows a program its audience already knows.

However, unlike the creators of Frankenstein and Dracula, DeQuincey’s place in the Gothic tradition has not gained him the kind of recognitionhe deserves.  It is always nice to seehim get a little publicity, as he  did inthe last issue of the London Review of Bs, in a review of a new biography byNicholas Spicer. – Warning: invidious comparison ahead – The NYRB featured a review of a biography of another “gothic” writer, Wilkie Collins, by Robert Gottlieb that was sort of astonishing, in that Gottlieb apparently thinks Dorothy Sayers is the last person who wrote about detective novels. It is like Gottlieb wandered out of the club, noticed it was later than 1940, and wandered back into the club to write a gentlemanly review. Sometimes the NYRB is so old that it is actually older than its founding, in the  1960s, when it was young. I distinctly heard some snoring in the back row of some of the sentences in the Gottlieb review.

But to return to our onions: Spicer is good about what a horrific family life the De Quincey’s endured. It wasn’t just opium – it was the poverty. As in the life of Karl Marx, money, in De Quincey’s life, or the lack of it, was an ever present menace.   

“As the debts piled up behind them, the family lurched close to utter destitution. De Quincey was repeatedly ‘put to the horn’, a practice native to Edinburgh, whereby a debtor was publicly denounced and made eligible for arrest. In October 1832, he was briefly imprisoned and only avoided further arrests by taking refuge in the Sanctuary of Holyrood, out of the reach of his creditors. Margaret was often ill and De Quincey suffered continually from the effects of his addiction and his attempts to break it – typically, periods of constipation alternating with debilitating bouts of diarrhoea. He sold or pawned everything he could, including most of his books. Two days before the birth of his eighth child, he filed for Cessio Bonorum, a kind of bankruptcy proceedings. In September 1833, his three-year-old son, Julius, died: he had to flee the child’s wake to give the slip to a creditor who’d discovered his whereabouts, "

Like the sound of this? It’s the ardent dream of Trumpians everwhere to make this kind of thing more common. But I digress…

 Spicer has some great comments about De Quincey’s rhetoric, his bizarrie, which was what captured Baudelaire’s admiration (Baudelaire translated De Quincey) as well as, evidently, Poe’s – Poe uses a similarly mix of essayistic seriousness an parody.  Spicer is right here:
“Making fun of others, he idealises himself, but, whether consciously or not, his writing always presses at the limits of seriousness, where solemnity cracks up in a snort of poorly supressed hilarity. His style tips his grander effects into self-parody.
Spicer doesn’t like De Quincey’s sentimentality, or his romantic flights; and it is true that De Quincey tends to weep at his own misfortunes, and endow himself with an irritated sensibility that is easy to read as mere rationalization. But we can’t take De Quincey in pieces to really appreciate him. It is that pressing towards the non-serious, the blind spot in our hierarchized sense of occassions, in which all these things find their necessity. Spicer quotes a lovely bit from De Quincey’s essay on astronomy.

"Lindop recounts an exchange between De Quincey and Pringle Nichol, professor of astronomy at Glasgow University, in which De Quincey confessed himself baffled by the professor’s attachment to the consequences of observable fact. De Quincey’s imagination had taken flight in an essay inspired by a particular theory of the nebula, which the professor pointed out had been disproved by subsequent astronomical findings: ‘Nichol apparently misunderstood the case as though it required a real phenomenon for its basis,’ he wrote."

The  essayist’s contract is with truth – but truth as essai, truth as partial – and even more than that, truth as always partial, never, in the end, forming a whole. Truth as something ultimately more fundamental than the law of non-contradiction. Truth as non-serious.

Sunday, May 21, 2017

styles of saying nothing: the new york times editorial

I open to the NYT site today and find this thing that looks like a sentence under EDITORIAL: 
 -- Too much indulgence in impeachment notions could prove to be a distraction.

There are reasons to think of it as a sentence. For instance, it does have a subject – which is, sort of, ‘indulgence’, or more comprehensively, ‘impeachment notions’  – and it does have a verb, ‘be’, set resolutely in the conditional  - under ‘could’ – and modified like the cough of a high priced lawyer – which is the role played by ‘prove’  - and finally it slips out of the side exit in a finagling  bit of murk – ‘a distraction’.

Such are its parts. Its gestalt is what interests me. Just as margarine is a chemical imitation of butter that can pretty much function as butter functions – you can spread it on toast, you can melt it in a heated pan – but misses one of those functions – that of tasting like butter – so, too, this sentence misses out somewhere in the sensory scale. If you came upon this sentence in laboratory conditions, detached from its source, and were forced to guess its source, I’d wager that you’d say, this must be from an editorial. Because editorials are constructed of these weirdly margarine like phrases. They avoid attachment to any living  subject (a lacuna that is usually filled in with a “we” that, far from being inclusive, operates to exclude as marginal any living creatures outside the special zone of the editorial office), and they never go straight to their objects, bur rather sidle to them through the equivalent of hmms and haws. Except that even a hmm or a haw is throaty – it is a creation of phlegm and hesitation – whereas these hesitations seem detached from any bodily function.  The “could prove to be” litigiously melts down the “are” into an absolute vacancy, in which any statement is true. If we are hit be a meteor tomorrow, it would be true If impeachment never comes in the more normal course of human events, it would be true. If impeachment happens, it would still be true.
Partly this omni-veridical (and omni-empty) 'could prove to be' hangs, essentially, on the oddness of the object –a distraction. Distractions don’t just get up and crawl through the physical world – they require attention. Which in turn requires a brain, or a collectivity of brains. To put these brains in time and space – to frankly situate them in history – seems to be an exercise that exhaustis the sentence before it is even halfway to its target. This is not a string of words that will ever turn over and actually express itself in a human, oh too human way.  

We all are familiar with that ultra American thing – an attraction. As in coming attractions, the slogan of the movie trailer. A distraction is the negative of an attraction, and perhaps we can envision it as a Zen movie trailer, showing nothing.  But… this can’t be right, for then distraction would lead to concentration, at least in all the ascetic  traditions  I am aware of. Instead, these coming distractions are notions of … coming attractions.
Hmm

This style of saying nothing seriously has a history that is intertwined with the history of liberalism in modernity. That history, in turn, is entwined with the history of critique – both in the reactionary vein, and in the revolutionary one. I myself rather like, stylistically,  both ends   of the spectrum of critique, but I am also aware that critique doesn’t seem to have made a dent in this anonymous, liberal elitist style of saying conditional nothings seriously, in order that nothing serious really happen.     

Tuesday, May 16, 2017

Burying history under its monuments: the new confederacy

The NYT article on the monuments to the Confederacy by Gary Shapiro tries to be thoughtful, but it struggles with a larger thoughtlessness. While Shapiro is right that confederate monuments have a historical value, he seems oddly oblivious to that history. These monuments were raised by the same people who either participated in or condoned lynching and terror. Slavery does not exhaust our inventory of American evils. To say that Jim Crow was "nasty" shows at the least an inadequate conception of how Jim Crow came about. To quote Bob Marley, a better authority here, 'half that story has never been told." These monuments were part of a process, and that process existed not in ante-bellum times - which seems to be Shapiro's main concern - but in the bloody post-bellum times that allowed the white establishment to, in essence, reverse the verdict of the Civil War. In other words, these are not just monuments to the Civil War past, they are emblems of the Jim Crow present. Since Shapiro shapes his essay around Richmond, let's contrast the monuments to Lee and Davis with, for instance, this map of Virginia lynchings. It is poetically pertinent that as marble statues of Confederate generals were being raised in the capital of the state, a more human, struggling monument was being raised in the state's countryside - with tar and feathers, with castration, with hanging. And so far as I know, noon of those advocates for "preserving" our history have ever advocated for preserving this history. Every confederate monument is an instrument to get us to forget the history being enacted around its base: lynching, mass imprisonment, mass disenfranchisement, wholesale economic fraud.

Louisiana, whose representative recently shed tears for the good old confederate days and who voted to provide more aid to their marble concrete monuments of racists than they provide to sick living human beings, could do with hundreds of monuments to the brave band of African Americans and white reconstructionists who were assassinated or killed in pograms, such as that which occurred in Colfax. How many people have heard of Colfax? Its obscurity is a measure of the success of the raisers of the Confederate monuments, who wanted less to memorialize history than to bury it.
https://en.wikipedia.org/wiki/Colfax_massacre

A vanishing act: repressive desublimation and the NYT

  We are in the depths of the era of “repressive desublimation” – Angela Carter’s genius tossoff of a phrase – and Trump’s shit video is a m...