Sunday, June 09, 2013

Plots and secrecy



When I used to review novels for Publishers Weekly, the form was dictated partly by the editorial limitation of space: I had 250 to 300 words to operate in. Conventionally, the review would either start out with or end with some elaboration of an adjective – basically, blurb territory. Then would come characters and plot – or telling what the novel was about. If I could find the room, I might refer to the writer’s reputation.
Now, this procedure relies heavily on the idea that a novel is about a plot, and that a plot is something that one can extract from the text that ‘moves’ the events and characters in the novel forward. Even if the novel varies “forward” – even if it is arranged chronologically so that it looks backwards, or it mixes up narrative patches that are in the past or future of the narrative’s present – the plot is the thing that makes the novel. The plot is to the novel what the plays are to a game – a plot encloses, in a determined field, the chances that the narrative rehearses in its serial plot-parts. If an orphan goes out one foggy afternoon to visit the tomb of his dead mother and discovers an escaped convict among the graves   which happens in the first chapter of Great Expectations – I expect that this will have a bearing on the entire action of the book, an action which involves numerous small actions over the course of twenty some years. The action, the plot, is a great maker of pertinence, that very English virtue that Grice made into a fundamental part of conversational implicature.
There is, of course, another meaning of plot, which refers not to the implicate order of fiction, but to the conspiracies or plans of human beings in secret coordination, one with the other, to bring about some event. A plot in this sense hinges very much on secrecy.
The plots of fiction and the plots of non-fiction have a way of converging – in fact, the latter seems, sometimes, to have almost swallowed the former, as though none of the stunted rituals of modern life present the interest to the reader that is associated with plotting in secret.
In The Genesis of Secrecy, Frank Kermode brings together the narrative motive and secrecy as though, in reality, the plots of non-fiction have always been the secret sharers of the plots of fiction.  He usefully uses the notion of insiders and outsiders. A secret creates an immediate divide between those who share it and those who don’t. I itch to put the term “sharing” under scrutiny, here, since it seems to stand outside of the dominant exchange system and point to other systems of wealth and power – but I am more interested, here, in the categories of insider and outsider with relation to the form of narration.
Kermode takes the Gospels as an exemplary narrative. It is an inspired choice. From the perspective of secrets, the Gospels make the very strongest claims for the privilege of the insider. It is not that the Gospels unfold a conspiracy, although certainly some conspiring goes on to do Jesus to death. But the real secret, here, is in the double life of Jesus – on the one hand, a small time carpenter’s son, on the other hand, the beloved son of God. To understand the plot requires not only knowing that Jesus believed that he was the son of God, but believing it oneself. It requires metanoia, conversion.
Not only does the insider understand the plot, but if the insider is correct, the outsider can never understand the plot until he or she becomes an insider. The ritual of becoming an insider is not simply a matter of cognition, but of a special kind of semi-cognitive thing: belief. The belief comes not from the head – with its cognitive gearing – but from the heart – which understands that feeling is not subordinate to the world, but quite the reverse. And if this is true – death, where is thy sting?
To get away from the pull of the Gospel, Kermode’s point about secrecy and narrative is made in more general terms in a later essay published in Critical Inquiry: Secrets and Narrative Sequence.
“My immediate purpose is to make acceptable a simple proposition: we may like to think, for our purposes, of narrative as the product of two intertwined processes, the presentation of a fable and its progressive interpretation (which of course alters it). The first process tends towards clarity and propriety (“refined common sense”), the second towards secrecy, toward distortions which cover secrets.”

This does seem like refined common sense. And yet it shakes off, way too thoroughly, the insider/outsider categories that Kermode was using in the Genesis of Secrecy. I think that shaking off points to a retreat to a classically ahistorical project: salvaging the presentation of the fable. As though the presentation came all in a block. After which – and the after here signals, again, a certain ideal temporality, not an empirical one but a conceptual temporality – we find interpretation.

I’m thinking about this common sense presentation of the two elements of fiction at the moment because I’m writing a fiction. One of my readers asked me, when I sent her the fourteenth chapter, to write her a plot outline, because she has been receiving the chapters over time, as I write them, and she wanted not to have to go back to previous chapters to see what was going on. So I wrote the plot outline, and I was mildly surprised to see that the plot I wrote was, in a sense, impossible to infer from the chapters so far, which encompass more than half the book.

I have harbored Dadaist dreams of writing a novel which would have one surface plot for the reader and another for the author – and perhaps another outside of both the reader and the author. In this book, the plot that the reader thinks binds together the book is not the real plot, but incidental to the real plot, as it is understood and put together by the author. However, why not strain at that pitiable thing, the author? What if the real plot of the book is not understood by the author as well? As in the myth of Bellerophon, where a messenger carries a letter which, unbeknownst to him, requests that the receiver kill the messenger, perhaps the author of the plot could be considered a blind messenger, delivering a different plot from the one he or she knew? After all, there is a large degree of blindness in the world.

In a sense, such a novel would be an anti-gospel, because it would be closed, ultimately, to any access to its secret. The insider, here, would be defined by the fact that the secret he holds could not be shared. This would turn the world of the plot in a sense upside down. I don’t quite know how this kind of plot could even be constructed – a plot that resisted ever being known.

Surely, this is the great modernist temptation.  

Wednesday, June 05, 2013

The age of skipping



Neither Jesus nor Socrates left us any footnotes. The oral form into which they destined their teachings does allow for references, but these references have a more choral than notorial nature – they echo, they caress, they allude. But if they never exactly cite page and author, if they do not exactly locate the quote, both teachers do indeed quote, and do indeed gloss. They tease the footnote, one could say. This is the way it is, mostly, with prophets and poets – although there are exceptions, such as Pope’s translations of Homer, Swift’s joyful notes to The Tale of the Tub, Eliot’s credentialing notes to the Waste Land, and finally the takeover of the text by the note in Pale Fire and the backtracking notes of David Foster Wallace’s Infinite Jest, notes that are less in the Swiftian mode, which mocks all metalevels, and more in the mode of a certain desperation concerning metalevels, a desperation that the footnote’s totalizing authority has been lost. Or, to put it another way, if the footnote is an epistemic instrument, one that delivers a certainty with a robust reliance on the correspondence theory of truth (here is the author, the page, the publication, the publisher – everything the reader needs to find a source), then it bears its textual fate – to become a doxic instrument, a reference to, for instance, the entry concerning Uqubar in the  The Anglo-American Cyclopaedia
(New York, 1917), or, less radically, a space that is invaded and undermined by the uncertainties of the world of midrange objects in which we live, ourselves one of them.

There’s an overview of the literature on the footnote by Fabio Akcelrud Durão in Critique 10, 2012, which has made me want to read vast volumes: from Bernays (1892) to Andréas Pfersmann’s evidentaly magisterial study of the topic, Séditions infrapaginales. Poétique historique de l’annotation littéraire (XVIIe-XXIe siècles), Genève, Droz, coll. « Histoire des idées et critique littéraire » (vol. 464), 2011, 536 pages. And yet, I have a feeling I won’t. I have a feeling that in this lifetime, even as I edit for money and write my little things for love, I will have to keep skipping. My hope is that I can make a certain poetry of skipping. And that at least I will know the references.
   

Sunday, June 02, 2013

Milton Friedman's wrong: the social responsibility of business is not just to make a profit



Milton Friedman wrote an article for the NYT Magazine in 1970 that was appropriately headlined: The social responsibility of business isto make a profit.

While Friedman was not the first to make the argument, his reasoning certainly is the starting point for an idea that has lodged in the American soul like a tick in a coon’s ass.
The reasoning is not, as it should be, legal, which is why, from the very beginning of his argument, Friedman goes off on the wrong track:
“The discussions of the "social responsibili­ties of business" are notable for their analytical looseness and lack of rigor. What does it mean to say that "business" has responsibilities? Only people can have responsibilities. A corporation is an artificial person and in this sense may have artificial responsibilities, but "business" as a whole cannot be said to have responsibilities, even in this vague sense. The first step toward clarity in examining the doctrine of the social responsibility of business is to ask precisely what it implies for whom.
Why, you might ask, can only persons have responsibilities? There’s no reason given. The premise is probably some kind of individualism of a very weird kind, in that actually, when we look at how responsibility turns up in everyday practice, we find collectives and institutions operating under the rule of responsibility all the time. There’s nothing in ordinary speech that rules out such sentences as: The responsibility of the Highway Department is to build and care for highways. But Friedman’s lack of an argument for the proposition that only individuals have responsibilities is a minor tic – even if it reflects an individualistic mindset that is founded neither in anthropology, sociology, language or philosophy, but solely in an individualistic ideology.
The second step, however, is where Friedman goes very wrong: “A corporation is an artificial person and in this sense may have artificial responsibilities, but "business" as a whole cannot be said to have responsibilities, even in this vague sense.” The problem here, of course, is that business as a whole is something that doesn’t make much sense. Businesses exist in a number of ways in a number of contexts, and surely the way to talk about business is to specify the context one is speaking in. A business in Alexandria in 300 B.C. is undoubtedly going to be run with some differences from a business in NYC in 1970. One of those differences is surely going to be the fact that the business in NYC in 1970 has to declare itself to the state. There are various ways in which this responsibility can be modified, depending on the scope of the business, but anybody with a halfwit’s sense of commercial law knows that past a certain size, businesses as a whole do have a responsibility – one that divides them into licit and illicit enterprises.
Friedman, of course, was not a lawyer. He was an economist, with a certain ideology. In 1970, as John Kenneth Galbraith made clear with his book, New Industrial State, from 1967, large businesses – corporations – certainly were not operating to maximize their profits. They were, to use an ugly term, satisficing – trying to achieve a level of profit consistent with their sector, while conceding a certain opportunity space that may have created, at least in the short term, more profit.  
Friedman was notoriously dissatisfied with the compromise between ‘socialism’ and the ‘free market’ inscribed in the way that businesses in the post-New Deal, post-war period were actually doing business. He wanted to change the ethos of management, which is why he argued for a unilateral view of the responsibility of businesses. He succeeded in helping create a new management ethos. Unfortunately, the idea that business only exists to make a profit – a statement that is clearly false, since the purposes of business are modified by the laws to which businesses are responsible – emerged as a truism of the Reagan era, partly because it is so easy to state – it has the cocksureness of the popular maxim, on the order of “if you’re so smart why ain’t you rich” – and partly because the underlying premise is firmly rooted in a peculiarly American myth of individualism. Which is why an argument that seems to confound the way things necessarily are with the way Friedman would, as an economist, prescribe the economic order – an argument that is specious on the surface – has gained such footing. It all seems like so much melted butter in the mouth:
“In a free-enterprise, private-property sys­tem, a corporate executive is an employee of the owners of the business. He has direct re­sponsibility to his employers. That responsi­bility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while con­forming to the basic rules of the society, both those embodied in law and those embodied in ethical custom.
Here one notices that the owner is reduced to a simple function: the owner wants as much money as possible, thus, the business is about making as much money as possible for the owner. But in truth, these owners, especially of large organizations, are a heterogenous and shifting lot, especially in comparison with the entrenched status of the corporate management. Unless that corporate management recognizes its responsibilities – which are spelled out not ethically, but legally, in a contract – they can manage as they will. Or they can manage to loot as much money from the business as possible, on the theory that their ultimate responsibility is to make as much money for themselves as possible. Of course, the managers aren’t the only contract bound individuals here – so are the owners, who are most often owners of the companies stocks. All of which points to the fact that owners and managers are endowed with responsibility not as a natural part of their positions, but as a derivative part of the contractual positions.
My argument, here, is that there is nothing in the idea of the free market system that defines and limits the responsibilities of businesses. Rather, those definitions and limits come into play in the contractual intermediation that brings businesses into existence. One could easily require all corporations to have a portfolio of ‘social responsibilities’ that would bear on the contractual responsibilities of owners and employees alike without, theoretically, damaging the free enterprise system.  The year Friedman published his influential essay, J.W. Hurst published a history of the corporation in the U.S. “The legitimacy of the Business Corporation in the Law of the United States”, which disproves Friedman’s principle, at least as regards the evolution of the corporation in the United States. Hurst shows how business went from being unchartered by the state to being chartered, and how the owners and managers were endowed with or developed their degree of power over the business enterprise.   As Hurst points out, in the colonies and in the pre-bellum U.S., “the legistlature’s grant was necessary to incorporation… that it authoritatively fixed the scope and content of corporate organization.” Which means, simply, that businesses can have multiple purposes designed into their papers of incorporation. For instance, the state can decide that it doesn’t want to be burdened with the negative externalities of business – pollution, for instance – and make the corporation responsible for taking care of those externalities. At the same time, the state can desire that the enterprise undertake its business because it views the undertaking as a social good. And thus it can bring together a number of social purposes in the corporation, without thereby destroying the corporation as an entity.
Battering down the idea that the social responsibility of business is to make a profit is an excellent way of making businesses socially positive once again. It is certainly high time to rewrite the rules of incorporation, including the pernicious rule that allows interstate companies to incorporate under the rules of some selected state – interstate companies should incorporate at the national level with the Commerce Department. This simple rule would be a small start in bringing the plutocracy to heel, at least in the States.

Friday, May 31, 2013

The Spoiled children of a spoiled class



Sociology without class analysis is like physics without calculus: it always entails a primitive regression to magical and occult forces. I’m thinking here of Elizabeth Kolbert’s article about spoiled American children inthe New Yorker, which has bits in it like: “The books [about disciplining children] are less how-to guides than how-not-to’s: how not to give in to your toddler, how not to intervene whenever your teen-ager looks bored, how not to spend two hundred thousand dollars on tuition only to find your twenty-something graduate back at home, drinking all your beer.
The last, of course, is the kicker – we are firmly in the territory of the top ten percent. The “you”, the “we”, these marks that seal the unsealable class divisions that now, more than ever, have appeared on the surface of every developed economy in the world – these interest me. This is not just the usual ideological brainwashing, the disguising of those divisions – this is a step further in magic thinking. The we is used to make those divisions disappear. The nudgery liberalism of the Obama era started out with the desire from bipartisanship that, in effect, was the objective correlative of the magic thinking of the upper class, which not only wants to change the lifestyles of the no doubt racist, sexist, homophobic proles, but also wants not to hear tiresome tales of increasing poverty, precariousness, and all the rest of what goes into what is really happening in the quicksand of everyday life for the vast majority of people.
Kolbert is not a stupid writer, and she has written some very good things about environmental issues, but she seems incorrigibly bound by her presumed you and we – identifying with the presumed readership of the New Yorker, who worry about what prep school to send the kids to. Her article contrasts the behavior of one six year old of the “Matsigenka, a tribe of about twelve thousand people who live in the Peruvian Amazon” with the children of thirty Los Angeles households being studied by another anthropologist, Elinor Ochs. In the maddening style of New Yorker’s premier purveyor of sociology-lite, Malcolm Gladwell, the data is given as though every fact was impenetrable to criticism and to history. Sociology-lite loves these atomic givens, upon which they immediately build a sort of theme of the moment. The theme of the moment – spoiled children – is a hotter issue now than ever before, given the royal fucking distributed to the children in the advanced economies at the moment – the fifty percent unemployment among the 17 to 30 crowd in the austerity ravaged lands of Spain, Portugal, Greece, Cyprus – and in the U.S., where Demos put out a report last year about the state of all those “beerdrinking” young college grads that “you” casually forked over 200 thou for:

“Young adults gained little ground in 2012.
Altogether, there are more than 5.6 million 18 to 34-year-olds who are willing and able to take a job and actively looking for work, but shut out of opportunities for employment. These young adults compose 45 percent of all unemployed Americans. An additional 4.7 million young people were underemployed—either working part time when they really wanted full-time positions or marginalized from the labor market altogether. Last year, the unemployment and underemployment rates for people under 25 were more than double those for workers over 35.
Young African American and Hispanic workers face higher unemployment and underemployment than white workers in their age groups.
Young adult Hispanic workers experience unemployment rates 25 percent higher than those of whites, while African Americans face rates approximately double. One in four African Americans between ages 18 and 24 is looking for a job but cannot find one, as are more than one in seven Hispanic young adults.”

I like the New Yorker, but I also use it as my little seismograph of sentiment among the liberal elite. Mostly, that sentiment is alarming – it tends towards the “punitive” liberalism of the Clinton era, where the answer was to lock up more poor kids and definitely take their freeloading mothers off the welfare rolls, which would allow us to… well, increase the Earned Income credit and make symbolic gestures towards environmentalism. Lately, the NY has been on a role, lecturing us for instance about having too much empathy (turns out there are people out there who don’t want the young, no doubt spoiled Dzhokhar Tsarnaev to be put on death row. They even, gasp, have sympathy with him for being shot up, instead of enjoying each wound and hoping that perhaps we will find some horses and draw and quarter him like they used to do in the old days. New Yorker readers have responded to this with enthusiasm, as per this video.  I suppose the return of punitive liberalism was the inevitable next step in the sad mockery of progressive politics that we see all around us – but I’m always willing to kick against the pricks, or bang my head against a wall – a spoiled child’s gesture – and scream that this is bullshit. That a spoiled class produces spoiled children is not going to break my heart – drink your parents beer is what I’d advise that mythical college grad in Kolbert’s imagination.

Friday, May 24, 2013

a little note on rhyme



I listen to Adam’s burbles, his hiccups, gasps, moans, and something that is a sort of pure vibration of his vocal chords, and I think of these things as being creatures on the threshold of that great thing, language, peering into it, pondering the leap, although in the end all of this phonological hamming up will still be intact, in the interstices of sense, so to speak. These are elements in the Shakespearian sense – half atom, half fairy. Nobody is taught them. Who among us teaches his child to say um, to use my friend Michael Erard’s favorite example? Hein being, I suppose, the French equivalent. Nobody, that is who.
However, I’ve been thinking about phonemes and sense lately in terms of rhyme. In terms of the cognitive devise that rhyme is.
A few days ago, I was taking a picture and I said to my friends, who were composing themselves to be the foci of my field of vision – I said, throw your hands in the air like you just d
Since then, I’ve been thinking about Chubby Checker’s couplet. The meeting of air and car in – well, in what? The meeting is staged in a number of spaces – phonological, semantic, mental, musical. I’m not sure where these verbal cosmonauts actually dock. I do know that they pull with them the hands in the air, and the curious meaning of those hands. When your hands are thrown into the air, usually you care a lot. You are being robbed, or the police are training their guns on you, or you are catching a ball. You are, in other words, under stress. Thus, to be told to throw your hands in the air like you just don’t care puts you slightly off balance.  How do you do that?
There’s a tradition that sees in rhyme a memory technique. In one of the drawers of the memory palace, you pull out rhyme. And it is true that rhyme becomes memorable. In the revolt against rhyme, which for some critics, like Donald Wesling (In The Chances of Rhyme), is the parameter that signalsf modernity in literature (that is, the revolt against it – not unrhymed verse itself, but the way unrhymed verse attacks rhyme), the fundamental objection to rhyme is that it is not natural, or sincere. Wesling thinks modernity is the era haunted by authenticity as both norm and impossibility. But of course rhyme never went away. It did become something to be justified. Myself, the thing I like about rhyme is that that side of it that is a cognitive trouvaille – something we don’t expect, since we assume our semantics are going to lead us to the higher sense. That rhyme might – well, that is uncanny.
Now I am going to throw my hands in the air like I just don’t care.

The use-value of sanity

  Often one reads that Foucault romanticized insanity, and this is why he pisses people off. I don't believe that. I believe he pisses...