Saturday, March 14, 2015

a philosophical suicide pact: on relativism

One of the myths of our time is that the average college freshman takes into the classroom a belief in moral and cultural relativism. I have my doubts. I don’t doubt that a college freshman might be given to saying things like, that is only your opinion, or, well, that’s my opinion, given the unfamiliar stimulus of a first year philosophy class. But outside of that stimulus, I don’t see much evidence for either form of relativism. On the contrary, there is a lot of evidence for a naïve faith in the rightness of the established order on up through and including such absurdities as grading. As far as romantic reveries in which the subject is placed on a parity with the objective world, this seems contraindicated by  the strong increase in busness majors and the clamor from both parents and students for a secure job after graduation.
The myth bugs me on several levels, one of which is the assumption that relativism is a sophomoric sophism that dissolves in the light of the overwhelming proofs for universal truths and values. The latter are represented, of course, by the teacher, so infinitely more sophisticated in philosophical argument than the childish, unconscous  echo of Protagoras.
The place to start wth those phrases is not to juxtapose them to glorious universal truths, but to question the idea that the conjunction of “my” and “opinion” has any sense, outside of the semantic possibility that allows possessives to modify nouns. There are, after all, few self-generated opinions among the sane. It is a rare existence that stops itself long enough to question the opinions of others that it has absorbed all its life, much less finding better ones. And those rare existences mostly shift to opinions that have also had their long career in the belief community. Although I am an opiniotated person, I can’t really say that I have a lot of my opinions proper, any more than I can say that there is part of the atmosphere consisting of my air.
In fact, doubt about the capacity to invent opinions – rather than embroider pre-given beliefs – is I think one of the important, humbling stages of self knowledge.
Then there is the irritating smugness by which relativism is usually dismissed from the city. The argument goes that if you don’t have objective, universal values, you have no footing to condemn Auschwitz. That’s a pretty preposterous argument, and it is founded on a very contemporary view of relativism in which respect for other beliefs is the universal rule. But why I should be relativistic about the stars and the trees, and take my hat off for the respect rule is usually not analyzed. In fact, one of the motives in my case for relativism is the realization that Dachau was not built by moral or cultural relativists, but for them – for putting them away and destroying them. The Nazis were all about universal truths, and ruthlessly punished those who they felt violated the natural order of values. To think that relativism, with its endemic questioning of any absolute, was a Nazi doctrine is preposterous.
What the Nazis did, of course, was to pursue methods that were in exception to the moral rules they ardently believed in. In this, unfortunately, they were not very original. In the United States, from 1945 until the present, the population has accepted and resourced the making of a terrific force of nuclear missiles aimed at inflicting millons of civilian casualties. That doesn’t mean that Americans have adopted flexible relativistic norms – rather, it means that, like in all belief systems that build a great superstructure of moral and cultural universals,  the foundations are riddled with ingenious exceptions and emergency situations.
I used to call myself a relativist, but in fact the more I have looked at this issue, the more I think it is one of those philosophical death pacts, those double binds, in which both sides are fucked up. It is hard to see how a relativism can get off the ground without formal rules for recognizing belief systems – rules, in other words, that are norms. And of course, as a relativist can easily point out, in every society that claims to adhere to universal norms, one can expect a lively sub-system of exceptions. In those societies that have philosophy, the subsection called ethics is usually involved in both proclaiming universal values and justifyng the everyday exceptions to them that make life possible.

This kind of enjambment makes me think that there is something fundamentally wrong with the way the relativist issue is discussed in philosophy. It makes me think that it is a sucker’s game.   

Friday, March 13, 2015

The ownership society - they own,and you gotta ship out.

Here's what the ownership society - the neoliberal dream from Bush to Bush, from Clinton to Obama, looks like: the ripoff society! Eduard Porter's column is as fine as it can be within the limits of the idiot episteme we suffer under, where all that exists exists to profit the oligarchs:
"A research paper by Mr. Bogle published in Financial Analysts Journal makes the case. Actively managed mutual funds, in which many workers invest their retirement savings, are enormously costly.
First, there is the expense ratio — about 1.12 percent of assets for the average large capitalization blend fund. Then there are transaction costs and distribution costs. Active funds also pay a penalty for keeping a share of their assets in low-yielding cash. Altogether, costs add up to 2.27 percent per year, Mr. Bogle estimates.
By contrast, a passive index fund, like Vanguard’s Total Stock Market Index Fund, costs merely 0.06 percent a year in all.
Of course, Mr. Bogle has a horse in the race. He founded the Vanguard Group. He invented the first index fund for the public. His case is powerful, nonetheless.
Assuming an annual market return of 7 percent, he says, a 30-year-old worker who made $30,000 a year and received a 3 percent annual raise could retire at age 70 with $927,000 in the pot by saving 10 percent of her wages every year in a passive index fund. (Such a nest egg, at the standard withdrawal rate of 4 percent, would generate an inflation-adjusted $37,000 a year more or less indefinitely.) If she put it in a typical actively managed fund, she would end up with only $561,000."
Porter, alas, doesn't recommend the right change. Tax 401ks which are simply tax avoidance schemes for the upper and set up low tax passive index accounts through a postal bank - which the US had at one time, since in the 40s and 50s the government still did a few things for people instead of plutocrats. This won't happen, of course - after all, the advantage of doing this would accrue to the bottom 90 percent rather than the top 10 percent, and the White Republic that has its boot on the American people - executive, legislative, and judicial - wouldn't stand for that. But, of course, the screwed up retirement scene is the direct result of the government de-regulating and encouraging the huge rip off that fed wall street. Surprise!

Saturday, March 07, 2015

david autor and the new defense of the 1 percent: don't do the math! look over here at this shiny neo-liberal platitude!

It took a while for the research of economist David Autor to reach the rightwing mimosphere, but it is there now. Autor's claim has become gospel for the rightwing set: As David Brooks puts it, If we could magically confiscate and redistribute the above-average income gains that have gone to the top 1 percent since 1979, that would produce $7,000 more per household per year for the bottom 99 percent." This is said to mislead, so that you think, oh, 7,000 isn't much. But if you do the math, that means every household in american would be making 315,000 dollars more per year. 
I think this is close to my estimate. To quote the EPI institute: "The CEO-to-worker compensation ratio was 20.1-to-1 in 1965 and 29.0-to-1 in 1978, grew to 122.6-to-1 in 1995, peaked at 383.4-to-1 in 2000, and was 272.9-to-1 in 2012, far higher than it was in the 1960s, 1970s, 1980s, or 1990s."
So, imagine that CEOs were making the same salary in 2012 and the compensation ratio was 29.0. Average CEO compensation was $14.1 million in 2012. Thus, the average worker would be making 486 thousand dollars.
We are fucked.
Brooks of course goes on to compare real money earned with a fake premium on college education, "But if we could close the gap so that high-school-educated people had the skills of college-educated people, that would increase household income by $28,000 per year." Of course, Brooks doesn't repeat the idea that every year since 1979, the bottom 99 percent would be earning 28,000 more per year.
Now, it is possible that Brooks, who has no head for math, is misquoting Autor. But if he is correct, than Autor is a bigger putz than he appears, since obviously an increase of 28,000 over 45 years is much less than an increase of 316,000. Autor's work is in the domain of justifyng the wealthy and red herringism. Don't think about that one percent! But he accidentally seems to have confirmed just what we know.
We are so fucked.

Friday, March 06, 2015

Russia's Chalabi

I just read the gingerly New Yorker portrait of the old Yeltsin era crook and current fashionable Russian "dissident", Khodorkovskii. David Remnick loves Khodorkovskii, and so does the NYRB. He is their Chalabi. Of course, we have to overlook the blemishes in the past - which Joffe, his New Yorker Boswell, sketched with perhaps some trepidation (American liberals like not to dwell too much on the past of  their rich freedom fighters - a fraud here, an act of violence there, who cares?). I suppose the equation here is that since Putin is Hitler and the Devil, his opposition must be Gandhi and Solzhenitsyn rolled into one.  The problem, of course, is that the cleansing operation by which a Chalabi becomes a Charles De Gaulle and a Khodorkovskii becomes a "dissident" in a new Gulag (I do admire this - that the writers of a country, the US, that has the largest and one of the cruelest prison systems on the planet can calmly talk about the New Gulag) - the problem is that when you implant them back in their native country, the natives, puzzlingly, aren't enthusiastic. It took years for American journalists, always expecting a popular revolution in Iraq in Chalabi's favor, to get their heads around the fact that Iraqis thought he was a crook. When he received less than one percent of the vote in Iraq's 2007 presidential election, it was sort of funny, given that the vast majority of news stories about that election in the NYT and the Post had been about Chalabi. It was like some European paper betting on Dennis Kucinech being elected president in 2004. Khodorkovskii's reputation is being kept alive by the American press, with the same disregard for reality. Of course, Americans have never had a very firm grasp on the reality of any place outside of the strange American republic.  Even, it turns out, the highfliers at the New York Review of Books - who are definitely not the highfliers who used to be there. It is a funny thing - American intellectuals are more provincial, now, during the age of "globalisation", than they used to be before this vaunted time. Provincial n the sense that they could take facts and imagine how they were perceived in another society or culture. All that is dead, now, and replaced by howlers about human rights or new gulags.

Wednesday, March 04, 2015

opening - an objection

Among the chief ornaments of the romance of philosophy is the high place accorded to the open, or to openness. Open the understanding or the mind or the eye, openness as a state of being – these are all on the plus side of the ledger. Heidegger, of course, is the great poet of openness in this tradition, charging openness with a numinous relationship to being that you can take or leave – but he is only building on a vast previous structure.
Closing, perhaps as a consequence, is never given high marks by philosophers. Closing one’s eyes or one’s understanding is, automatically, a bad thing. Even in building an argument, to come to a conclusion – a close – is often transformed, in the text, into opening up.  After the Absolute spirit has tied itself in knots and done more tricks than Houdini, he at last is in a good place at the end of the Phenomenology of Spirit. You would think that the absolute spirit would be able to close up shop and go fishing. But no!  He has to open up once again and go, in recollection, though the whole muddle again. No closing for it!
Such are the lessons of the masters. But Adam, ever the dissenter, disagrees.
A couple of months ago, he was with the orthodoxy. Back in those primitive days, he would often strain towards the door knob, or at best, hang from it, crying for the door to be opened.  This happened most often when Mama or Papa had made their exit. Sometimes, though, it was the principle of the thing.
In the last month, however, he has a, learned the word door knob, and b, figured out how to turn one. Having set himself up to join the grand tradition of opening, he, instead, has begun to close doors meticulously.
Of course, one of the things about being out in the open is that you can be seen. This is fun and spiritual if you want to be seen. If, however, you want to hide, closing is your friend.
However, closing seems to have more than a ludic value for my wee little pea. He seems to recognize, in a closed door, a symbol of a larger order. Thus, when settling in to bed and grudgingly accepting the turning off of the lights, he delays the onset of sleep by pointing to the door and demanding it be closed. The thing about this is that he often has already closed it. It is as if Adam recognizes further degrees of closure. There is the closure that you use to hide with in a game, but there might be other types. One is, perhaps, that opening invites people to leave a room. It introduces a certain selfish individuality among one’s courtiers, who might be inclined to go through the open door and go into another room and start watching Peppa Pig or Goodnight Gorilla on the computer – such unimaginable riches!
Now, from the romantic philosophical view, closing here might be a symbol of involuntary servitude. But from another point of view, say that of a two and a quarter year old, it might be a sign of solidarity. It is, definitely, something with a dimension beyond the mere physical closing, just as opening has its more numinous dimension. One of the irritating things about opening in the tradition is that it is often treated as a natural property. The open is the natural situation. But one could well argue that, for living things, opening is unnatural. Skin, tegument, eyelids, doors, drawers, pots, urns, bags, all the paraphernalia of closing shrewdly measure the heroism of opening against the cleverness of closing.
I am not saying that Adam doesn’t appreciate opening. In fact, once he has closed the door on me, he will open it himself, eventually, if I don’t make a sound or an approach to do so. And he points out, every day, how much he wants an outdoor basketball court. (He likes to say I want lately. After seeing a story about a little girl who wanted the moon, he also wants the moon, which he imagines would be a very big basketball.  I want the moon daddy. Cursed little girl!). So in the technical sense, he likes the open – the undomesticated, or at least the domesticated only to the degree that it has resulted in a basketball court and a park with a slide.

To wrap up this rap: Any child’s history of philosophy would have to cast a more skeptical eye on opening as a given and a good.    

Tuesday, March 03, 2015

photogenic and the twentieth century

Photogenic drawing was the phrase used  by Talbot Fox, among others, to describe the photographic method: chemically treating a sheet of paper so that the light falling on an object made an impression of that object on the paper.  Fox and Daguerre were contemporaries, and daguerrotype soon overtook photogenic drawing as the preferred term, to be overtaken in turn by photography. The word, from the Greek for product of light, was not forgotten, but came to be employed in technical contexts – for instance, in discussing light producting organisms like fireflies; but then it took a strange turn in its philological life history.
The first references to the new meaning of photogenic come from French cinema culture. Already, in 1921, in Cinea, Jean Epstein is connectng photogenie to a particular impression of a thing or a person on the screen:
“The cinema itself is movement, so much that even its natures mortes, telephones, factories, revolvers,  revive and pulsate. It isn’t a question of worrying about making them live: let it happen and it gives life.
But it is a particular life, a life of ideas, a life of sentiments. Note: everything that is witness of an exclusive thought: habit, tiredness, animality, distraction, plays with a marvelous photogeneity. The cinema is mystical. It attaches a uniquely important value to everything which represents, exteriorly, the signs of intelligence.”
It is probably the French use of the term which floated back to the US. In the twenties, as we all know, a new American literature was being written by expats in Paris. What is less remembered is that numerous American news bureaus sited themselves in Paris, and there was a strong trans-Atlantic flow of journalists. The earliest US source that I can find is a story from the Washington Post, dated April 23, 1922, entitled Parisian News and Views, from a special correspondant. The item recounts the movie mania sweeping France, and makes the usual coy with the American image of France as the home of dashing male lovers, who have all the lines:
“So much is this true that if Don Juan lived today the spiritual Clement Vautel is sure his classic lovemaking would be transfored into such simple words as: “you are so photogenic. Would you like for me to present you to one of my friends – who is a moving picture director?”
Photogenic operates in that paragraph as an exoticism, an introduced species, something with an accent. At about the same time, the word appears in New York Times stories with quote marks around it. God bless the New York Times for having had, since forever, a stick up its ass about formal and informal English. One can go back in the archive and find words that are currently accepted as standard, like ‘leak’ for a leak of information, and trace their gradual loss of the branding quote marks in NYT stories. The appearance of the word in a cluster of newspaper stories of this time shows that photogenic was taking off, that it filled a need. Like the starling, another introduced species, it found the environment in the US conducive to massive growth.
By the 1920s, the film industry had been around for around 30 years. As Ty Burr points out in his recent book on stardom, Gods Like Us, film stars and the star system had not been around that long. The first photoplays didn’t name the people who appeared – acted? – in them. As audiences for these things grew larger, the studios began to receive massive amounts of mail asking for names. Burr picks one actor as the first star: Florence Lawrence. It is evident that Burr doesn’t quite get Lawrence:
“Her very few surviving films reveal a stat uesque woman, attractive in the preferred Gibson Girl mode of the day, with a prominent nose, broad face, serene expression. Her acting is histrionic with out being over bearingly so, yet there’s little that makes her jump off the screen the way a movie star is supposed to.” 

“Jumping off the screen” is in the semantic neighborhood of Epstein’s terms in 1921 – reviving, coming to life, resurrection.  Epstein’s examples – the objects of ordinary life – temper, of course, the hijacking of photogenic as an attribute of stardom.  But the special correspondent to the Washington Post already caught the erotic charge, the personalization of the photogenic.

Surely we are encountering, here, one of the tripwires of modernity. Edgar Morin wrote, long ago, that the art that presents an image of reality injects that image into reality. What photogenic injected into reality was a new organization of appearances.  One should, I think, see the photogenic against another term - “aura” – which is also emerging, although in philosophical culture, with Bela Belazs in Visible Man and, most famously, in Walter Benjamin essay on Art in the Age of Mechanical Reproduction.


Sunday, March 01, 2015

my problem with reductionism

I’ve never quite understood the reductionst program in the philosophy of science.
I’ve edited beaucoup papers and dissertations logically proving that, happily, the mental is a level wholly reducible to the molecular, or that the vital is reducible to the laws of physics without a remainder, and I’m the editor – I don’t interpose myself in the flow of argument and shout halt! These papers grudgingly reference the problems in the field, the fact that bridging principles seem to break down and that we still have no account that would explain the higher level phenomena completely, but in the end we can, on principle, correlate every mental and vital even to an underlying physical one, and that is all we need.
This is what they say. I begin to lose the thread with the word “underlying:.
Underlying. Higher and lower levels. In the arguments themselves these words are used with a, it seems to me, blissful unconsciousness. Because I still don’t know what level means, here.
It would seem that after we have done our tricks, we can abolish the level talk –and yet we can’t. The level itself, what it is, where it comes from, is the great stubborn residual here. Is it a fiction? I’ve not read a defense of the idea that the level is a fiction, and that underlying is simply a bow to rhetoric. Rather, it seems that we consider the level both a convenient conceptual device and a self-explanatory rhetorical conventionl. But it seems to me that the whole argument rests on there being a level that can be reduced.
If it is a rhetorical convention, it seems to me that it has sprung not from quasi-science or pre-science, but from the way the mind is. And if it is more than a convention – if it is sort of a natural fiction, like a mirage – then our story of reduction is certainly not finished if it can’t account for the mirage.

It is a puzzle, to me.

Pavlovian politics

  There is necessarily a strain of the Pavlovian in electoral politics - I'm not going to call it democratic politics, because elections...