I believe that economics is indeed essential for the study of a market system, but that it is a snare and a delusion for the study of the social order that this system serves. – Robert Heilbroner
When LI was in Malinalco, my friend, M., wanted to get some tomatoes and some underpants for her little boy. We walked around the cobbled streets. It was late afternoon. M. wanted to complete our task before the sun went down, because after dark, the pedestrian in Malinalco is prone to attack from the packs of dogs that suddenly seem to materialize out of the shadows. Residents have gates to shut after dark, so they can avoid unwanted canine intrusion.
To get the underwear, we went to a few shops. None of them had the kind M. was looking for. To get the tomatoes, we didn’t go to a shop. We went to the market.
Like every Mexican village, there are some streets in the center of town upon which, every day, venders pitch their stands. Some markets are elaborate, with vendors of Barbie dolls, sunglasses, hats, and computer games pitched next to vendors of cucumbers, watermelons, corn, and tacos. Some are less elaborate. An economist, looking at these structures, might well see materialization of the purest form of market – the one to one relationship between vendor and consumer is such that prices actually reflect real dickering, supply and demand in action.
However, since Karl Polyani’s day, economists have seen something else. A recent issue of Social Research was dedicated to Robert Heilbroner, the economist who died last week. There is a nice article by Robert Dimand that compares Heilbroner’s conviction that the key to economic history is not the neo-classical market with Karl Polyani’s notion that capitalism can only be understood in the context of its managed overthrow of pre-capitalist exchange systems.
Here are the opening grafs of Dimand’s article:
IN THE OPENING PARAGRAPH OF HIS INTRODUCTION TO A COLLECTION OF debates among Marxist historians and economists over The Transition from Feudalism to Capitalism that were initiated by Dobb (1946), Rodney Hilton (1976: 9) recalled that Karl Polanyi (1948) "thought that Dobb had retained from Marx what was bad (the labour theory of value) whilst discarding what he, Polanyi, thought was Marx's "fundamental insight into the historically limited nature of market organisation.'" Beyond condescending praise of Polanyi's review for "a serious attitude to the problems of a Marxist analysis" and passing mention in the next paragraph that R. H. Tawney's review of Dobb "did not raise any of the general theoretical problems which Polanyi hinted at," Hilton (and the other contributors reprinted in the volume) proceeded to ignore Polanyi's challenge as thoroughly as any mainstream neoclassical economist could have done.
In contrast, Robert Heilbroner shared the fundamental insight that Polanyi derived from Marx, and brought it to the attention of millions of readers. Over four decades and in 11 editions of The Making of Economic Society, Heilbroner examined the replacement of socially embedded provisioning by the market as a means of organizing society and production during the Industrial Revolution, while in seven editions of The World Philosophers that spanned nearly half a century he explored the accompanying changes in how economists thought about the economy. In his vision both of how the economy had changed and how economic thought had interacted with these changes, Heilbroner stood shoulder to shoulder with Polanyi.”
One of the things philosophers have learned from Freud and Heidegger is that forgetting is a manufactured act. It is one the things that liberal philosophers can’t forgive in Freud and Heidegger. There were two, overlapping forgettings that constituted the ideological foundation of the Cold War in the American sphere. One was the forgetting of how this “replacement of socially embedded provisioning by the market as a means of organizing society and production during the Industrial Revolution” took place. This forgetting foreclosed on both the ravages of the Industrial Revolution in Europe and America, and the cost of producing free market economies in the colonial sphere. The terror famines that occurred in Ireland and India under British rule, for instance, were dropped as a subject of discussion, or – in the case of Ireland – sentimentalized. It took Mike Davis’ book, The Victorian Holocaust, to revive interest in a series of famines that, at the turn of the century (circa 1900) in England, were well known to any educated English socialist.
The other great forgetting was about the wars that bubbled up in the capitalist world, finally ending in World War I. In fact, the whole history of the Russian Revolution was systematically distorted by cutting out the crucial facts of the war on the Eastern front – the senseless slaughter of up to two million Russians. Given the shadow of that fact, the bitter Civil War between the Red and White forces could no longer be told as a morality tale in which bad Reds killed wonderfully royalist Whites. Nor could one construct the nice myth of Lenin as the father of the Gulag with quite the straightforward indignation required, if one asks about the capitalist forces in Britain and Germany and France that authored a war that decimated 8 million people. That this slaughter was crowned, in hindsight, as a war in defense of democracy -- when, of course, it was a war in defense of a particular power arrangement among capitalist states, the governing classes of which were agreed on the necessity of perpetuating white power -- was a grim joke.
Polyani and Heilbroner, however, were exceptions to the Cold War rule. They would notice, about those markets, the use of public space – and often the use of ‘borrowed’ electricity. They would notice that the margins of profit were not derived from the rational bickering between seller and consumer – but often on quite seemingly irrational prejudices on both sides that indicate different regimes of values. They would note the tie between adults and children as laborers, and they would not homogenize them indiscriminately into a quantifiable labor unit – because this quantification would lead to predictive distortions when trying to analyze the reproduction of the labor system embodied in these markets. They would question the gauge of efficiency in the role of these markets – something M. and I discovered in our odyssey in search of underwear, here.
These issues were Heilbroner’s specialty. In the same issue of Social Research there is an exemplary Heilbroner piece, Economics as Universal Science, which quietly mocks those who, like Gary Becker or certain members of the Chicago School, claim that economics can act as the master-science for studying human behavior (plus, of course, a thrilling dose of sociobiology).
Heilbroner starts off by asking where we locate economies, if they have such modeling force in telling us about human behavior. His own premise goes like this:
“I shall undertake this task by starting from the premise that the continuity of society requires structured ways of assuring social order. These ways range from the routines and habits of daily life to formal institutions of law and order. In referring to this spectrum I shall use the term "sociological" as a portmanteau term that covers the order-bestowing influences of private life, of which incomparably the most important are the pressures of socialization exerted by parents on their offspring--pressures that teach children how to fulfill the roles expected of them in adult life. The second term, "political," I use in the conventional sense of the institutional means by which some group or class within society can enforce its will over other groups or classes. The definition of these terms is less important than my intention to describe a protective canopy of behavior-shaping arrangements, part informal and private, part formal and public, that protects the community from actions that would threaten its continued existence.
“Both the sociological and political elements in this canopy are fundamentally concerned with an aspect of social order and coherence that is usually referred to only obliquely. This aspect is the general state of obedience or acquiescence without which the armature of rights and privileges that defines any social order could be retained only by force and overt repression. With his customary candor, Adam Smith called this necessary aspect of society "subordination": "Civil government," he wrote, "suppose[s] a certain subordination." We shall return many times to this theme, but the challenge it raises should now be clear. It is the disconcerting idea that economics is socialization or
subordination in disguise.”
Heilbroner sees, however, that the ‘imperial” economist, as he calls him, can give two responses to the placement of the socius at the center of society. One is that the socius is actually a network of decisions, and hence of choices. Economics is the science that is going to rationalize that hodge-podge of choices by gauging it according to an optimal model that follows a simple rule: all choices are motivated by the perception of an advantage. It doesn’t matter if the perception is wrong, or distorted, or ignores long term advantages, etc. What matters is the logic of advantage.
The other answer, Heilbroner thinks, is to make economics the study of the division of labor that lies at the heart of the social order. Thus, subordination can again be wrapped into economics.
Heilbroner’s consideration of these options in the light of what Polyani calls the Great Transformation – the emergence of an international, hegemonic capitalist system in the last two hundred some years – is more insightful than the guff one usually gets from economists. Here are two more grafs to chew on:
"Economics thereby takes the economic system to be the living model of capitalism, containing within its categories and conceptions everything that is essential for its comprehension. It is here that economics betrays its fatal limitations as a universal science, and its knavish consequences as an imperial doctrine.
"The first such consequence is that economics itself appears as a neutral rather than a charged explanation system for capitalism. This becomes apparent in many ways. A term of great importance such as "efficiency," for example, is regarded as a quasi-engineering criterion, rather than one whose unspoken purpose is to maximize production as a profit-making--not a purely engineering--endeavor. Similar unnoticed sociopolitical meanings cling to other such terms, including "production" itself, which is counted in the national income accounts only insofar as it results in commodities, not use-values. In much the same fashion, the fundamental unit of the economic system is taken to be the rational maximizing "individual." The economic system is thus conceived as a society of hermits, not as an order of groups and classes.
"This concealment of a social order is most clearly evidenced when we notice the manner in which economics rationalizes functional income distribution. Marx wrote scathingly of Monsieur le Capital and Madame la Terre, each entitled to receive a reward for the contributions each has made to the social product, but modern economics has forgotten the fetishisms that Marx exposed. Of even greater importance, it has no explanation for, or interest in, the curious fact that the reward paid as net profit, which goes only to owners of capital, gives them only a "residual" claim on output, after all factors, including capital, have been paid their marginal products. In view of the repeated demonstrations of economics that the tendency of the market system is to eliminate such residuals as mere transient imperfections of the system, one must be a sociologist or political theorist to explain why owners of capital seem so eager to protect these dubious claims. Thus the manner in which the market supports the class structure of capitalism is a matter before which economics is silent--indeed, a matter of which it is, in some sense, unaware."
“I’m so bored. I hate my life.” - Britney Spears
Das Langweilige ist interessant geworden, weil das Interessante angefangen hat langweilig zu werden. – Thomas Mann
"Never for money/always for love" - The Talking Heads
Wednesday, January 19, 2005
Tuesday, January 18, 2005
LI is feeling a little under the weather today. We do recommend the interview with Anatol Lieven over at Asiasource today. It seems to have occured while LI was in Mexico, but it is still strongly pertinent, especially given the news the Hersh article is making.
Lieven has a theory about the crosscurrents of American nationalism which he unfolds in his recent book, and reproduces in the interview. The theory isn’t complicated – in American nationalism, the messianic democratizing gene is at odds with the messianic xenophobic gene – and he doesn’t question enough what ‘democratizing’ means – thus letting a word that, properly, should refer to a form of governance continue to hold its fatal submeaning, given to it through the cold war and into the Bush era, of a ‘free enterprise’ form of political economy.
Here’s a coupla grafs:
“But one of the striking and tragic things about the debate leading up to the Iraq war - although one can hardly call it a "debate" - was that the vast majority of it, outside certain relatively small left-wing journals, was conducted with almost no reference to the genesis of the Vietnam war, the debates which took place then, and the insights which were generated about aspects of the American tradition. Instead of analyzing what it was about their own system which was pulling them in the direction of war with Iraq, too many members of the American elite, including leading Democrats as well as Republicans, talked only about the Iraqi side.
Even that, of course, they got completely wrong, but they did not even once ask the obvious question: "What is it about our system that may make this a disaster?" After all is this not a general pattern of American behavior in the whole world by now? This business of a Green Zone in Baghdad, American officials bunkered down behind high-protective walls, with no contact with Iraqis, is this not part of a larger trend? Yet somehow it was assumed that in the case of Iraq it would be different, that America would go in, be welcomed with open arms, quickly reshape Iraq in accordance with American norms, and then quickly leave again. “
And – to give Lieven his due – he does criticize Bush’s ‘democracy” slogan somewhat:
“When they [19th century imperialists] did talk of bringing democracy, they only did in the context of the far future, something that might come about after several generations; in Africa, they talked about a thousand years of British or French rule eventually leading to self-government and democracy. In other words, they were absolutely clear and logical. These countries would need a long period, centuries literally, of Western authoritarian, imperial rule before they would be capable of self-government, constitutional rule, democracy and so forth. Indeed to an extent this was the way that it actually worked out: the British had ruled India or parts of India for 150 years before they introduced the first very limited local, district elections with fairly circumscribed powers and a franchise of less than 0.5 per cent of the population. They started doing that only from the 1880s on. They and the other liberal imperialists had a policy of what one might call authoritarian progress, not of democratization.
“Now, of course, it is completely different. The liberal imperialists of today, because of the completely different ideological era in which we are living, have to say that what they are bringing is democracy. So they conquer a place and then within a year or two, they have to hold elections, they have to claim to be introducing free government and so forth. That is just, once again, absolutely, manifestly contradictory. There would have been nothing contradictory in the 19th century about imposing Ahmed Chalabi on Iraq; the British and French did that kind of thing again and again. They had some client ruler, some dissident prince or whatever, whom they wanted to make emir of Afghanistan or of somewhere in Africa, and they just marched in and imposed him. People may have criticized it, but there was no suggestion that this was incompatible with what they were setting out to do. Of course, if you say that you are bringing democracy, if you preach about democracy, if you say your whole moral position is based on democracy, and then you impose a puppet leader, then frankly you look not just hypocritical but ridiculous, which is essentially how the US appears in much of the Muslim world.”
Lieven has a theory about the crosscurrents of American nationalism which he unfolds in his recent book, and reproduces in the interview. The theory isn’t complicated – in American nationalism, the messianic democratizing gene is at odds with the messianic xenophobic gene – and he doesn’t question enough what ‘democratizing’ means – thus letting a word that, properly, should refer to a form of governance continue to hold its fatal submeaning, given to it through the cold war and into the Bush era, of a ‘free enterprise’ form of political economy.
Here’s a coupla grafs:
“But one of the striking and tragic things about the debate leading up to the Iraq war - although one can hardly call it a "debate" - was that the vast majority of it, outside certain relatively small left-wing journals, was conducted with almost no reference to the genesis of the Vietnam war, the debates which took place then, and the insights which were generated about aspects of the American tradition. Instead of analyzing what it was about their own system which was pulling them in the direction of war with Iraq, too many members of the American elite, including leading Democrats as well as Republicans, talked only about the Iraqi side.
Even that, of course, they got completely wrong, but they did not even once ask the obvious question: "What is it about our system that may make this a disaster?" After all is this not a general pattern of American behavior in the whole world by now? This business of a Green Zone in Baghdad, American officials bunkered down behind high-protective walls, with no contact with Iraqis, is this not part of a larger trend? Yet somehow it was assumed that in the case of Iraq it would be different, that America would go in, be welcomed with open arms, quickly reshape Iraq in accordance with American norms, and then quickly leave again. “
And – to give Lieven his due – he does criticize Bush’s ‘democracy” slogan somewhat:
“When they [19th century imperialists] did talk of bringing democracy, they only did in the context of the far future, something that might come about after several generations; in Africa, they talked about a thousand years of British or French rule eventually leading to self-government and democracy. In other words, they were absolutely clear and logical. These countries would need a long period, centuries literally, of Western authoritarian, imperial rule before they would be capable of self-government, constitutional rule, democracy and so forth. Indeed to an extent this was the way that it actually worked out: the British had ruled India or parts of India for 150 years before they introduced the first very limited local, district elections with fairly circumscribed powers and a franchise of less than 0.5 per cent of the population. They started doing that only from the 1880s on. They and the other liberal imperialists had a policy of what one might call authoritarian progress, not of democratization.
“Now, of course, it is completely different. The liberal imperialists of today, because of the completely different ideological era in which we are living, have to say that what they are bringing is democracy. So they conquer a place and then within a year or two, they have to hold elections, they have to claim to be introducing free government and so forth. That is just, once again, absolutely, manifestly contradictory. There would have been nothing contradictory in the 19th century about imposing Ahmed Chalabi on Iraq; the British and French did that kind of thing again and again. They had some client ruler, some dissident prince or whatever, whom they wanted to make emir of Afghanistan or of somewhere in Africa, and they just marched in and imposed him. People may have criticized it, but there was no suggestion that this was incompatible with what they were setting out to do. Of course, if you say that you are bringing democracy, if you preach about democracy, if you say your whole moral position is based on democracy, and then you impose a puppet leader, then frankly you look not just hypocritical but ridiculous, which is essentially how the US appears in much of the Muslim world.”
Monday, January 17, 2005
If you compare the defense of Darwinism in the nineteenth century, when it was a new and shaky venture, with the defenses mounted of it now, when it is a successful and chubby scientific paradigm, you find a strange thing: early Darwinists were much more radical critics of their opponents than today’s breed. Yes, scientists now will publish lists of common fallacies about evolution spread by creationists, or they will man the battlements to fight over this or that supposed gap in the evolutionary record. But they always play defense. We suspect this is because of a general feeling that hurting the religious sensibilities of people in a society where an appreciable percentage believe they have been abducted by aliens, while others are waiting for the rapture and the conversion of the Jews, is the better part of valor and funding opportunities.
We say to hell with that. Perhaps, we have sometimes thought, it really would be a good thing to teach ID in school, subjecting it to the rough and tumble of the scientific method, and thus induce a benign strain of skepticism in the population. Handling it with kid gloves not only encourages Intelligent Design, but coddles an unhealthy gullibility in the populace.
How would our class on ID look?
The first thing to do is to understand both intelligence and design. There is a distinction of degree between the type of intelligent design that remains at the level of technique – design that solely takes advantage of how materials work – and design that depends on a deeper level of understanding of why materials work.
To illustrate this, compare the discovery of wrought iron, which was purely a “how’ discovery, with the discovery of ammonia fertilizer, which was largely a why discovery.
Richard Cowen has published a nice web book, for his course on geology at UC Davis, which has illuminating chapters on the bronze age, the iron age, and so on. According to Cowen, we have wrought iron work going back to 1400 B.C. Tutankhamen, who was buried around then, took with him into the underworld (besides a celebrated horde of goldwork) an iron dagger.
Cowen points to the difficulty in working with iron:
“In principle, iron can be smelted from magnetite or hematite, which are comparatively common ores, but iron does not melt at the temperatures that are reached in a primitive furnace: iron is still solid when copper and bronze are molten. Let us suppose that some iron ore, say FeCO3, siderite or "ironstone", is loaded into a furnace along with malachite, and fired with charcoal. (Siderite is a grey-green mineral whan it is fresh, though it weathers to a brown rusty iron oxide mineral at the surface. It often occurs with malachite, copper carbonate.) Inside the furnace, the first reaction of the siderite is like that in copper smelting: the carbonate breaks down to form an iron oxide:
FeCO3 = FeO + CO2
Then, as the charcoal burns to form hot carbon monoxide (CO), metallic iron is produced:
FeO + CO = Fe + CO2
The problem is that even when the reaction is complete, the iron that is produced is not liquid slag. But the iron is left unmelted, as a dense spongy mass of metal. There are always impurities that form a slag, but the iron will hold at least some of the slag in the small holes in its spongy texture. If the smelter is allowed to cool after the slag and the copper have been tapped off, then the iron would be left behind as an ugly solid mass that could easily be discarded as worthless.”
Notice that the very idea of carbon monoxide would be incomprehensible to our Egyptian ironsmith, who had no conceptual knowledge of the mathematics, or atomic theory, or physical laws involved in what he was doing. What he did know is that if he hammered on that “ugly solid mass” while it was still red hot, he could hammer out some stuff that was in it and shape the rest of it. Which is what he did. Intelligence, here, remains on the surface of the procedures for making the product.
Contrast this with the invention of nitrate ammonia fertilizer. In 1909, Fritz Haber knew, as a chemist, both the laws of physics and the fund of knowledge about molecules that had been discovered during the 19th century. He worked in a lab. In this lab, he produced a reaction between nitrogen gas and hydrogen gas to produce ammonia under medium temperature and high pressure. To do his work, he used mathematical formulas. And to refine the combination that made ammonia, Carl Bosch invented a machine.
The Haber-Bosch cycle, as it is called, is a pre-eminent case of “Why” intelligent design. It couldn’t have happened in ancient Egypt, since neither the physical nor the conceptual tools were present. Notice – the more complex the design, the more dependent the design is on an order. All our smith needed was a primitive furnace able to amplify heat, a hammer, and a stone of some sort. He didn’t leave behind a laboratory, or formulas, or an institutional memory. On the other hand, Haber needed a laboratory, he needed libraries filled with books, and he needed an institution in which physical space could be provided, as well as highly specialized tools. Even if we had more thoroughly bombed Germany than was the case in WWII, and had totally blotted out the street on which Haber worked, we could recover enough evidence of laboratory work to trace what Haber did, when he did it and how he did it than we could ever recover about the beginning of wrought iron smithwork. If you like, this is the second law of thermodynamics applied to intelligent design: as the system makes order, it must release waste. In a broad sense, that is what the relics of scientific achievement are.
Now that we have a sense of what Intelligent Design is – a sense of the material order that must condition the increasing complexity of design, and a sense what we mean by intelligence designing products – let’s turn to the ID argument. We will ignore the critique of evolution to concentrate on the positive claims of the school.
The best example of such claims comes from a molecular biologist, Michael Behe, whose book, Darwin’s Black box, argues for the intelligent design of organism. Behe uses complexity – just as we have. He claims that evolutionary theory (which he defines as small, incremental mutations acting on organisms) can’t explain intrinsically complex organic structures.
“By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modifications of a precursor system, because any precursor to an irreducibly complex system that is missing a part is by definition nonfunctional. An irreducibly complex biological system, if there is such a thing, would be a powerful challenge to Darwinian evolution.”
Helpfully, he uses an example. To quote from the Boston Review article about his book:
One of Behe's goals is to show that irreducible complexity is not confined to the inanimate world: some biochemical systems are also irreducibly complex. Here he succeeds. Certain biochemical systems show exactly the properties Behe attributes to them. His description of the mind-boggling cascade of reactions that occurs during blood-clotting is particularly persuasive: thrombin activates accelerin, which, with Stuart factor, cleaves prothrombin; the resulting thrombin cleaves fibrinogen, making fibrin, etc. Knock out any of these innumerable steps and the animal either bleeds or clots to death.
To Behe, an extraordinary conclusion follows on the heels of irreducible complexity: Darwinism cannot explain such systems. The reason, he says, is simple: An irreducibly complex system "cannot be produced directly . . . by slight, successive modifications of a precursor system, because any precursor to an irreducibly complex system that is missing a part is by definition nonfunctional." You cannot, in other words, gradually improve a mousetrap by adding one part and then the next.”
So – granting, for the moment, that evolution is totally incapable of explaining the blood clotting system – what does account for it?
Wierdly enough, for a scientific theory of such sweep, Behe seems pretty reticent to give us an account. But we already have examples of intelligent design, so let’s try to fill out the unspoken here, shall we?
One thing we’ve noticed is that Haber’s work left behind a lot of evidences. Surely, the design of the hominid blood clotting system, then, should also leave behind a lot of evidences. Haber’s work involved prototypes that failed, and borrowed from other chemists extensively. Oddly enough, this system of borrowing mimics evolution. That is, parts of a theory are selected, parts are thrown out, and so on. Designs are incrementally improved on. Sudden inspirations turn out to have absorbed millenia of techniques and science and theory. But evolution is a no no for Haber – his designer only exists in blinding flashes of inspiration. On his own terms, however, that doesn't make much sense. We would expect such a designer to pay no attention to other blood clotting systems in other species. Funnily enough, the designer or designers of the hominid blood clotting system used many of the same chemicals that exist in the blood clotting system of, say, lobsters. In fact, perhaps they were influenced by their past 500 million years of organism design. If this is true, some form of evolution is needed. Intelligence itself demands that we surreptitiously bring it back in to explain speciation on the ID level.
Now to get down to the other requirements for designing this terribly complex system: we need a number of prototype hominid bodies; we need a lab or some facility designed to keep the hominid body alive as various amazing chemical grafts are performed on it; we need the tools to perform such grafs; and of course a body of literature, however assembled, with the formulas, etc., for the chemistry of the thing.
The great thing about all that complexity is that it strongly infers a great deal of relic leaving. You know how ID-ers are about the fossil record -- sticklers. So surely they have a counter record, a history of wonderful archaelogical and paleological discoveries. After all, the individual acts of speciation it calls for number well over 100 million, and span more than 500 million years. We have strong reason to believe, in fact, that the designers must have used some material means, and thus left behind stray lab equipment and other detritus, because that five hundred million years shows a distinct order. Remember, the smith in Egypt was constrained not only conceptually but materially from producing ammonia nitrate – as design gets more complex, it requires a larger material reserve. To create even a galvanized paper clip, you have to have the tools to make iron much more malleable than it could be made in 1900 b.c., you have to be able to alloy it, and you have to be able to deal with it at very high temperatures. And since analogy is what is giving us our ID idea in the first place, we can see, here, the reason that there are no 'precambrian rabbits". There's a clear learning curve for the designers. This is so nice, since the order of creation is perhaps the strongest reason that almost all biologists for the past sixty years, since the great synthesis of Neo-Dawwinism, and most of them even before then, are Darwinians of one type or another. Obviously overlooking those scalpels from the Jurassic period.
So what we are looking for is evidence of laboratories existing from around 500 million years ago to a mere 150,000 years ago. Speciation has occurred since then, but this gives us a pretty good target range. To take Behe’s example, it should be simple to dig in the jungles of Africa and find, oh, oxygen bottles and vats of chemicals dating back to 150,000 b.c. Maybe the lab walls have utterly decayed by now, but how about some thrown out hominid prototypes? Surely they must be cluttering the ground. It would be nice to find a few books from 65 million b.c., too, ones containing formulas for dinosaur species making. Perhaps, though, the designers, being supersmart by this time, had put everything on CDs. Still, production conditions being what they are, surely we can find a few tools, plastic tubing, things like that. After all, if you find a mousetrap, you can pretty much bet that somewhere there's a mousetrap factory, right?
Disappointingly, so far the evidence is null, nothing, on this five hundred million years of activity. Apparently the designers have been very big on the vacuum cleaner idea. Build your hominid prototype, put in your complex blood clotting system, get in, get out, clean up. Beautiful work.
Or perhaps in the last 500 million years we’ve been visited by 100 million spaceships, ferrying the products of the laboratories of Alpha Centauri down to earth. Spaceships do leave traces, too. As well as occasionally crashing. Surely Behe has a full record of spaceship archaelogical digs to refer to, if that is his theory?
No? Nothing? Not a single material evidence for any ID whatsoever, over 500 million years, spanning 100 million acts of species invention?
I have been playing, of course. Behe isn’t a real scientist when he writes about these things, but a sort of ideological Ken Doll for the Right. The logic he employs is not the logic of science. It is the logic of shizophrenic breakdown. In the same way that you can never prove to a schizophrenic that the tv is not talking to him (since the messages are encoded in such a way that only the schizo could hear him), ID is formed in such a way that its production of astonishingly no evidence whatsoever is counted as a great triumph.
However, Behe's book does show the lengths of fraudulence to which the great Rightwing machine will stoop in order to enforce a political order. That Behe's book was reviewed at all, and by respectable people, is sign of the times. After the review of his book in the Boston Review, Behe, like a con caught with his hand in somebody’s pocket, responded with some baldfaced effronteries about evolution that were truly funny. We especially liked two comments. In one, he gives us the grounds for falsifying ID. Does this have to do with finding evidence for labs in Africa, books in the Jurassic period, or the like? Nothing so vulgar. Here is what he says:
“One last charge must be met: Orr maintains that the theory of intelligent design is not falsifiable. He's wrong. To falsify design theory a scientist need only experimentally demonstrate that a bacterial flagellum, or any other comparably complex system, could arise by natural selection. If that happened I would conclude that neither flagella nor any system of similar or lesser complexity had to have been designed. In short, biochemical design would be neatly disproved.”
Surely he swallowed his bubble gum when he wrote that. This is an old con trick. A man writes a book that claims that the Martians built the pyramids. What would disprove the claim? If archaeologists found a signed blueprint of one of the pyramids from 1000 B.C. This is reasoning for gulls – which is why, in the age of Bush, it is so popular. Ruled by a gull, bred by gulls, and educated by gulls -- that's the U.S.A.
The other quote that is interesting from Behe reveals his basic charlatanism. He writes:
“Orr says we know mousetraps are designed because we have seen them being designed by humans, but we have not seen irreducibly complex biochemical systems being designed, so we can’t conclude they were. I discuss this on pp. 196-197. We apprehend design from the system itself, even if we don’t know who the designer is. For example, the SETI project (Search for Extraterrestrial Intelligence) scans space for radio waves that might have been sent by aliens. However, we have never observed aliens sending radio messages; we have never observed aliens at all. Nonetheless, SETI workers are confident, and I agree, that they can detect intelligently-produced phenomena, even if they don’t know who produced them.”
This is a patently false analogy. The SETI people claim that they would know a material transmission, sent by a material transmitter, obeying physical law, to be a intelligently-produced phenomenon. ID is claiming no such thing – in fact, it rigorously avoids claiming such things, because such things are check-able. Rather, the analogy should be to a group of people who claim to be receiving ESP from outer space. In the end, ID amounts to no more than such a claim. It isn’t science, but science is so, you know, materialistic.
We say to hell with that. Perhaps, we have sometimes thought, it really would be a good thing to teach ID in school, subjecting it to the rough and tumble of the scientific method, and thus induce a benign strain of skepticism in the population. Handling it with kid gloves not only encourages Intelligent Design, but coddles an unhealthy gullibility in the populace.
How would our class on ID look?
The first thing to do is to understand both intelligence and design. There is a distinction of degree between the type of intelligent design that remains at the level of technique – design that solely takes advantage of how materials work – and design that depends on a deeper level of understanding of why materials work.
To illustrate this, compare the discovery of wrought iron, which was purely a “how’ discovery, with the discovery of ammonia fertilizer, which was largely a why discovery.
Richard Cowen has published a nice web book, for his course on geology at UC Davis, which has illuminating chapters on the bronze age, the iron age, and so on. According to Cowen, we have wrought iron work going back to 1400 B.C. Tutankhamen, who was buried around then, took with him into the underworld (besides a celebrated horde of goldwork) an iron dagger.
Cowen points to the difficulty in working with iron:
“In principle, iron can be smelted from magnetite or hematite, which are comparatively common ores, but iron does not melt at the temperatures that are reached in a primitive furnace: iron is still solid when copper and bronze are molten. Let us suppose that some iron ore, say FeCO3, siderite or "ironstone", is loaded into a furnace along with malachite, and fired with charcoal. (Siderite is a grey-green mineral whan it is fresh, though it weathers to a brown rusty iron oxide mineral at the surface. It often occurs with malachite, copper carbonate.) Inside the furnace, the first reaction of the siderite is like that in copper smelting: the carbonate breaks down to form an iron oxide:
FeCO3 = FeO + CO2
Then, as the charcoal burns to form hot carbon monoxide (CO), metallic iron is produced:
FeO + CO = Fe + CO2
The problem is that even when the reaction is complete, the iron that is produced is not liquid slag. But the iron is left unmelted, as a dense spongy mass of metal. There are always impurities that form a slag, but the iron will hold at least some of the slag in the small holes in its spongy texture. If the smelter is allowed to cool after the slag and the copper have been tapped off, then the iron would be left behind as an ugly solid mass that could easily be discarded as worthless.”
Notice that the very idea of carbon monoxide would be incomprehensible to our Egyptian ironsmith, who had no conceptual knowledge of the mathematics, or atomic theory, or physical laws involved in what he was doing. What he did know is that if he hammered on that “ugly solid mass” while it was still red hot, he could hammer out some stuff that was in it and shape the rest of it. Which is what he did. Intelligence, here, remains on the surface of the procedures for making the product.
Contrast this with the invention of nitrate ammonia fertilizer. In 1909, Fritz Haber knew, as a chemist, both the laws of physics and the fund of knowledge about molecules that had been discovered during the 19th century. He worked in a lab. In this lab, he produced a reaction between nitrogen gas and hydrogen gas to produce ammonia under medium temperature and high pressure. To do his work, he used mathematical formulas. And to refine the combination that made ammonia, Carl Bosch invented a machine.
The Haber-Bosch cycle, as it is called, is a pre-eminent case of “Why” intelligent design. It couldn’t have happened in ancient Egypt, since neither the physical nor the conceptual tools were present. Notice – the more complex the design, the more dependent the design is on an order. All our smith needed was a primitive furnace able to amplify heat, a hammer, and a stone of some sort. He didn’t leave behind a laboratory, or formulas, or an institutional memory. On the other hand, Haber needed a laboratory, he needed libraries filled with books, and he needed an institution in which physical space could be provided, as well as highly specialized tools. Even if we had more thoroughly bombed Germany than was the case in WWII, and had totally blotted out the street on which Haber worked, we could recover enough evidence of laboratory work to trace what Haber did, when he did it and how he did it than we could ever recover about the beginning of wrought iron smithwork. If you like, this is the second law of thermodynamics applied to intelligent design: as the system makes order, it must release waste. In a broad sense, that is what the relics of scientific achievement are.
Now that we have a sense of what Intelligent Design is – a sense of the material order that must condition the increasing complexity of design, and a sense what we mean by intelligence designing products – let’s turn to the ID argument. We will ignore the critique of evolution to concentrate on the positive claims of the school.
The best example of such claims comes from a molecular biologist, Michael Behe, whose book, Darwin’s Black box, argues for the intelligent design of organism. Behe uses complexity – just as we have. He claims that evolutionary theory (which he defines as small, incremental mutations acting on organisms) can’t explain intrinsically complex organic structures.
“By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modifications of a precursor system, because any precursor to an irreducibly complex system that is missing a part is by definition nonfunctional. An irreducibly complex biological system, if there is such a thing, would be a powerful challenge to Darwinian evolution.”
Helpfully, he uses an example. To quote from the Boston Review article about his book:
One of Behe's goals is to show that irreducible complexity is not confined to the inanimate world: some biochemical systems are also irreducibly complex. Here he succeeds. Certain biochemical systems show exactly the properties Behe attributes to them. His description of the mind-boggling cascade of reactions that occurs during blood-clotting is particularly persuasive: thrombin activates accelerin, which, with Stuart factor, cleaves prothrombin; the resulting thrombin cleaves fibrinogen, making fibrin, etc. Knock out any of these innumerable steps and the animal either bleeds or clots to death.
To Behe, an extraordinary conclusion follows on the heels of irreducible complexity: Darwinism cannot explain such systems. The reason, he says, is simple: An irreducibly complex system "cannot be produced directly . . . by slight, successive modifications of a precursor system, because any precursor to an irreducibly complex system that is missing a part is by definition nonfunctional." You cannot, in other words, gradually improve a mousetrap by adding one part and then the next.”
So – granting, for the moment, that evolution is totally incapable of explaining the blood clotting system – what does account for it?
Wierdly enough, for a scientific theory of such sweep, Behe seems pretty reticent to give us an account. But we already have examples of intelligent design, so let’s try to fill out the unspoken here, shall we?
One thing we’ve noticed is that Haber’s work left behind a lot of evidences. Surely, the design of the hominid blood clotting system, then, should also leave behind a lot of evidences. Haber’s work involved prototypes that failed, and borrowed from other chemists extensively. Oddly enough, this system of borrowing mimics evolution. That is, parts of a theory are selected, parts are thrown out, and so on. Designs are incrementally improved on. Sudden inspirations turn out to have absorbed millenia of techniques and science and theory. But evolution is a no no for Haber – his designer only exists in blinding flashes of inspiration. On his own terms, however, that doesn't make much sense. We would expect such a designer to pay no attention to other blood clotting systems in other species. Funnily enough, the designer or designers of the hominid blood clotting system used many of the same chemicals that exist in the blood clotting system of, say, lobsters. In fact, perhaps they were influenced by their past 500 million years of organism design. If this is true, some form of evolution is needed. Intelligence itself demands that we surreptitiously bring it back in to explain speciation on the ID level.
Now to get down to the other requirements for designing this terribly complex system: we need a number of prototype hominid bodies; we need a lab or some facility designed to keep the hominid body alive as various amazing chemical grafts are performed on it; we need the tools to perform such grafs; and of course a body of literature, however assembled, with the formulas, etc., for the chemistry of the thing.
The great thing about all that complexity is that it strongly infers a great deal of relic leaving. You know how ID-ers are about the fossil record -- sticklers. So surely they have a counter record, a history of wonderful archaelogical and paleological discoveries. After all, the individual acts of speciation it calls for number well over 100 million, and span more than 500 million years. We have strong reason to believe, in fact, that the designers must have used some material means, and thus left behind stray lab equipment and other detritus, because that five hundred million years shows a distinct order. Remember, the smith in Egypt was constrained not only conceptually but materially from producing ammonia nitrate – as design gets more complex, it requires a larger material reserve. To create even a galvanized paper clip, you have to have the tools to make iron much more malleable than it could be made in 1900 b.c., you have to be able to alloy it, and you have to be able to deal with it at very high temperatures. And since analogy is what is giving us our ID idea in the first place, we can see, here, the reason that there are no 'precambrian rabbits". There's a clear learning curve for the designers. This is so nice, since the order of creation is perhaps the strongest reason that almost all biologists for the past sixty years, since the great synthesis of Neo-Dawwinism, and most of them even before then, are Darwinians of one type or another. Obviously overlooking those scalpels from the Jurassic period.
So what we are looking for is evidence of laboratories existing from around 500 million years ago to a mere 150,000 years ago. Speciation has occurred since then, but this gives us a pretty good target range. To take Behe’s example, it should be simple to dig in the jungles of Africa and find, oh, oxygen bottles and vats of chemicals dating back to 150,000 b.c. Maybe the lab walls have utterly decayed by now, but how about some thrown out hominid prototypes? Surely they must be cluttering the ground. It would be nice to find a few books from 65 million b.c., too, ones containing formulas for dinosaur species making. Perhaps, though, the designers, being supersmart by this time, had put everything on CDs. Still, production conditions being what they are, surely we can find a few tools, plastic tubing, things like that. After all, if you find a mousetrap, you can pretty much bet that somewhere there's a mousetrap factory, right?
Disappointingly, so far the evidence is null, nothing, on this five hundred million years of activity. Apparently the designers have been very big on the vacuum cleaner idea. Build your hominid prototype, put in your complex blood clotting system, get in, get out, clean up. Beautiful work.
Or perhaps in the last 500 million years we’ve been visited by 100 million spaceships, ferrying the products of the laboratories of Alpha Centauri down to earth. Spaceships do leave traces, too. As well as occasionally crashing. Surely Behe has a full record of spaceship archaelogical digs to refer to, if that is his theory?
No? Nothing? Not a single material evidence for any ID whatsoever, over 500 million years, spanning 100 million acts of species invention?
I have been playing, of course. Behe isn’t a real scientist when he writes about these things, but a sort of ideological Ken Doll for the Right. The logic he employs is not the logic of science. It is the logic of shizophrenic breakdown. In the same way that you can never prove to a schizophrenic that the tv is not talking to him (since the messages are encoded in such a way that only the schizo could hear him), ID is formed in such a way that its production of astonishingly no evidence whatsoever is counted as a great triumph.
However, Behe's book does show the lengths of fraudulence to which the great Rightwing machine will stoop in order to enforce a political order. That Behe's book was reviewed at all, and by respectable people, is sign of the times. After the review of his book in the Boston Review, Behe, like a con caught with his hand in somebody’s pocket, responded with some baldfaced effronteries about evolution that were truly funny. We especially liked two comments. In one, he gives us the grounds for falsifying ID. Does this have to do with finding evidence for labs in Africa, books in the Jurassic period, or the like? Nothing so vulgar. Here is what he says:
“One last charge must be met: Orr maintains that the theory of intelligent design is not falsifiable. He's wrong. To falsify design theory a scientist need only experimentally demonstrate that a bacterial flagellum, or any other comparably complex system, could arise by natural selection. If that happened I would conclude that neither flagella nor any system of similar or lesser complexity had to have been designed. In short, biochemical design would be neatly disproved.”
Surely he swallowed his bubble gum when he wrote that. This is an old con trick. A man writes a book that claims that the Martians built the pyramids. What would disprove the claim? If archaeologists found a signed blueprint of one of the pyramids from 1000 B.C. This is reasoning for gulls – which is why, in the age of Bush, it is so popular. Ruled by a gull, bred by gulls, and educated by gulls -- that's the U.S.A.
The other quote that is interesting from Behe reveals his basic charlatanism. He writes:
“Orr says we know mousetraps are designed because we have seen them being designed by humans, but we have not seen irreducibly complex biochemical systems being designed, so we can’t conclude they were. I discuss this on pp. 196-197. We apprehend design from the system itself, even if we don’t know who the designer is. For example, the SETI project (Search for Extraterrestrial Intelligence) scans space for radio waves that might have been sent by aliens. However, we have never observed aliens sending radio messages; we have never observed aliens at all. Nonetheless, SETI workers are confident, and I agree, that they can detect intelligently-produced phenomena, even if they don’t know who produced them.”
This is a patently false analogy. The SETI people claim that they would know a material transmission, sent by a material transmitter, obeying physical law, to be a intelligently-produced phenomenon. ID is claiming no such thing – in fact, it rigorously avoids claiming such things, because such things are check-able. Rather, the analogy should be to a group of people who claim to be receiving ESP from outer space. In the end, ID amounts to no more than such a claim. It isn’t science, but science is so, you know, materialistic.
Sunday, January 16, 2005
The election, part 1
Last year, in February, we made a spotty analysis of Iraq’s situation and why Sistani’s call for an election was important. Surveyors of property consult previous plats to orient their plumb lines; purveyors of opinion should follow the same procedure. This, then, is what we wrote back then:
“February 4, 2004:
Summarizing the LI position, it would go something like this: Bush’s argument for war disguised an all to familiar American imperial adventure. As in Latin America, the administration was trying to take out a hostile dictator and replace him with a compliant puppet, under whose benevolent gaze the U.S. could spread its fine mesh of corporate interest, engulfing the resources and wealth of a conquered protectorate.
What Iraq demonstrated is that intervention on this scale, and at this distance, is not going to happen. The Empire has limits. More, the unintended consequence of the intervention was the removal of a truly horrendous regime, and the opening to an at least tentatively democratic one. Good news.
This happened as the result of two happy accidents. The first accident was the sheer incompetence and unpreparedness of the Americans in advancing towards their goal. The idea of stuffing a swindler like Chalabi down the throat of the population was quickly abandoned as impractical. The ‘liberated’ population didn’t follow the script. The looting destroyed vital infrastructure, while the infrastructure itself, after eleven years of sanctions, was incredibly decayed. Misstep after misstep was made by the imperialists, who were most successful, apparently, at building concrete berms to keep out the dangerous wogs.
Meanwhile, happy accident number two was happening. The resistance turned out to be dogged and disruptive. Like the Bush administration, the resistors were guided by a bad intention – a pure power grab – and a much worse history, that of mass murderers. They squared off against the occupiers, and as they did so, they relieved the Iraqi population from the consequences that would have ensued from a successful Bush plan – puppet status, nationwide respectable looting to the advantage of corporations and exiles. This more subtle looting, it turns out, has been forced to prey only on the American taxpayer, who is pumping money on the grand scale into keeping Cheney's retirement benefits very, very real.
The tide turned, we think, with the capture of Saddam H. This capture, in one blow, operated against the Americans and the resistance. The utter bankruptcy of the resistance, and its futility, was finally and conclusively exposed, on the one hand. On the other hand, the last excuse not to resist the Americans was blown away. The Iraqi masses could now operate without fearing the return of Saddam. And their first action was to counter the occupation.
This is why we think the elections Sistani wants are so important. Both the Bushies and the liberals are opposed to them, because they both share a managerial ideology. They both talk about democracy, but they want it organized to the point where their side retains power.
Well, we’d love to see secular democratic socialists retain or return to power in Iraq, but we believe process can't be separated from content; that top down implementation of a secular state evolves top down governance, usually by the military. If you think that insulating a progressive group against real politics works, look around you in the world. It is a fatal and stupid thing to do. It creates a malignant alliance between progressives in the country and their sponsors out of the country. This, in turn, attenuates the rooting of the progressive wing within the country until it represents, to the people at large, one more aspect of a colonialist ethos.
The consequence of a direct election might well be a triumph for a reactionary, theocratic party. But we think that if that party is going to triumph, it is going to triumph no matter how much the NGOs think they can manage the country into their various versions of liberal democracy. Far better to strengthen the parties that oppose theocracy within the country from the beginning, far better to take up the election challenge, have them begin to understand the mechanism of electoral politics, than to try to manage a detour around "petty politics". Which is why we are rather disappointed that people who truly do want to see the triumph of a secular state that measures its surrenders to neo-liberalism against an ideal of social welfare are locked into the scared mode. Sure, Iraq teeters on a blood bath of factional struggle – but, as nobody seems to remember, the Kurds went through the same struggle in the 90s, and seem to have not only survived it, but become much more secular, democratic, etc., etc. Not that we think the two Kurdish warlord parties are the last word in secularism .. however, the opportunity exists, there. Given that the Americans are blindly working towards freeing Iraq of debt and repairing the infrastructure, whoever wins the elections will have a better position than Iraq has had since 1979.
This isn't to underestimate the body count. Actually, it is hard to even estimate the body count in this country -- nobody counts it. However, the alternative body count was worse -- the attrition from sanctions, the hopelessness of Saddam, the blighting of all promise.
Of course, we are probably wrong about much of this, re the real situation in Iraq.”
Now the election is upon us, in two weeks. LI’s post showed a peculiar blindness to the fact that an accidental outcome does not automatically erase the force that brought it about. In fact, that force might refuse to recognize it. So it has proved with the resistance and the Americans. The Americans, who were opposed to elections last February, finally conceded, due to a combination of the insurgency among the Sunni and Sadr’s uprising among the Shi’ites. However, the American concession was not such as to leave either the mechanism of the election or the leadership of Iraq to the Iraqis. A number of decisions were made with the intention of maximizing American influence. These included a large provision for the votes of Iraqi exiles – many of whom live in the U.S. and have as much stake in Iraq as the American descendents of Irish immigrants have in the fate of Ireland -- and the generation of a complex national procedure that was thought, at the time, to guarantee the strong presence of America’s most faithful allies in the country, the Kurds.
On the other hand, the resistance has grown stronger in its reach, and more conscious of its lack of strength. Thus, the resistance has blindly and naturally pursued a strategy that would aggrandize its power. For the resistance to have a chance at gaining nationwide footing, the occupation must continue. Of course, with the destruction of the Iraqi army under Bremer, one could make the case that the immediate evacuation of the Americans would be good for the only organized armed force in Iraq – but we think that severely underestimates the strength of various militias associated with the major Shi’ite factions.
The American dilemma in Iraq is this: after committing the war crime of destroying Fallujah, the Americans have pretty much short circuited any possibility of alliance making among Iraq’s Sunnis. This lands the Americans in a contradictory position vis a vis their global strategy in the region, which rests – and will continue to rest – on alliances with the most fundamental Sunni powers in the region – Saudi Arabia and Pakistan. This is a real squeeze. While able to finesse the alliance with Israel with the Saudis, repressing Iraq’s Sunnis on the Fallujah scale will, eventually, make very expensive trouble with the Saudis.
So, was support for the election a delusion? Is the election terminally bad?
Delusion, to LI, is the idea that some generalization will absorb the particulars of Iraq’s current situation. War is all about the upending of generalizations – and the generals who make them. When Sistani called for an election, back in 2003, he was absolutely right. Perhaps LI should have been more pessimistic about the material process that would result in an election under occupation. In the end, the election Sistani got was badly structured, while the use of American forces as an instrument to strike against the Sunnis has so distorted the possible election result, that the government we can expect after January will certainly not have the legitimacy that, even given the flawed process, it could have had – and for this, certainly, the Shi’ite leadership is partly to blame. While Allawi’s gamble in allowing the decimation of Fallujah is understandable, though sick (how else was he to stake out political ground?), Sistani and Sadr’s silence was a huge mistake.
These are the vicissitudes of election under occupation. However, we still believe in the election route to the goal of freeing Iraq from the occupation and maintaining, for the moment, its national integrity. This, in spite of the farce of the ballot and the voting process, as described in the Washington Post:
“With the elections to pick a 275-member National Assembly just two weeks away, details are emerging about the Iraq campaign and balloting…. Many of the worries are shared by specialists from international, nongovernment organizations who are in Iraq assisting in preparation for the voting.
Election organizers are still wrestling with questions of how to publicly list names of all candidates, and with difficult details of how the votes will be counted and reported, according to telephone interviews with specialists last week. They said they face the uncomfortable prospect that it is likely to be two weeks before results are known, and the complicating possibility that the declared winners will be challenged afterward under the election rules.
Voters on Jan. 30 will get at least two ballots, one for the national assembly and one for a governate legislature, equivalent to a state legislature, they said. The national ballot will have a line with the name, number and symbol, if there is one, for each of 111 slates of candidates. But the names of the individual candidates that make up each slate will not be on the ballot, the specialists said.
Because of the danger, the slates, even those put forward by the major parties, are not releasing the names of all their candidates.”
All of these factors favor Allawi. We wouldn’t be surprised to see Allawi stay in power after the election, in spite of the fact that he is certainly not the most popular political figure in Iraq. The best we can expect, in that scenario, is that Allawi will have to really deal with those voices that want Iraq to re-assert its sovereignty – which he has ignored, since June, given the support of the Americans. If Allawi continues to act as he has since June, Sistani will either have to move more aggressively against him, or watch the popularity of his umbrella group crumble, to the advantage, most likely, of Sadr.
Looking backwards -- at the history of the war so far -- let's sum it up like this:
One could argue for or against the American occupation in 2003. At that time, the anti-war position was an abstract matter of justice (the protest against the U.S. becoming a huge pirate ship), and a matter, at least for LI, of the hurt done to American interests by the crazy diversion of the war in the first place. However, the reality in Iraq was that the nation was no worse off than under the sanctions. And, as we have emphasized often, with the end of the horrendous Hussein regime (which, as we have also emphasized, would have been a good thing even if it was the bubonic plague which had carried off S.H. -- but, in the latter case, it wouldn't have made the bubonic plague a good thing. Such is the complexity of ethics, and such is the judgement we'd apply to the 'goodness' of the American action). But starting in around November, 2003, we’d say, the occupation has become an active evil in Iraq. The longer the American troops are there, the worse off the Iraqis will be.
Last year, in February, we made a spotty analysis of Iraq’s situation and why Sistani’s call for an election was important. Surveyors of property consult previous plats to orient their plumb lines; purveyors of opinion should follow the same procedure. This, then, is what we wrote back then:
“February 4, 2004:
Summarizing the LI position, it would go something like this: Bush’s argument for war disguised an all to familiar American imperial adventure. As in Latin America, the administration was trying to take out a hostile dictator and replace him with a compliant puppet, under whose benevolent gaze the U.S. could spread its fine mesh of corporate interest, engulfing the resources and wealth of a conquered protectorate.
What Iraq demonstrated is that intervention on this scale, and at this distance, is not going to happen. The Empire has limits. More, the unintended consequence of the intervention was the removal of a truly horrendous regime, and the opening to an at least tentatively democratic one. Good news.
This happened as the result of two happy accidents. The first accident was the sheer incompetence and unpreparedness of the Americans in advancing towards their goal. The idea of stuffing a swindler like Chalabi down the throat of the population was quickly abandoned as impractical. The ‘liberated’ population didn’t follow the script. The looting destroyed vital infrastructure, while the infrastructure itself, after eleven years of sanctions, was incredibly decayed. Misstep after misstep was made by the imperialists, who were most successful, apparently, at building concrete berms to keep out the dangerous wogs.
Meanwhile, happy accident number two was happening. The resistance turned out to be dogged and disruptive. Like the Bush administration, the resistors were guided by a bad intention – a pure power grab – and a much worse history, that of mass murderers. They squared off against the occupiers, and as they did so, they relieved the Iraqi population from the consequences that would have ensued from a successful Bush plan – puppet status, nationwide respectable looting to the advantage of corporations and exiles. This more subtle looting, it turns out, has been forced to prey only on the American taxpayer, who is pumping money on the grand scale into keeping Cheney's retirement benefits very, very real.
The tide turned, we think, with the capture of Saddam H. This capture, in one blow, operated against the Americans and the resistance. The utter bankruptcy of the resistance, and its futility, was finally and conclusively exposed, on the one hand. On the other hand, the last excuse not to resist the Americans was blown away. The Iraqi masses could now operate without fearing the return of Saddam. And their first action was to counter the occupation.
This is why we think the elections Sistani wants are so important. Both the Bushies and the liberals are opposed to them, because they both share a managerial ideology. They both talk about democracy, but they want it organized to the point where their side retains power.
Well, we’d love to see secular democratic socialists retain or return to power in Iraq, but we believe process can't be separated from content; that top down implementation of a secular state evolves top down governance, usually by the military. If you think that insulating a progressive group against real politics works, look around you in the world. It is a fatal and stupid thing to do. It creates a malignant alliance between progressives in the country and their sponsors out of the country. This, in turn, attenuates the rooting of the progressive wing within the country until it represents, to the people at large, one more aspect of a colonialist ethos.
The consequence of a direct election might well be a triumph for a reactionary, theocratic party. But we think that if that party is going to triumph, it is going to triumph no matter how much the NGOs think they can manage the country into their various versions of liberal democracy. Far better to strengthen the parties that oppose theocracy within the country from the beginning, far better to take up the election challenge, have them begin to understand the mechanism of electoral politics, than to try to manage a detour around "petty politics". Which is why we are rather disappointed that people who truly do want to see the triumph of a secular state that measures its surrenders to neo-liberalism against an ideal of social welfare are locked into the scared mode. Sure, Iraq teeters on a blood bath of factional struggle – but, as nobody seems to remember, the Kurds went through the same struggle in the 90s, and seem to have not only survived it, but become much more secular, democratic, etc., etc. Not that we think the two Kurdish warlord parties are the last word in secularism .. however, the opportunity exists, there. Given that the Americans are blindly working towards freeing Iraq of debt and repairing the infrastructure, whoever wins the elections will have a better position than Iraq has had since 1979.
This isn't to underestimate the body count. Actually, it is hard to even estimate the body count in this country -- nobody counts it. However, the alternative body count was worse -- the attrition from sanctions, the hopelessness of Saddam, the blighting of all promise.
Of course, we are probably wrong about much of this, re the real situation in Iraq.”
Now the election is upon us, in two weeks. LI’s post showed a peculiar blindness to the fact that an accidental outcome does not automatically erase the force that brought it about. In fact, that force might refuse to recognize it. So it has proved with the resistance and the Americans. The Americans, who were opposed to elections last February, finally conceded, due to a combination of the insurgency among the Sunni and Sadr’s uprising among the Shi’ites. However, the American concession was not such as to leave either the mechanism of the election or the leadership of Iraq to the Iraqis. A number of decisions were made with the intention of maximizing American influence. These included a large provision for the votes of Iraqi exiles – many of whom live in the U.S. and have as much stake in Iraq as the American descendents of Irish immigrants have in the fate of Ireland -- and the generation of a complex national procedure that was thought, at the time, to guarantee the strong presence of America’s most faithful allies in the country, the Kurds.
On the other hand, the resistance has grown stronger in its reach, and more conscious of its lack of strength. Thus, the resistance has blindly and naturally pursued a strategy that would aggrandize its power. For the resistance to have a chance at gaining nationwide footing, the occupation must continue. Of course, with the destruction of the Iraqi army under Bremer, one could make the case that the immediate evacuation of the Americans would be good for the only organized armed force in Iraq – but we think that severely underestimates the strength of various militias associated with the major Shi’ite factions.
The American dilemma in Iraq is this: after committing the war crime of destroying Fallujah, the Americans have pretty much short circuited any possibility of alliance making among Iraq’s Sunnis. This lands the Americans in a contradictory position vis a vis their global strategy in the region, which rests – and will continue to rest – on alliances with the most fundamental Sunni powers in the region – Saudi Arabia and Pakistan. This is a real squeeze. While able to finesse the alliance with Israel with the Saudis, repressing Iraq’s Sunnis on the Fallujah scale will, eventually, make very expensive trouble with the Saudis.
So, was support for the election a delusion? Is the election terminally bad?
Delusion, to LI, is the idea that some generalization will absorb the particulars of Iraq’s current situation. War is all about the upending of generalizations – and the generals who make them. When Sistani called for an election, back in 2003, he was absolutely right. Perhaps LI should have been more pessimistic about the material process that would result in an election under occupation. In the end, the election Sistani got was badly structured, while the use of American forces as an instrument to strike against the Sunnis has so distorted the possible election result, that the government we can expect after January will certainly not have the legitimacy that, even given the flawed process, it could have had – and for this, certainly, the Shi’ite leadership is partly to blame. While Allawi’s gamble in allowing the decimation of Fallujah is understandable, though sick (how else was he to stake out political ground?), Sistani and Sadr’s silence was a huge mistake.
These are the vicissitudes of election under occupation. However, we still believe in the election route to the goal of freeing Iraq from the occupation and maintaining, for the moment, its national integrity. This, in spite of the farce of the ballot and the voting process, as described in the Washington Post:
“With the elections to pick a 275-member National Assembly just two weeks away, details are emerging about the Iraq campaign and balloting…. Many of the worries are shared by specialists from international, nongovernment organizations who are in Iraq assisting in preparation for the voting.
Election organizers are still wrestling with questions of how to publicly list names of all candidates, and with difficult details of how the votes will be counted and reported, according to telephone interviews with specialists last week. They said they face the uncomfortable prospect that it is likely to be two weeks before results are known, and the complicating possibility that the declared winners will be challenged afterward under the election rules.
Voters on Jan. 30 will get at least two ballots, one for the national assembly and one for a governate legislature, equivalent to a state legislature, they said. The national ballot will have a line with the name, number and symbol, if there is one, for each of 111 slates of candidates. But the names of the individual candidates that make up each slate will not be on the ballot, the specialists said.
Because of the danger, the slates, even those put forward by the major parties, are not releasing the names of all their candidates.”
All of these factors favor Allawi. We wouldn’t be surprised to see Allawi stay in power after the election, in spite of the fact that he is certainly not the most popular political figure in Iraq. The best we can expect, in that scenario, is that Allawi will have to really deal with those voices that want Iraq to re-assert its sovereignty – which he has ignored, since June, given the support of the Americans. If Allawi continues to act as he has since June, Sistani will either have to move more aggressively against him, or watch the popularity of his umbrella group crumble, to the advantage, most likely, of Sadr.
Looking backwards -- at the history of the war so far -- let's sum it up like this:
One could argue for or against the American occupation in 2003. At that time, the anti-war position was an abstract matter of justice (the protest against the U.S. becoming a huge pirate ship), and a matter, at least for LI, of the hurt done to American interests by the crazy diversion of the war in the first place. However, the reality in Iraq was that the nation was no worse off than under the sanctions. And, as we have emphasized often, with the end of the horrendous Hussein regime (which, as we have also emphasized, would have been a good thing even if it was the bubonic plague which had carried off S.H. -- but, in the latter case, it wouldn't have made the bubonic plague a good thing. Such is the complexity of ethics, and such is the judgement we'd apply to the 'goodness' of the American action). But starting in around November, 2003, we’d say, the occupation has become an active evil in Iraq. The longer the American troops are there, the worse off the Iraqis will be.
This week, I listened to a little debate between a Court Liberal – he’d been freeze dried on the Washington Post op ed page in the Clinton years, and is regularly perked up to make concessions and hew to a “centrist’ position on NPR – and a conservative. The NPR commentator sometimes came in with incisive questions, such as: do you want cream with your coffee?
Hey, I’m exaggerating!
Anyway, the debate was about Social Security. The Court liberal didn’t say anything like, the President is lying through his ass. Such remarks get you kicked out of the Court. He did point out that the idea of Social Security going bankrupt in 2040 was at some, ahem, variance, ahem, from the real state of affairs. Then, having the eagerness to compromise that D.C. induces in establishment types, he muttered that the real reform should be done on Medicare. Real reform doesn’t mean – finding ways, in the wealthiest nation in history, to make sure that there is a minimum level of health care for every citizen, guaranteed by the state. No, it means cutting into those nasty entitlements, and making health care that much more harder to afford among the populace.
The conservative then remarked that Bush had a chance to really appeal to young people, who can see just what the prez means. They are eager to daytrade their cool little privatized accounts, unlike the braindead Old.
The NPR guy asked, do you want sugar with your coffee?
And that was the end of that. I did have to laugh. This idea of ‘young people” going out, gung-ho, for eviscerating Social Security so obviously fails the ‘unit of analysis’ test that it could only pass muster in the U.S. press.
While it is true that FICA is paid individually, the benefits do not accrue individually, as even the most casual survey of American households would show. Some scientists have postulated the radical theory that young people come from other human beings. Many of them, it is thought, were born. Moreover, they not only have mothers, sometimes they have fathers too. Furthermore, and sadly, these young people are so ill versed in economic rationality that if Mom and Dad are rotting out there on the sidewalk, due to unavoidable cuts in those retirement entitlements, some of the young people might even cut them a check, or even (horrors!) allow them board in the house! Yes, it is called the family structure.
Now, of course, family in the Bush vocabulary and on NPR exists only in terms of something called “family values,” which can be defined as an allergic reaction to the sight of Janet Jackson’s tits that causes bleeding of the gums, a disbelief in evolution, and voting against same sex marriage. But (due to the perversity of humankind), families also exist as economic units. This, of course, goes against the hardy Bush Economic theory, which sees human beings sort of as individual mountain climbers, all without connections to each other, scrambling up the slopes. The highest, of course, must be the best climbers. By coincidence, some of the highest were born to best climbers. Must be genetic.
In reality, however, individuals don’t act in the economic system like disconnected mountain climbers,. but like connected ones, bound by a complicated series of ropes with all kinds of other climbers. Somehow, I don’t see young people as rejoicing at the thought of being in close proximity to their parents during their parents’ golden retirement years. In spite of the fact that they will be daytrading their FICA money like mad, watching it climb climb climb on the next tech bubble, which is what we know will surely happen.
I was reminded of William Hazlitt’s excellent essay on Bentham. Bentham very much saw human beings as individual climbers. As Hazlitt points out, Bentham’s lack of compromise did give him rather an air of grandeur, rather like Milton. Here’s Hazlitt’s charming passage:
“When any one calls upon him, he invites them to take a turn round his garden with him (Mr. Bentham is an economist of his time, and sets apart this portion of it to air and exercise)—and there you may see the lively old man, his mind still buoyant with thought and with the prospect of futurity, in eager conversation with some Opposition Member, some expatriated Patriot, or Transatlantic Adventurer, urging the extinction of Close Boroughs, or planning a code of laws for some “lone island in the watery waste,” his walk almost amounting to a run, his tongue keeping pace with it in shrill, cluttering accents, negligent of his person, his dress, and his manner, intent only on his grand theme of UTILITY—or pausing, perhaps, for want of breath and with lack-lustre eye to point out to the stranger a stone in the wall at the end of his garden (overarched by two beautiful cotton-trees) Inscribed to the Prince of Poets, which marks the house where Milton formerly lived. To shew how little the refinements of taste or fancy enter into our author's system, he proposed at one time to cut down these beautiful trees, to convert the garden where he had breathed the air of Truth and Heaven for near half a century into a paltry Chreistomathic School, and to make Milton's house (the cradle of Paradise Lost) a thoroughfare, like a three-stalled stable, for the idle rabble of Westminster to pass backwards and forwards to it with their cloven hoofs. Let us not, however, be getting on too fast—Milton himself taught school! There is something not altogether dissimilar between Mr. Bentham's appearance, and the portraits of Milton, the same silvery tone, a few dishevelled hairs, a peevish, yet puritanical expression, an irritable temperament corrected by habit and discipline.”
Hazlitt had an unformed and unsystematic objection to Bentham’s system:
“Every pleasure, says Mr. Bentham , is equally a good, and is to be taken into the account as such in a moral estimate, whether it be the pleasure of sense or of conscience, whether it arise from the exercise of virtue or the perpetration of crime. We are afraid the human mind does not readily come into this doctrine, this ultima ratio philosophorum, interpreted according to the letter. Our moral sentiments are made up of sympathies and antipathies, of sense and imagination, of understanding and prejudice. The soul, by reason of its weakness, is an aggregating and an exclusive principle; it clings obstinately to some things, and violently rejects others. And it must do so, in a great measure, or it would act contrary to its own nature. It needs helps and stages in its progress, and “all appliances and means to boot,” which can raise it to a partial conformity to truth and good (the utmost it is capable of) and bring it into a tolerable harmony with the universe. By aiming at too much, by dismissing collateral aids, by extending itself to the farthest verge of the conceivable and possible, it loses its elasticity and vigour, its impulse and its direction. The moralist can no more do without the intermediate use of rules and principles, without the 'vantage ground of habit, without the levers of the understanding, than the mechanist can discard the use of wheels and pulleys, and perform every thing by simple motion.”
Hazlitt was on the losing end of the quantifying argument, whose pale descendents are just those atomically loosed young people frolicking about the new National Lottery/Social Security office, buying tickets. We think, however, that Hazlitt’s premonition that the stoniness at the heart of Bentham’s pleasure would bring something bad into the world turned out to be true.
But give Bentham credit for honesty: he would be shocked and appalled at the conjunction of the rhetoric of an unctuous Christian familialism and the parallel attempt to openly shed the economic ties of family.
Hey, I’m exaggerating!
Anyway, the debate was about Social Security. The Court liberal didn’t say anything like, the President is lying through his ass. Such remarks get you kicked out of the Court. He did point out that the idea of Social Security going bankrupt in 2040 was at some, ahem, variance, ahem, from the real state of affairs. Then, having the eagerness to compromise that D.C. induces in establishment types, he muttered that the real reform should be done on Medicare. Real reform doesn’t mean – finding ways, in the wealthiest nation in history, to make sure that there is a minimum level of health care for every citizen, guaranteed by the state. No, it means cutting into those nasty entitlements, and making health care that much more harder to afford among the populace.
The conservative then remarked that Bush had a chance to really appeal to young people, who can see just what the prez means. They are eager to daytrade their cool little privatized accounts, unlike the braindead Old.
The NPR guy asked, do you want sugar with your coffee?
And that was the end of that. I did have to laugh. This idea of ‘young people” going out, gung-ho, for eviscerating Social Security so obviously fails the ‘unit of analysis’ test that it could only pass muster in the U.S. press.
While it is true that FICA is paid individually, the benefits do not accrue individually, as even the most casual survey of American households would show. Some scientists have postulated the radical theory that young people come from other human beings. Many of them, it is thought, were born. Moreover, they not only have mothers, sometimes they have fathers too. Furthermore, and sadly, these young people are so ill versed in economic rationality that if Mom and Dad are rotting out there on the sidewalk, due to unavoidable cuts in those retirement entitlements, some of the young people might even cut them a check, or even (horrors!) allow them board in the house! Yes, it is called the family structure.
Now, of course, family in the Bush vocabulary and on NPR exists only in terms of something called “family values,” which can be defined as an allergic reaction to the sight of Janet Jackson’s tits that causes bleeding of the gums, a disbelief in evolution, and voting against same sex marriage. But (due to the perversity of humankind), families also exist as economic units. This, of course, goes against the hardy Bush Economic theory, which sees human beings sort of as individual mountain climbers, all without connections to each other, scrambling up the slopes. The highest, of course, must be the best climbers. By coincidence, some of the highest were born to best climbers. Must be genetic.
In reality, however, individuals don’t act in the economic system like disconnected mountain climbers,. but like connected ones, bound by a complicated series of ropes with all kinds of other climbers. Somehow, I don’t see young people as rejoicing at the thought of being in close proximity to their parents during their parents’ golden retirement years. In spite of the fact that they will be daytrading their FICA money like mad, watching it climb climb climb on the next tech bubble, which is what we know will surely happen.
I was reminded of William Hazlitt’s excellent essay on Bentham. Bentham very much saw human beings as individual climbers. As Hazlitt points out, Bentham’s lack of compromise did give him rather an air of grandeur, rather like Milton. Here’s Hazlitt’s charming passage:
“When any one calls upon him, he invites them to take a turn round his garden with him (Mr. Bentham is an economist of his time, and sets apart this portion of it to air and exercise)—and there you may see the lively old man, his mind still buoyant with thought and with the prospect of futurity, in eager conversation with some Opposition Member, some expatriated Patriot, or Transatlantic Adventurer, urging the extinction of Close Boroughs, or planning a code of laws for some “lone island in the watery waste,” his walk almost amounting to a run, his tongue keeping pace with it in shrill, cluttering accents, negligent of his person, his dress, and his manner, intent only on his grand theme of UTILITY—or pausing, perhaps, for want of breath and with lack-lustre eye to point out to the stranger a stone in the wall at the end of his garden (overarched by two beautiful cotton-trees) Inscribed to the Prince of Poets, which marks the house where Milton formerly lived. To shew how little the refinements of taste or fancy enter into our author's system, he proposed at one time to cut down these beautiful trees, to convert the garden where he had breathed the air of Truth and Heaven for near half a century into a paltry Chreistomathic School, and to make Milton's house (the cradle of Paradise Lost) a thoroughfare, like a three-stalled stable, for the idle rabble of Westminster to pass backwards and forwards to it with their cloven hoofs. Let us not, however, be getting on too fast—Milton himself taught school! There is something not altogether dissimilar between Mr. Bentham's appearance, and the portraits of Milton, the same silvery tone, a few dishevelled hairs, a peevish, yet puritanical expression, an irritable temperament corrected by habit and discipline.”
Hazlitt had an unformed and unsystematic objection to Bentham’s system:
“Every pleasure, says Mr. Bentham , is equally a good, and is to be taken into the account as such in a moral estimate, whether it be the pleasure of sense or of conscience, whether it arise from the exercise of virtue or the perpetration of crime. We are afraid the human mind does not readily come into this doctrine, this ultima ratio philosophorum, interpreted according to the letter. Our moral sentiments are made up of sympathies and antipathies, of sense and imagination, of understanding and prejudice. The soul, by reason of its weakness, is an aggregating and an exclusive principle; it clings obstinately to some things, and violently rejects others. And it must do so, in a great measure, or it would act contrary to its own nature. It needs helps and stages in its progress, and “all appliances and means to boot,” which can raise it to a partial conformity to truth and good (the utmost it is capable of) and bring it into a tolerable harmony with the universe. By aiming at too much, by dismissing collateral aids, by extending itself to the farthest verge of the conceivable and possible, it loses its elasticity and vigour, its impulse and its direction. The moralist can no more do without the intermediate use of rules and principles, without the 'vantage ground of habit, without the levers of the understanding, than the mechanist can discard the use of wheels and pulleys, and perform every thing by simple motion.”
Hazlitt was on the losing end of the quantifying argument, whose pale descendents are just those atomically loosed young people frolicking about the new National Lottery/Social Security office, buying tickets. We think, however, that Hazlitt’s premonition that the stoniness at the heart of Bentham’s pleasure would bring something bad into the world turned out to be true.
But give Bentham credit for honesty: he would be shocked and appalled at the conjunction of the rhetoric of an unctuous Christian familialism and the parallel attempt to openly shed the economic ties of family.
Friday, January 14, 2005
In our post yesterday, we hung the blame for the collapse of poetry, which is surely one of the salient features of our time, on academia. This is way too easy. Perhaps the blame should be fixed, rather, on the end of walking.
Most adult Americans do not notice the landscape in terms of walking. But those of us who don’t own cars (LI is of that miserable number) have a keen sense of the difficulties thrown up by roads. Absurdly, a system that theoretically shunts people from one place to another at speeds that were impossible before the twentieth century also creates a prison. This prison, like all prisons, simply by containing certain spaces renders them unfit for human habitation. It erects areas the passage of which is forbidden on pain of death. The walker is hemmed into certain areas and certain routes, not because these routes are naturally difficult – mountains and jungles and such – but because they are humanly convenient – concrete, asphalt, and lots of metal hurling about at bonecrunching speeds.
Ben Jacks, in the Spring, 2004 issue of the Journal of Architectural Education, penned a brief for walking: “Re-imagining Walking: four practices.” Re-imagining might be a portentous word for what, to LI, is simply getting to the grocery store without a bicycle. Before we re-imagine walking, we might want to imagine not-walking. We all know the beneficial consequences of being On the Road. Freedom, for one. The concrete embodiment of the bill of Rights is getting in your car and traveling two thousand miles, alone. Recipe here depends, crucially, on having the right selection of CDs, mixed with a certain random selection of radio stations along the route. At no point is listening to news or talk radio allowed – although Gospel is. We have done this – we do drive. We like driving.
But the death of the walker’s landscapes, obesity, and the withering away of poetry – these , too, might be aspects of the hegemonic transportation grid that we’ve tattooed on the hide of the continent.
Jack'S essay mentions Francesco Careri, an Italian situationist whose stalker’s manifesto is here.
Here’s a sample graf:
“Perceiving the discarded territories, in completing such a route, between that which is secure, quotidian, and that which is uncertain, generates a sense of dislocation, a state of apprehension. This altered state induces a perceptual intensification unexpectedly giving the space a meaning, making "everywhere" a place for discovery, or instead a dreaded place for an undesirable encounter. The gaze becomes penetrating, the ear becomes keen to every sound.”
We’ve recently been around an infant, a little boy. A friend’s kid. The boy showed, from the first, a desire to stand like we’ve never seen in a baby before. He learned to walk early, and does well at it. He likes to stumble through a room, he likes wandering after his Mom, he likes being given a mission – getting his shoes, for instance. Although he fastens on any shoes he finds in his path. Walking is obviously part of a very intense, sensual experience, inseparable, in infancy, from the explosions in the neural pathways, the REM sleep, the marvelous mineral of the tooth, etc., etc. Yet we know that, in all probability, by the time this boy is forty, the walking will be gone. That is, the bliss of it, or the utility of it.
For LI’s money, the best modern walker-artist is Iain Sinclair, the man who walked around the London Orbital. He invented a phrase for how he works: psychogeography. Although Sinclair doesn’t make the connection himself (that I’m aware of), he is the latest in a fugue tradition that Deleuze identified in Mille Plateaux (the ‘schizo out for a walk” section) and that Ian Hacking studied as Mad Travelers. Hacking’s book (Mad Travelers) has been reissued as a Harvard U. paperback. This is from the UVa Press site, which originally published it:
"It all began one morning last July when we noticed a young man of twenty-six crying in his bed in Dr. Pitre's ward. He had just come from a long journey on foot and was exhausted, but that was not the cause of his tears. He wept because he could not prevent himself from departing on a trip when the need took him; he deserted family, work, and daily life to walk as fast as he could, straight ahead, sometimes doing 70 kilometers a day on foot, until in the end he would be arrested for vagrancy and thrown in prison.
Thus begins the recorded case history of Albert Dadas, a native of France's Bordeaux region and the first diagnosed mad traveler, or fuguer. An occasional employee of a local gas company, Dadas suffered from a strange compulsion that led him to travel obsessively, often without identification, not knowing who he was or why he traveled. He became notorious for his extraordinary expeditions to such far-reaching spots as Algeria, Moscow, and Constantinople. Medical reports of Dadas set off at the time of a small epidemic of compulsive mad voyagers, the epicenter of which was Bordeaux, but which soon spread throughout France to Italy, Germany, and Russia.”
Hacking's book is becoming one of those philosophic texts that artists digest in their own bizarre ways -- like Deleuze's work.
Most adult Americans do not notice the landscape in terms of walking. But those of us who don’t own cars (LI is of that miserable number) have a keen sense of the difficulties thrown up by roads. Absurdly, a system that theoretically shunts people from one place to another at speeds that were impossible before the twentieth century also creates a prison. This prison, like all prisons, simply by containing certain spaces renders them unfit for human habitation. It erects areas the passage of which is forbidden on pain of death. The walker is hemmed into certain areas and certain routes, not because these routes are naturally difficult – mountains and jungles and such – but because they are humanly convenient – concrete, asphalt, and lots of metal hurling about at bonecrunching speeds.
Ben Jacks, in the Spring, 2004 issue of the Journal of Architectural Education, penned a brief for walking: “Re-imagining Walking: four practices.” Re-imagining might be a portentous word for what, to LI, is simply getting to the grocery store without a bicycle. Before we re-imagine walking, we might want to imagine not-walking. We all know the beneficial consequences of being On the Road. Freedom, for one. The concrete embodiment of the bill of Rights is getting in your car and traveling two thousand miles, alone. Recipe here depends, crucially, on having the right selection of CDs, mixed with a certain random selection of radio stations along the route. At no point is listening to news or talk radio allowed – although Gospel is. We have done this – we do drive. We like driving.
But the death of the walker’s landscapes, obesity, and the withering away of poetry – these , too, might be aspects of the hegemonic transportation grid that we’ve tattooed on the hide of the continent.
Jack'S essay mentions Francesco Careri, an Italian situationist whose stalker’s manifesto is here.
Here’s a sample graf:
“Perceiving the discarded territories, in completing such a route, between that which is secure, quotidian, and that which is uncertain, generates a sense of dislocation, a state of apprehension. This altered state induces a perceptual intensification unexpectedly giving the space a meaning, making "everywhere" a place for discovery, or instead a dreaded place for an undesirable encounter. The gaze becomes penetrating, the ear becomes keen to every sound.”
We’ve recently been around an infant, a little boy. A friend’s kid. The boy showed, from the first, a desire to stand like we’ve never seen in a baby before. He learned to walk early, and does well at it. He likes to stumble through a room, he likes wandering after his Mom, he likes being given a mission – getting his shoes, for instance. Although he fastens on any shoes he finds in his path. Walking is obviously part of a very intense, sensual experience, inseparable, in infancy, from the explosions in the neural pathways, the REM sleep, the marvelous mineral of the tooth, etc., etc. Yet we know that, in all probability, by the time this boy is forty, the walking will be gone. That is, the bliss of it, or the utility of it.
For LI’s money, the best modern walker-artist is Iain Sinclair, the man who walked around the London Orbital. He invented a phrase for how he works: psychogeography. Although Sinclair doesn’t make the connection himself (that I’m aware of), he is the latest in a fugue tradition that Deleuze identified in Mille Plateaux (the ‘schizo out for a walk” section) and that Ian Hacking studied as Mad Travelers. Hacking’s book (Mad Travelers) has been reissued as a Harvard U. paperback. This is from the UVa Press site, which originally published it:
"It all began one morning last July when we noticed a young man of twenty-six crying in his bed in Dr. Pitre's ward. He had just come from a long journey on foot and was exhausted, but that was not the cause of his tears. He wept because he could not prevent himself from departing on a trip when the need took him; he deserted family, work, and daily life to walk as fast as he could, straight ahead, sometimes doing 70 kilometers a day on foot, until in the end he would be arrested for vagrancy and thrown in prison.
Thus begins the recorded case history of Albert Dadas, a native of France's Bordeaux region and the first diagnosed mad traveler, or fuguer. An occasional employee of a local gas company, Dadas suffered from a strange compulsion that led him to travel obsessively, often without identification, not knowing who he was or why he traveled. He became notorious for his extraordinary expeditions to such far-reaching spots as Algeria, Moscow, and Constantinople. Medical reports of Dadas set off at the time of a small epidemic of compulsive mad voyagers, the epicenter of which was Bordeaux, but which soon spread throughout France to Italy, Germany, and Russia.”
Hacking's book is becoming one of those philosophic texts that artists digest in their own bizarre ways -- like Deleuze's work.
Thursday, January 13, 2005
On New Years day, LI had dinner with a group of very literate folks in Mexico City. One of them, our friend L., was talking about poetry – we were all trying to think of appropriate poems for New Years Day – and she mentioned that she considered, at one point, translating Dylan Thomas into Spanish. But then she learned (she sadly said) that critics say that Thomas is a bad poet.
I know that feeling: the fear of having bad taste, of some soft spot in one’s intellectual armor. Taste, one imagines, is corrected by the larger experience. There are critics I admire who have condemned Thomas’ poetry – Kenner, apparently, couldn’t stand it, or separate it from the man who made it. We respectfully disagree.
Jan Morris, in a review of a bio of Dylan Thomas in the New Statesman, quotes two disparagers:
“Dan Davin of Oxford University Press thought that Thomas's brain was not of the first class and that he spent "a great deal of noise on perceptions which are either obvious or absurd". Stephen Spender once dismissed his art as "turned on like a tap ... no beginning or end, shape or intelligent and intelligible control". Thomas spoke no foreign language, first went abroad when he was 32, and had a taste for westerns and cheap thrillers.”
One feels, like a chill coming on, that sooner or later someone will roll out Johnson’s judgment on the poems of Ossian: "Sir, a man might write such stuff forever, if he would abandon his mind to it."
In fact, of course, nobody has ever successfully written a Dylan Thomas poem except Dylan Thomas – and even he lost the knack at the end of his life, poor sod. What academics suspect is that Thomas’ poetry is all effect – the marvelous words end up echoing no larger substance. While I have some sympathy for the idea that poems should be separated from their mere effects, a little moderation, please. Academia has now created, in creative writing programs all over the world, poetry that has no effect whatsoever. Striving to be pure, it has become purely forgettable. Too often, very very smart people will confess that they read no poetry whatsoever.For which, frankly, I blame contemporary poets, who should make a collective prison break out of the world of teaching. Do anything else.
I like a poem that, at some point, I can say to myself. That moment of saying the poem to oneself is not all a poem is about, but without it, the poem has no skin, no place where the nerves end. Anatomical dolls are not our idea of beauty.
J.S. Mill, as we know from his Autobiography, was saved from the horrid erudition shoveled on his head by his pa by poetry – specifically, Wordsworth’s. He tried to define poetry in an interestingly wrong headed essay, making, among other distinctions, this one between poetry and fiction:
“Many of the greatest poems are in the form of fictitious narratives; and, in almost all good serious fictions, there is true poetry. But there is a radical distinction between the interest felt in a story as such, and the - excited by poetry; for the one is derived from incidence, the other from the representation of feeling. In one, the source of the emotion excited is the exhibition of a state or states of human sensibility; in the other of a series of states of mere outward circumstances. Now, all minds are capable of being affected more or less by representations of the latter kind, and or almost all, by those of the former; yet the two sources of interest correspond to two distinct and (as respects their greatest development) mutually exclusive characters of mind.
“At what age is the passion for a story, for almost any kind of story, merely as a story, the most intense? In childhood. But that also is the age at which poetry, even of the simplest description, is least relished and least understood; because the feelings with which it is especially conversant are yet undeveloped, and, not having been even in the slightest degree experienced, cannot be sympathized with. In what stage of the progress of society, again, is story-telling most valued, and the story-teller in greatest request and honor? In a rude state like that of the Tartars and Arabs at this day, and of almost all nations in the earliest ages. But, in this state of society, there is little poetry except ballads, which are mostly narrative, --that is, essentially stories,--and derive their principal interest from the incidents. Considered as poetry, they are of the lowest and most elementary kind: the feelings depicted, or rather indicated, are the simplest our nature has; such joys and griefs as the immediate pressure of some outward event excites in rude minds, which live wholly immersed in outward things, and have never, either from choice or a force they could not resist, turned themselves to the contemplation of the world within. Passing now from childhood, and from the childhood of society, to the grown-up men and women of this most grown-up and unchild-like age, the minds and hearts of greatest depth and elevation are commonly those which take greatest delight in poetry: the shallowest and emptiest, on the contrary, are, at all events, not those least addicted to novel-reading. This accords, too, with all analogous experience of human nature. The sort of persons whom not merely in books, but in their lives, we find perpetually engaged in hunting for excitement from without, are invariably those who do not possess, either in the vigor of their intellectual powers or in the depth of their sensibilities, that which would enable them to find ample excitement nearer home. The most idle and frivolous persons take a natural delight in fictitious narrative: the excitement it affords is of the kind which comes from without. Such persons are rarely lovers of poetry, though they may fancy themselves so because they relish novels in verse. But poetry, which is the delineation of the deeper and more secret workings of human emotion, is interesting only to those to whom it recalls what they have felt, or whose imagination it stirs up to conceive what they could feel, or what they might have been able to feel, had their outward circumstances been different.”
This seems to me to get one of the main things right – the last sentence especially – but the main thing wrong, as well as the anthropology. Children love verse that tells no tale, but sounds funny or interesting, for one thing. As for the rude people's line, our friend at Brooding Persian probably could tell us a little bit about that. The main thing, though, is that Mill gets entangled in the distinction between emotion and incident. This is a familiar and endlessly tugged against trap. I think it is just the wrong way to talk about poetry. Mill is not alone, of course – Eliot has a similar notion, and the distinction has had a long and hale life that continues today. With nefarious consequences, insofar as it empties out what we can say when we talk about a poem.
Myself, I prefer to think of poems in terms of orientation, or maps. Pound's periplum. What does this mean?
Let me explain by way of an illustration. There is a story in Oliver Sacks The Man who Mistook Himself for a Hat. A music professor was examined by Sacks. The professor was, according to all tests, physically blind. The blindness was caused by the deterioration of the retina. Yet the man claimed to be able to see. In order to understand the case, Sacks went to the man’s home. And, indeed, he seemed to get around the house, and to say things about the house, which only a man with sight could similarly do and say. Or so Sacks thought. Then they had dinner, and Sacks noticed, during dinner, that the professor was “singing” the dinner to himself. He had a song, a sort of hum, that he used to orient himself to all the things on the table.
This is what poetry does for me. Bruce Chatwin, in The Song-Lines, recounts (with, perhaps, some exaggeration) that Australian aborigines, who have widely varying languages, are nevertheless able to sing directions to each other, since the directions are encoded in the intonations, and not the words, of their songs. Chatwin cites an anthropologist who was so fascinated by this cultural ability that he began to apply the song-line principle to poetry in Europe, claiming that the Odyssey was a song-line.
Well, the latter seems a little fantastic, but as a principle, this corresponds to part of what I get from poetry. And this orienting moment is what I would call the "poetry" in fiction -- not sentences highly spiced with adverbs, or that drift from specificity into spindrift.
At this point in the post, I wanted to get to Mina Loy, some of whose Lost Lunar Baedecker is published on the web at this site. But I’ll postpone that – since humanity can only bear a certain length in blog posts, n’est-ce pas?
....
As we splutter to set up the LI donation week, or month, we were pleased to get fifty dollars from a donor yesterday. We will soon be putting up more info. Thanks!
I know that feeling: the fear of having bad taste, of some soft spot in one’s intellectual armor. Taste, one imagines, is corrected by the larger experience. There are critics I admire who have condemned Thomas’ poetry – Kenner, apparently, couldn’t stand it, or separate it from the man who made it. We respectfully disagree.
Jan Morris, in a review of a bio of Dylan Thomas in the New Statesman, quotes two disparagers:
“Dan Davin of Oxford University Press thought that Thomas's brain was not of the first class and that he spent "a great deal of noise on perceptions which are either obvious or absurd". Stephen Spender once dismissed his art as "turned on like a tap ... no beginning or end, shape or intelligent and intelligible control". Thomas spoke no foreign language, first went abroad when he was 32, and had a taste for westerns and cheap thrillers.”
One feels, like a chill coming on, that sooner or later someone will roll out Johnson’s judgment on the poems of Ossian: "Sir, a man might write such stuff forever, if he would abandon his mind to it."
In fact, of course, nobody has ever successfully written a Dylan Thomas poem except Dylan Thomas – and even he lost the knack at the end of his life, poor sod. What academics suspect is that Thomas’ poetry is all effect – the marvelous words end up echoing no larger substance. While I have some sympathy for the idea that poems should be separated from their mere effects, a little moderation, please. Academia has now created, in creative writing programs all over the world, poetry that has no effect whatsoever. Striving to be pure, it has become purely forgettable. Too often, very very smart people will confess that they read no poetry whatsoever.For which, frankly, I blame contemporary poets, who should make a collective prison break out of the world of teaching. Do anything else.
I like a poem that, at some point, I can say to myself. That moment of saying the poem to oneself is not all a poem is about, but without it, the poem has no skin, no place where the nerves end. Anatomical dolls are not our idea of beauty.
J.S. Mill, as we know from his Autobiography, was saved from the horrid erudition shoveled on his head by his pa by poetry – specifically, Wordsworth’s. He tried to define poetry in an interestingly wrong headed essay, making, among other distinctions, this one between poetry and fiction:
“Many of the greatest poems are in the form of fictitious narratives; and, in almost all good serious fictions, there is true poetry. But there is a radical distinction between the interest felt in a story as such, and the - excited by poetry; for the one is derived from incidence, the other from the representation of feeling. In one, the source of the emotion excited is the exhibition of a state or states of human sensibility; in the other of a series of states of mere outward circumstances. Now, all minds are capable of being affected more or less by representations of the latter kind, and or almost all, by those of the former; yet the two sources of interest correspond to two distinct and (as respects their greatest development) mutually exclusive characters of mind.
“At what age is the passion for a story, for almost any kind of story, merely as a story, the most intense? In childhood. But that also is the age at which poetry, even of the simplest description, is least relished and least understood; because the feelings with which it is especially conversant are yet undeveloped, and, not having been even in the slightest degree experienced, cannot be sympathized with. In what stage of the progress of society, again, is story-telling most valued, and the story-teller in greatest request and honor? In a rude state like that of the Tartars and Arabs at this day, and of almost all nations in the earliest ages. But, in this state of society, there is little poetry except ballads, which are mostly narrative, --that is, essentially stories,--and derive their principal interest from the incidents. Considered as poetry, they are of the lowest and most elementary kind: the feelings depicted, or rather indicated, are the simplest our nature has; such joys and griefs as the immediate pressure of some outward event excites in rude minds, which live wholly immersed in outward things, and have never, either from choice or a force they could not resist, turned themselves to the contemplation of the world within. Passing now from childhood, and from the childhood of society, to the grown-up men and women of this most grown-up and unchild-like age, the minds and hearts of greatest depth and elevation are commonly those which take greatest delight in poetry: the shallowest and emptiest, on the contrary, are, at all events, not those least addicted to novel-reading. This accords, too, with all analogous experience of human nature. The sort of persons whom not merely in books, but in their lives, we find perpetually engaged in hunting for excitement from without, are invariably those who do not possess, either in the vigor of their intellectual powers or in the depth of their sensibilities, that which would enable them to find ample excitement nearer home. The most idle and frivolous persons take a natural delight in fictitious narrative: the excitement it affords is of the kind which comes from without. Such persons are rarely lovers of poetry, though they may fancy themselves so because they relish novels in verse. But poetry, which is the delineation of the deeper and more secret workings of human emotion, is interesting only to those to whom it recalls what they have felt, or whose imagination it stirs up to conceive what they could feel, or what they might have been able to feel, had their outward circumstances been different.”
This seems to me to get one of the main things right – the last sentence especially – but the main thing wrong, as well as the anthropology. Children love verse that tells no tale, but sounds funny or interesting, for one thing. As for the rude people's line, our friend at Brooding Persian probably could tell us a little bit about that. The main thing, though, is that Mill gets entangled in the distinction between emotion and incident. This is a familiar and endlessly tugged against trap. I think it is just the wrong way to talk about poetry. Mill is not alone, of course – Eliot has a similar notion, and the distinction has had a long and hale life that continues today. With nefarious consequences, insofar as it empties out what we can say when we talk about a poem.
Myself, I prefer to think of poems in terms of orientation, or maps. Pound's periplum. What does this mean?
Let me explain by way of an illustration. There is a story in Oliver Sacks The Man who Mistook Himself for a Hat. A music professor was examined by Sacks. The professor was, according to all tests, physically blind. The blindness was caused by the deterioration of the retina. Yet the man claimed to be able to see. In order to understand the case, Sacks went to the man’s home. And, indeed, he seemed to get around the house, and to say things about the house, which only a man with sight could similarly do and say. Or so Sacks thought. Then they had dinner, and Sacks noticed, during dinner, that the professor was “singing” the dinner to himself. He had a song, a sort of hum, that he used to orient himself to all the things on the table.
This is what poetry does for me. Bruce Chatwin, in The Song-Lines, recounts (with, perhaps, some exaggeration) that Australian aborigines, who have widely varying languages, are nevertheless able to sing directions to each other, since the directions are encoded in the intonations, and not the words, of their songs. Chatwin cites an anthropologist who was so fascinated by this cultural ability that he began to apply the song-line principle to poetry in Europe, claiming that the Odyssey was a song-line.
Well, the latter seems a little fantastic, but as a principle, this corresponds to part of what I get from poetry. And this orienting moment is what I would call the "poetry" in fiction -- not sentences highly spiced with adverbs, or that drift from specificity into spindrift.
At this point in the post, I wanted to get to Mina Loy, some of whose Lost Lunar Baedecker is published on the web at this site. But I’ll postpone that – since humanity can only bear a certain length in blog posts, n’est-ce pas?
....
As we splutter to set up the LI donation week, or month, we were pleased to get fifty dollars from a donor yesterday. We will soon be putting up more info. Thanks!
Subscribe to:
Comments (Atom)
A vanishing act: repressive desublimation and the NYT
We are in the depths of the era of “repressive desublimation” – Angela Carter’s genius tossoff of a phrase – and Trump’s shit video is a m...
-
You can skip this boring part ... LI has not been able to keep up with Chabert in her multi-entry assault on Derrida. As in a proper duel, t...
-
Ladies and Gentlemen... the moment you have all been waiting for! An adventure beyond your wildest dreams! An adrenaline rush from start to...
-
LI feels like a little note on politics is called for. The comments thread following the dialectics of diddling post made me realize that, ...