Search This Blog

Saturday, 30 April 2022

Review: Peter Salmon An Event Perhaps A Biography of Jacques Derrida






I am one of those people who would rather read a book about Jacques Derrida than read a book by Derrida. This one, though very well-written and diligent, isn’t going to change my mind. Derrida is just not my cup of tea - a phrase which the late Richard Wollheim used at the end of a superb essay on the work of Jacques Lacan which had involved him in a great deal of diligent reading. Nonetheless, I must have picked up something and not so long ago cheerfully titled an essay “Social Construction Deconstructed”. But then there is a generic use of “deconstruction” which is almost certainly not faithful to an original idea. To use the term doesn’t make me one of the (dwindling) band of the Derrida faithful.

When I arrived in Paris for a year’s graduate work in the autumn of 1971 I promptly spent most of my money on a pile of books by celebrity theorists of the time, intending to read them all. I bought Derrida’s De la Grammatologie and L’Ecriture et la Différence, both published in 1967, and  probably read half of each book - I don’t have the physical books any more so I can’t be more precise. But the material seemed at a distance from my more immediate concerns at the time and I didn’t finish them or do anything with them. Nonetheless, I found my way to the Ecole Normale Superieure and sneaked into Derrida’s (very sparsely attended) lectures in which he was offering a close reading of Hegel on the family and marriage, and that was even less of concern to me. So I stopped going. Salmon’s book (p. 166) tells me that those lectures were the basis of his 1974 book Glas. It was common practice in the Paris of the time for professors to use draft chapters of next year’s book as their teaching material. I stuck with Foucault for the whole year and he was working on his Pierre Rivière study and presented work in progress in seminars. But his approach was different, and it was much more of a teaching situation that he created, not least because he had dissuaded the tourists from attending: at the Collège de France all teaching was supposed to be open to anyone and Foucault devoted his first session grilling those present about their motives for being there. The grilling was severe enough to reduce numbers substantially at the second session. From Foucault, I got the idea of studying minor or parallel thinkers alongside  major figures and  a few years later writing about John Stuart Mill  tried to show how part of his system of thought had been more fully fleshed out by the more or less forgotten Sir George Cornewall Lewis. Likewise, I sought to show how you could make sense of Rousseau's arguments in Du Contrat Social by putting them alongside Condorcet's work on probability theory and majority voting.

Later, in the 1980s when  teaching at the University of Sussex one of my colleagues was Geoff Bennington who has now devoted a lifetime’s talent to promoting Derrida’s work, initially by translation and then in many other ways including direct collaboration with Derrida. There was an occasion when  I complained that Derrida was an improvisatore and  Bennington replied “I think he’s the bee’s knees”. I had put Derrida into the same category as Lacan, whose “seminars” (attended by hundreds at Saint Anne) were theatre in the tradition of Anton Mesmer and, by this stage in his career, more or less unintelligible. He entered each week wearing a fur coat; a female assistant helped him take it off and held on to it for the duration. In stark contrast, I greatly enjoyed the patient and relaxed seminars offered by Roland Barthes, who was my director of studies.

I made one more attempt to engage with Derrida’s work in 1997 at a Colloque held at Cérisy la Salle in Normandy devoted to Derrida and the topic of L’Animal Autobiographique. Derrida spoke at length, uninterrupted, and certainly for longer than Fidel Castro’s record. I found it exasperating. Peter Salmon now tells me that Derrida “presented a ten hour lecture” (p 273). It certainly felt like it. 

Unless you are very stupid, then if you spend your life writing eighty books (Salmon’s figure for Derrida) then you are more or less bound to say something interesting somewhere. But I’m happy to leave it to others to discover where. 

Sunday, 13 March 2022

Review: Toril Moi, Revolution of the Ordinary

 


There are so many universities in the world that we have only estimates of their number; an App can’t track them because some don’t call themselves universities (MIT and many others) and because some are bogus. But I’m fairly confident that spread across those universities there are thousands of Departments which offer undergraduate degrees in Literature, most commonly the Literature of the country in which they are based or, at least, written in its national language.

I’m also fairly confident that poems and novels are always taken as exemplary for Literature and that survey courses which introduce students to representative samples of different periods and genres within the national literature are very common. It’s for that reason that I can walk into my local bookshop and buy cheap, well-edited editions of nineteenth century English language novels easily identifiable by their black Penguin Classics spines. I benefit from the student demand for these things.

Undergraduates are expected to read the representative material assigned and quite often do. But what else are they supposed to accomplish? For over a century now departments of Literature have struggled to make their work - well, more disciplinary. Various approaches have been proposed and almost certainly more approaches than in the harder sciences where a textbook author can even dream of writing a book which will be used world-wide - at school in the 1960s my textbook for Economics was simply called “Samuelson” and probably got close to having world-wide success outside the then Communist world.

To begin with, the new Literature departments could trade off what was already an established way of responding to poems and novels which could be found in pre-1914 European and North American journals, reviews, and newspapers where Response often took the form of assuming a moral high ground from which, in particular, immorality could be seen for what it was. Literature was often immoral and readers needed to be told that in their own interests. How else could they know which novels to buy for themselves but keep from their servants (for whom a separate category of improving literature was available - The Blind Washerwoman, and such like). The new university departments could easily accommodate to such disciplinary activity and still do though nowadays there is much debate as to whether students are in the same category as servants and to be protected from immorality. Remarkably, students can now be found who will, in any case, demand protection, whereas In My Day ….

Beyond moralising raps on the knuckle the next most common form of Discipline was the demand that students Pay Attention to the words on the page in such a way that they would not attribute character traits or motives or moods or conclusions clearly contradicted by words to be found at page 123 et seq.

You could read Toril Moi as urging the case for a more subtle and sophisticated version of that kind of (elementary?) discipline, basing herself on the later philosophy of Wittgenstein as mediated by Stanley Cavell in particular. Literature makes use of ordinary language to do fairly extraordinary things and paying careful, engaged attention to it - and to one’s own responses - is the way in.

But is the way in also the goal and conclusion? When you’ve read something attentively is that it?  Toril Moi does not think so - she is not trying to resurrect what was once called the New Criticism whose advocates would tell you very firmly that if it wasn’t on the page you had no business talking about it and that if you did talk you couldn’t expect an A or even a B.  Like Rita Felski who uses the “flat” ontology of Bruno Latour [see my review of Felski’s Hooked on this site, 24 February 2022 ], Moi accepts that to fully appreciate (acknowledge) what the words on the page are being used to do it may be entirely appropriate to draw attention to the author’s biography, to the historical circumstances in which the book was written, to the author’s assumptions about likely or desired readers, to the author’s awareness of current censorship practices (an awareness which, in my reading, for example, blights Oscar Wilde’s The Picture of Dorian Gray, this site 16 August 2021). She wants us to think of poems and novels as forms of action or enactment connected to situated human existence and not detached “texts” which could have dropped from the skies.

 But at the same time she wants to resist the approaches of those who just want to put the “text” through a grinder - Marxist, Freudian, Feminist, Post-Colonial ….. basically in order to demonstrate how the work Fails but, as just reward, enables the grinder-operator to get an A. She is interested in keeping a mind which is at once open and informed so that the “text” has a chance to lead us to new ways of looking at things which we may otherwise take for granted. At page 211, for example, she separates Viktor Shklovsky from other formalists and says that he got things right when he championed defamiliarization as something rather more than simple “technique”. I agree - and the idea itself can be found a century earlier in Coleridge’s response to Wordsworth. The genuineness of Shklovsky’s commitment is to be found in his memoir, A Sentimental Journey.

*

 Just now I used the phrase “situated human existence”. Moi takes over from Wittgenstein the idea of “forms of life” which is closely connected to the idea of agreement (often unreflected) in our responses. I have a problem with this because I think that Wittgenstein does not distinguish two different kinds of agreement and in so doing opens himself to rival readings, which for short one might call “naturalist” and “sociological”.

If you show experimental subjects the two arrow-headed lines which comprise the Müller-Lyer illusion they will all agree, independently of each other, that the arrows are of unequal length. Even when told that they are not, the illusion persists.

The illusion reveals something about how human vision works; that we agree in our responses is a distributive agreement which has nothing to do with anything we have learnt, been taught, or discussed. Similarly, young children (before the age of four or five) make drawings which develop in ways and in a sequential order which is common across cultures and owes nothing to the surrounding cultures of visual representation into which some children will subsequently be inducted. The naïve child artists agree in the way they think faces and figures are to be represented though no one has taught them this (or, in Wittgenstein’s language, trained them). Pile up such examples of distributive agreement (being frightened by a scary story…) and you can then begin to think of agreement in responses as something natural and you can read some of the things Wittgenstein says as supportive of that and you can make him into a naturalist as did Colin McGinn in Wittgenstein on Meaning (1984). Wittgenstein does not make it easy, however, because he has very little to say about babies and infants and what he does say seems bleakly conventional and uncomprehending.

But, of course, there is another kind of agreement which can be called collective. This does not require that we have voted or held debates or even talked about it though sometimes we will have done so. We can come to agree by various means but by those diverse means our form of life comes to have a social or communal or conventional character as explored by philosophers like David Lewis in his Convention (1969) and much subsequent literature including the work of Margaret Gilbert. We agree collectively, not distributively, to drive on the left not the right, and so on. Social constructionists think that everything (or nearly everything) has this character and they can find ways of reading Wittgenstein which turns him into a sociologist of culture. They did a lot of that in Oxford where Wittgenstein’s account of “following a rule” got construed as “following our rule and don’t dare disobey”, as if the nature of language could be entirely understood via the local dialect. This emphasis on the social is found most clearly in the work of Gordon Baker and Peter Hacker in books such as Language, Sense and Nonsense (1984). My own view is that the Oxford Wittgensteinians fell into the trap which Dennis Wrong once characterised as “The Oversocialized Conception of Man in Modern Sociology” (a 1977 article).

Toril Moi’s book covers a lot of ground and is really the product of a life time of careful engagement. It’s lucid and held together by the thread provided by Stanley Cavell’s work, which is much more humane and resonant than anything the Oxonians came up with. Whether Moi’s book will in practical terms solve or dissolve the problem of what kind of Discipline is best suited to deal with Literature is another matter.

 

 

 

 

 

 

Thursday, 24 February 2022

Review: Rita Felski Hooked

 




I was introduced to Rita Felski’s work in 2020 when I consulted what in England we call an Early Career Academic about an essay on Lolita that I was writing (now published as Nabokov’s Dream); he suggested I read The Limits of Critique (2015). In that book, Felski writes about the importance of the pleasure we get from reading novels or looking at paintings, and so on. In this new book she starts from a reflection on the often odd and idiosyncratic ways in which we become attached, or attach ourselves, to a work - maybe re-reading it frequently or humming the tune every day. (In another book I was reading recently, an anecdote was told about the philosopher Gilbert Ryle who was once asked if he read any fiction. Oh , yes. Jane Austen. All of them, once a year).

In Felski’s work there is a background hum of unease with what has happened to the humanities during her career and even before her career begun. Both institutional and broader cultural pressures have turned teachers of the humanities into purveyors of artificially narrowed pre-occupations often enough combined with a narrow-minded demand for conformity, usually in favour of some politico-cultural orientation deemed progressive but not always seen as such by outsiders. (So, for example, myself I see  the “critique” of cultural appropriation as both a bit absurd - because opposing itself to what is probably the main dynamic of all cultural change - and a bit backward-looking - which is to say, reactionary).

Felski occupies a prominent position in the academy and her own particular reservations (cultural appropriation is not something she discusses in this book, I should add, though it often involves getting hooked on something) are expressed in a more nuanced way than might be used by an outsider and perhaps some are not expressed at all.

Her response to claustrophobia is to try to open up the field of what can and should be done under the rubrics of “The Humanities” guided by a theoretical commitment to the Actor Network Theory (ANT) pioneered by Bruno Latour - who I haven’t read. But it seems that the slogan of ANT might well be, “Only connect!” Let me give an example of what might be involved in an ANT-ish opening up. (This is my own example and will show whether I have grasped the point or not).

Suppose we have hitherto worked on the assumption that response to a painting begins at the point when we stand before it (at an appropriate distance) on a gallery wall.  Well, how did we get to that Point? In the immediate past, we ascended the steps of what is probably an architecturally impressive building (that counts as an actor in ANT), passed through turnstiles and past security guards and gallery attendants (there are people who want to steal paintings because they are often worth a LOT of money and the guards remind us of that so they are actors too). We have  side-stepped other gallery-goers who may look older or younger than us, better dressed or worse, unevenly distributed by sex and ethnicity in ways which we may note as placing us in a majority or a minority. So many actors! Eventually, we get to the painting only to discover that twenty seven people got there before us. (Tourist tip: If you are thinking of visiting the Louvre to look at the Mona Lisa…well, Forget It).

All this contributes towards the state of mind in which you at last (hopefully) look at your painting, the identity of which you may now check against the gallery label (another actor).

All that has happened since you climbed the steps has gone into creating the state of mind in which you now stand before the painting. In addition, of course, there is all the preparatory reading you may have done about the painter whose work you are now looking at, or about the period or school within which they worked, and the title of the course requirement essay you have to write. What chance some supposedly pure unmediated response to what is now in front of you?

You might feel that your chances of unmediated response are better when you walk down the street listening to a new album through headphones until one song catches you and even hooks you enough for you to spool back and listen again. And perhaps again. This scenario is also capable of being written up in the terms of Actor-Network Theory,though it might seem that a sudden epiphany, a break -out experience in which you suddenly and unexpectedly attach to something with delight is actually a breaking out from your usual networks. Epiphanies could be described as an unlearning experience. (See footnote) .

Felski is particularly interested in this kind of experience and it explains the title of her book. She thinks we are often coy about admitting that something has hooked us, and especially so in a college classroom where to admit to such enthusiasms might seem out of place - a bit childish, perhaps; a bit down-market; a bit politically incorrect - there are now many readers, female and male, coy about owning up to  enthusiasm for Lolita, novel or films.

There’s not much to argue with in what Felski argues. But the danger - which she seeks to address - is that in place of scholarly narrowness and puritan exclusion we end up with seminar discussions of marshmallow softness, lectures which are hopelessly idiosyncratic (…if I may digress for a moment, I recall Bob Marley and the Wailers ... You what? Yes, it was their first UK tour (Awed silence).Yes, it was in 1972…. (long digression)), and books which though interesting don’t close in on any claims which might exclude other claims. And I’m not sure about claims which do not exclude other claims but rather seek to bundle them all up into a narrative which nods to every interested party. 

I'm also doubtful that the dynamics of places like university seminars can actually accommodate every interested party: in my experience (and it may have been my fault) they tended to gravitate towards vicarage tea parties in which the tutor has fingers crossed that no one will say Fuck or take their clothes off (the latter once, the former more frequently). 

I enjoyed reading Felski’s book. She has an especial talent for incorporating references to the literature - and there are many - into the flow of her writing, so that you are never confronted with Tombstone Quotes which always lead me to the thought that they might be skippable.

*

(Helen Thaventhiran writes an interesting review of Felski’s book in London Review of Books, 27 January 2022 and Rita Felski has a Letter in reply on 24 February).

*

Footnote: This is how I characterise them in an essay "Lifelong Unlearning" included in Duncan Barford, editor, The Ship of Thought (2002) and in a revised version in my Silence Is So Accurate (2017)

 

 

 

 


Tuesday, 22 February 2022

Benjamin J B Lipscomb The Women are Up to Something.

 




This book is better than its rather desperate American titling - it's published by Oxford University Press America rather than OUP Oxford and that is relevant. It’s fluently written and readable, making no great intellectual demands. But it will also help, I suspect, if as  reader you are rather over-awed by OXFORD, a place you would never dream of calling Oggsford or Oxfraud.

For his American audience, Lipscomb provides an accurate, lucid guide to how the University is organised - the collegiate structure, the tutorial system, methods of examination in Arts subjects, character of the degree courses, and so on. But he doesn’t see that many of these things have their downsides nor does he see through the mysteries.

The college system, for example, has always allowed for the creation of backwaters of one kind or another where students may get a poor deal, academically, though the accommodation and the food and the team sports may be good. The colleges have been clublands and their Fellows often enough people - historically, all male and until recently bachelors - only too happy to settle down to a life-time of boarding school existence with the usual private languages, arcane rituals, insider dealing, and hysterical feuds - the latest 21st century one at Christ Church, very obscure and lasting several years, has used up a great deal of money from its supposedly charitable funds and diverted the energy of Fellows from their proper work.

Lipscomb makes sustained early reference to the use of unseen written examinations as the basis on which scholarships, prizes, and degree classifications are awarded and is clearly awed by the thought of how brilliant the scripts must sometimes have been. He is not the only one: Oxford philosopher Michael Dummett seeking to explain (many years after the 1941 event) how Elizabeth Anscombe got a First Class degree in Literae Humaniores (Classical Studies) despite History papers which were indifferent or bad says to an interviewer, “her philosophy papers must have been astonishing” (p 73). The truth is, we can’t know whether they were or not because all those unseen hand-written examination papers, scribbled under invigilated time pressure in the Examination Schools, were burnt very soon after the event. And until recently, no external examiner would have had a chance to look at them before bonfire night since OXFORD saw no need for external examiners. From the nineteenth century on it provided them to upstart provincial universities who, of course, needed to be policed. (By the way, Oxford is not in London; it’s in heartland Church & King provincial England, but don’t mention that …). Anyway,  we can’t know how astonishing some of this student scribbling was; none of it was ever published to show what a Model Essay might look like. In Anscombe’s case, all it took for her to get a First Class degree - against the norm that to do that you had to show some merit in all your papers - was for the Philosophers to outvote the Historians on the Examination Board and leave them to grumble later over the port. 

I discovered about the burning in 1967. I had sat for a university-wide undergraduate  prize in Economics awarded on the basis of the usual timed examinations, answering previously unseen questions. One paper required that just one question should be answered in the three hour time allocation. I wrote an answer to the question, “Is Economic Growth a Good Thing?”  - precisely the kind of “general” question in which Oxford specialises and about which Lipscomb writes several times. I was pleased with what I had written - it must have run to three or five thousand words. So when I got the letter telling me I had been awarded the prize (worth about twenty percent on top of my total annual income) I wrote asking if I could have my script back because I thought I could work on it and improve it. Back came the reply: It’s already been burnt. So no one will ever know how astonishing (or not) it was and don’t ask me because I don’t know either. 

(Imagine an Art College where the drawings and paintings submitted for your Degree Show were assessed and then burnt).

You might say that the mysteries of Oxford are those of a literate culture which has never placed high value on the written word, and preferred oral debate, oral traditions, and gossip in the cloisters. 

Lipscomb is good when he situates the formation and rise of his selected Quartet of female philosophers in the context of the Second World War. Though the four have been selected as a friendship group all born in 1919 - thus completely excluding their near contemporary, Mary Warnock, born in 1924 and also a noted Oxford moral philosopher - the four were part of a war-time cohort of female students who benefitted from the sudden disappearance of all the young men sent off to war. Only the men who were old, infirm, or claimed to be especially devoted to doing God’s work, remained. As a result, the proportion of female students increased to the extent that they were no longer a marginal presence, living in out-of-town poor colleges where the food and the wine cellar had never been up to much. But the women's colleges were probably more meritocratic than the men's - where birth and private school attended counted for more - with Somerville possibly the most meritocratic and Lady Margaret Hall the least among the women's colleges. But I don't have data and I don't know if they exist, though schools attended are probably reconstructible for many cohorts from even the distant past.

The war time female students got better tuition than normal since there were underemployed college tutors all over Oxford.  Lipscomb documents their access to well-above-average tutors, notably Donald MacKinnon. He was a talented and charismatic figure and female students who had the hots for him could and did disguise their infatuation as enthusiasm to join him in a religious Quest.  Much more exciting than having an affair with a married man. (Lipscomb documents it in the case of Iris Murdoch but is less cynical about it all than MacKinnon’s wife was, or I am).

It’s because they are all dead that Lipscomb can write about such things and include the scurrilous anecdotes, notably those which feature the eccentric Miss Anscombe, mother of seven children by her husband, Peter Geach. Anscombe and Geach were Roman Catholic converts of the More Papist than the Pope kind - some of it really quite inhuman. The message got through to their children, at least according to the one anecdote I recall from my time in Oxford (oh, it’s apocryphal and false, of course, of course): A new babysitter had gone to the house one evening to look after the children; left alone with her, they lined up to declare: You can’t tell us what to do. We are the Anscombe - Geach children.

Pulling rank was not the preserve of children and it did happen too among marginalised female philosophers. But the men were clearly worse and Lipscomb shows the Great Men of the time behaving badly, though not necessarily atrociously so and not always without provocation: J L Austin, A J Ayer, R M Hare, Gilbert Ryle …. The way I look at it some of these figures, both male and female, were indeed  important cultural figures whose work has been influential and will continue to be read in different cultures. Austin and his colleague H P Grice (who doesn’t figure in Lipscomb’s narrative) were the inspiration for a whole new world-wide scientific approach to language, language pragmatics, and are read as such now. So their work eclipses their time, their personal weaknesses, and their eccentricities.

But some of them were simply big fish in a pond smaller than they imagined. Oxford in the past was  a small university, reflecting the fact that only a very small fraction of each age group in Great Britain proceeded to university education, maybe 3 to 5% - it was 5% as late as the early 1960s as far as I can establish. In the sciences both Cambridge and Oxford were world leaders from early in the twentieth century. But in the Humanities the stand-out figures are few, the most obvious being Wittgenstein who though eventually a Cambridge professor was an institutionally marginal figure. Anscombe  succeeded to the same university chair and kept up the tradition of personal eccentricity and obnoxiousness. But though a significant figure in twentieth century British philosophy with three volumes of Collected Papers published in the 1980s, is she more than that? And as a Roman Catholic thinker?

 

 

 


Friday, 4 February 2022

Review: Jean Rhys Good Morning, Midnight

 




I think of Jean Rhys as an Expressionist writer whose short sentences are like bold brushstrokes in startling, unpredictable colours. It’s the kind of writing facilitated by having a bottle of wine or whisky or both beside you, and provides fresh supporting evidence for the old claim that in vino veritas.

In her writing, Rhys has to get past both internal inhibition and external censorship. In real life, she does things like taking money for sex which place her outside polite society  and into the demi-monde and she takes as husbands and lovers men who are accidental or professional criminals -  criminal enough to go to jail.

The alcohol helps her evade inhibition but the external censorship is evaded by literary crafting of a character sometimes described as obsessive. Whether the crafting was done drunk or sober I don’t know; either way, Rhys is a great stylist.

Good Morning, Midnight published in 1939 and set mostly in Paris of the period contains significant material written in French, not just in passages of spoken dialogue but in narratorial sentences written in a mix of English and French. Sometimes the French is left untranslated even (and especially) when the words are not ones which would have been learnt at school, like maquereau [pimp] at page 72. It’s not glossed though the context at least half-way enables a guess as to its meaning, “What she wants is three hundred francs to give to her maquereau. Will I give her three hundred francs for her maquereau?” Sometimes it is glossed but with a delay.

In one long passage which extends over pages 42 - 53 and is then picked up again at page 149, much is made of a contrast in register which could not in 1939 have been clarified with accurate English translations; the censor would have intervened.

Thus, a tall English girl who “speaks French very well” (page 38) looks at the narrator sitting in a café and exclaims, “Et qu’est-ce qu’elle fout ici, maintenant?” (page 39). The dictionaries, even supposedly modern ones,  are coy but this could be translated as, “What the fuck is she doing here, now?”, the fout being in the same register  as that in the common expression, Foutez-nous la paix! [Fuck off and leave us alone!]. Things change over time, and in the 1930s, maybe less forceful translations would be in order: What the hell is she doing …? Piss off.../Clear off … Rhys does not translate but instead writes, half a page later, “But what language! Considering the general get-up what you should have said was, ‘Qu’est-ce qu’elle fiche ici?’ Considering the general get-up, surely that’s what you should have said. What language, what language! What would Debenham & Freebody say, and what Harvey Nichols?” [These are the names of middle-class/upper middle-class London department stores where most of the shoppers would have been female].

But nothing has been translated at this point; the reader may be able to work out that ficher is less coarse / vulgar/ offensive than fouter but even though “Qu’est-ce qu’ elle fiche ici?” could have been glossed as, say, “What on earth is she doing here?” without troubling the censor, it has not been done.

The narrator returns to the language issue at page 41 when she speculates that the girl said fout,  “partly because she didn’t like the look of me and partly because she wanted to show how well she spoke French and partly because….” But though no translation has yet been provided, the narrator then pulls herself up for her obsessional behaviour, “…why get in a state about it? (page 41).

But the theme is continued at page 42, where she adds an implied thought to the original and at last provides a translation, but a censored one:

“Qu’est-ce qu’elle fout ici, la vieille? What the devil (translating it politely) is she doing here, that old woman? …. I quite agree….I am asking myself all the time what the devil I am doing here”

That’s at top of page 42; bottom of page 43 the theme is again reprised:

“Now it’s getting dark. Now the gates are shutting. (Qu’est-ce qu’elle fout ici, la vieille?)”

The play with language then shifts at pages 44 -47 to simple repetitions in the key of Shakespeare. She starts with a half-allusion (“tomorrow, tomorrow…”) and then runs through “Back, back, back…”, “Hours and hours and hours”, “Courage, courage”, “Jesus, Jesus….Mother, Mother”, “Chloroform, chloroform”, “Back, back, back …” and several  more.

But fouter / ficher gets another outing at page 53, when a male acquaintance describes a friend he wants her to meet and sums up in untranslated French, “Mais au fond, vous savez, il s’en fiche de tout, il s’en fiche de tout le monde”. And the narrator then takes one line to comment, “He sounds fine” which in the context of all that has gone before ought to raise a smile.

 At page 149 (the novel is 157 pages in my edition) we get one last throw of the dice:

“The damned room, grinning at me. The clock ticking. Qu’est-ce qu’elle fout ici, la vieille?”

Microsoft has had a field day with green and red squiggles all over this. I guess it struggles with many novelists. They do things differently.

 


Wednesday, 19 January 2022

Review: Neil Levy Bad Beliefs. Why They Happen to Good People

 





This is a very interesting, very readable book not least because the author, though presenting himself as a professional philosopher, draws on a range of theories and data culled from contemporary cognitive and behavioural sciences. His leading idea is that we live in an epistemic [knowledge] environment which is now heavily polluted by poor research, fake news, conspiracy theories, and so on, but where understandable concerns about free speech and poorly thought through notions of “balance” have so far prevented any serious clean up. This is important because self-help strategies alone cannot rescue individuals from forming false beliefs and making unfounded claims to knowledge. Indeed, it is the very fact that we are rational animals deploying our efforts in scarce real time that often enough leads us to false beliefs. We are reliant on those around us and society’s institutions for our sense of what is true and what is false and they ain’t making it easy. Reform is needed.

These problems are not as contemporary as Levy presents them; there is a long back story of concern about the power of charlatans, demagogues, and revolutionaries to lead even the thoughtful astray. That concern, in a recognisably modern form, dates at least from the eighteenth century. For example, when the charlatan Anton Mesmer started to make a big hit in pre-1789 France, the leaders of institutionalised French science and philosophy sought to articulate ways in which an ordinary non-expert person might see that he was a charlatan even without detailed understanding of his theories, such as they were[1].  True, in retrospect Mesmer clearly had some prescient understanding of aspects of the human mind which were then little understood. But this was incidental: Mesmer was running a (literal) theatre for the sake of fame and money. Something similar might be said of Jacques Lacan in the latter part of his life[2].

It would be easy to suppose that the institutional leaders were only concerned to keep outsiders outside but, as it happens, the most intellectually powerful figure among them - the Marquis de Condorcet - was not only a democrat who believed in a wide franchise (which would include women) but was responsible for a 1785 work in the theory of probabilities which showed that there is, as far as knowledge is concerned, strength in numbers: if certain conditions are met, it is rational to defer to majority opinion because the majority is more likely to be right than wrong and the bigger the absolute gap between majority and minority, the greater the reliability. Condorcet and his followers took  juries as a prime location for application of his probability work for it leads to the  conclusion that - conditions met -  larger juries are better than smaller ones and a requirement of unanimity or near unanimity much, much better than simple majority vote. What Condorcet could not anticipate was the TV programme Who Wants to Be A Millionaire which demonstrates the truth of his claims in near-perfect form.

Condorcet showed that majority voting is a good guide to truth:


(1) the more enlightened (knowledgeable) is each individual voter, with a minimum requirement that they be more likely to be right than wrong on any one occasion (p = greater than 0.5)

(2) provided that when voting, voters are trying to give the right answer
(3) and provided that they vote independently of each other - if one voter follows the lead of another, that simply reduces the effective number of voters

If these conditions are met, then in a majority vote the probability of the majority being right increases (and quite dramatically, heading towards p = 1 [certainty] ) the larger the absolute vote gap between majority and minority.

In the TV quiz show Who Wants to be a Millionaire? contestants have the right to ask the large audience for help with a question to which they don’t know the answer. Audience members are then offered four possible answers to the question and asked to select the right answer from among them. It’s rational for the contestant to confirm the audience’s majority answer as their own - and for these reasons:

(1) the Audience is likely to be quite knowledgeable. Quiz show live audiences are likely to contain a high proportion of people good at quizzes, so p is likely to be greater than 0.5.
(2) members of the audience have no motive to give answers they believe to be untrue (they enjoy giving right answers!)
(3) they vote independently of each other using push-button consoles with little or no time to consult the person sitting next to them and explicit instructions not to.

Hey Presto, the audience's choice of right answer will, almost certainly, BE the right answer. If some researcher checked back over Ask the Audience choices, I think they would rarely find that the Audience got it wrong. Ask the Audience is a No Brainer if you don't know the answer yourself[3].

In nineteenth century England, progressive thinkers who wanted to see a more democratic society with a wider franchise always met with objections which insisted that the masses were too ignorant and too liable to be seduced by charlatans for it to be safe to entrust them with the vote. To this, progressive thinkers - most obviously the Utilitarians - responded with proposals for extending education to all, but also with a defence of the desirability of deference  to scientific authority. Deference is a key term in Levy’s book and his criteria for identifying those people and theories worthy of deference map very closely onto criteria proposed by Utilitarian thinkers. Thus Levy writes,

“A scientific consensus is reliable when it has been stress-tested, by all the disciplines relevant to the topic, for an extended period of time” (page 109)

Compare this passage from a book published in England in 1849:

“With respect to the subjects of speculation and science, the existence of an agreement of the persons having the above qualifications [natural ability, long study, personal honesty] is the most important matter. If all the able and honest men who have diligently studied the subject, or most of them, concur, and if this consent extends over several successive generations, at an enlightened period, and in all or most civilised countries, then the authority is at its greatest” (page 42) as it is in astronomy (page 43).

This passage is taken from An Essay on the Influence of Authority in Matters of Opinion by Sir George Cornewall Lewis, usually reckoned a minor Utilitarian thinker. One of his major themes is that when it comes to forming rational beliefs and opinions we have to rely on others; we cannot do it alone. This is the theme to which Levy devotes chapter 4 of his book, starting with a critique of the individualism of traditional epistemology and its impossible reliance on a chimera of “unaided individual cognition”. Sir George Cornewall Lewis provides the basic argument against this view:

“I firmly believe…that the earth moves round the sun; though I know not a tittle of the evidence from which the conclusion is inferred. And my belief is perfectly rational, though it rests upon mere authority….” (page 63).

That’s the way it is: we rely on authority much more than we imagine, we have to do so, and it is fully rational to do so IF the epistemic environment is not degraded. Unlike Levy, Lewis simply asserts that the environment is a healthy one for the passage I have just quoted continues:

“….all who have scrutinised the evidence concur in confirming the fact; and have no conceivable motive to assert and diffuse the conclusion, but the liberal and beneficent desire of maintaining and propagating truth” (p 63)[4]

Were that it was ever thus! A substantial  part of Neil Levy’s book tells the depressing story of the many ways in which it ain’t like that. I will not summarise the various stories which extend well beyond the obvious ones: denying the Holocaust, denying climate change, denying the efficacy of vaccines. Levy’s central proposal is that we need to clean up the epistemic environment on which we have to rely and he looks to nudge theory for inspiration, arguing in chapter 6 (“Nudging Well”) that nudges can support our personal rationality and autonomy, not manipulatively undermine them as is often enough alleged.

I can think of just one alternative that has ever been proposed: Auguste Comte reckoned that the intellectual environment of his time was pretty polluted; he thought the solution was to quarantine himself from it. He called this “mental hygiene”. I am sure he would have recommended disconnection from Facebook, Twitter, and much more besides.

 

 



[1] Robert Darnton, Mesmerism and the End of the Enlightenment in France (1968)

[2] Trevor Pateman, https://www.academia.edu/43059186/Jacques_Lacan_in_the_text_of_Elisabeth_Geblesco

[3] Trevor Pateman, https://www.academia.edu/43058086/Majoritarianism_An_argument_from_Rousseau_and_Condorcet

[4] https://www.academia.edu/43045987/Liberty_Authority_and_the_Negative_Dialectics_of_J_S_Mill

This paper, along with that on Majoritarianism previously cited,  is based on my  M Phil thesis How Is Political Knowledge Possible? (University of Sussex, submitted 1977, awarded 1978)


Tuesday, 18 January 2022

Review: Michael Morris Real Likenesses

 


 

Back in 1975 a philosopher, Colin Radford, presented a paper to the Aristotelian Society in London with the title, “How Can We Be Moved by the Fate of Anna Karenina?” The paper has since been much discussed. There are at least two ways of hearing/reading the title, one of which invites us to think of an explanation for a remarkable phenomenon and the other invites us to pause and seek to explain this hitherto taken-for-granted common phenomenon.

My guess is that most adults are much more frequently moved by the fate of characters in novels, plays, and films than they are by the fate of people in “real” life, exception perhaps made for those who work in settings like hospital intensive care units. Equally, this being moved in “unreal” situations - to tears, laughter, or orgasm [tragedy, comedy, pornography] - is almost entirely occasioned by the narrative arts. To put it bluntly, you do not often see people leaving art galleries in tears.

The capacity to be moved by narratives appears early. Many years ago, I had begun reading very short book-at-bedtime stories to my younger daughter, using illustrated books and (as you do) pointing my finger as I read. I got to the point in one story where a young girl who has been entrusted with carrying a basket of eggs to market drops them and all the eggs break, a drawing  - to which I pointed - illustrating that fact. Oh dear! At this point my daughter, aged two and a bit and until this moment listening quietly, burst into tears.

That sort of reaction is unlearnt. It is a primitive or, as I would say, it is a natural reaction. Nothing to see there, no special explanation required.

True, there is a larger adaptive-evolutionary story which can be connected to this unremarkable fact. I tell it first in relation to vision. Seeing is not something of which human beings are capable; it is something to which, unless blind, they are liable. That is thanks to a remarkably powerful vision module which goes into action at birth and rapidly achieves its more or less final state, though much later affected by the physical decline of our bodies. The vision module works very fast and very reliably which is hugely adaptive; it saves us from many accidents and other catastrophes. True, it can be fooled as it is by the very recently invented  Müller-Lyer illusion, Escher’s drawings, and trompe l’oeil paintings. But this is a small price to pay for the many benefits which accrue from the fast operation of an encapsulated system which does not generally tolerate interference from our thinking. There are exceptions: someone may guide us towards seeing something which we don't see spontaneously, often a resemblance which we didn't catch - a resemblance which was not transparent  but whose translucence could be seen through by means of a verbal hint..

Similarly, in my view, human beings are natural believers - they are credulous. That does mean that they are easily fooled but being a believer is more adaptive, on balance, than being a natural sceptic. When I go hunting with my tribe, it is really not a good idea to play the sceptic when I hear one of my fellow-hunters shout, “Watch out! Snakes in the grass!” My natural credulity leads me to do the sensible thing; I watch out, promptly and without question. As it happens, this priority of credulity can also be given some kind of philosophical justification; you cannot even get going in life - even solitary life - unless you take things on trust. Things just have to be taken as they appear if you are to get started; doubt comes later and it’s a second-order ability (or, for some people, liability).

So it’s not surprising that we are fooled by fictions. There is no need for “willing suspension of disbelief” because there is simply no disbelief to be suspended. There is a part of the brain - a module if you like - which simply does not discriminate fact and fiction. We know this in a common sense way when we complain that horror films frighten us, romantic novels make us cry, and pornographic films arouse us sexually even when we don’t want them to. Well, as in the rest of life, want doesn’t mean get.

The story can be continued. When philosophers think about the use of language, they usually have in mind acts of speaking or reading, which are thought of as voluntary acts. They should more often think about hearing. If someone says something to you, audibly, in a language you understand, you have no choice about understanding the words even if you may not understand their import. Someone says, “I like your hat” and that may be compliment or sarcasm and you may not be sure which; but it is not a voluntary act or decision to hear the words, “I like your hat”. You just hear them and understand them, at least at some basic/literal level. You may really, really dislike hearing and understanding certain words but you really, really can’t avoid it. That’s why you often press people to not use (or mention) them at all.

From that I derive the conclusion that the human capacity for language is not originally an ability; it begins as a liability - we are liable to language. That’s why children learn their mother tongue more or less effortlessly as they hear it being addressed to them and, more generally, spoken all around them. Language “acquisition” is something that happens to them, more like a viral infection than a property deal.

As a final example: the capacity to recognise objects depicted in a photograph or representational drawing is also a liability; it’s unlearnt and untaught. Deprivation experiments have been conducted to show that that is true: withhold all pictures from a child until the child can speak, and the child can name, unprompted, the objects in the photographs or drawings now revealed to them.

In his book Real Likenesses Michael Morris adopts the approach of the analytical philosopher, rather than that of the naturalist / psychologist, and seeks to characterise the formal status (ontological, epistemological, logical) of those artist-representations of things which affect us - in paintings, photographs and novels - rather as if they were real things. So a painted face is the  real likeness of a face and  a photographed face the real likeness of a face; and likewise a fictional/novelistic character is a real likeness of a character.  The argument is lucid but highly technical and I am not going to go into detail in this little review/essay. As an analytical philosopher, Morris aims for completeness, absence of contradiction, absence of falsifying examples, and no unresolved paradoxes. In philosophy, that’s always a tall order.

Towards the end of the book, Morris has a section (pp. 204-214 ) which discusses how proper names work in novels and he quotes the opening sentence of Muriel Spark's novel Memento Mori:

“Dame Lettie Colston re-filled her fountain pen and continued her letter”

The problem with this - as he partly acknowledges (p.207) - is that “Dame Lettie Colston” is not a good example of a proper name. At its very first appearance, it is much more than a proper name; it’s loaded with descriptive content. I open the novel, perhaps knowing nothing about it, but three words in I already know a lot. I now know that the novel is likely to feature prominently - unless I am being tricked by an unreliable narrator - a woman ( that takes out fifty percent of the world’s population) whose British /English title of “Dame” takes out about 99.9999% percent of the remaining fifty percent. The British/English probably resolves to English with the “Lettie Colston”, taking out the Scots, the Welsh and the Northern Irish. We are now down to a few thousand and possibly just a few hundred members of a class (“English Dames”) of whom Lettie Colston is going to be presented as a fictional member, either typical or untypical. We will have to continue reading in order to discover which. Proceed to the words “fountain pen” and with a novel published in 1959 when the battle between fountain pen and biro was fully joined, we also have to read on to discover if this shows Lettie taking sides or whether it shows that the time setting of the novel is that past in which biros did not exist. It is part of the novelist's skill to know how to keep us reading; in that first sentence, Spark tells (or shows) us quite a bit about one character  - but not quite enough. So we have to read on.