Philosopher John Corvino (Wayne State) kindly invited me to share this funny item:
“How Do You Solve A Problem Like My Vita?”
(To the tune of “How Do You Solve A Problem Like Maria” from The Sound of Music. Written by John Corvino and Thomas Williams during their job-market days, with apologies to Rogers and Hammerstein.)
How do you solve a problem like my vita? How do you find a job that's tenure-track? What should I call myself here on my vita? A metaphysician, an ethicist, a quack? Many an area I've claimed to cover. Making up research goals can be a chore. But if they only knew, the "Papers Under Review," have been rejected sixteen times before . . . Oh, how do you solve a problem like my vita? How do you keep from feeling like a whore?
[Quickly:] At the smoker I feel sick, sipping free beer much too quick, And this polyester suit is getting warm. Unpredictable professors, awkward talkers, tacky dressers: It's the pure apotheosis of bad form. Did my interview go well? From their questions I can't tell. But the chair was looking at me very odd. When he started turning red, I should not have plowed ahead. I'm a moron. I'm a genius. I'm a fraud.
.... How do you solve a problem like my vita? How do I stand out from the current crop? How do I look as a potential colleague? An arrogant jerk, a scatterbrained dork, a flop? What if they ask me how I'd teach aesthetics? How can I fake an interest in Descartes? The salary is poor, the teaching load's 4-4, But at the APA you play the part. Oh, how do you solve a problem like my vita? How do you make a charlatan look smart?
Please, please, will scientists, science journalists, and readers wake up to this truism: if the mind is the brain, any mental difference will be a brain difference. Suppose there are some actual mental differences between men and women, whatever their prior causes. (Hard to imagine training up half of humanity one way, half another, without creating some differences between them.) There will then be some neural differences. Suppose you have two televisions, whose images are different. You call in the technician, who trumpets the discovery that they differ in their pattern of pixels. How remarkable! Actually, no. That bit we knew already: no difference in the images without a difference in the pixels. Same for ourselves: no difference in states of mind without a difference in states of brain. That doesn’t mean it has to be that way, or is designed to be that way. Even if your mind is your brain, that doesn’t mean ‘your brain made you do it’—as if the ‘you’ were a different being, a soul puppet whose strings are pulled by your neurons. Let’s not fall for this confusion, or we’ll take what happens to be the case, and freeze it. We’ll take differences, however they may have come about, and make them seem inevitable and appropriate. We don’t need this deterministic fairy-tale. It’s bad for men and women, bad for science, bad for us all.
Robert Paul Wolff, as I had hoped, provides some useful commentary. Let us not forget that, during his long career, Mandela was an advocate of what would now be called "terrorism" against the South African state, and correctly so. He survived a quarter-century of unjust imprisonment, and yet, remarkably, assumed the mantle of political power without seeking revenge. He was a very unusual man, whom I suspect we do not yet understand, but Professor Wolff is surely right that there was something "great" about him, notwithstanding the debasement of that adjective by its promiscuous application to war criminals, political hacks, and servants of the ruling classes.
AND ANOTHER: Albie Sachs, a former justice of the South African Constitutional Court and himself a most impressive and remarkable individual, made an illuminating and moving set of remarks about Nelson Mandela several months ago.
Over the years dossiers for applicants for junior positions have gotten larger and larger. Some dossiers we have received for our current search number more than 80 pages, One thing that is particularly striking to me is the number of letters of recommendation that each dossier contains. Back in the dark ages, when I was in graduate school, dossiers included on average three letters. In this year's search not a single dossier that I have seen includes that few. Many include six, seven or even eight letters. I don't fully understand the reason why, though. I am tempted to attribute at least part of the ever increasing number of letters to understandable anxiety on the part of job seekers. There seems to be a view that the more people you can get to say good things about you, the better off you will be. I certainly understand and sympathize with the anxiety. On the other hand, I doubt that there is a single search committee that is eager to read an average of six or seven letters per file. Nor do I think that more letters means more information or a more honest assessment of the candidate. Just for comparison sake, a typical tenure file at my own university will generally include around eight outside letters. Do we really need the same number of letters for a junior appointment as for a tenure case? Personally, I don't think so. But I am sure reasonable people can disagree on this score and would like to know what others think.
To stir the pot just a little, let me add the following. In some number of cases -- though I'm not prepared to quantify whether it be many, most, or just a few -- the additional letters come from people not from the candidate's home institution. Sometimes this makes some sense and is helpful -- if, for example, the outside writer has taught, supervised, collaborated with, or been colleagues with the candidate. Outside PhD examiners are often asked to write, for example. And you would certainly expect people who have moved a bit from place to place to have letters reflecting that fact. Or, in another vein, if a candidate has worked in somebody's lab, say, as sort of the philosopher in residence. All that makes sense and I have no beef with it. On the other hand, many outside letters seem to be the result of proactive networking on the part of candidates. Perhaps the thought is that it is important to have letters from prominent people in other departments with whom one has had some degree of contact. The thought may be that this networking will help increase and demonstrate the reach and impact of one' s work. In a few rare cases, such letters may carry some tiny bit of weight. But that, I think, really is the exception rather than the rule. In my experience, most such letters are relatively bland and pointless. They are neither particularly helpful nor, thank goodness for the candidates, particular hurtful. I suspect that it's an issue of divided loyalties. Seldom have I seen a letter from prominent philosopher P from institution X writing for candidate C from institution Y that directly compares C to P's own students at P's own institution. Most of what such letters add is generic praise. But there's usually enough of that in the "core" letters to suffice.
I'm not sure anything can stop the momentum for more and more letters. And maybe I wrong to think the momentum needs to be stopped. Maybe I'm just tired from reading thousands of pages of files. But I thought this might make a good discussion point for your blog.
I generally recommend candidates have not more than five or six letters, on the assumption that the 5th and/or 6th letters really add something--e.g., a letter from someone outside the department who is expert in the candidate's area and can add something useful about the candidate's work. What do readers think? Signed comments preferred, though grad students and job seekers may post with a pseudonym (pick a distinctive one).
I get this question all the time; here's the most recent version:
I’ve been attending a community college for a few years now, and I’ll be earning my AA in Philosophy in a couple of semesters. I’ve been checking out some of the local universities, but nobody offers a philosophy degree achievable by night classes or online. I found an online undergraduate degree program at Arizona State University, but I’m not sure if they’re credible. Do you have any suggestions or advice for someone like me?
Readers? Signed comments preferred; details would be helpful (i.e., details about the program, why it was good, or why it was bad, etc.).
We are entirely capable of knowing what policies best contribute to people leading positive and rewarding lives.
In recent decades, social scientists have been studying human happiness in the same way we study any other human attribute. Vast new multidisciplinary research has emerged around the proposition that it is possible to empirically measure the extent to which people view their lives as satisfying.
So what conditions best promote more rewarding lives?
The answer is simple and unequivocal: Happier people live in countries with a generous social safety net, or, more generally, countries whose governments "tax and spend" at higher rates, reflecting the greater range of services and protections offered by the state. (These findings come from analysis of data from the World Values Surveys for the 21 Western industrial democracies from 1981 to 2007 for my book "The Political Economy of Human Happiness." Similar findings have been reported in peer-reviewed journals like "Social Research" and the "Social Indicators Research.")
The relationship could not be stronger or clearer: However much it may pain conservatives to hear it, the "nanny state," as they disparagingly call it, works. Across the Western world, the quality of human life increases as the size of the state increases. It turns out that having a "nanny" makes life better for people. This is borne out by the U.N. 2013 "World Happiness Report," which found Denmark, Norway, Switzerland, the Netherlands and Sweden the top five happiest nations.
As a survivor from the wartime group [at Oxford], I can only say: sorry, but the reason was indeed that there were fewer men about then. The trouble is not, of course, men as such – men have done good enough philosophy in the past. What is wrong is a particular style of philosophising that results from encouraging a lot of clever young men to compete in winning arguments. These people then quickly build up a set of games out of simple oppositions and elaborate them until, in the end, nobody else can see what they are talking about. All this can go on until somebody from outside the circle finally explodes it by moving the conversation on to a quite different topic, after which the games are forgotten. Hobbes did this in the 1640s. Moore and Russell did it in the 1890s. And actually I think the time is about ripe for somebody to do it today. By contrast, in those wartime classes – which were small – men (conscientious objectors etc) were present as well as women, but they weren't keen on arguing.
It was clear that we were all more interested in understanding this deeply puzzling world than in putting each other down. That was how Elizabeth Anscombe, Philippa Foot, Iris Murdoch, Mary Warnock and I, in our various ways, all came to think out alternatives to the brash, unreal style of philosophising – based essentially on logical positivism – that was current at the time. And these were the ideas that we later expressed in our own writings.
A number of interesting and/or curious claims here (I am particularly struck by the confident dismissal of logical positivism, and by the implication that Oxford no longer produces talented female philosophers), but I'm curious what readers think? Signed comments will, of course, be preferred.
Mary Midgley, aged 81, may be the most frightening philosopher in the country: the one before whom it is least pleasant to appear a fool. One moment she sits by her fire in Newcastle like a round-cheeked tabby cat; the next she is deploying a savage Oxonian precision of language to dissect some error as a cat dissects a living mouse.
Interesting piece from Jacobin. (Its partial target is Mark Oppenheimer, one of the scummier journalists I've encountered, but the piece is of interest independent of the easy mark. Adorno even makes an appearance!)
Authors and/or publishers kindly sent me these new books this month:
Portraits of American Philosophy edited by Steven M. Cahn (Rowman & Littlefield, 2013) (intellectual autobiographies by, among others, J.B. Schneewind, Judith Jarvis Thomson, Ruth Barcan Marcus, and Harry Frankfurt).
Hegel's Thought in Europe: Currents, Crosscurrents and Undercurrents edited by Lisa Herzog (Palgrave Macmillan, 2013).
Professor Ivano Caponigro, a linguist at UC San Diego, asked me to share the following:
I'm working on a biography of Richard Montague (1930-1971) that aims to reconstruct his intellectual and personal life, his contributions, and his legacy.
Please contact me if you knew him personally (or just met him a few times) or have any material from him or about him (letters, manuscripts, pictures, audio recordings, etc.) or if you know anybody who knew him or may have material about it.
“Julius Caesar!” He looked up with genuine astonishment. He was a philosopher. Why on earth would anyone ask him to read a dissertation about Julius Caesar?
“Shakespeare’s Julius Caesar.”
“Oh. Right.” As though that made it clearer.
The young woman who’d cornered him in his office was a complete stranger. He’d never laid eyes on me, nor I on him. Doubtless I was wearing a leather mini skirt and high brown boots, my usual garb back then.
My take on the play was vaguely philosophical— very vaguely. I had ideas about Shakespeare and 17th century nominalism, though no philosophical context to fit them into, nor any other context, actually, self-directed as I was, the usual experience for women graduate students at Columbia those days. The Shakespearean hadn’t liked my topic, said it was “too modern and psychological,” so I approached the Miltonist with the idea, who said, uh, no, it would be awkward for him to direct a Shakespeare dissertation. I went away and came back a year later with a completed dissertation, throwing myself on the mercy of the Miltonist, who said, okay, okay, he’d try to set up a committee, there was a theater person who might read it, and a nice young assistant professor who more or less had to say yes since he didn’t have tenure, and he suggested that I talk to Arthur Danto. And that was my committee. They more or less smuggled me out the back door, unbeknownst to the Shakespearean (who found out years later, after I was comfortably ensconced on the west coast).
So that’s what I was doing in Danto’s office that spring morning, with my request. He didn’t have to agree, he was an eminence, even then; I wasn’t his student, and the dissertation was very long. But he came through.
After the dissertation defense, I stood outside in the hall, Philosophy Hall, waiting for the verdict. The committee filed out, shook my hand, congratulated me, handed me a few pages of typed notes, and went away.
Everyone except Danto. He stood there, I stood there, an uneasy moment, then he said, “By the way, do you have a job?”
“A job?” I had, as a matter of fact, just lost the adjunct position I’d been counting on to keep me in New York for the next few years. It was 1974; the job market had crashed.
“Would you like a letter?”
“Like… a recommendation? Sure, that’d be great.” It had never occurred to me to ask, I was that clueless.
He needn’t have offered—nobody else did. And he wrote not just once but many times through the years, as I applied for grants, fellowships, positions. He wrote a good letter; I never saw it but I was told it was eloquent. It opened doors.
Those were the only times we met. We corresponded when he was writing for The Nation and, for a time, so was I; we said, the next time I was in New York, we’d get together, etc., though we never did. But I remember him more fondly and vividly than I do almost any of my Columbia professors, as the one who had a sense that there was a person on the line, a person who might need a job.
I think that’s what we take from our teachers, finally, not so much information imparted as a sense of who they are.
I was struck by this from a short Times piece about Appiah, in which he mentions "the most important questions facing us — gender, the environment, animal rights." Are those the "most important questions facing us"? Discuss.
Not really, superficial similarities notwithstanding. When he endorses the theory of ideology and non-teleological historical materialism, that will be different. Right now he's at the moralizing utopion socialist stage.
Jonathan Wolff (UCL) comments. At the end of the column, he runs together two issues that should be kept separate: the combative nature of philosophy and how one should treat students. Professor Ishiguro's approach on the latter seems the right one, but that is independent of whether philosophy as practiced among peers should, or should not be, combative. Insofar as truth is at stake, combat seems the right posture!
Anthony Appiah (ethics, political philosophy, philosophy of race) at Princeton University has accepted a senior offer from New York University, to start in 2014. According to the university press release, "He will spend half the year in New York teaching in the Department of Philosophy and School of Law; the other half of the year he will teach and lecture at NYU’s other global sites, principally Abu Dhabi."
Derrick Darby (social & political philosophy, African-American philosophy), Professor of Philosophy at the University of Kansas, has accepted a senior offer from the Department of Philosophy at the University of Michigan, Ann Arbor, where he will start in fall 2014. With Darby and Elizabeth Anderson, Michigan will now be one of the very top choices for students interested in philosophy of race.
What do people think of grad students making their work available online early in their careers? It seems pretty common for students still doing coursework to post paper drafts on Academia.edu, even when the drafts are far from publishable, and I'm not sure if the potential advantages outweigh the potential costs. Having your name known is probably good, as are generating interest in your research, starting conversations with people working in the same area, and maybe showcasing your potential; on the other hand giving people a preview of your immature work might cut the other way if rather than recognizing in it scholarly potential they just see bad grad student papers. Having early drafts online also means that reviewers for a journal can google the title of your paper and see who wrote it, thereby removing anonymity. I recently had a paper of mine show up in a collection of links on a blog I'd never heard of and suddenly Academia.edu was informing me of dozens of views per day. For a moment I was excited, but then I remembered that people were reading a hastily-composed workshop draft. I think the ideas and prose represent me well but there are sections missing and the argument needs substantial work (as I learned when certain parts of it got shredded at the workshop). I thought about taking the paper down and leaving just an abstract, or removing it altogether, but I never made it past worrying. Did I err in leaving it up for public scrutiny?
We did touch on aspects of this topic briefly once before, but these questions are more specific. I'll note that this is a major issue in academic law because of the substantial use of SSRN. The question there is when junior faculty or job seekers should put their work on-line; my advice is not to put anything on-line until they think it's ready to be submitted for publication. The reason is simple: first impressions are sticky, and if you put a half-baked piece of work on-line, you're unlikely to get a second chance with that reader. I think the same advice applies to grad students--great to put work on-line that's at a stage where you and the faculty you work with think it's ready (or very close to ready) for submission to a journal, otherwise not.
What do readers think? Faculty, please use your name; students may use a distinctive pseudonym, but must include a valid e-mail address, which will not appear.
In the past, the U.S. has sometimes been described sardonically — but not inaccurately — as a one-party state: the business party, with two factions called Democrats and Republicans.
That is no longer true. The U.S. is still a one-party state, the business party. But it only has one faction: moderate Republicans, now called New Democrats (as the U.S. Congressional coalition styles itself).
There is still a Republican organization, but it long ago abandoned any pretense of being a normal parliamentary party. Conservative commentator Norman Ornstein of the American Enterprise Institute describes today's Republicans as "a radical insurgency — ideologically extreme, scornful of facts and compromise, dismissive of the legitimacy of its political opposition": a serious danger to the society.
The party is in lock-step service to the very rich and the corporate sector. Since votes cannot be obtained on that platform, the party has been compelled to mobilize sectors of the society that are extremist by world standards. Crazy is the new norm among Tea Party members and a host of others beyond the mainstream.
The Republican establishment and its business sponsors had expected to use them as a battering ram in the neoliberal assault against the population — to privatize, to deregulate and to limit government, while retaining those parts that serve wealth and power, like the military.
The Republican establishment has had some success, but now finds that it can no longer control its base, much to its dismay. The impact on American society thus becomes even more severe. A case in point: the virulent reaction against the Affordable Care Act and the near-shutdown of the government.
...though apparently well-known in other parts of the world. Based on an actual strike by Mexican-American workers against a zinc mining company, it is set in New Mexico, and uses actual mineworkers and their families in most of the main roles (including the male lead, Juan Chacon--his wife was played by a professional actress, however). The movie is strongly pro-union, unsurprisingly, but much more surprisingly, it has a strong pro-feminist twist, that wouldn't have been surprising in 1975, say, but is remarkable for 1954 (I won't spoil that part, you'll have to watch it). The main writers and directors were blacklisted for refusing to cooperate with the fascist McCarthy Committee, but like other communist sympathizers at the time, they were, in ethical sensibility, well ahead of their time. (At the end, the movie reveals which actors were "professionals" and which locals--there are several surprises.)
Matt Evans (ancient philosophy, ethics), currently Associate Professor of Philosophy at the University of Michigan at Ann Arbor, has accepted a senior offer from the Department of Philosophy at the University of Texas at Austin, to begin next fall. That's a significant boost for ancient philosophy at Texas, which should help re-establish it as among the top North American programs.
Peter Ludlow (philosophy of language, mind, cognitive science, & linguistics), Professor of Philosophy at Northwestern University, has accepted a senior offer from the Department of Philosophy at Rutgers University, New Brunswick, where he will also have an appointment in and serve as Director of the Rutgers Center for Cognitive Science. (No word on whether his avatar will also be moving, however!)That's a signinficant loss for Northwestern, though I would expect them to remain solidly in the top forty even without Ludlow. (I am told they will also be doing a senior search.)
He was Emeritus Professor of Philosophy at Syracuse University at the time of his death, and had also taught at Washington University in St. Louis, Bowling Green State University and the London School of Economics at various points in his career. He was well-known for his work on the foundations of decision and game theory, and their application to a range of issues in social and political philosophy. I will post links to memorial notices as they appear.
UPDATE: Dr. Anthony Fisher, a recent Syracuse PhD, asked me to share this obituary prepared by him and Prof. McClennen's wife, Ellen:
McClennen, Edward Francis II, aged 77, died November 2, 2013. Professor of Philosophy, Edward F. McClennen was a passionate thinker and an unrepentant liberal. He contributed foundational work to the field of decision and game theory with over fifty articles and the highly influential book Rationality and Dynamic Choice: Foundational Explorations (Cambridge University Press, 1990). He was truly an original philosopher who defended an unorthodox conception of rationality in the face of traditional theories of decision and game theory. He was a "maverick" in his field, as he would say. Ned believed that people could achieve extraordinary things by cooperating and devoted his later years to integrating economics and political philosophy in the service of a theory of a just civil society and government--a Rational Society, the title of his book manuscript.
Born in Cambridge MA, Ned received a BA in philosophy from The University of Michigan in 1959 and a PhD from The Johns Hopkins University in 1968. He taught at Purdue University, Lehman College CUNY, Washington University, Bowling Green State University, The London School of Economics, and Syracuse University and was a visiting professor at Harvard University, The University of Pittsburgh, Rutgers University, The University of Rochester, The University of Western Ontario, and The University of Amsterdam. At Bowling Green as the Ohio Board of Regents Eminent Scholar in Moral and Social Philosophy, he was co-developer of a program funded by UNESCO, the Kennan Institute and others that brought young Central and Eastern European scholars to the US after the collapse of the Soviet Union as a means helping them understand the relevance of new institutional economic theory for their reemerging nations. As Centennial Professor of Philosophy at The London School of Economics, he designed and administered a highly successful interdisciplinary Masters Program in Philosophy, Policy and Social Values. From 2005-7 he participated in a group of international scholars who were brought to Libya by Saif al-Islam Gaddafi to help in the writing of a new Constitution, for which Ned drafted the Bill of Rights. As many would attest, Ned was a gracious host who loved company. He enjoyed cooking fine food to share with others over long evenings of spirited conversation. He also loved art, a legacy from the George De Forest Brush side of the family, and he treasured the view of Pleasant Bay from the McClennen family home on Cape Cod. He is survived by his wife Ellen Esrock and his children, Nathaniel Esrock McClennen and Sarah Pearmain McClennen.
Mikkel Gerken kindly called to my attention this interview with Fred Dretske that was conducted by the students, named below, for an undergraduate philosophy publication, Tanken, at the University of Copenhagen; the students kindly gave permission to publish the interview.
Interview with Fred Dretske
By Tanja Tofte Bøndergaard and Linda Fønss
Duke campus is beautiful and, to a newcomer, seems like a quiet place. Forbes has ranked this tranquil spot 7th on its list of “power factories”, but it is probably better known for its basketball team and its fierce rivalry with University of North Carolina. This is where Fred Dretske, internationally acclaimed philosopher, has his office and where we have set out to meet him. It wasn’t written in the stars that Fred Dretske would become a philosopher. In fact he was only a year away from completing a degree in electrical engineering when he decided that it was not the field of study meant for him.
“I was an engineer, when I decided I didn’t want to spend my life talking to engineers. I would go to the coffee shops and overhear conversations and say: “Oh, that sounds very interesting.” People talking about literature and sometimes philosophy. I didn’t know these people, but I thought it all sounded fascinating.“
It was a basic introduction to philosophy at college, which got Fred Dretske so hooked that he decided to scrap four years of college and start over with the uncertain prospects of becoming a philosopher.
“I was so excited about the problems that I knew I wanted to do that for the rest of my life.I think it is an advantage to keep your career options open as long as you can, because you’re going to spend 40-50 years doing this and you want to make sure you get it right. I never regretted the decision I made; I know it was the right decision, because I’ve had fun my whole life. But had I stayed in engineering, I think I would have been miserable. You have to make a career decision at some time, of course, but put it off as long as you can.”
When Dretske had finally made his decision and, after two years in the army, entered the academic world of philosophy, he was talked into writing his PhD. on the nature of space, time and substance. However, he quickly abandoned this field and instead fixed his attention on the subject of perception in relation to epistemology, a topic he was far more dedicated to.
“My first book, 'Seeing and Knowing', was about perception; what it takes to see something and how seeing is transformed into knowing. Well, once you start looking at all the ways we talk about seeing and the various ways we use perceptual verbs, there are enormous complications there that you have to sort through to make headway.My claim was that there was something called non-epistemic seeing, so that I can see the book on the table without knowing that it is a book, without even knowing what a book is: simple seeing. Then there is seeing what it is, seeing that it is a book, that’s epistemic seeing. This distinction, I think, is - once you understand it - pretty obvious.It doesn’t seem to me, though, that psychologists and neuroscientists are much interested in such distinctions. What they want to know is: “If you do something here, does it affect what’s there?”.It seems to me that philosophers have something to contribute here. Seeing what is on the table is ambiguous between “seeing what it is that is on the table” and “seeing that thing which is on the table”. That’s a big difference and a philosopher has to be sensitive to such verbal differences, or he’s going to get confused and not be of much help to anyone. Scientists aren’t particularly interested in these subtleties of language. They just get bored. Philosophers are fascinated by them ... I am anyway.”
Speaking of the importance of the humanities, this program features, among others, philosopher Debra Satz from Stanford, where 45% of the faculty are in the humanities, but only 18% of undergraduates are humanities majors.
Continental Philosophy Farhang Erfani, a philosopher at American University, provides a useful set of links to news, events, interviews, reviews, videos, etc. related to "Continental philosophy" (broadly construed)