Wednesday, June 30, 2010

Ass[backwards]essment in Higher Ed

About a week ago, in a NYT article entitled "Deep in the Heart of Texas," professor and provocateur Stanley Fish lambasted the Texas Public Policy Foundation and Texas Governor Rick Perry for proposing that the evaluation of faculty should move to a more "consumerist" (Fish calls it "mercenary") model. The proposal would require college and university faculty to "contract" with their students, and it promises to reward--by as much as $10,000-- faculty who meet their contracts' terms. Who decides whether the conditions of the contract have been met? Why, the student-customers, of course. And how do they decide this? By filling out teacher evaluations at the end of their contract's term... er, I mean, at the end of the course.

I've never met a colleague, in any discipline of higher education, who unreservedly supports student evaluations. The reasons for most faculty's dislike of them varies widely-- some think evals are badly designed, some think they're weighted too heavily (or, less often, not heavily enough), some think the evaluation rubrics reward and punish the wrong things, some have just been burned too badly, too many times, by negative evals-- but almost everyone agrees with Stanley Fish that there is a fundamental misconception at work in what the results of student evaluations are assumed to indicate. The misguided "consumer contract" model of the classroom figures faculty as providing a service to their customers (students). Correspondingly, this model figures student evaluations as a kind of "customer satisfaction" gauge. The idea at work in this model is that students have a right to recieve something in exchange for their tuition money (and, presumably, their effort and time), so student evaluations are a way of holding faculty accountable for upholding their end of the contract. Now, it most certainly is the case that student evaluations do hold (at least untenured) faculty accountable, and there are many good reasons to advocate faculty accountability. But what are they being held accountable to? Fish claims:

... what they will be accountable to are not professional standards but the preferences of their students, who, in advance of being instructed, are presumed to be authorities on how best they should be taught.

That's no small complaint. In addition to the very obvious problem of students perhaps not being the most authoritative or reliable judges of the merit of their instruction-- at least not at the time that they submit evaluations-- there are many other suspect variables at play in student evaluations. Students who perform poorly in a course tend to evaluate their professor's skill (disproportionately) more harshly, just as students who perform well tend to (disproportionately ) laud their professor's role in their achievements. Similarly, students tend to rate professors higher in courses where they already have an interest or investment, and lower for courses that are "requiured," outside of their major, or outside of their strong skill sets. Courses held early in the morning consistently recieve lower evaluations than classes held in "prime time," that is, between 10am and 2pm. In the humanities, writing-intensive courses score lower. In the social and natural sciences, courses with labs score lower. And that's not even to mention the host of other, more ambiguous and difficult to locate, prejudices that factor into student evalutions, like the fact that young female professors and non-white professors consistently receive lower evaluations, harsher criticisms and, frankly, more abuse from students. Of course, that is not to say that student evaluations are without merit or useful information, but only to say that, well, they're not all they're cracked up to be.

I am not opposed to student evaluations in principle. I think they offer an interesting, even if not always entirely fair and balanced, picture of the quality of instruction in any particular course. And in the case of my own students' evaluations, I have often found indications of areas in which I needed to improve-- from requiring less (or more) reading per week, to controlling dominant students better in class discussions, to talking slower, to providing more feedback or responding to emails faster-- and I do my best to weigh students' satisfaction or dissatisfaction with my course against my own standards (and my discipline's standards) for what the course requires. But I do agree with Fish and many of the people who responded to his followup article that making student evaluations the sine qua non of judging faculty merit is grossly misguided. And to do so in such a grossly "consumerist" way (as Texas is proposing) is a recipe for disaster.

The truth is, there already is something like a "contract" in every classroom. It's called a syllabus. Faculty are held accountable to the criteria laid out in that document, and although it is not technically a legal document, it is a very close analogue. As opponents of tenure will undoubtedly object, there are very few consequences for tenured faculty who breach the limits of their syllabi, but untenured faculty are most certainly held to the letter of the law there. And, presumably at least, faculty who regularly disregard their syllabi and disrespect their students are not granted tenure. So, maybe there are a few rogue tenured faculty out there who are just phoning it in without any repercussions, but that hardly seems to warrant compromising the integrity of academia by imposing the consumerist model upon it.

There is just too much to lose by forcing faculty into some dumbed-down version of a fawning wait-staff. The merit of a college course cannot be reduced to the equivalent of some Facebook thumbs-up "like" image. Student evaluations are important, even uniquely valuable, elements in the broad measure of a faculty member's contributions, but they cannot be the first princinple of that measurement. It's like asking a batter who has just struck out (or been walked) to judge the merit of the home-plate umpire. As the British painter and historian Benjamin Haydon once said: "Fortunately for serious minds, a bias recognized is a bias sterilized." When it comes to the role of student evaluations in faculty assessment, more serious minds are needed.

Saturday, June 26, 2010

(Un)Paint It Black?

Recently, Blogger (the host site for this blog) began offering several new design templates to its users, which has resulted in a lot of aesthetic changes to the Blogger-blogs that I read regularly. (The design changes over Ideas Man's and Chet's blogs have been particularly noteworthy.) This site's design has pretty much remained the same for the last 4 years, the only significant change being my move from the "narrow" to the "wide" format. However, I have received over the years more than a few complaints about the white-type-on-a-black-background design, which some readers find a little too hard on the eyes. I've resisted making any major changes so far, but now I want to offer you all a chance to weigh in on the matter.

Here's the deal: If the vox populi calls for change, I'll be happy to accomodate. But you should know in advance that I'm only going to make design changes. If you want the themes, content, or general ideological filter of this blog to change, you should probably just move on along to the next hitching post in the blogosphere.

So, please register your vote in the poll below. If you vote "yes," I would really appreciate if you could let me know in the comments section what you would like to see changed.

***UPDATE: Poll closed. Changes made. Hope you like the new look.***

Saturday, June 19, 2010

Let Them Play!

As I'm still not quite over my post-New-Orleans euphoric haze, I wanted to post one more thing about that city and its blessed horns before returning to more serious material on this blog. This time, however, the news is not good...

One of the most fantastic and unique things about New Orleans is its street musicians. A lot of American cities (though not enough) have street performers, but most of the musicians are solo artists. What's great about New Orleans is that it has entire brass bands who play on street corners (and in the actual streets) throughout the French Quarter and on its edges. So, it's nothing short of an absolute tragedy that, in the last week, the New Orleans Police Department has decided to begin enforcing a long-ignored noise ordinance (still on the books because of Louisiana's bizarre system of civil law influenced by the Napoleonic Code) in a bid to run street musicians off as early as 8pm. The NOPD reportedly has been "urging" street musicians, including bands like the To Be Continued Brass Band, to sign a statement acknowledging the noise ordinance. TBC Band, which has played on the corner of Bourbon and Canal, undisturbed, for at least the last 8 or 9 years, has refused to sign the statement and is at the center of a campaign (including a Facebook group) protesting the new action.

Let me just state the obvious here: New Orleans has a lot of things to worry about-- poverty, crime, an as-yet-unreturned post-Katrina diaspora, not to mention the dire threat to its economic and environmental recovery caused by the recent BP oil spill-- so it's just ridiculous that the NOPD (at the behest of its new superintendent, Ronal Serpas) has decided that street musicians are suddenly something to worry about. If anything, street musicians are one of the BEST things about New Orleans, standing as testament to the unlikely resilience and inexplicable vitality of that city's enduring culture. Silencing the sound of brass and snare and the group-calls of second lines on the streets of New Orleans, late into the night, is tantamount to throwing in the towel on the fight to preserve one of our country's greatest cultural crucibles.

Let. Them. Play.

Friday, June 18, 2010

Kermit and Me

I just returned from a long weekend in New Orleans, the second-greatest city in the U.S. South. One of my chief aims while in NOLA was to see Kermit Ruffins, an absolutely amazing trumpet player, co-founder of the legendary Rebirth Brass Band and, more recently, a regular star on the HBO series "Treme." Before I left Memphis for grad school, I used to make trips to New Orleans fairly often, and I both first discovered Kermit Ruffins and saw him play several times at Vaughan's Lounge back in the day. But this trip was my first return to the Crescent City since moving back to Memphis. My friends and I were lucky enough to be able to see Ruffins play at a great, medium-sized club on Frenchman Street called the Blue Nile, where we literally got to stand less than 10 feet away from the band. (The picture of Kermit Ruffins to the left was taken by me from just that position.) And, just as I remembered, he was nothing short of a-maaaa-zing.

Now, this will come as no surprise to readers of this blog, but I love, love, LOVE good music. In general, I've never really been a huge fan of jazz, though I have always considered Bebop, lounge standards, and New Orleans jazz as exceptions to that general rule. New Orleans jazz is all about the horns-- and for all of the love and respect I give to the Memphis horns sound, there simply isn't anything like New Orleans horns. And as horns go, there isn't anything quite like Kermit's trumpet. So, when Kermit finally took the stage at the Blue Nile, and we found ourselves right on the front row, I just didn't think it could get any better than that.

But it did.

Now rivaling my experience last year at B.B. King's for Greatest Music Experience EVER is my night at the Blue Nile with Kermit Ruffins. Just after his first set, my friend Elizabeth and decided we would try to go over and talk to Kermit while he was on break (eating boiled crawfish on the side of the stage), which was a surprsingly easy thing to do as it turned out. I immediately told Kermit all the things that star-struck fans do, including the fact that I used to watch him play at Vaughan's and that his version of "St. James Infirmary" was what made me fall in love with the song. He was just as friendly as his on-stage persona would suggest, and he chatted us up about New Orleans and Memphis and music and food. At some point, I returned to the bar for a refill and left Elizabeth talking to Kermit. Then, about 10 minutes later, one of Kermit's people came over and said he wanted to talk to me again. Whaaaa?

So I went back over behind the stage, where Kermit said to me: "Your friend says you can sing. Where do you sing in Memphis?" I told him that I sometimes sing at Wild Bill's, to which he responded with a smile and a "aaaaahhhh, yeah, Wild Bill's. That's some blues, right?" I said yeah, and told him he should stop in there the next time he's in Memphis. Then, Kermit asked: "You want to sing one with us tonight?"

[This space stands in for my utterly indescribable moment of simultaneous elation and panic. Really, there are no words.]

Of course I said yes, yes, YES!, I would love to sing one with him. He was so very nonchalant about it as he went back to munching on his crawfish and said: "Well, don't get too far from the front and we'll call you up later." I was just about to walk off when two things occured to me: (1) I don't sing jazz, I sing blues. And (2) why does he think I can sing? I mean, I'm just some chick whose friend says she can sing. (You would think that a girl needs a little bit more than the word of an anonymous, intoxicated recommender to get on stage with Kermit.) So, stupidly, I turned back and said to Kermit: "How do you even know I can sing?" He said: "Oh yeah, right. Well sing me something now. Sing some of St. James." So, I did. And that was that.

He did call me up in the second set, and I did get to sing a number with him. Yeah, that's right, I GOT TO DO A NUMBER WITH KERMIT RUFFINS! Here's my photo evidence.

In the end, Kermit Ruffins was just as kind and funny and talented and down-to-earth as I had hoped he would be. And, just as I expected, he played a mean horn all night long. It used to be the case that a lot of Memphis music included horns as well, but that's not as true anymore-- though random sax players still regularly sit in at Wild Bill's. It's too bad, really, because horns make everything better... kind of like a Hammond B3 or a pedal steel does, in my opinion. (Right outside of the Blue Nile there was an incredible, probably 12+ piece, brass band playing on the street corner. And I can attest that if one horn is good, twelve horns will practically make you want to quit your job and live in a box on Frenchman Street.) At any rate, my night with my friends and Kermit now counts as one of those things that, if I had such a list, I would PROUDLY and CONSPICUOUSLY cross off of my Bucket List.

Tuesday, June 08, 2010

Doing Harm

The Nobel Peace Prize-winning organization, Physicians for Human Rights, has just released a White Paper entitled "Experiments in Torture" that documents medical professionals' complicity with CIA human intelligence collection programs, which include the now-infamous "enhanced interrogation techniques" (EITs), in post-9/11 detention centers. There is, of course, a continuing (and, at least on one side, entirely bad faith) debate over whether or not EITs are technically equivalent to "torture," but the overwhelming consesus among international jurists, humanitarian and human rights organizations, medical professionals and just about anyone else with a working conscience is that EITs are both morally reprehensible and illegal. Unfortunately, very few gains have been made in the campaign to call the Bush administration to account for its responsibility in the initiation and subsequent justification of EITs, despite noble efforts by people like Senator Patrick Leahy (who called for the establishment of a truth commission to investigate these issues). I've posted several times on this blog about torture, a topic that constitutes a significant part of my current scholarship, and so it won't come as any surprise to readers that I am fully convinced that EITs are the equivalent of torture techniques. I've read some really shocking, heartwrenching, gruesome and, quite frankly, thoroughly disillusioning material in the course of my research on torture, but I have to say that the PHR White Paper is a whole new low.

In sum, the findings of the PHR report are as follows:

Health professionals working for and on behalf of the CIA monitored the interrogations of detainees, collected and analyzed the results of those interrogations, and sought to derive generalizable inferences to be applied to subsequent interrogations. Such acts may be seen as the conduct of research and experimentation by health professionals on prisoners, which could violate accepted standards of medical ethics, as well as domestic and international law. These practices could, in some cases, constitute war crimes and crimes against humanity... The knowledge obtained through this process appears to have been motivated by a need to justify and to shape future interrogation policy and procedure, as well as to justify and to shape the legal environment in which the interrogation program operated.

The White Paper argues that the Bush administration's employment of medical professionals to "monitor" EITs was a way of pre-emptively protecting itself from charges that these practices were in violation of U.S. statutory and treaty obligations prohibiting torture. The hypothesis here is that advocates of EITs (like the Department of Justice's Office of Legal Counsel) presumed that if they could point to the presence and oversight of medical professionals in these interrogations, their presence and oversight would validate the Bush administration's redefinition of procedures formerly considered torture (like waterboarding, forced nudity, sleep deprivation, temperature extremes, stress positions and prolonged isolation) as "safe, legal and effective" "enhanced interrogation techniques." The problem is, according to PHR, illegal and non-consensual human experimentation also constitutes a "war crime" (and, when its perpetration is systematic and widespread, a "crime against humanity"). So, effectively, the Bush administration and the CIA employed one criminal act (human experimentation) to protect itself against liability for another (torture).

Even those of us who are not medical professionals know that one of the fundamental precepts of medical ethics is primum non nocere ("first, do no harm"). But the legal proscription of unethical human experimentation is also codified in the Nuremberg Code and the so-called Common Rule, both of which cover not only medical professionals, but also extend to any "research" conducted by the CIA or the Department of Defense. Despite the creative redefinitions of torture as "safe, legal and effective" by the Bush administration, its legal counsel and the executors of EITs, it is undeniably clear that, at the very least, those techniques DO HARM. The fact that they employed medical professionals to ensure EIT harm stopped just short of death, and that those same medical professionals recorded and documented the effects of EITs in order to "perfect" their maximum-harm-short-of-death potency, is just another addition to our nation's growing, yet still unacknowledged, registry of shames.

It's unethical. It's illegal. It's a disgrace to all of us in whose name it was performed. Arrest them all, I say-- the soldiers, the doctors, the politicians that authorized them. the lawyers that protected them, and the corporate-security leeches that profited (and continue to profit) off of them. Drag them all before a judge and a jury and let justice be served.

Monday, June 07, 2010

Parsing the "Anti-"s

Read no further if you're not willing to consider the possibility that "anti-Israeli state policy" claims are NOT tantamount to "anti-Semitism."

The recent deadly attack on an aid-bearing flotilla headed to the Gaza Strip has re-stoked the fires at home and abroad over Israel's continued blockade of Gaza. That blockade has been in effect since June 2007, following the election of Hamas to the Palestinian government, and has resulted in what practically every international aid organization recognizes as a full-blown humanitarian crisis. According to the politically-neutral American Near East Relief Association (ANERA), the ugly details of life in Gaza under the blockade are undeniably tragic. And yet, regrettably, the basic humanitarian dimension of this crisis continues to be eclipsed in domestic and international political discussion NOT by debates over what constitutes permissable or impermissable state-sponsored actions on the part of Israel, but rather by debates over what constitutes allegiance to or enmity towards the state of Israel itself. And so, again, critics of Israel's actions in the most recent flotilla incident find themselves on their heels, forced to elaborate the distinction between "criticizing the policies of a particular state" and "rejecting the right of a particular state to exist" over and over. And over. Of course, the suggestion that a particular state might not have the right to exist is an especially thorny one in Israel's case, but the tendency of organizations like AIPAC to suggest that all actions of the Jewish state are themselves representative of Jewish values only makes a thorny issue exponentially thornier. The elision of "anti-Israeli state policy" and "anti-Semitism," which dominates all considerations of the Israel-Palestine conflict, is an old (and increasingly dated) interpretive frame that just won't go away... and which, for the sake of Israelis and Palestinians alike, desperately needs to go away.

Let's just review, for a moment, the facts surrounding the recent Gaza flotilla incident. The explicit aim of the activists aboard the flotilla was to deliver embargoed goods to the residents of Gaza, of course, but it could be argued that the actual delivery of those goods was incidental to their mission. That is, it seems entirely reasonable to assume that a chief aim of the flotilla activists was simply to break the Israel blockade, full stop. Perhaps those flotilla activists were nothing more than humanitarians, perhaps they were conscientious objectors in the long tradition of civil disobedience, perhaps they were lawbreakers plain and simple. Whatever one may think about the legality or illegality of Israel's blockade and the morality or immorality of the flotilla activists' disregard of that blockade, it requires a gross and terribly prejudicial denial of the facts on the ground to claim that what is happening in Gaza today is not still a humanitarian crisis and that Gaza residents were not in dire need of the embargoed goods. So, the question is: why did Israel respond so violently to the renegade aid-deliverers?

Here is where an interpretation of the facts gets considerably murkier. Israel claims that its blockade was originally instituted, and has been subsequently extended, in an attempt to curb the terrorist threats (and actual assaults) of Hamas, which is the democratically-elected representative of the Palestinian people and which refuses to acknowledge the right of Israel to exist. But we should remember, as M.J. Rosenberg persuasively argues, that "Israel does not need permission from anyone-- let alone Hamas-- to exist. All it needs from Hamas is an end to violence and that is precisely what Hamas is offering, in exchange for lifting the blockade." Hamas has, in fact, offered Israel the promise of an indefinite cease-fire in exchange for lifting the blockade several times. And Israel has accepted that offer, then refused to live up to its end of the deal, several times. Assasinations, attacks, and political posturing of both the literally and figuratively violent sort have continued from both sides. Who is suffering the most in the interim?

The people of Gaza.

What seems beyond debate--outside of the echo-chamber of reductively pro-Israel talking heads-- is that the Gaza blockade is primarily intended to inflict collective punishment on the people of Gaza, regardless of their sympathy or lack thereof toward Hamas. A closer look at the details of the blockade (below), published earlier this week in The Economist, demonstrates exactly how needlessly obstructionist the blockade really is.
It's hard to look at those details and NOT think that the Israeli attitude toward Gaza residents is little more than "we intend to make your lives miserable by whatever means necessary." What is the strategic reasoning at work behind this blockade? Israel is both an occupying force in Gaza and controller of its borders (and everything that passes through them). For just a moment, let's put aside the question of the "right" of Israel (or Palestine) to exist. Let's put aside questions about settlements, reciprocal recognition, two-state solutions, or who counts as a terrorist organization. Do we have any reason at all to presume that every single resident of the Gaza Strip is a member of Hamas and wants to see the legitimacy of the state of Israel obliterated from the face of the planet? Is the blockade really fueled by the spirit of anti-anti-Semitism?

Quite simply, no. It is not. But the more (morally and politically) questionable Israeli state actions become, the more it becomes necessary for advocates of Israeli state policy to mask those actions in the protective garb of anti-anti-Semitism. The discussion between Peter Beinart and Steven Rosen on NPR this past weekend ("Criticizing Israel, Outside of Israel") is an excellent study in the nuances of "anti-Israel-state-policy" vs. "anti-Israel" vs. "anti-Semitic" positions. Beinhart, exhibiting a graciousness and patience well beyond that of his interlocutor, repeatedly emphasized that his own position, which is highly critical of Israeli state policy, ought to be considered of a kind with other pro-Israel advocates. He placed himself and other critics of Israel state policy squarely within the noble tradition of conscientious objectors, who realize that sometimes the very best expression of one's love for a country is to criticize it when it seems to have gone astray. Rosen countered with the (highly-suspect) proposition that one is either a "friend" of Israel or her enemy, and to be a "friend" of Israel means to back her actions, warts and all. Rosen stopped short of outright accusing Beinart of anti-Semitism, but the very strong inferences that could be easily drawn from his argument were all but an accusation.

Now more than ever, the force of this rhetoric needs to be explicitly rejected. Back in 2002, then-President of Harvard University (currently, Director of the White House National Economic Council for Barack Obama) Larry Summers caused a bit of a stir when he said:

Profoundly anti-Israel views are increasingly finding support in progressive intellectual communities. Serious and thoughtful people are advocating and taking actions that are anti-semitic in their effect if not their intent.

In response to Summers' claim, (progressive intellectual) Judith Butler published an essay in the London Review of Books entitled "No, it's not anti-semitic", in which she tried to make the case for the separation of Zionism from Jewry, the separation of the state of Israel from the Jewish diaspora, the separation of political critique from hate speech. Butler wrote her piece in the context of what she viewed as a threat to academic freedom on the campus at Harvard, but her words extend beyond the bounds of Cambridge. From Butler:

Here, it is important to distinguish between anti-semitic speech which, say, produces a hostile and threatening environment for Jewish students – racist speech which any university administrator would be obliged to oppose and regulate – and speech which makes a student uncomfortable because it opposes a particular state or set of state policies that he or she may defend. The latter is a political debate, and if we say that the case of Israel is different, that any criticism of it is considered as an attack on Israelis, or Jews in general, then we have singled out this political allegiance from all other allegiances that are open to public debate. We have engaged in the most outrageous form of ‘effective’ censorship.

A couple of years later, Butler revisited this same issue in the fourth chapter of her excellent book Precarious Life, in which she argues that one (among many) casualties of the prohibition against criticisms of Israeli state policy is our ability to "mourn" Palestinian lives. That is, the formal or informal restriction of our ability to criticize Israel compromises us as members of the human community, effectively blinding us to the suffering of a certain segment of that community. What is, perhaps, the most regrettable and offensive dimension of this censorship is that it exploits one people's historical suffering to justify the exploitation of another's.

The public shame that we have learned to associate with anti-Semitism since World War II is a legitimate and earned one, but its current deployment in the service of deflecting any and all criticism of Israeli state policy is itself shameful.

Saturday, June 05, 2010

In Praise of Smartypants

This morning, in stereotypical egghead fashion, I tuned my radio to the Saturday NPR programming, poured myself a cup of coffee, and sat down to read the most recent (June 7) issue of the New Yorker. In the mini-essay section "Talk of the Town," located near the beginning of every New Yorker, there was a piece by Rebecca Mead entitled "Learning By Degrees," which questioned both the logic behind and the wisdom of a recent trend in advice-giving that cautions young people against attending college. This is the advice of Richard K. Vedder (Professor of Economics at Ohio University and founder of the Center for College Affordability and Productivity), for example, who recently told the New York Times that, because eight of the top-ten job categories that will add the most employees in the next decade (e.g., home-health aides, customer service reprentatives, store clerks) do not requre post-secondary schooling, young people are better off taking the money they would have spent on a college education and spending it somewhere else, like on a house. This kind of advice seems to be finding a particularly receptive audience even among professional academics, largely because of the recession's negative impact on employment and the corresponding bleak outlook for recent grads looking for jobs. According to Vedder and those sympathetic with him, high schools would be better off equipping students with the skill-sets they need to enter the workplace (which he lists as "the ability to solve problems and make decisions," "resolve conflict and negotiate," "cooperate with others," and "listen actively") rather than treating all students as if they should be readied for college. (Whatever that means. Vedder doesn't clarify how "readying students for college" might not include equipping them with the same skills, or equipping them with different skills.) Mead also notes our collective "romantic attachment" to the figure of the successful college dropout (a la Steve Jobs or Bill Gates) that, when coupled with the unapologetic anti-intellectualism currently in vogue in political discourse, serves to buttress the case of skip-college advocates far and wide.

I soldiered on through the New Yorker this morning (including an excellent piece on the goalkeeper for USA's World Cup team, Tim Howard), but couldn't quite shake the sting of that first little essay. So, like anyone who wants to restore his or her faith in humanity does, I went to check in with the Facebook to see what my little corner of humanity had to say for itself. Alas, right there on my Recent News Feed was a link to this article from the Washington Post about conservative revisionist historian Earl Taylor, president of the National Center for Constitutional Studies, who has been tourng the country and mis-educating an angry and naive portion of our citizenry about my beloved U.S. Constitution and the Founding Fathers. Taylor, clearly some perverse ilk of originalist, advocates in his seminars (among other absurdities) that the original intention of the Founding Fathers would not have supported any of the Constitutional Amendments ratified after the 10th Amendment. (For those of you keeping score at home, that means no abolition of slavery, no federal income tax, no women's suffrage, no Presidential term limits and, most curiously, neither the prohibition of alcohol nor the repeal of that prohibition.) And, of course, in textbook originalist fashion, Taylor speculates that if the Founding Fathers wouldn't have endorsed it, neither should we. Now, Taylor's wacko interpretive frame for the Constitution aside, what disturbed me most about this article was the repetition of a sentiment that I had seen in the earlier New Yorker piece, namely, an unreflective and self-congratulatory embrace of anti-intellectualism that almost bordered on... well, stupidity. At the end of the article, Taylor speculates that America is facing a moral and ideological crisis, one for which we are ill-advised to consult the learned among us. The plain-spoken, Main Street, libertarian and so-called populist teachings of his seminar, contrary to the egghead speculations of the amoral (or immoral) educated elite, are the only true salve. Taylor remarks:

When it is all said and done, there will have to be good people who have answers. These things have to be taught far and wide. It's right and it's good, and it's not limited to just a few uppity-ups.

So, I read this piece this morning, adjusted my spectacles, re-tied my smoking jacket, confronted my godless world and heaved the kind of sigh that registers the same "oh-the-humanity" despair that the sight of a helium-filled promise engulfed in flames inspires. I mean, really? Don't go to college because higher education only equips you with impotent, unmarketable skills? Because it necessarily resigns you to the disconnected, useless, irrelevant class of the "uppity-ups"? Really? Seriously, what's so scary about the smartypants?

Rebecca Mead, bless her heart, did try to make the case for a generalist liberal-arts education (even with a major in Philosophy!) near the end of her New Yorker piece, arguing that "what an education might be for" is something other than, but not totally unrelated to, getting a job. Putting aside for a moment the highly-specious suggestion by Dr. Vedder that a college education does not equip one with fundamental (read: "marketable") skills-- and also putting the ressentiment-fueled, misdirected populism of the "good" Mr. Taylor's smug dismissal of "uppity-ups" -- what is needed in this debate, in my view, is a vigorous re-assertion of the "value" of education for a citizenry. One of the chief problems with a reductively "economic" view of the "value" of education is that it forces us all into the exclusive category of "worker." Whatever does not maximize the output of our work or, correspondingly, maximally increase the exhange-value of our work, is taken to be of diminished (i.e., merely supplemental) value. What is missing in that sort of evaluative schema, I think, is the very basic insight that the merit of the contributions of all "workers" is (or should be taken as) equivalent to the merit of their contributions as "citizens." (Arizona-inspired sympathies notwithstanding.) Mead gets at exactly this point when she writes:

One needn’t necessarily be a liberal-arts graduate to regard as distinctly and speciously utilitarian the idea that higher education is, above all, a route to economic advancement. Unaddressed in that calculus is any question of what else an education might be for: to nurture critical thought; to expose individuals to the signal accomplishments of humankind; to develop in them an ability not just to listen actively but to respond intelligently. All these are habits of mind that are useful for an engaged citizenry, and from which a letter carrier, no less than a college professor, might derive a sense of self-worth...

Indeed, if even a professionally oriented college degree is no longer a guarantee for employment, an argument might be made in favor of a student's pursuing an education that is less, rather than more, pragmatic. (More theology, less accounting.) That way, regardless of each graduate's ultimate end, all might be qualified to be carriers of arts and letters, of which the nation can never have too many.

I can hear the objections of Dr. Vedder, Mr. Taylor and anti-uppity-ups decrying Mead's insight as the predictable prejudice of smartypants everywhere. But, even if one were to grant the reductively utilitarian objections of their negative argument (i.e., "one should not go to college because it is not an economically sound decision"), what is their positive argument? Is it-- rather, can it be anything other than-- a counsel to become a "worker," and no more than a "worker," just for the sake of what makes a worker a "worker"? Isn't that, in the end, the heart of Marx's critique of capitalism? That it reduces the "human being" to the "worker" and, consequently, makes him or her nothing more than one more variable in the algorithm of profit-maximization?

There is, of course, an almost perfectly mathematical rationale behind that logic, but it is one that ultimately valorizes the "business" model and, consequently, erases the contribution of the citizen as independently significant and meaningful, because it necessarily conflates the worker and the citizen. My obligations and duties as a citizen must be calculated independent of my interests as a worker, if for no other reason than that the very definition of a "citizen" includes and sometimes negates) a social dimension that exceeds my own personal interests, my own individuated pleasure and pain. Academia, for all its faults, is one of the last places that human beings are allowed to consider their roles as workers, as citizens, as individuals, as family or community members, in conjunction or disjunction qua parts of what it means to be a complex human being. And it is one of the last places that human beings are given exposure to the whole history of human reflection upon how those conjuctions or disjunctions ought to be prioritized. For many, many college graduates, even a cursory encounter with Homer's Iliad, or Shakespeare's Richard III, or Aristotle's Nicomachean Ethics, or Darwin's On the Origin of Species, or Franz Boas' The Mind of the Primitive Man, or Immanuel Kant's Groundwork for a Metaphysics of Morals, or Michel Foucault's Discipline and Punish can serve as the spark for a whole life of dedicated critical and intellectual engagement. Perhaps more importantly, exposure to those texts and those thinkers can serve as a bulwark against the dangerous seduction of rhetoricians (and so-called "populists") like Glen Beck, Sarah Palin and Rush Limbaugh, whose anti-intellectual diatribes slowly and steadily lull the populace they are supposedly defending into self-defeating quietism.

If the choice is between unproductive workers and uncritical citizens, which is what the current business-model approach to higher education seems to favor, let me go on record as favoring the unproductive workers over the uncritical citizens. Why? Because unproductive workers who are also uncritical citizens have no means at their disposal for questioning the social, political, and economic frameworks that imprison them in the restrictive and dehumanizing mode of workers. It is clearly in the interest of that model to figure "the educated" as elitist and disconnected from the "real" concerns of workers, and to dissuade workers from becoming educated themselves, but I think the left-leaning character of the Academy-- so bemoaned by libertarians and conservatives-- is evidence of the disingenuousness of that rhetoric.

Take note, workers (and potential workers) of the world: If you're worried about getting a job, the smartypants are not your enemies.

Wednesday, June 02, 2010

My Poor Memphis

The New York Times ran a story earlier this week entitled "The New Poor: Blacks in Memphis Lose Decades of Gains," which painted a very grim picture of the recession's effects on African-Americans in Memphis. According to most demographers, Memphis will soon be the first metropolis in the U.S. with a predominantly black population, which means that our fair city is often looked to as a prognostic indicator for "black urban life." The NYT article descibes Memphis as a city in which the black middle-class was, up until very recently, on the steady rise. But the two major effects of the recent recession (unemployment and foreclosures) are disproportionately ravaging the black community, disproportionately when compared to white communities, and that is the real tragedy of this story. Memphis is not the only place in the country where this is happening; it's just the place where its happening is the hardest to deny.

A few weeks ago, our mayor (A.C. Wharton, Jr.) testified before Congress about the Reverse Redlining Lawsuit that the city of Memphis filed against Wells Fargo. "Reverse redlining" is a special variety of predatory lending, which has literally eviscerated black communities in Memphis, and which has its own racially-charged history. As you may recall, before Title VIII of the 1964 Civil Rights Act (a.k.a., the Fair Housing Act) was in effect, urban communities were starved for credit and denied loans for decades by "redlining" investment and lending practices. Banks, insurance companies, employers, even supermarkets literally drew a red line on a map around certain communities, most often racially determined, in which they would not invest. The Civil Rights Act of 1964, for the most part, outlawed redlining practices, which made possible the rise of a black middle-class in cities like Memphis over the last three decades, enabling blacks with access to homes, jobs and credit that translated into accumulable and, eventually, accumulated wealth. However, it now appears that all of that progress has been, or soon will be, reversed. The banks returned to their old tricks during the so-called housing "boom," this time with the more insidious practice of "reverse" redlining. Rather than targeting black communities for the denial of loans and credit, financial institutions literally flooded those "redlined" areas with exploitative loan products that have now drained residents of their wealth... and, significantly, drained the city of its progress.

Mayor Wharton called this the "changing face of discrimination," astutely linking the racist practice of reverse redlining to the racist history of redlining. If predatory lending is the chief culprit in our current recession, Wharton wants to make sure we all know that predation is not, and never has been, equally distributed. In his testimony to the U.S. Congressional Judiciary Subcommittee on the Constitution, Civil Rights and Civil Liberties, Wharton did not mince words when he stated:

Simply put, predatory lending is to this generation what 'no lending' to Blacks and Latinos was a generation before.

Although the NYT article did include a few remarks from Mayor Wharton, and although it did make mention of Memphis' lawsuit against Wells Fargo, the fact that the article is titled "The New Poor" indicates that (per usual) they have failed to connect the dots when it comes to race, class and the law. The "new poor" is not new. It refers to the same people, the same communities, who are made and kept poor in the same ways that they have been for the entire history of our country. Intentional blindness to the direct lineage that connects Jim Crow redlining to Obama-era reverse redlining is just another failure to see the big picture. Race continues to operate as the primary American filter through which all goods, privileges, rights and resources pass. Mayor Wharton implored Congress to connect ALL of the racial dots, rather than just the "cultural" ones, when he said: "For those who would say that the lawsuit we've filed is aligned with our city's history for 'singing the blues,' I assure you that this is not the case. We are not 'singing the blues'-- we are 'crying foul'."

Foul, indeed.

Tuesday, June 01, 2010

Say What?

Straight from the You-Gotta-Be-Kidding-Me Files, we have this update from the Supreme Court of the United States: if you want to invoke your right to silence, you better say so. OUT LOUD.

Oh, SCOTUS, why do you hate Miranda so?

As you may remember from Civics class, the 5th Amendment to the U.S. Constitution guarantees all of us the right to remain silent, that is, the right not to incriminate ourselves. And as you no doubt remember from every Law&Order episode ever, suspects are to be reminded of this right when they are arrested-- before they are interrogated-- and arresting officers are to be sure that suspects understand their Miranda rights, even if that means translating the warning into the suspect's native language. Until this most recent Supreme Court decision (Burghuis v. Thompkins, 08-1470), an arrestee's silence was not considered to be a waiver of these rights. But in a 5-4 decision today, SCOTUS ruled that remaining silent was not tantamount to invoking your right to do so.

If it weren't such a devastating blow to civil rights, the decision would be almost comical. There is more than a bit of cartoonish absurdity involved in the logic that concludes one must speak in order to invoke his or her right to be silent. (Reminds me more than a bit of the old "Duck Season, Wabbit Season" sketch.) In her dissent, our newest Justice Sonia Sotomayor got right to the heart of this absurdity, writing: "Criminal suspects must now unambiguously invoke their right to remain silent – which counterintuitively, requires them to speak... At the same time, suspects will be legally presumed to have waived their rights even if they have given no clear expression of their intent to do so. Those results, in my view, find no basis in Miranda or our subsequent cases and are inconsistent with the fair-trial principles on which those precedents are grounded." Yeah, counterintuitive is an understatement.

The majority opinion attempted to draw a parallel between an arrestee's right to an attorney, which he or she must explicity request, and an arrestee's right to silence. If you're being interrogated and you want the interrogation to stop so that you can consult legal counsel, you have to ask for that. If you cannot afford legal counsel, the court is obliged to provide it for you, but there is nothing to stop police from proceeding with their interrogations and investigations in advance of your invoking your right to an attorney. According to Justice Anthony Kennedy, the same logic applies to an arrestee's right to silence. Unless he or she says-- out loud and in an unambiguous declarative statement-- that he or she is invoking the right to silence, then it should be assumed that said right is being waived.

Of course, the problem with this parallel is that it seems entirely reasonable to presume that a suspect who has been informed of his or her right to an attorney, and who does not ask for that attorney, is effectively waving his or her right to one. (At least temporarily, because we know, of course, that this right can be invoked at any time.) On the other hand, it seems entirely unreasonable to presume that a suspect who has been informed of his or her right to remain silent, and who remains silent, is not in effect invoking the right to do so as a "right." At the very least, it seems safe to assume that he or she is NOT "waiving" the right to remain silend by remaining silent!

What's more, there is an easily anticipatable slippery-slope that proceeds from the logic of this decision. For example, what exactly are we going to require suspects to SAY in the course of invoking their right to silence? It's going to have to be something more than "I don't want to talk" or "I won't answer any questions," because neither of those positive declarations are substantively different from simply remaining silent. Will courts require that suspects say something like "I invoke my right to remain silent" or "I am exercising my privilege against self-incrimination"? Again, I'm not sure that's substantively different than remaining silent. What's left to make sense of the SCOTUS decision except a demand that suspects declare something similar to the "pleading the Fifth" statements that are often delivered in trials?

When witnesses plead the Fifth in trial, their statement of that plea usually follows this form: "On advice of counsel, I invoke my right under the Fifth Amendment not to answer, on the grounds that I may incriminate myself." In a courtroom, no inferences at all can be drawn from this declaration. But I wonder whether or not such a statement, inside a police interrogation room, would be treated so judiciously. My guess is that it would be taken, effectively, as an admission of guilt-- or, at the very least, as a justification for heightened suspicion-- which would no doubt undermine further the already-waning presumption of innocence in our legal system.