Monday, September 22, 2008

      Blogging in the Classroom

      I'm trying out a new pedagogical technique in all of my courses this semester. I've set up a blog for each course and have required students, as a part of their grade, to contribute regularly to those sites. In one of my courses, blog posts and comments are the only writing students are required to do, though the total amount of writing (in word-count) is equal to what I would have them write anyway. In the other two courses the blog contributions are supplements to more traditional writing assignments (like a seminar paper). I will admit that at least part of my motivation for experimenting in this way was to try to bring philosophy coursework into the 21st century... but that wasn't my chief reason.

      My real justification for this experiment is two-fold. First, I think one of the chief advantages of a "class blog" is that it provides a space for maintaining a consistent and uninterrupted conversation about the subject matter outside of the regularly scheduled "class time." I'm sure all of us hope that our students carry their reflections on the course material with them when they leave the classroom and, if we're really hopeful, we probably imagine them talking about these sorts of things in their dorm rooms, over lunch or beers, at parties. I suspect that some, maybe many, of them do this... but the truth is that students today are overburdened with extra-curricular activities and commitments (at my institution, they're called "co-curricular" activities and commitments), so there are numerous artificially-imposed limits to the attention students can direct at any one particular subject. So, my requirement that students participate in and keep up with their class blog is a bit of a ham-handed way of forcing them to see their work in my class as extending beyond the 3 hours that they spend with me every week. Also, and not unrelatedly, I have learned over the years that students often spend their time in class absorbing and attempting to process new material, which means that they often can't formulate something reflective to say until after class is over. How many times have we all had that experience where a student comes and speaks with us during office hours and says something particularly astute and relevant, prompting us to ask why didn't you make this comment in class??!! The blog allows for just this sort of lag-time, giving students a chance to come back and make that comment that didn't occur to them until class was over. Philosophy is best done in conversation, and no "natural" conversation has a 50- or 75-minute time limit. The blog also allows for semi-tangential or moderately-relevant contributions, which we often need to squash in class but which make for a deeper and more comprehensive consideration of the material.

      Second, the more you write, the more you write. Because blog-writing requires not only "posting" (equivalent to "essay" writing) but also "commenting," students end up writing more often... and just plain more. I'd like to say something like "the more you write, the better you write," but of course that is not always the case. Nevertheless, developing the habit of writing regularly is one particularly effective way, in my view, to improve one's writing. An added advantage of blogging is that everything that students write for the course is subject to the scrutiny of the entire class (rather than just me, as the "grader"). My experience so far this semester is that students' writing is of a higher quality because they know that everyone will be reading it. There is less misspelling, less sloppy grammar, less weak argumentation. And, in a sense, everyone must "edit" his or her ideas in response to the comments of his or her classmates, which is another invaluable writing skill. Although I was initially worried that blog-writing, because of it's shorter length, would result in incomplete or merely pithy essays, I find that this limitation in fact forces students to distill and focus their thoughts into the fewer words they are allowed. So, gone is all of the "fluff" material that we often find in student papers (biographical information, long quotations, irrelevant opining, repetitive argumentation). And finally, students at last are allowed to view their writing as a manner of engaging ideas and other people, as another way to have a conversation, rather than some purely utilitarian tool in the service of a grade.

      For those who are wondering about the "nuts and bolts" of this practice, here's how my class blogs work: Each course has a blog (here, here and here, on Blogger) that is "public" in the sense that anyone in the world can view it, but "private" in the sense that only members of the class are authorized to post or comment. Students have a set number of posts and a set number of comments that they are required to complete before the mid-term, and another number of posts and comments that must be completed after the mid-term. Participation above and beyond the minimum requirement is rewarded. There is a minimum word-count for posts. Post authors are responsible for responding to any direct question or challenge that appears in the comments to their posts. And, finally, I don't grade each post individually, but rather I give a "blog participation" grade at the midterm and again at the end of the semester based on the quality and quantity of the student's writing.

      So far, I'm happy with the results of this experiment, though I intend to evaluate its effectiveness again at the end of the semester, as well as distribute a "student survey" to gauge students' experience with the blogs. Even if this fails, we at least will have saved some trees this semster!

      UPDATE: Read the post-semester follow-up post: Blogging in the Classroom, Revisited

      Sunday, September 21, 2008

      The "Handwriting" of College Radio

      Yesterday, in my capacity as the faculty advisor for Rhodes Radio, I was a part of the committee charged with interviewing and selecting the next General Manager for the radio station. Because our little Rhodes Radio is still in its infancy stage, in a town with an abundance of colleges/universities yet a paucity of independent/college radio stations, the selection of our next (read: "second") General Manager was an important one. I was relieved to find that the two student finalists were both excellent candidates for the position, and they both gave impressive and mature interviews, which made the decision extremely difficult.

      One (of many) reasons that it was so difficult to pick a General Manager for Rhodes Radio is that we needed someone not only with discipline, leadershp, time-management skills, and the ability to navigate a tremendous amount of stress, but we also needed someone with a "vision." As we all know, independent/college radio is an endangered species-- but it's a quirky, strange, and beautiful little animal that I, for one, don't want to see die. The students who are in college now are probably one of the last generations that will remember what it was like to listen to the radio... and that memory is fading fast, even for them. This is too bad, really, because the current college-age generation is also extremely savvy about their musical tastes. They listen to a broader and more diverse range of tunes, partly as a result of being able to surf-and-compile their own "playlists" rather than listening to the repetitive cycles of commercial hits by corporate radio stations (almost all of which are beholden to contracts with record labels and long ago lost anything resembling a "love" of music). But the problem, of course, with this current generation's music-listening habits is that they have become more and more solipsistic, more and more isolated, more and more individualized and, consequently, less and less communal.

      College radio is the last bastion of that old, paradoxical approach to broadcasting, which is both "independent" and "communal." That is, college radio is the last place where "communal" doesn't mean "commercial," and "independent" doesn't mean "idiosyncratic." In my view, it requires a fairly sophisticated sensibility to get this, even more to implement and sustain it, and I do not envy the job of the General Managers whose charge it is to do that.

      You can imagine, then, how pleased I was to find that our new General Manager at Rhodes Radio "gets" it. Not only does he get it, but he can articulate it, and he seems to have some pretty good ideas about how to achieve it. It's always a risk to go with the "big idea" candidate, because he or she is invariably untested and, hence, unproven. But let me tell you what won me over in this case: In his application, which included an essay describing the candidates' "vision" for the radio station over the next two years, he listed all of the requisite "pragmatic" plans that need to be implemented (fundraising, standardization, promotion, etc.)... but he spent most of his essay explaining why college radio matters and what it should be. He noted the "sense of smart independence" that mainstrem radio cannot and does not offer, and "the DIY vibe that only non-commercial, volunteer radio provides." He described college radio as "the box in the middle of town that gives anyone with something to say a place to stand up and have their voice heard." And then there was this:

      Few people on campus get hand-written letters anymore, and yet e-mailboxes are bursting at the seams. Let’s remember what handwriting looks like. Let’s humanize music again Let’s get back to our mix-tape days, where music told you something about the person and the way they worked. Let’s provide students an alternative to their computer-screen-headphone-personal-playlist mentality by making the musical experience more communal than individual.

      Yeah, that's something I can believe in. What's more, I think that sort of vision is about the only way to keep college radio alive and flourishing. Because, the truth is, college radio doesn't run on money (which we don't have) or technological innovation (which we can't afford) or mass appeal (which would require a bigger antenna, which we don't have and can't afford), but rather college radio runs on the passionate investment of people who believe in it, who work hard for it, who don't want to see it die, and who care that it still bears the mark of their community's handwriting.

      Monday, September 15, 2008

      The Trouble with Rapture

      With Hurrican Ike, Lehman Brothers' bankruptcy, and the sale of Merril Lynch to Bank of America, it was pretty rough last weekend in the news. So, you may be interested to learn that according to The Rapture Index, which is a "Dow Jones Industrial Average of end-time activity," we're sitting at a very uncomfortable level of 162. (Anything above 160 is practically apocalyptic.) The rapture, of course, is a central component of Christian eschatology, and it refers to a future event in which it is believed that Jesus Christ will descend from heaven and judge the quick and the dead in advance of his establishing the Kingdom of Heaven on Earth. According to Paul's first epistle to the Thesssalonians, believers will literally be swept up into the clouds to join Jesus... and the rest of us will be left, damned, wondering where they went.

      Like many people who grew up in the church, I suspect, the rapture was a mysterious, frightening, and practically ubiquitous imaginative possibility in my youth. I remember watching a movie about the rapture in church camp one summer, in which the film's protagonist (a young girl about my age at the time) wakes up to find everyone in her family missing from her home. (As predicted in Scripture, the rapture had come "like a thief in the night.") It was creepy and utterly terrifying to me, and I'm sure it inspired my taking serious inventory of my pre-teen soul. I suppose there are a range of intensities with which one can believe in the rapture, but for "true believers" (who must be seriously distrbed by our current "162" ranking on the Index) this always-anticipated but never-precisely-anticipatable event adds a level of urgency and magnitude to all of our otherwise mundane actions and experiences.

      There's been a lot of harping on Sarah Palin recently for her belief in creationism. Last December on this blog, I offered a criticism of then-Presidential-candidate Mitt Romney's creationism in a post titled "The Trouble with Fossils." I was particularly disturbed by Romney's (ultimately untenable) claim that "to be asking presidential candidates about their specific beliefs of books of the Bible is, in my view, something which really isn't part of the process which we should be using to select presidents." Of course, the "specific book of the Bible" to which Romney was referring is Genesis, and the "specific belief" that he felt should not be an element in our selection process is creationism. I argued that Romney had missed the Good Judgment Boat on two counts with that remark: first, by believing in a literalist rendering of the Biblical creation account and, second, by believing that such beliefs are irrelevant criteria in the process of selecting a President.

      As I hope was obvious in that earlier post, my criticism of Romney was directed less at his particular beliefs about the origin of the world, but rather primarily at his beliefs about (1) what constitutes "good judgment" and (2) the role that our evaluations of candidates' powers of judgment should play in the election process. To believe that the world and all that is in it was created in a span of six days and nights, all scientific evidence to the contrary, is a manifest demonstration of bad judgment. But I can imagine ways in which this sort of judgment-- IF we consider it as a judgment of meaning and not a judgment of fact-- may serve other, quasi-justifiable ends in one's life. So, even though I would seriously question the judgment of any candidate who regularly disregards the legitimacy of scientific truth, my skepticism might be somewhat assuaged by the realization that s/he at least judges the world in which we live to be a meaningful and purposive place.

      Belief in the rapture, on the other hand, bothers me both as a judgment of fact and as a judgment of meaning. I do not want the Leader of the Free World, with his or her finger on whatever "button" might destroy said world, to believe (1) that this world is a fallen and temporary place, (2) that the "end" of this world is an event to be welcomed and possibly facilitated, and (3) that his or her eternal happiness is being postponed by the perpetuation of this world. In short, I don't want a "Rapture-Ready" President. In fact, I want a President who is absolutely terrified by the prospect of the rapture, and who instead focuses his or her energies on bringing about justice in this world. I want a President who sees the troubles of our times as problems to be solved, not signs of the apocalypse.

      And, pace Romney, I think this process of judging candidates' powers of judgment is exactly the process we should be using to select a President.

      Sunday, September 14, 2008

      R.I.P. David Foster Wallace

      Novelist, MacArthur "genius" and postmodern wunderkind, David Foster Wallace, was found dead in his home on Friday night after hanging himself. He was only 46.

      I know that just a week ago I was poking fun at the cult-status of Wallace's Infinite Jest. I feel bad about that now. So, let me say for the record that one of the things that I most admired about Wallace was that he somehow managed to find and maintain that delicate balance between a skeptical, disillusioned and exceedingly-academic nihilism on the one hand, and a probative, resilient and thoroughly genuine belief in the inherent meaning and meaning-making genius of humanity on the other hand. Wallace was exactly the sort of "postmodern" with whom I am the most sympathetic, and about whom the right-wing critics of postmodernism understand nothing-- that is, the sort of guy who believes wholeheartedly that "the Emporer wears no clothes" and yet still recognizes that Emperors are people, too, and it must really suck to be caught out there in front of everybody all naked and vulnerable like that.

      From a commencement address that he delivered at Kenyon College in 2005:

      But most days, if you're aware enough to give yourself a choice, you can choose to look differently at this fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line. Maybe she's not usually like this. Maybe she's been up three straight nights holding the hand of a husband who is dying of bone cancer. Or maybe this very lady is the low-wage clerk at the motor vehicle department, who just yesterday helped your spouse resolve a horrific, infuriating, red-tape problem through some small act of bureaucratic kindness. Of course, none of this is likely, but it's also not impossible. It just depends what you what to consider. If you're automatically sure that you know what reality is, and you are operating on your default setting, then you, like me, probably won't consider possibilities that aren't annoying and miserable. But if you really learn how to pay attention, then you will know there are other options. It will actually be within your power to experience a crowded, hot, slow, consumer-hell type situation as not only meaningful, but sacred, on fire with the same force that made the stars: love, fellowship, the mystical oneness of all things deep down.

      Not that that mystical stuff is necessarily true. The only thing that's capital-T True is that you get to decide how you're gonna try to see it.

      This, I submit, is the freedom of a real education, of learning how to be well-adjusted. You get to consciously decide what has meaning and what doesn't. You get to decide what to worship.

      Friday, September 12, 2008

      In Praise of "Very Short Introductions"

      This semester, I've decided to use a few of the texts from the Very Short Introductions Series published by Oxford University Press in my courses. These very small, very cute, and very inexpensive little books, according to OUP, offer "concise and original introductions to a wide range of subjects." There are almost 200 titles in the series now, ranging from very broad topics like "Ideology," "Globalization" and "Sexuality," to more specific topics like "Medical Ethics" and "Chance Theory," to topics that focus on a single figure like "Mandela" or "Foucault" or "Darwin." My experience with this series so far has been that the "very short introductions" (hereafter, VSI's) that address topics about which I am already familiar seem like fair, even if cursory, treatments of their subject matter. And the VSI's that address topics about which I am not familiar have, in fact, fulfilled the promise of their title, providing very useful bibliographies as well as a broad conceptual map to help in navigating the new material.

      One of the challenges of teaching "intro to philosophy" courses, which tend to be organized as historical "survey" courses, is that they can often feel self-defeating. That is, many students come to intro courses without much (or any) experience with what it means to read or write or think "philosophy," and they are then bombarded with the details of Descartes' Meditations, or Kant's Groundwork, or Plato's Republic without any sort of meta-structure in which to situate these figures and arguments and assign them real importance. For many students, in my experience, intro "survey" courses end up requiring them to memorize material-- what are the three formulations of the Categorical Imperative? What is the ontological proof for the existence of God?-- that doesn't have any meaningful "uptake" (to use Austen's term). So, predictably, they forget the details of philosophy as soon as the course is over and, what's worse, they don't get much sense of what "philosophy" is apart from those details.

      The VSI's are helpful in getting students to the thinkers by way of the ideas, instead of the other way around. So, I'm peppering them in with the regular, orthodox tomes as an experiment this year. I'll let you know how it goes.

      Tuesday, September 09, 2008

      Is he still around?

      There are plenty of nut-jobs out there who dominate the news for a brief time, then sort of fade away. While they're laying low, it's easy to forget about them-- until they raise their ugly heads again and show up on something like VH1's The Surreal Life. Such is the case with David Duke (pictured left, in all his glory), who is a former Lousiana State Representative, a former Presidential candidate and, oh yeah, former Grand Wizard of the Ku Klux Klan. Duke is not going to star on The Surreal Life-- which is perhaps unfortunate, because I wouldn't mind seeing him go mono y mono with someone like Pepa or Janice Dickinson-- but he is crawling out of his hole again.

      In one of the most brilliantly titled articles I've read in a while-- "Racist A-Holes to Gather in Memphis for Convention"-- I learned that, just 4 days after the Presidential election, Duke and his cronies will be coming to Memphis "to say clearly that neither Black Radical, Barrack Obama [sic], nor Mr. Amnesty, John McCain truly represent the will of the American people." On Duke's website (which I am not linking to on purpose), I learned that their conference is called the "European American Conference" and also that "college students will tell you that a university education today is a guilt-trip for whites."


      Monday, September 08, 2008

      2 years and 10K people

      Sometime last night, this blog received its 10,000th visitor. Crossing the 10K mark seems like a big deal, though I know that there are plenty of blogs out there that get that many visitors in a day. Even still, the event arrived at a serendipitous time, as this week also marks the 2-year anniversary of my blog.

      I wanted to extend my sincere gratitude to those who have stuck with this blog over the last couple of years. As Prof. Grady once said, the life of a blog really happens in its "comments"-- and that has definitely been the case here. In the true spirit of this site, you visitors (and lurkers) have motivated me to read more, write more, think more and be more. Thank you.

      Sunday, September 07, 2008

      Wake Up!

      I'm teaching a course on "Existentialism" this semester, which is not only one of my favorite philosophical movements, but also one of my favorite things to teach. As I've said to my colleagues many times before, existentialism is the one philosophy that seems to have been created for 18- to 25-year-olds. The list of existentialist themes and tropes read like a "Greatest Hits" of philosophy. Freedom. Death. Angst. Identity. Alienation. Authenticity. Teaching existentialism to undergraduates is like handing out candy to babies. It's probably not that good for their (mental) health, but it's so, so very tasty and delicious.

      For a lot of us, there came a point in our educational journey when we learned that existentialism was passé, and we were encouraged to quickly dispatch with it if we intended to do "serious" philosophical work. It's difficult for me to explain exactly why or how this happens... even in my own case. I suspect that it has something to do with the reductive and cartoon-y version of existentialism that is hocked in a lot of philosophy classrooms (and conferences), which tends to (mis)represent existentialism as a school of naive beliefs in "the subject" and his or her "absolute freedom." Or it may be a result of existentialism's undeniable "popular" appeal, which is always a black spot in the opinion of The Academy. Or maybe it's because existentialism, strictly speaking, was one of the most short-lived philosophical "movements" in history. (If you mark existentialism's beginning with Jaspers and Heidegger in the 30's and it's end with the "death of the subject" in the late 60's... well, that's only about 3 decades, if you're being generous!) Whatever the real reasons for its dismissal are, I find myself seriously questioning them whenever I teach existentialism again.

      Last week, we were covering Kierkegaard's Fear and Trembling in my class, a text which almost always reveals something new to me whenever I re-read it. This time around, I was struck by Kierkegaard's account of the churchgoer who, upon hearing his or her preacher's sermon on Abraham's binding of Isaac, falls asleep. Kierkegaard lambasts any version of watered-down Christianity that transforms the horrible and horrifying story of Abraham and Isaac, in which the "father of faith" can only be "understood" as a murderer or a madman, into some easily digestible morality tale. Kierkagaard asks what should be the obvious questions: how can Abraham serve as the Christian model of faith? And, if he is the model of faith, how can we (Christians) not be shocked and horrified by that implied imperative? And, more importantly, why in the world does this story not keep us up at night?!!

      I suppose there's a sense in which I want to ask the same questions about existentialism that Kierkegaard asks about Christianity. When did those texts and thinkers get so "watered-down" and hackneyed that we practically fall asleep when reading them (or induce sleep when teaching them)? This query is not unrelated to my last post, in which I suggested that we should not give up on the possibility that college courses can be life-changing. But nobody's life gets changed if they're asleep, or bored, or so busy with skills-aquisition that they can't muster the energy for serious self-reflection. Kierkegaard is right, I think, to argue that the only catalyst for "being more" (or, at the very least, "being differently") is to be shocked out of complacency. For some students, the texts themselves will bring about that transformation... but for the rest, we teachers must wake them up and keep them awake.

      And that means that we have to be awake first.

      Thursday, September 04, 2008

      Drinking the "Liberal Arts" Kool-Aid

      I know, I know. I should be writing about Sarah Palin's speech at the Republican National Convention. But I just can't bring myself to do it. I'm still shocked and dismayed that I didn't see one single non-white face in the Convention audience on television last night. And I'm also still amused that, at one point, the television cameras were panning the Convention floor, and one of the signs that read "Country First" was partially blocked so that it looked like it read "...try First." That cracked me up.

      Instead, I want to weigh in on "the first-year college experience," which is the topic of a couple of interesting posts at Perverse Egalitarianism ("Welcome to College. May I Take Your Order Teach You Something?") and Dead Voles ("The Wonders of College"). Both Mikhail and Carl make short work of dispatching with whatever romantic notion we might have of the "first-year college experience," which most of us believe to be eye-opening, mind-expanding, and life-changing for our young charges. According to a study conducted by sociologist Tim Clydesdale, discussed in his book The First Year Out: Understanding American Teens After High School, very few fresh college students experience any change in their identities, values, religious or political views during their first year. Rather, according to Clydesdale:

      Most of the mainstream American teens I spoke with neither liberated themselves intellectually nor broadened themselves socially during their first year out... What teens actually focus on during the first year out is this: daily life management.

      So, Clydesdale warns, don't expect your first-year students to climb atop their desks and yawp "O Captain, my Captain!" They're too busy trying to figure out how to balance the demands of life without parental control: studying, partying, working, making friends, managing finances, waking up on time. And while they're doing that, they're keeping the characteristics that have defined them so far in what Clydesdale calls an "identity lockbox."

      Of course, this can't be true of all first-year students, and Clydesdale acknowledges that a (very) few of them do in fact have the deep and profound experience that college brochures promise. But the sorts of students who are inclined towards peripeteia and anagnorisis are basically just little versions of us and, according to Clydesdale, most of them grow up to be professors just like us. This is how we keep the dream alive.

      In the end, Clydesdale advises that we basically give up on that dream. Stop telling first-years that your course is going to expand their world-view, because it probably won't. Instead, he advises that we focus on skills-development, which is not only more attuned to what new college students want, but what they need. At least in part, I agree with Clydesdale that teaching critical skills to new students-- how to read well, how to write well, how to think well-- is time and energy well-spent. But, having drunk the liberal-arts Kool-Aid a long time ago, I still believe that those skills are partly useless, and mostly meaningless, if they are not contextualized within some "for the sake of which."

      For the time being, I'll keep my Rule #1 as it is... including the "Be More" part. Maybe I'm just preaching to the Future Professors and Purveyors of the Dream Choir, but maybe not. At any rate, despite the nay-sayers, I'm going to keep practicing a lesson I learned from the Republicans last night: "...try First."