Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

13 May 2013

And now for a new journey into the unexpected.

Somewhere along the lines of fifteen years ago and perhaps more, perhaps 18 years ago, I was approached at my old university about the possibilities of doing an online course.

Online education was in its infancy, at least at major universities (I'm not saying there weren't early adopters, only that there weren't many). Options were extremely limited -- no videos, very little ready-made content from publishers, and a fairly basic chat functionality.

I said no.

The technology was too primitive, and I wasn't sure how to bring a real course to students with such tools. In fact, it was more or less a correspondence course that exchanged email (and file upload and storage) for snail mail.

Things change. I am responsible for helping people teach online as well as being an online teacher myself these days, and of course the tools have changed significantly. Courses are very full featured and can be very rigorous (although just as in traditional classrooms, rigor is not always offered nor sought out). I am still not sure online education is a substitute for traditional education (caveat: studies do show that objective measurements of content learning is comparable in online and traditional classes, but I'm not talking about objective measurements...I'm talking about the co-curricular aspects of a course and college itself), but I am not going to deny that it has opened up possibilities for non-traditional students that were hard to imagine in the years prior to online.

All this as prelude to the fact that for the first time ever I will be attempting to teach a composition course online. I know I'm not the first to teach composition online, but it will be my first time, and therefore, I'm busy with pacing, assignment sequence, and the wonderful logistics of getting students to peer review using the horrible tools Blackboard provides.

I will most likely be posting updates as time goes on this summer.

10 May 2013

MOOCs and Their Discontents, Part 1: Financial Winners and Losers

Anyone who deals with higher education has heard of MOOCs -- pronounced exactly like the ethnic slur, but spelled differently -- and the controversy surrounding their emergence, dissemination, and utilization. The Massive Open Online Course promises at this point to give access to education previously untouchable by the unwashed masses: lectures from Harvard and MIT, for example. Moreover, they promise not only to give access to these courses, but also to provide some form of "credit" for completing the course. Credit is not in scare quotes to undermine the legitimacy of the courses, but rather to indicate that "credit" can mean anything from a printable certificate, to a badge, to actual college credit hours depending on university and course.

It's this last part that has some faculty and administrators nervous. For instance, why should Student A pay $1500 for a 3 credit course in Western Civ from their local college when they can transfer in the credits from passing their free MOOC course in Western Civ? For faculty, the danger is that with fewer courses to teach, fewer faculty members are needed. For administrators, it amounts to roughly the same thing: fewer courses means fewer tuition dollars rolling in. While many administrators delight in the idea of destroying faculty power and reducing the labor costs associated with faculty, the sane ones understand that one can't really burn the village to save it: the administrators only have jobs because of the surplus value they've extracted from faculty labor.

So who benefits from MOOCs?

The first answer would seem to be students. After all, if you can get for free what used to cost you $1500 (and we can extrapolate beyond my one example to let's say, the maximum credits an institution would allow a student to transfer in, we'll pretend it's 30 credits...that's a savings of $15000), then it seems like you benefit. Furthermore, you could argue that perhaps the student is getting better instruction from a MOOC from Harvard than they would be getting from their local college or university. I'll let that point go for now, though: the issue of MOOC quality is another post altogether.

The second answer, I would argue, would be the big name brand universities that produce the MOOCs. Harvard and MIT, along with a few other well-respected universities, are nationally known names that can attract participants (or at least enrollees) in their MOOCs based on name recognition alone. Even the slightly curious might sign up for a Harvard MOOC in The Heroic and Anti-Heroic in Ancient Greek Civilization. The university gets free publicity and good will -- not an entirely bad thing when some segments of society love to hate you -- and perhaps revenue down the road through selling course materials, advertising, and other peripheral products. I would argue, though, that the MOOC revolution will stand or fall based upon its ability to make money.

The losers here seem to be nearly every other college and university. It's hard to see how they make any money accepting transfer credit from MOOCs. Now it's easy to say that they don't make any money now off transfer credits, regardless of where the student took the credit, and that's true. However, it's also true that currently most students are paying for those credits somewhere, and if MOOCs become more accepted for transfer credits, fewer students will be paying community colleges, state schools, and even the small liberal arts colleges for those credits, which means that collectively those schools will lose out on a substantial amount of revenue. More importantly, those students who spend four years at an institution, in other words those students who were taking all or nearly all of their undergraduate credits from your institution, may decide that summertime is better spent getting six free MOOC credits than six paid credits from your summer offerings.

Small schools can't even get in the game in the same way as the big players, although some will try, I'm sure. However, unlike traditional online courses (it feels odd writing "traditional" and "online" together like that), which many small schools have shown are viable forms of outreach for them, these MOOCs do not generate revenue: they are IT resource intensive for a small school and would to a large extent cannibalize the pay-for-credit offerings those schools have already put online.

It's this dynamic, free courses for students and the consequent squeeze-out of the small schools by the big names, that sets the stage for the questions that follow: the pedagogy and the ideology of the MOOC. MOOCs threaten fundamental structural changes in higher education in the way that online education never did: online education has been quickly subsumed into the traditional structure of colleges and universities, either as separate "world campuses" or as another delivery method in the existing continuing education structure. Sure, online education has given rise to diploma mills like the University of Phoenix and Capella, but even that phenomenon isn't new -- it's just easier to get access to, and with student loans as a lucrative revenue stream, it's not going away anytime soon. The MOOC threatens to do away with revenue altogether -- it doesn't just divvy it up differently. As such, it strikes at the core of the current university model.

23 September 2011

Yet again, there's no free lunch.

On The Guardian's website, Dan Gillmoor raises a very good point about our increased reliance on and desire for technological interfaces in everyday life. We love the convenience of mobile phones, GPS, and the like. We enjoy the "free" services provided by facebook and, well, blogger.

At facebook, we go apoplectic when they make changes to the interface, acting as if we've paid dearly for a product that the company won't keep as we want it, when really we've paid absolutely nothing...at least in material compensation (we have paid quite a bit in privacy and provided companies like facebook with valuable marketing information, so in essence, they're the ones getting something for next to nothing).

Gillmoor argues, though, that facebook is really only the tip of the iceberg. As our devices get smarter and more interconnected, they and we become reliant and visible to the global network of data exchanges and that exposes us to ever more present surveillance. Speaking of the GM OnStar service, Gillmoor paints a rather dystopian future:
We're only at the beginning of this trend, I fear. Someday soon – count on it – governments will order car makers to install software and communications "services" that give government not just the power to know where you are, but also to govern your top speed or, should it decide it needs to do this, stop your car, dead, on the highway.
I submit it's not terribly far-fetched to speculate in this manner.

Moreover, it raises the point, uncomfortable to many, that Marx was more right than even he knew about the long-term effects of Capitalism. Capitalism created the modern consumer and through the mechanism of commodity fetishism we are being drawn ever deeper culturally into a world in which we become the objects we consume; our identities are no longer even ours, but are rather pieces of data shared around the world and marketed back to us.

Concurrent with the market infiltration of our everyday life, we have the rise of the surveillance state that grows, through our own desire for consumer objects, in its ability to track us and our activities.

Which is not to say that technology is bad. However, we do grow closer to those dystopian imaginings of the 1980s and 1990s in which the only people who can effectively resist the state are those who can re-program or disable the surveillance, like Neo in The Matrix or the Gene Hackman character in Enemy of the State, who exists completely disconnected from the grid and whose most dangerous moments occur when he must reconnect for brief periods.

Once again, the piper gets paid one way or another.


28 June 2011

Perhaps the nearest sign that I'm growing crotchety and old.

I like newspapers.

I like holding one and reading the columns. I think there's an entire ritual that's passing away centered around sofas, coffee tables, and bulky Sunday papers with all those sections and circulars and supplements.

I don't know the economics of it, but I wonder if -- counter to all our amazement at the joys of the internet and the economic engine we believe it to be -- the internet hasn't killed not only the newspapers but also the entire economic system around it, from advertising artists and salespeople to printers and shippers and paper suppliers. Like I said, I don't know if there's a net gain or loss economically, and since I'm not Michael Gerson, I'm not going to write some utterly uninformed piece about it.

Besides, I like the internet, too.

Now it used to be that if I wanted a copy of the Post, I plunked down my coins and picked up the daily (OK, I actually subscribed when I lived in DC, and would be subscribing still if I weren't in PA). You paid for it. And advertisers paid for it. Then the internet came along and we all thought news was free. Newspapers were caught in a bind: they had to get onto the internet or become irrelevant, but the moment they got on the internet they undercut their print editions. People won't pay for internet content...or so the theory goes.

Even papers you never had to pay for are struggling in the internet age.

One of my particular joys in living in DC, especially when I was in my twenties, was reading the CityPaper's matches section. I especially liked the "none of the above" category, because it had the potential to supply in three or four very short lines astounding humor. Pair those ads with the ludicrous porn shop ads and there was great clipping material to send to friends in faraway places.

Then Craigslist came along.

I like Craigslist, too, but it's too easy. The trolls aren't terribly inventive, and the potential for surprising humor just isn't there, except in the area where musicians try to form bands...that can still be comedy gold.

Ben Franklin got his start printing papers. When papers close, old Ben sheds a tear.

26 July 2010

No stone unturned.



During the Vietnam War, it became increasingly evident that television had changed the war. Not only did television speed up the home front's access to information about the war, but it also brought it vividly into everyone's evening news. Unlike the newsreels of World War II that were highlight clips available in movie theaters, the news reports from Vietnam showed reporters in the midst of firefights; the chaos of the war entered the living room.

Compared to nightly news reports, newsreel footage is quaint, sterile, distant, and downright naive:



In the decade and a half between the fall of Saigon and the opening of Gulf War I, the government and the televised news media learned some important lessons. For the government's part, they learned they had to control the message, so they released footage of "smart bombs" and held press conferences explaining exactly what was happening (or at least what they said was happening), and that information was dutifully lapped up and disseminated by the various news organizations.

News organizations, in particular CNN, had learned that war was not an event to be reported but a bankable commodity to be exploited. War coverage could be branded and developed: panels of experts could convene, pre-packaged pros and cons could be aired as if they were open debate, and occasionally an overview of the war, complete with military supplied footage and analysis, could occur. CNN saw the war as an incredible visibility boost, and of course marketed their coverage and references to their coverage to convince viewers that they were a reliable source for information:



More importantly, though, they branded the war. It became a show, complete with recognizable graphics and theme music:



But you don't have to take my word for it; you can read Baudrillard's excellent The Gulf War Did Not Take Place for a more lucid analysis of the media victory in the Gulf War I. While some illiterate morons believed Baudrillard was arguing that the Gulf War was a hoax (much like conspiracy theorists argue about the moon landing), Baudrillard's points consisted of a critique of the mediated nature of the event and whether the action actually satisfied the definition of war as opposed to massacre.

The advance of the First Gulf War was live 24 hour coverage and the development of stations devoted to nothing but news (which of course meant nothing but infotainment, since hard analysis doesn't sell and there's not enough news to fill 24 hours unless you repeat it, extend it, manipulate it, and turn it into an event). The advance of the Second Gulf War and the Afghanistan War (perhaps we could label both neatly as "Bush's Boondoggle" or "Middle East Adventurism") is the advent of the internet.

Digital recording has made (to use CNN's term) "iReporters" out of nearly everyone. Cheap cell phone images have fueled the cable channels' speculation shows, while higher quality hand held recording devices and widespread internet connectivity have allowed nearly anyone to produce and disseminate footage (and the accompanying phenomenon of "viral video" simply drives home the point that the production, dissemination, and consumption of images cannot be contained or controlled by the traditional media infrastructure).

Digitized material spreads beyond the control of its producer or its original broadcaster. Derrida argued that all text is "always already" beyond the control of its creator and especially so if it becomes public discourse (and you have to have a sense that Emily Dickinson understood that as well when she wrote that "publication is the auction of the mind"), and in the internet age the avenues of dissemination are simply multiplied and accelerated. They approach "real time," the "real" being more of a tease, a promise of revelation that often doesn't materialize or disappoints. Much like the CNN reporters of the 1990's (and present) who often stand around desperately trying to fill time in order to fulfill the promise of presence, the internet as entity promises everything -- unmediated access to information without respect to broadcast schedules, as well as an unfillable archive of everything that has ever happened.

Into this medium springs wikileaks, a site whose visibility depends upon its access to formerly secretive information; like most news sites, it's raw material is information, but unlike other news sites, it doesn't do anything with the raw material: it simply dumps it on the internet, making it freely available to anyone with an internet connection. Wikileaks represents the next watershed in the public relations of warfare, which is to say in warfare. Prior to the Vietnam War, the military and government could rely on a distance between the war zone and the home front; prior to the First Gulf War, the military and government could rely upon the dominant model of infotainment to spin their messages (and the embedded reporters of Gulf War II simply represented a tremendous advance, both in terms of control and in terms of PR victory, in the military's response to that model); however, the internet age represents a challenge that Lyotard first identified back in 1979 in The Postmodern Condition: A Report on Knowledge: control of information will be the dominant field of warfare or interstate rivalry:
Knowledge in the form of an informational commodity indispensable to productive power is already, and will continue to be, a major --perhaps the major --stake in the worldwide competition for power. It is conceivable that the nation-states will one day fight for control of information, just as they battled in the past for control over territory, and afterwards for control over access to and exploitation of raw materials and cheap labor.
In other words, knowledge as commodity has always served traditional interests. Wikileaks represents a denial of knowledge as commodity, or at least in the traditional sense. However, the news outlets who have always made information their stock in trade will find no real challenge from wikileaks -- they have simply been given immense raw material with which to work; the real challenge is to the government and the military, who are now finding that just as battlefield television cameras brought their combat actions under intense scrutiny, wikileaks (and the internet in general) will now bring their internal discourse on war into the light and under the same intense scrutiny.

Endlessly.

22 June 2010

The age demanded...

So we live in the age of twitter, which may in fact be perfect both as a communications means and a symbol of a pathetically shallow and simplistic culture. In an age where Obama's recent speech, written at a nearly tenth grade level, apparently is too difficult for most Americans to understand, Sarah Palin comes to the rescue with her twits, as reported by cnn.com:
"RahmEmanuel= as shallow/narrowminded/political/irresponsible as they come,to falsely claim Barton's BP comment is "GOP philosophy," Palin also tweeted in reply to Emanuel's comments.
Deep. Really deep. Her argument is ironclad, her support unimpeachable. Sure, you could go on and on detailing how Barton's comments, while completely at odds with the PR desires of the Republican Party, actually reflect the laissez-faire attitude of the party and its belief that corporations trump government, but I'm already beyond 140 characters and therefore way beyond the attention span of Palin's supporters.

Until I can pare that down to a series of grunts and hand signals, I'm afraid I will not be able to communicate with the right wing.

29 March 2010

How did it come to this?

How did our nation end up in the mess it's in?

We are the world's wealthiest nation. Our education system, for all the criticism it takes, is fairly extensive and in many cases exceptional. Our institutes of higher education are magnets for international students, demonstrating global esteem and at least the perception of quality. A great many of us have instant access to information through the internet, and since nearly every American household has cable/satellite, we are subject to a barrage of news and information...oh.

Wait a minute. Maybe we have too much information, as that old band The Police once sang. Not too much information as in let's stifle it and censor things and close avenues of communication, but too much information as in we're not processing it properly and if we don't have the tools to process it properly, we're simply awash in information with little way to get our bearings as to which is good information and which is bad information.

Just as the advent of the newspaper allowed information -- and let's not forget, gossip -- to spread at exponential rates (see Balzac's Lost Illusions for an excellent commentary on the at-that-time state of the art lightning fast communication), and television did the same thing in the 1960's (bringing among other things the Vietnam War direct to the American public), so too has the internet and the spread of cable infotainment channels like CNN and Fox revolutionized information access and transmission.

You Tube allows the semi-intelligent to become celebrities for a short time by doing stupid things to their bodies (and for the stuff You Tube won't show, there's 4chan) or by filming their children in states of dental-sanctioned inebriation. Andy Warhol's prediction becomes absolutely prophetic.

Unfortunately, our ability to process the information seems not to have kept pace with the access to it. It's become even worse since Fredric Jameson talked about "total flow" back in the early 1990's in Postmodernism, or the Cultural Logic of Late Capitalism.

The success of poststructuralist attacks on the notion of objective presentations of truth were necessary interventions that dislodged the monolithic power of either myths of the state (see the Schoolhouse Rock videos of American history) or of media as a fundamentally objective pursuit. Unfortunately, the right-wing had by and large failed to understand these arguments and incorporated only the first part into their analysis both of poststructuralism and the media. Interestingly and paradoxically, the right wing is quite comfortable arguing that poststructuralism is morally bankrupt because it denies objective truth ("eternal, universal, and natural God-given truths"), while at the same time adopting poststructuralism's critique of that sort of truth as they condemn the "liberal media."

What's missing, of course, is the second part of the poststructuralist critique, one that Derrida for instance was at pains to return to again and again (see "Violence and Metaphysics," Of Grammatology, Spectres of Marx, or nearly any of his late works -- the quickest gloss may be "Violence and Metaphysics" contained in Writing and Difference -- see esp. pp. 128-29): that the absence of an unmediated access to universal truth does not mean that we can therefore throw out standards of judgement. It's quite simple, but easily forgotten in the easy soundbite of "moral relativism" that right wingers like to throw around.

I'll skip a bit here, but suffice to say that eventually we get around to the idea that it isn't so much knowledge that's power -- at least culturally -- but transmission of information, good or bad. Conspiracy theories, which used to be confined to small groups of isolated crackpots, are now given the power and reach afforded by globally linked communities. The speed of information and the format of information does not lend itself to extended critique or immersion in the object: instead we are immersed in an unending stream of information that doesn't separate the latest Disney-channel star's scandal from market news or political maneuvers -- other than the fact that the scandals are given higher billing and more air time.

So we have the advent of the Tea Party movement -- a gathering of malcontents (which isn't a bad thing in itself) whose numbers wouldn't qualify them for any sort of attention in the days when the supposedly evil mainstream media (and look, I have plenty of critiques of traditional media outlets, but I'm really tired of the idea that they can all be collapsed into some monolith -- the great media conspiracy theory) actually evaluated the newsworthiness of events and movements. However, in these latter days of news as entertainment, we have Fox in particular actively promoting the Teabaggers -- surely and odd position to be in if one is interested in notions of "objective journalism" (of course, I'm all for reportage, but there's a fundamental difference between activist-journalists filing reports for explicitly aligned outlets and a major news corporation pursuing a "news story" as though it's part of their new fall line-up).

Stupidity parades itself around on the basis that the "mainstream media" has silenced the "real story." Truth claims can't be evaluated because "liberals" (who are all at once progressives, communists, socialists, and fascists) won't let the truth be told. Conversely, mainstream television pundits, whose truth claims are eviscerated on a daily basis by, of all things, a comedian, are the heroes of the teabaggers, who apparently have no critical faculties for evaluating truth claims. Evidently having been served miserably in their varied educational histories, they are unable to distinguish between liberals, communists, and fascists. All are wrong, and all are one.

Stupidity is on tour, coming to a city near you, in the form of the Tea Bag Express.

21 November 2008

In partial explanation of my absence, in lieu of a doctor's excuse.

I'm at a conference on teaching at the college level, so I'm getting a bit of a break from the daily grind. Conferences are usually energizing experiences for me, since the environment is intense and the locations are, if not exotic, at least out of the ordinary. However, I can't help but think that the main problem with college-level education conferences is that they tend to present information I learned as an undergraduate as if it were new material. Hey, look -- students learn more when they have to manipulate content rather than simply take notes at your lecture! Oh, check this out -- varied assessment techniques are more valid than just midterm and final!

However, the best one I've had so far was a session on using technology in the classroom -- or to be more specific, it was a session on "trends" in technology that could be applied to education. In other words, it turned into a "did you know that there are things called blogs that can be used for interacting with students?" and a "search engines can pull up all sorts of information about you, even information you may not have provided yourself."

I kid you not. 2008. Late 2008.

But I'd be remiss if I didn't tell you that the food was good, the accomodations pleasant, and that some of the sessions have been much better than those I've chosen to outline. I suppose my big problem is that I'm comparing it to English conferences, in which you'd most certainly get laughed out of the room if you tried to present some twenty year old reading of Moby Dick as if it were something new (not that I haven't seen my share of bad English conference papers, but in general you get savaged in the question and answer session for presenting old ideas as new discoveries).

I've also discovered that most of the people at this conference aren't actually trained education researchers -- they're college professors from one discipline or another who have a great interest in their teaching, and it's good they have that interest; they're probably all good teachers, dedicated teachers. However, that doesn't mean they can design valid research studies on educational models. For instance, I question the validity of a study in which students are divided into two groups: one group takes the course online, the other takes it in a traditional classroom setting. Learning is measured by a multiple-choice post-test that the traditional students take closed book in class under time constraints, and the online students take online under time constraints. The instructor seemed to think that the time constraints precluded online students from looking up answers, even though the tests were based on readings in their textbooks (hello, google books anyone, or even the good old fashioned method of having the book open and marked to key chapter summaries, bullet lists, etc.).

Maybe I'm just a curmudgeon.

12 February 2008

Untimely meditations.

I wonder about the future of literacy.

Democracy assumes both equal access to all and an informed electorate -- for instance, theoretically one should know about the candidates in a given race prior to voting, or be somewhat apprised of the issues of a referendum that might be on the ballot. Obviously, equal access is a bit easier to enforce than informed electorate.

For some strange reason, I feel that literacy is tied to an informed electorate. Maybe I'm being idealistic, but I'd like to think that being able to read Hegel or Emerson or Dewey should be a skill more wide-spread through our society. For one thing, it would knock morons like Thomas Friedman and George Will off the editorial pages, because a populace that could make sense of Emerson's "The American Scholar" (for instance) wouldn't stand for two minutes the sort of mushy thinking those two goons spout.

Literacy isn't simply the ability to read. Most first graders can read words on a page and figure out simple directives like "Employees must wash hands before returning to work." (and here again, literacy doesn't necessarily have anything to do with following those directives...) Literacy involves critical thinking skills because it requires us to make meaning of the words we've read -- an always imperfect task, but one which even Derrida argues is a necessary task.

I suppose much of my worrying is due to the reach of technology into our leisure time: just as recorded music more or less killed family musical recitals as a form of popular entertainment, so too will Guitar Hero, Nintendo DS, and Virtual Worlds kill what little time remains to our thoughts after cable television carved out its chunk.

Am I sounding too Frankfurt School?

Maybe I'm just getting old.

26 October 2007

Being sick can make time for reading otherwise worthless columnists

Another thing I did while I was sick was sit around and read the paper. Thoroughly. In fact, I read two papers. Thoroughly. Because we get both the Washington Post and now recently the New York Times home deliveries. We'd cancelled the Times delivery back in the dark days of dissertation deadlines, when both papers often sat wrapped and stacked in not so neat clear and blue plastic piles in our front hallway. Ahem. Now in the heady days of post-doctoral bliss (and indeed that "we" a few sentences back is not royal, it's plural) both papers come thumping on the doorstep (generally) before seven a.m.

So I was reading the editorial page of the Post, when I noticed that one of their numerous conservative columnists was pushing yet another piece of ill-framed and unwieldly arguments across a few columns like so much shit rolling through the gutter. Yes, it was Michael Gerson opining on James Watson, our addled Nobel laureate of DNA code-cracking fame. What a coincidence, I thought, since I'd written about Dr. Watson about a week previously. Gerson was Bush's chief speechwriter for about five years and is largely responsible for the ridiculous scare phrases that Bush used to justify his illegal invasion of Iraq back in 2003 (remember the "don't let the smoking gun be a mushroom cloud" bullshit? pure Gerson). He cut his teeth working for the rabidly anti-egalitarian "Heritage Foundation," and somehow the Post allows him to spew filth twice a week on their pages.

Gerson of course didn't see Watson's gaffe as a problem for the usual consumers of racist eugenicist claptrap, but rather a problem for "liberalism":

Watson is not typical of the scientific community when it comes to his extreme social application of genetics. But this controversy illustrates a temptation within science -- and a tension between some scientific views and liberalism.

The temptation is eugenics. Watson is correct that "we already accept" genetic screening and selective breeding when it comes to disabled children.

Oh. Well, maybe I can accept it so far, since you could read Gerson as tacitly acknowledging that it's only a problem for liberals because conservatives already agree with Watson's racist argument. But you only have to read a little further before you realize that Gerson has actually just set up a straw man argument that he labels "liberalism," and he doesn't even do a very good job of it:

This creates an inevitable tension within liberalism. The left in America positions itself as both the defender of egalitarianism and of unrestricted science. In the last presidential election, Sen. John Kerry pledged to "tear down every wall" that inhibited medical research. But what happens when certain scientific views lead to an erosion of the ideal of equality?
OK. Not hard to spot the first one, right? That little slip between "unrestricted science" and Kerry's attitude toward medical research. Gerson would probably have us believe that Kerry was looking to bring back Josef Mengele as head of NIH. The second one though is more important, and it's that moronic conflation, so common among conservatives, of "equality" and "identity." To ask for equal rights is not to assert that everyone is identical down to every last molecule of their bodies. Of course you could scientifically ascertain that some people are taller, some are shorter, some are stronger, some are weaker, but one doesn't go about handing out political rights based on such distinctions.

Sure, you could argue that Gerson, behind his straw man argument, is really scared that more genetic research will lead to attempts to "perfect" the race (like many of us, he's probably seen Gattaca), not that it hasn't been tried before. Those attempts generally come from the conservative side of the table, you know, the ones who at one time or another are trying to keep immigrants from the "wrong places" out of the country because they'll "mongrelize" America, or who tried to keep anti-miscegenation laws on the books to maintain the "purity" of the white race.

However, Gerson tries to slip it by us one more time, arguing that because progressives trust in science so much yet believe in egalitarianism (which again he sees as somehow opposing one another), they might yield to the temptation of creating a master race:

Watson and many scientists assert a kind of reductionism -- a belief that human beings are the sum of their chemical processes and have no value beyond their achievements and attributes. But progressives, at their best, have a special concern for the different, the struggling and the weak. When it comes to eugenics, they face not only a tension but a choice -- and they should choose human equality over the pursuit of human perfection.
Ahhh, he shows some real concern over the plight of the progressive, which is nice, except he's the only one who ever asserted there was a danger of progressives advocating for genetic manipulation to "weed out" the potential weaklings, etc. (seriously, Michael, it's pretty clunky to shove eugenics into the progressives' laps, as if that's a big progressive talking point -- keeping in mind that I do cede to him that back in the teens and twenties many individuals aligned with the capital P Progressives, like Margaret Sanger, were enamored of eugenics). Mainly, eugenics has been wielded by the conservative movement, who in the US argued that feeble-minded immigrants and their biological inferiority made them susceptible to Bolshevism. I kid you not -- a political outlook linked to one's genetics.

Unfortunately in today's world, most of us realize that it's the progressives in this country who tirelessly work to protect the rights of the downtrodden (physical, economic, or otherwise) already in this country. It takes real chutzpah for a neocon like Gerson to tell progressives they should side with "human equality"; after all, it's Gerson whose rhetoric has been essential to dehumanize Arabs (especially Iraqis) and to ensure the linguistic success of Bush's hubristic war of political eugenics, attempting to install democracies from the barrels of guns.

I have a seriously hard time trying to take seriously moral advice from a man so closely linked to the most corrupt, blood-stained, morally bankrupt administration in the US with the exception of Richard Nixon (and maybe Andrew Jackson).

11 March 2007

Time and distance are out of place here.

It's an hour later than my body thinks it is. At least my computer knows the real time. Daylight Savings Time reminds us all that time is a relative matter, or at least our methods of keeping it regular are. Congress could just as easily passed regulation that instead of one hour forward we went two hours forward. Or even better, to appease business, we could still require the 8 hour workday but decide there were only 12 hours in a day. Everything would have to be recalibrated! Old watches would be obsolete. That would generate all sorts of work for people, replacing timepieces in public places, engineering new watch faces that reflected the changed time standards, and of course the computer programmers who would have to tell every little and large thing that had any sort of computer chip to tell time in it that we were now on a new schedule.

Happy DST all!

08 March 2007

When I'm Virtual, I'm Ten Feet Tall

The bbc.co.uk has an interesting story about virtual reality today. They're talking about "massively multiplayer online" games, or MMOs, and a few game developers were making some predictions:
Daniel James, chief executive of Three Rings, said: "You are about to see, and this is happening already in Asia, many different kinds of games that are massively multiplayer and less based on role-playing games."
He added: "This medium is going to destroy TV - and it's going to happen in short term."

Destroy TV? Wow. A long time ago, like maybe a year or so ago, I wrote about virtual reality and how it's rapidly challenging our "reality." After all, why be the limited you that you are in real life when you can be the suave cosmopolitan virtual you? Get yourself an avatar that scores well on beauty and sophistication, and new worlds open up. Anyway, that rumination was based upon my brother-in-law's telling me about his friend who plays World of Warcraft and has dropped engagements with many of his real-life friends so he can attend "parties" with his Warcraft friends in the virtual world.

If you haven't read your Baudrillard, now is the time to do so. Jean Baudrillard died this week. I don't read French, but I wish I could read this. Baudrillard predicted the collapse of the "real" and the "virtual," although he argued forcefully that even what we call the "real" world has been virtualized by such forces as the mass media.

But getting back to the destroy TV bit: television, that great gaping hole in nearly everyone's living room, offers us one-way versions of virtual reality: we can get stuck inside someone else's plotlines. MMOs offer us two-way versions of virtual reality: we can create and interact within the rather vague limitations of the software. It's like those old-fashioned "Choose Your Own Adventure" books from a pre-internet youth...but now it's almost endlessly malleable. You create your own reality when you want to. Simply logout when you've had enough.

Perhaps most interesting for the bloggers and the myspacers and friendsters and facebookers out there is this little tidbit:
"It will be really hard to tell what is and what isn't an MMO. There will be a lot of experiments in convergence between social networking and MMOs.
"Five years from now a social networking site without a 3D universe will look like a
dinosaur."

So we will go from the flat experience of a webpage (think of the myspace experience now) to a more surrounding and "realistic" world of social networking, where your current myspace page becomes a house people can visit, can look around, can pick up objects, maybe make themselves a spot of tea, etc. Conversations won't be the bloop and blips of instant messenger, but the spoken dialogue between two avatars.

Honey, it's cold outside, but the sun's always shining in my computerworld...

31 January 2007

Scholarship, Past and Present.

Recently I've been thinking about the changes that have come over academic society in the last two decades. When I was sent off to college in the lazy late summer of 1987, precious among my luggage was my handy electric typewriter with a dual cartridge that included the correction tape right with the normal tape. Sometime during those magical four years in which I supposedly learned what I needed to function in some career out in the "real world," I learned to use a Macintosh (like that was hard) and MicroSoft Word and to deal with the rigid demands of a dot matrix printer. The library still had its card catalog, but a new electronic system had been in place for a short time, which you could access from the library lobby. I spent a good deal of time in the periodicals room thumbing through bound copies of Studies in American Fiction, American Literature, and the Faulkner Journal.



These days, you don't have to go to the library to read most journals. JSTOR and other services have put that content online in pdf format and you can browse the catalog and download articles in the comfort of your own home. Research has never been easier. Conversely, the status of scholarship has declined to the point that students believe wikipedia entries are as authoritative as PMLA. They aren't.

However, there's something to be said for a library as a refuge. I certainly feel more scholarly when I'm sitting in the main reading room of the Library of Congress or even in the stacks of a university library.

To be surrounded by books is a consummation devoutly to be wished.