Monday, December 31, 2007

Resolutions

I have mixed feelings about New Year's resolutions. Sometimes I think they're just hype. Other times, I think they are useful ways to consider where you want your life to be going. It's just the getting there that's hard.

For example, I probably made a New Year's resolution to quit smoking a dozen times during my life as a smoker. It never worked. That is, it never worked until the year I made the resolution to quite smoking not on January 1st, but sometime that year. It was April by the time I did it, but it worked.

I've also had some other rather remarkable successes with New Year's resolutions.


On the other hand, of the resolutions I made last year, I only accomplished one of the four. The resolutions were (in no particular order):
1. visit a new restaurant or coffee shop every month
2. finish the novel I've begun
3. run a half marathon
4. read or think about the dissertation at least 15 minutes everyday.

I only really accomplished number 1, and really, it wasn't a terribly hard thing to do - I really only made it a resolution because I wanted to explore the new places in the city.

Number 2 and 3 didn't get completed at all. I had to shelve the half marathon because of injury, which I suppose can't be helped, and I was making good progress until I was forced to stop. I may put it on the list again. I'm a little leery of it because the injury, so we'll see. But the novel? It's not like it actually requires a lot of work. It's 80% complete. It's just that last year I didn't write a single additional page. Why is that? I know I tell myself it's because I've been busy - and I have - but there's gotta be more to it that that because I could've made the time. I'll have to keep pondering that one.

Only number 4 was came really close to being accomplished, especially if I cut myself slack and make it 15 minutes every week day. Then I pretty much accomplished that every day except for perhaps August when we were moving. I've made really good progress on the dissertation, and I really enjoy working on it, so the goal behind the resolution - to pick up my progress on the dissertation - was definitely met.

So now I'm left with wondering if there's anything I want to put on a list for this year. Right now, I'm not certain I want to make any resolutions. Can you still call them New Year's resolutions if it takes you till February to articulate them?

Friday, December 28, 2007

Is elearning really better?

For someone who works in elearning, I realize this might be a career-threatening question to ask, but it's not that I'm questioning the value of technology in learning, but that I'm struggling with the social and cultural values around technology and learning.

One of my tasks this week is to create the schedules for two hybrid courses I'll be teaching next week. The syllabi are ready, but I'm struggling with the schedules still. I know my learning objectives, and I know what results I want.

What I'm struggling with is what combinations of online and face to face (f2f) activities will be accomplish those objectives and produce the desired results.

Part of the problem is just scheduling. One of the hybrid courses meets only once a week, which means we've got 4 hours f2f then a stretch of a week of nothing but online. The students will mostly be first term students as well, which will mean they'll need LOTS of hand holding to adjust to this kind of class. But even for the course that meets twice a week f2f, the schedule is still less than ideal.

What would be ideal?

A schedule where we meet for about 6 hours a week f2f and 2 hours online for the first half of the course, then 3 hours a week and 5 online for the second half would be ideal. But I've yet to hear of a post-secondary institution with such flexible scheduling.

If I had a schedule like that, we could concentrate on learning the basics, practicing them in the lab, learning to collaborate within groups and as a whole class, then I could send them off to practice what they've learnt mostly on their own. That would fit the objectives of the course nicely, and it also follows the progression that writing naturally undergoes as well. Brainstorming is a mostly collaborative affair (even if the "collaboration" exists primarily in preliminary research, not discussion), and writing a mostly solitary one. So lots of f2f at first, followed by less direct contact and more remote, online contact as support for the writing would be ideal.

[At first I was going to write that the schedule would be unique to writing courses, but then I realized that might not be the case. Perhaps it would also be ideally suited for the teaching of science, or communication studies, or even engineering. I was going to say that teaching writing is more holistic and harder to divide into discrete units, but then thinking of my own science degree, I sometimes wonder if I might have had better insight into the discipline as a whole if all my classes weren't divided into distinct units, with testing and evaluation at the end of each of them. It made each topic seem so discrete from the others. Perhaps the scheduling of those courses could also do with some flexibility!]

Since I won't get my wish for needs-based scheduling, and will have to meet regularly online and f2f, I'll have to adapt the course. But that desire got me thinking about elearning in general.

There are a few things that I think I can state with relative certainty.

1. elearning has been around long enough that most educators are familiar with the concept. Early adopters still continue to push the envelope, but most post-secondary institutions and instructors (and a good number of primary and secondary school boards) are using at least some technology to "enhance" learning.

2. most educators did not experience elearning in their own education, and if they did, it was in its very early stages (I realize this is not universal, but note that I said "most") so it is sometimes a challenge to understand the learner's experience.

3. learners (particularly post-secondary learners) are generally familiar with at least some of the elearning technologies they will encounter in the classroom, so that as a designer of elearning and blended/hybrid courses, you can count on most of your students/users familiarity with some of the basics.

4. elearning has changed so rapidly that there is little long-term research on the effectiveness of various technological solutions to educational problems, although most educators and designers would seem to feel that at least some technology has made the job of teaching (or the administration behind it) easier.

Perhaps you might challenge some of these assumptions, but they are based on my observations both in post-secondary classroom settings, and in the corporate elearning environment.

But the title of this post asked if elearning is really better because I have some doubts about it.

To be honest, my doubt does not arise from some kind of anxiety about technology itself. I think as a tool, technology provides us with a means to produce effective learning. What I do have reservations about is that in adopting technological methods to try to deliver learning, we are adapting our teaching to the technology rather than using the technology to aid our teaching.

I'll try to explain by describing a few experiences over the last few weeks that have got me wondering about it all.

A few weeks ago, I assigned an essay to my students. It's a challenging essay, but I've taught it before, and when students are prepared for the challenge, encouraged to undertake it, and then supported in the classroom through their questions about it, then generally perform fairly well. Not this time. This time, even with extra support and time, the students disengaged. They thought the essay too long. They thought it "irrelevant" to their lives. They didn't like that they had to look up some of the words they didn't understand (or they didn't even bother and just ignored their ignorance). They complained en masse about the length of the assignment. As I'm working on the schedule for this term, this complaint is on my mind and I'm trying to decide whether to use the essay again.

A week after this, two of my students bragged to me that they hadn't read a book since high school. Another one told me that even in high school he hadn't read a book - he would just read the back cover when required to write about it!

Also a few weeks ago, I watched this video over at Phendrana Drifts, which began to worry me. The students in the video seem to justify their non-reading of course material by pointing to how much time they spend on Facebook or their cell phones. The argument seems to be equating reading a textbook with reading a social networking page. Call me old fashioned, but I think the cognitive engagement involved in reading a textbook is qualitatively different than what is involved in reading a Facebook funwall.

Then a few days ago during a family discussion, my children told me that their friends think they use big words. They also said this is because I use big words at home, and that being surrounded by a large vocabulary it became natural for them to adopt the same.

I object. I don't use big words. I don't use a big word just to sound more intelligent than I am - but I will use a less than common word if I think that it more accurately represents what I want to say. I am a bit particular in that way.

But the discussion got me thinking about vocabulary and how one acquires it. I never spent time reading a dictionary to learn the words I know. Sure, I've looked up a few words in my time, but mostly I acquired vocabulary from reading and discerning through context what a word meant. Usually multiple contexts. I surrounded myself with language, and it became second nature to me to use the language that I've read.

A light dawned then. Although I've known for a long time that good writers are also readers, I'd always had a bit of a suspicion that at least some of my students wrote so horribly (and inimaginatively!) because they just didn't want to try. It really hit home during that discussion that they just don't have big enough vocabularies to actually write accurately. They just don't know enough different words to be able to express themselves, or describe things, accurately.

So maybe you can see what I'm thinking now. For myself, I know that the sustained effort of reading books, including long ones, and reading many of them, has trained me how to read long pieces of writing and has given me a large enough vocabulary to be able to understand most of what I read. If my students have read only a handful of books in their lives, and their only current reading consists of magazines and webpages, then when will they have had the opportunity to learn how to maintain interest in a longer piece of writing or to even understand enough of the words in it to know what they all mean?

The problem with elearning is that it encourages this kind of superficial engagement with the material in the course. This isn't to say that it isn't possible to create elearning opportunities that go beyond just the superficial, just that the association between online and superficial is strong.

For learners/users who are accustomed to short, pithy online writing like they find in email, instant messaging and social networking sites, this kind of superficial writing becomes associated with the medium in which they view it. So they come to expect everything online will be a sound bite, a short, focused piece of writing, or that it won't contain any big words. When confronted with writing online that requires a sustained, thought-provoking engagement with the material, how are they then to respond? There is a good chance they will respond by rejecting the material, because it involves a level of concentration and attention they are not accustomed to giving to online content.

All this makes me think that there is a real value to the old fashioned book. Books require discipline to sit down for extended periods of time, focusing your attention on only one thing. I realize that many of my students can focus their attention on only one thing for a long period of time - I do have plenty of gamers in my classroom. But there's a difference in cognitive engagement with a book, which requires active imagination of the action taking place, and the reaction to a video game, isn't there? I realize many games require active participation - more so than say television watching - but they still are passive at a certain level, aren't they? Even if the cognitive activity of gaming is equal to that of the book, things like vocabulary development are still absent.

But even if we don't bring back the book as the standard text for education, as an educator and elearning designer, I think it's important to work to create that kind of discipline in students. In corporate elearning, the sound bite works, but that's because elearning is a secondary layer of education that always pre-supposes a more formal layer of education in the form of a degree, or training. eLearning in the corporate world is usually for upgrading or skills maintenance. But in that first degree, the one post-secondary students (and even secondary students) are pursuing, one of the skills they will need to develop is the ability to stick with a project for an extended period of time.

The challenge, at least for me, will be to develop the depth of understanding that students need while at the same time using the technological tools at my disposal to do it. Part of the challenge will be re-training students to understand that just because it's online, doesn't mean that it won't be hard and they won't have to work at it. Shifting students' mindset from online = superficial to online = just another medium is a challenge I think those of us who work in elearning have to manage in order for the technology to really make a difference in how we teach. The difficulties of meeting this challenge is part of the reason why I question a wholesale embrace of elearning in the classroom. Without understanding the nature of the medium, it would be far to easy to dumb things down to the level students come to expect from online instead of challenging them to reach the levels of complexity the content requires.

I've run across an analogy a couple of times that is usually used to demonstrate the need for technology in education. The analogy compares surgery and education. A surgeon of 100 years ago brought into a modern surgery would not be able to perform surgery because so much has changed, while a teacher from 100 years ago would be able to teach in a modern classroom (aside from lacking knowledge of what's happened in the last century). In other words, the practice of surgery has radically changed, but the practice of teaching has not. This story is usually used to demonstrate how teaching needs to get on the technology bandwagon and upgrade itself.

But I have to wonder. Is the practice of teaching mostly the same as it was 100 years ago because it works? Is this a case of "if it's not broke, don't fix it"? Surgical survival rates are higher now, which demonstrates the success of the changes in surgical practice. But is teaching the same because we've already figured out a good way to do it? After all, by most measures we're smarter, more literate, and better educated today than we were 100 years ago. We must be doing something right, mustn't we? Is elearning just a case of unnecessary messing with something we already do well?

Friday, December 21, 2007

Just an average Joe... or Stephen

Today after work I had one last stop to make aside from groceries. I stopped into my local Chapters bookstore, which was packed with people, though the helpful salesclerk who directed me to what I was looking for told me it was actually quiet!

As I approached the long line at the cash register, I noticed a security man next to the line. He wasn't wearing a uniform, but those distinctive ear pieces are a dead give away. I found myself wondering what had recently happened in the store that they'd decided to hire security. I'd been a regular at this store for many years before we'd moved away, and had never seen a security guard there before. I assumed it had something to do with the holiday rush.

As I stood in line, I saw a gentleman leaving the tellers, far ahead of where I was, and thought, "he looks familiar" and then a second later, "he looks like Stephen Harper." Then it dawned on me. It was Stephen Harper.

Two thoughts flashed through my mind in rapid succession. The first was curiosity. What's in his bag? What kind of books does the Prime Minister read? (Then I realized peeking in his bag wouldn't do any good because it was probably a gift.)

The second was a sense of approval. Some of it was for the man, but only a bit, since I don't agree with a lot of his policies; mostly it was a kind of sense of inclusiveness, a bit of homegrown pride, and a touch of admiration. I live in a country where the leader of my nation does his own Christmas shopping! I think that's pretty cool. Sure, he was surrounded by three men in dark coats and another in a plain suit who I assume was some assistant (I'd finally noticed the other security men when I'd realized the Prime Minister was in the checkout line with me). But at the same time, he was making his own purchase. And when a woman stopped him to ask if she could introduce her daughter to him, he stopped and chatted a moment before leaving the store.

Just another one of us poor, benighted souls trying to squeeze in a little last minute shopping... there's a kind of humility there that I can't help but admire. But then I realized, while he might be fighting the last minute shoppers, at least he has an entourage that can help clear the way! Not like the rest of us...

Wednesday, December 19, 2007

Feathered Fan

Walter must be fairly impressed with the Flames' 6 road-trip win record...

Tuesday, December 18, 2007

Posts begun and not completed

By the looks of this blog, there's been no activity. But that's not true. I've begun many posts. They just don't get completed. Sometimes they haven't even gone beyond the boundaries of my brain! They've been started; just not completed.

It's not that I don't have anything to say - I certainly do! It's just that I've got so many other demands on my time right now I can't seem to get any posts finished. Perhaps all those drafts will get completed at the beginning of the year....

I will say I was disappointed by the end of I Am Legend. They made it a Hollywood ending, and their explanation of the "legend" was lame, tacked on as if it was an after-thought. I went back and flipped through the ending in the book because I thought perhaps I was being unrealistic in thinking they could've filmed that ending. But no. I can clearly see how they could've translated the book's ending - far superior - to the screen. Someone just chose not to. Disappointing.

Last week I hated my students. This week I love them. It might have something to do with having a lot of marking to do last week. Too much marking. But if you've ever taught, you know what I'm talking about... and even if you haven't, you know what parts of your job you hate. Yep, that's where I was last week.

Holiday preparations are mostly complete. There's the grocery store still, and some gift wrapping yet to do as well as one more party, but I should not have to enter another retail outlet till after the holiday season, so I'm breathing a sigh of relief (partly because I won't have to listen to that horrible Christmas music anymore).

January and February are going to be very busy months for me, so I'm scrambling right now to finished academic and teaching activities as well as the online proposals that need to be gotten out of the way before the crunch hits during those months, but I'm hoping to finish those up this week as well and actually relax. (Yes, I know it won't actually happen, but the fantasy is keeping me going right now...)

The great thing about December is that you can always hope that the new year will be different than this one!

Wednesday, December 12, 2007

Straight No Chaser - 12 Days

If the department stores could just play this kind of music instead of the stuff they're playing, I'd be a much happier shopper!

Tuesday, December 11, 2007

Wow! That IS a long time...

My head hasn't been in the dissertation lately - which I'm missing terribly. I keep looking at the pile of dissertation books, wishing I could pick up the next one. I suppose I should just be glad that I'm not at the part of the process where I dread working on it.

My online work has been really busy, as has teaching, but that's not the only reason I'm not dissertating these days. What I am working on is that project I mentioned a while ago - the one where a colleague contacted me out of the blue to contribute to her book project.

I've set aside the next two weeks to work on the revisions. I figured two weeks (which really with everything else translates to about 3 days) would be enough when I first heard of it. But I just looked at the requirements again, and because I actually have to pretty much double what I've got right now, I've got lots of room to talk. That's good. I've got lots to talk about.

I'm also thinking there's another really good reason I should give this project a bit more attention. After all, I've been working on it for almost 10 years already. Such commitment deserves more than a couple weeks revisiting, don't you think? Hard to believe it's been that long, but it has. I first wrote on the Dark Tower series when there were only 4 books in the series. In fact, when I wrote the proposal for my undergrad honors thesis, the fourth volume was still in production and I read it as I was doing the research for the project.

I enjoyed it so much, that once the last three books were produced, I decided I wanted to revisit the work I'd done back in undergrad, so I submitted an abstract to a conference and dug out the thesis and started working. I certainly needed to do a lot of work on the thesis - after all, I'd written it many years before as a novice scholar. But to my surprise as well, there were some really good bits in it. Surprisingly good. I found myself marvelling that I'd actually written it in some places. I didn't remember being that insightful...!

And now I'm returning to it again several years later. It's a really interesting project to look back at my work on this particular piece of scholarship over the last decade. It kind of nicely chronicles my development as a scholar. The thing that surprises me the most about looking back at my undergrad writing is that there were some good ideas there - but they were badly expressed. Or more often, only partially expressed. I wrote a lot of observations, but failed in a lot of cases to make the connections between them. I didn't move beyond the surface. I didn't pose difficult questions of the text. Probably because I was still in the grip of author-awe I wrote about a few weeks ago. (Being reminded of my own struggles, even as an honors student has also helped me understand the fumbling rhetoric of my students a bit better)

But the work I did then was a good start. It has created a nice groundwork for me to take now and shape into a more nuanced argument. Or at least that's the hope! But it's funny how my grasp of the implications of some textual moments that I just observed earlier is much firmer now than it was then.

What's the neatest part of all this is it gives me an idea of what it might be like to be a professor, working as a scholar, knowing all the foundational texts, or having heard all the arguments before, having that huge reservoir of knowledge and experience to draw upon when making an argument.

In the dissertation, as much as I love what I'm doing, I feel like a novice. I feel like there's so much that I DON'T know in comparison to what I DO know that it's difficult to adopt the authoritative voice that I need in order to make the very arguments I need to make. In this project, I've worked over the material so many times now, the writing is all developing the best argument possible and then choosing an effective means of expressing it - I find I'm not worrying so much about authority.

It's a really nice feeling. I'm looking forward to attaining that confidence in more than just this one text. So this must be what it feels like to really be an expert, not just write as if you are one!

Monday, December 10, 2007

Nothing

It's not that there's nothing happening in my life. There's a good bit going on. But most of it is work - which is boring. The parts that aren't boring, are mostly just making me grouchy, and I've never mastered the art of the rant. When I rant, it just sounds grouchy. The world doesn't need more of that.

So until I find something pleasant or intelligent to post on, I'm afraid all you're getting is silence. I hope it doesn't last long.

Wednesday, December 05, 2007

'Tis the season...

... for funding applications. Apparently many funders out there leave their application season till the end of the year. Part of my online learning job is to file funding applications, proposals, precis, whatever you want to call them, for the projects we undertake.

I'm good at it. I've usually got a pretty good idea of what a funder is looking for and can translate their "criteria" into something that works with the project we're trying to do. So far the vast majority of my proposals have been accepted, which is good because that means I have work. It also helps that our marketing guy finds some really good funding sources for us to make proposals to.

But proposal writing is a tedious, long process, often requiring a great deal of research in order to mesh the purpose of our e-learning program with the criteria the funder uses to decide who they want to fund.

So on top of the holiday season, which is already making my life busy*, I've now got funding season to contend with - I'm feeling tired already....

*strange thing is, I have never really felt a holiday crunch before - it's always seemed manageable. I'm not taking on any more than I would any other year, perhaps even a bit less, and yet it seems to be more of a hassle this year and I can't figure out why. It's strange. It's also making me feel even more Grinch-like than I usually am!

Sunday, December 02, 2007

Time, money, brains and children

There's a lot floating through my head these days, making it difficult sometimes to figure out how to make sense of it all. A couple of threads came together for me as I was reading an article on public perceptions of genetic technologies. From "What Do You Think About Genetic Medicine?", a paragraph stood out. It's the statement of a woman participating in a focus group of parents of children with genetic diseases such as cystic fibrosis:
You have an imperfect child in this society and its a real handicap, its the money, its your time. The demands on a woman’s time [are] profound. If you have a handicapped child, then you’re looking at the heartbreak if she loses that child [and] all the investment. And these mums develop a very excellent and moving characteristic that makes humans human you know. They’ve got their feet on the ground and they can see the way things really are. That’s the mothers who have a child that’s disabled within the family. It does develop part of you that if you’ve never had to look at you’ve just not grown up . . . perhaps it keeps us being human. I don’t know how boring we’d all be and how dangerous we’d be if we were all a mob of intellectuals.
It's the last sentence that really got my attention. I don’t know how boring we’d all be and how dangerous we’d be if we were all a mob of intellectuals. The juxtaposition of boring and dangerous seems to border on oxymoron - danger tends to be thrilling, does it not? But that's not what my first response was. My first response was to wonder about this woman's need to contrast human-ness with intellectualism. Mothers who care for disabled children are kept "human" while intellectuals are not. Aside from the gendered bias of her statement, which of course brings up the question of where the fathers are, and whether they can be kept more "human" by caring for their disabled children as well, her statement is interesting.

As a mother and an intellectual, I feel like I have the authority to respond to this kind of sentiment. This kind of anti-intellectual rhetoric is not unusual, and I've certainly heard my share of it. In its most common expression, it opposes intellectualism to common sense, or life skills of some sort. The intellectual is derided for an inability to read a map, use a hammer, cook a meal, or other "basic" human skills.

Now I don't wish to take issue with this generalization as a generalization. I agree that there are plenty of intellectuals - myself included - who fail miserably at tasks that many people find simple. Sometimes it's because we over-complicate things, but more often I think it just comes from a lack of experience. In some cases, intellectuals (and others lacking common skills) are like children who just haven't had enough experience with something to develop a real competence at it. Even when intellectuals engage in routine or everyday activities, I suspect there are times when their attention is elsewhere, and thus they never pay close enough attention to those tasks to really get them right. I know that there are many times I'm not paying real close attention to mundane activities, and sometimes I find myself at a loss when I suddenly realize that I haven't been paying attention. You can be inexperienced at something because you haven't undertaken that task before, or you can inexperienced at it because you've never taken the time or energy to really pay attention to the task and learn it well. Of course you can also just be plain old incompetent, but that quality seems not to be limited to any single socio-economic factor.

I will give the author of the above quote credit for identifying a difference between the mother of a disabled child and an intellectual, but she's misplaced the nature of that difference. She identifies the difference as levels of "human-ness" but I think the primary difference between the two lies more simply in activity.

The activities of the mother of a disabled child are largely physical - caring for the child, working to secure funds, resources, and other necessary items in order to raise that child. In kind, this isn't much different from what parents of able-bodied children do. In quantity however, the workload of a parent of a disabled child is far greater. Everything takes more time and energy (and money). Although I don't have a disabled child myself, I know from the stories told to me by parents of disabled children that they have to expend far more effort to provide for their disabled child's needs than I do for any of mine.

The activities of an intellectual are largely mental. In fact, it's one of the things I sometimes find the hardest about an intellectual life - sitting still for the kinds of time that solid intellectual inquiry requires. (It's also what makes the life of the intellectual and the life of the parent clash so often)

I think it would be more accurate to oppose physical and mental activity as the hallmarks of the mother of a disabled child and the intellectual, than assigning humanity to the mother and inhumanity (that's what I read the dangerousness to be a sign of) to the intellectual.

But as human beings, are we not both physical organisms and thinking creatures?

I realize the humanist characterization of 'man as thinking creature' is outdated, but that's mostly because it defined 'man' as man, not as a dual-gendered humanity. If you let women into the thinking creature category, then the definition of humans as thinking creatures does seem fairly inclusive as well as definitive, marking off human from animal in a clear and reasonable way.

I'm starting to get a little lost in my own argument to some extent at this point, but I'm also trying to puzzle out this idea of the "dangerous" intellectual because I think it's critical to the kinds of public discourses we need to be having in our society with the emergence of so many potent biotechnologies - biotechnologies that the woman quoted above was responding to. Is the "dangerous" intellectual in this case based on the "mad scientist" stereotype of a man (not a woman) in a lab, trying to create life in a test tube? Is the speaker in the quote above actually making an opposition between women (who take care of disabled children) and men (who build dangerous technologies)? Or is it more simply as I first thought, an opposition between "real" people, and intellectuals?

I don't know that I have the answer. Unfortunately, the article I was reading gave no additional information beyond the quote I've reproduced above. I do know from other research that has been conducted, that there is a real opposition between the general public's concerns over biotechnology and the biotechnologists embrace of it. Certainly for researchers in biotechnology, there is the tendency, driven by curiosity (a strong human characteristic) to see how far the research can go, without necessarily stepping back to take a look at the big picture. The research also shows that there is no correlation between genetic knowledge and concern about biotechnologies - people who know their genetics are just as concerned as those who don't - so that it is not actually a case of those in the know being against those who don't know.

How pervasive are these assumptions - the stereotype of the mad scientist - or the equally damaging stereotype of the martyr-like mother who expends all her energy on her child? And more importantly, how do they get in the way of our understanding of each other? I don't know. As a mother and an intellectual, I can understand both, but identify with neither. Perhaps I'm just an oddball that way.

Wednesday, November 28, 2007

Not a Hollywood ending...

Went to see The Mist over the weekend, which does NOT have a Hollywood ending. Or if that's a Hollywood ending, they're certainly changing, because I'd resigned myself at this point to lame, everything-turns-out-alright-in-the-end fare from the big studios. Everything absolutely does not turn out well in the end, which was quite realistic, and really left you wondering what things would be like if the movie continued.

I've actually had the pleasure of watching another big ticket movie over the last couple of weeks that turned out to be far more thought provoking and narratively complex than the trailers would've had me believe. But then again, Beowulf's script was co-written by Neil Gaiman, so I should've known it would be more than the big-muscle smash-em-up action film the trailers tried to make it out to be. I haven't yet decided whether I like this new Beowulf better than Beowulf and Grendel. I think it depends on what kind of mood I'm in - whether I want an extrapolated story, or one closer to the original. Both are quality films, but for different reasons.

My experience with both The Mist and Beowulf has me cautiously optimistic that Hollywood might not have ruined the ending of I am Legend. Having seen trailers for it almost a year ago, and remembering a colleague at a conference raving about how good the book was, I read it several months ago. And I immediately despaired about the movie, because the book is anything but a Hollywood ending.

So now I am very curious to see if the producers of I am Legend have made it palpable for an audience who wants everything to turn out alright in the end, or if they have remained faithful to the book's darker and more philosophical ending. I won't give anything away, except to say, if you've seen the trailer, have you ever wondered why it's called I am Legend?

Tuesday, November 27, 2007

Dissertating

Strange sounding word "dissertating"... it simultaneously sounds ominous and a little creepy, but its verbal proximity to "dessertating" (which isn't a word, but if it was, would be a delicious word), makes it a strange kind of word.

So you might notice over on the left column that I've removed the banner declaring me a 2006 NaNoWriMo winner. I figured since the 2007 NaNoWriMo season has started, it would soon become a dated banner... but my blog certainly misses the color it added.

Instead, there's a little dissertation meter. It's recording the number of words I've typed in the dissertation compared to the number of words I should have by the end. It's not very accurate though. For one, the number of words I have written is fairly large but the number of GOOD words I've written is much smaller. Right now I'm spilling stuff out on the page and it needs to be organized and made more concise, so it's not really an accurate idea of where I am, just how much I've spilled.

I also suspect that my dissertation will be longer than the 70,000 words I've allocated. That number is based on the ideal number of words, at least according to Patrick Lunleavy in Authoring a PhD. The book indicates that each chapter should be an average of 10,000 words, and I've got 7 chapters mapped out, hence the 70,000. But I think some of my chapters will exceed the ideal number of words just because of the things I need to accomplish in them. Without footnotes it might be 10,000, but with them, they'll be considerably longer.

But the book is very useful. I considered buying Write your Dissertation in 15 minutes a day, but I suspect that you can't really write a dissertation in 15 minutes a day, and the title is just a clever way of getting desperate people to buy the book. Maybe I'm wrong, but I was just a bit suspicious. And besides, the Dunleavy book came well recommended.

I bought the Dunleavy book because I felt so overwhelmed at the start of the dissertation that I didn't know where to actually start! Giving me some good advice for how to get started, the book has been valuable. But when I started reading one of the later chapters, I realized that I don't really need this book, at least not to write the dissertation.

The dissertation is indeed a monstrously huge beast, but when you come right down to it, it's just like any other piece of writing. You need to do all the same things that you do for any shorter piece - even blog posts - that is, you need to introduce your subject, you need to organize the content into a logical sequence, and you need to finish it off in some way.

Sure, in a dissertation, you have to do this very well. And you have to do it over many, many pages. But really, the process is just the same. And no book will help you get out of the necessity of just doing it. So the beginning of the book was useful, but I might not turn to it again until I'm finishing up to see if there's any final words of wisdom I might glean from it. All I really need right now is to get the dissertation written!

One book I have also been reading, and am continuing to read is William Germano's From Dissertation to Book. It might seem in reading this book that I'm putting the cart before the horse, but it's actually quite valuable while writing the dissertation.

First off, I do intend to turn the dissertation into a book (provided of course that I end up in a job where I'm not teaching a 4-4 or even 5-4 load and haven't the time to turn around, let alone revise!). So knowing what I would have to do to accomplish that right now, might help me craft a dissertation that can more easily be turned into a book than if I didn't know what's necessary. Or at least that's the theory. So far, from what I've learnt in reading the book, I think it will turn out to have been a good idea. I also suspect that a dissertation that is written with the principles of a book in mind will be more readable and will avoid many of the pitfalls that dissertations fall into - but that's just a suspicion, I could be wrong.

It might also be appropriate since I'm on the subject to mention a book I read several years ago, near the beginning of my doctoral degree: Gregory Semenza's Graduate Student for the 21st Century. Although I read it a long time ago, I should've read it even earlier. It would be an excellent book to read before you enter graduate study. Or at least start reading before you begin a doctoral degree.

The book is valuable for what it tells you about what you're about to get into (or have gotten into). It discusses the nature of graduate study in the humanities, then moves through all the activities of a grad student - seminar papers, conferences, dissertation - and ends with a chapter on the job market. I found it very comforting to read as I went through each of these things (well, except for the job market of course), and it didn't sugar coat anything. If you're thinking at any point of entering graduate study, it would be a very good read.

Of course I question whether I needed to read these books or not. I certainly have a better idea of what I'm involved in and what's going on around me, but I suspect I might've survived quite well without them. But sometimes survival isn't enough, and I've found my reading about graduate study in some cases just as valuable as my reading for graduate study.

Saturday, November 24, 2007

Facebook vs. f2f

... so I've got a Facebook account. Most of my friends on Facebook are family members, some from far away. There are also some friend's pages that I've connected to, though people of my generation tend to use this social networking software much differently (and much less frequently) than its original targeted demographic. I've found a couple of friends from high school as well, which is a real trip down memory lane, that's for sure!

But now I've got a message in my in-box saying that one of my students has added me to his friends list. We're only in the third week of our semester, so I don't know this student well, but we've talked about assignments and such, and he seems a harmless and fairly nice person.

Now I have to decide how open I want this application to become. I never made my settings private, because I felt like doing so would defeat part of the purpose of having a page. But I also don't know that I'm comfortable with this latest request. I would have little problem adding this particular student, if it was just him. But it's the web that it will connect me to that I'm more concerned about.

At the same time, I'm also not sure I want my students to know much about my personal life. I've never felt comfortable as a teacher sharing personal details - professional ones, yes - personal, not so much. I know some teachers of adults who are comfortable doing that - I've just never found myself comfortable with sharing personal information. I'd rather keep our relationship friendly, but bound by the classroom, or at least the institution we both are affiliated with.

So now I have to decide. It doesn't really help that the institution I'm at fosters a kind of informality between instructors and students. I'm barely getting comfortable with students calling me by my first name. To share an online presence as well just feels like it crosses a boundary I'm not comfortable breaching.

Wednesday, November 21, 2007

For the bibliophiles in the audience

I first ran across Library Thing a few years ago during a book club discussion. I checked it out, entered my 200 free books and then left it. But this summer I went back to it and got the lifetime membership.

Then I set about the task of entering all my books into the site. The search function is really flexible and I had few troubles finding the bibliographic information for the books I have on my shelf. For Canadian versions of books, I set the search to access Canadian databases. For American versions, I could search American versions. For obscure academic books, I could set the Library of Congress or a university library as my database. Overall, there were a few tricky books to find, but they were usually small press publications, and of course any independent or desktop publications weren't listed (though I don't have many of those). You can also upload and download to an Excel file if you already have a list of books.

The other interesting thing is that it tells you how many other members have the same book you do. It's been interesting seeing which books are the most popular and which ones only a handful of other people own.

Why did I do this?

Well, first off, because I like organizing things. I'm one of those people whose CDs are alphabetized, though my spice rack isn't - I have a life after all.

But my biggest motivation came from changing our insurance coverage in preparation for owning a home. As part of that preparation, I created an inventory of the household items, and came swiftly to two conclusions: a) the value of all my stuff greatly exceeded my current coverage (which we fixed), and b) no one is ever going to believe that I have that many books. I began to envision some insurance agent telling me after a fire "Ma'am, I don't believe that you had $20,000 worth of books in that house". Yes, that's the approximate value. No, I don't think it's exaggerated. I'd like to think it's less, but looking at it honestly, that's probably the value of it.

So if at any point you're looking for a way to catalogue your books, Library Thing's a great place to start. You can post 200 books for free. If you want to post more, you can buy a yearly, or a lifetime membership, both of which are very reasonably priced. So now, if God forbid, my library should ever be damaged, I can prove that yes indeed, it really was that big, and here's the list of what was in it...

Tuesday, November 20, 2007

Mom and Dad approve

I received a student submission of an assignment with the note: "I read it to my mom and dad and they thought it was great."



My parents think I'm the greatest too... can I just read my dissertation to them instead of defending it?

Friday, November 16, 2007

Using other people's words

A few days ago, I wrote about authority in writing. Since it was a long blog post that you might not have suffered through, I'll reproduce a couple of paragraphs:
There's something that happens as you progress through academia that changes your sense of yourself as a writer. I know I took to heart the lesson my English teachers tried to impress upon me that what was important in academic writing was not my opinion, but a careful analysis of other people's opinions. I can distinctly remember several undergraduate papers where the professor told me to put a lid on my own opinions and just focus on the facts - or "textual evidence" as we call it in literature.

I now understand what they were trying to say - that my opinions had to be supported by the text - but at the time the concept was a bit fuzzy, which meant that I studiously avoided expressing any opinion. I invested a lot in the authority of the texts I was examining. I needed to find the right quote to support the argument that I was making, and I know that a few times, the argument emerged out of what quotes I could find, rather than working the other way round.
Today, I came across a series of essays written in a course where they were discussing open education. Since I've been talking about open source with my writing students this week, I took a closer look.

The students were asked to write essays in which they contributed as little as possible to the assignment. A particularly impressive example produced this essay. In it, the writer uses nothing but other people's words to make an argument. Impressive!

It reminds me of a comment I read one time made by Hannah Arendt, Walter Benjamin's biographer, that Benjamin loved using other people's writing and his ideal was to use nothing but others writing. While the essay above is not as artistically crafted as Benjamin's writing, I think he would've approved.

What I'm wondering is whether this kind of exercise might be useful to my students. If I required them to do the same - perhaps not for an essay but for a journal entry or writing-lab assignment - would they get the point? Or would they (maybe willfully) mis-interpret it as me saying they don't need to generate their own arguments in their papers?

I suspect as an assignment it would backfire, but it's an intriguing possibility nonetheless.

Thursday, November 15, 2007

If you can read this...

cash advance

I suppose this just reinforces my statement earlier this week that I find the blog an informal venue for trying out new ideas.

At least I don't have to worry about being to erudite!

Tuesday, November 13, 2007

On writing and blogging

I tell my writing students that writing is not just one activity. I try to convince them that writing involves pre-writing activities as well as drafting and editing. Part of that pre-writing is thinking about what they want to write about, but part of it also involves reading. I know I don't convince some of them. Many of them still persist in only starting to think about the essay two days before it's due and writing it the night before. Some of them can produce passable work that way.

I don't blame them. If they're strong writers and can pull off a B in my course by writing essays the night before, then I can't fault them for not trying to get an A. They choose their own level of involvement.

What I do try to impress upon them, and what I do pay a good deal of attention to in their writing, is how they respond to things they have read. (We employ a loose definition of reading in which we include visual, auditory and textual elements) If they try to write those high school type essays where they just express their own opinions about things, they don't get a very good grade. If they express their opinion as a response to something they've read, that they've clearly thought about and critically analyzed, then they get a better mark.

But this post isn't about students. It's about the connection between reading and writing.

I start with students because in my writing classes, it takes considerable effort to get students to see that it's not just about writing - that in order to write well, they have to read as well. And in between, they've gotta think.

I was thinking about this the other day, after doing my "writing as conversation" lecture/discussion, and realized that once you get to where I am, the balance has tipped.

There's something that happens as you progress through academia that changes your sense of yourself as a writer. I know I took to heart the lesson my English teachers tried to impress upon me that what was important in academic writing was not my opinion, but a careful analysis of other people's opinions. I can distinctly remember several undergraduate papers where the professor told me to put a lid on my own opinions and just focus on the facts - or "textual evidence" as we call it in literature.

I now understand what they were trying to say - that my opinions had to be supported by the text - but at the time the concept was a bit fuzzy, which meant that I studiously avoided expressing any opinion. I invested a lot in the authority of the texts I was examining. I needed to find the right quote to support the argument that I was making, and I know that a few times, the argument emerged out of what quotes I could find, rather than working the other way round.

Even in my graduate career, I've seen this kind of comment - that I depend too heavily on the text instead of my own scholarship.

I don't think it's a bad thing to teach students at the beginning of their careers to put a lid on their opinions. It's a necessary step in the process of becoming a critical thinker - to shut up long enough to hear what other people are saying in the texts you read. Most of my students desperately need to learn how to listen to others because they've spent way too much time expressing themselves.

But when you reached advanced levels of scholarship - when you're expected to now become one of those people that students should shut up and listen to - there's very little guidance in making that transition. Or at least I've struggled with making the transition.

My advisor has told me I have to "find my own voice". But how I go about doing that, isn't as simple. It's a case of it being easy to give the advice, but taking it and implementing it is harder.

You may have noticed that in the last few months there have been a larger number of posts that directly relate to arguments I'm trying to make in the dissertation, or my responses to things I've read. I've tried to only blog things that I think would interest other people, but I've also been trying to create blog posts as a way of "finding my voice", as my advisor suggested.

The format of the blog post lends itself well to this project I think. While there are opportunities to "cite" through linking, its more informal nature means it feels more like a conversation with my readers than a dissertation. So I feel less of a need to "support" my arguments with citation. The informal nature of the blog encourages me to express my opinions on things without feeling the need to rely only on other authorities.

One of the hardest things for me is to see myself as having the authority to speak on a subject. It's a malady that many graduate students suffer from. It's what makes it difficult to write the dissertation. I mean literally write the dissertation. We use the term "write" to encompass all those parts of writing that I tell my students are part of the process. But in my case (and judging from what I've heard other say, most graduate students), the actual act of writing - committing words that represent thoughts on paper - is the hardest thing to do. It's terribly, terribly easy to put off writing because you convince yourself you have to read one more important source before you could possibly write, and then another, and another.

But the blog frees me from that self-induced constraint. Since the opinions or thoughts on a topic come out of my reading and thinking about these things, I'm confident that they still have validity, but in the informal venue of the blog post, I feel less need to rely on what others have said about things. This allows me to "try out" an argument to see how it works.

The feedback function of commenting has been an added bonus in this way. The comments of my readers have often pointed out to me places where I'm being simplistic, or even where I've missed something entirely. So not only am I getting a chance to "try out" some of my arguments, but I have people pointing out - sometimes obvious - places where my thinking is incomplete.

I'm happy that I've found a venue that allows me to shift from authority-slave to finding my own voice, but at the same time, I really think graduate education could do a better job of teaching people like me how to make that transition. I've found my own method, but a little more guidance would've gotten me to this point much faster. I have a colleague who is flying through the program, and I suspect that part of the reason is because the reluctance to express an opinion is NOT a problem in that case.

As an undergrad, I needed to shut up and listen to others. But as a doctoral student, I need to learn to speak up. My professors were really good about pointing out how to do the first, but on the second, I've been floundering with little guidance. I'll just have to figure it out myself, and if blogging is the way to do it, then blogging it is.

Saturday, November 10, 2007

The difference between bodies and machines

There's something that's been bugging me lately in a lot of the reading I've been doing. It started bugging me a while ago - probably even a year ago - but I didn't really think it important then. But this week during my reading I came across the Psymbiote project.

isa gordon, whose body provides the scaffolding for symbiote, is a very young and attractive woman (though it might be more accurate to say her body is being incorporated into Psymbiote, at least according her conceptualization of the project). Which got me thinking.

See, I used to have one of those young, tight, and attractive bodies like hers. Used to. The older I get, the looser things get in general, and of course the older I get, the less my body can be described as young. This is not unusual. It happens to everyone. (Or at least everyone who doesn't go under a plastic surgeon's knife... and even then, there's no way to really turn back the clock.) I'm not saying this out of bitterness, please don't get me wrong there. I'm rather happy with the state of my body... and my mind and my soul come to think of it. I'm really rather satisfied with the state of things. But this satisfaction has only come as a recognition of the inevitability of that loosening that takes place as our bodies age.

But I'm starting to digress from the point.

My point is that the human body changes throughout it's lifecycle. When you think of what a newborn human looks like, their proportions are different than an adult. Any adult that had the head-body proportion of an infant would look freakish. And of course children have to learn how to control the voluntary functions of their bodies.

ASIDE: There's a whole other argument out there regarding human bodies and tools, that envisions the human body as a tool which we gradually learn to use, but that's another issue and would have to be a different post. It also is an issue that I discuss in the dissertation, but so far my thinking about it isn't complete, so if a post is ever forthcoming, it will have to wait.
But the point is that human bodies change. We all know this. Yet whenever I read, or hear discussions about how our bodies and our machines will become more closely integrated with each other, the technophiles who write these things usually ignore that changing body.

They act as if the body they inhabit is the body they always have and always will inhabit.

Now as I said, I didn't really think about it much. I was aware these theorists were ignoring the changes in the body, but didn't really think about why myself. I pretty much chalked it up to the fact that the theorists who I've encountered talking about this are men.

Now, before the word "sexism" can pop into your head, let me try to explain. I will grant that it is sexist of me to expect that men writing about bodies and technology will ignore the body to a greater extent than women might. Yes, I'm guilty of thinking that. BUT. It wasn't so much that these were men writing, but that they were adopting a maculinist point of view.

Several months ago I read N. Katherine Hayles How We Became Posthuman. In it, she talks about the history of cybernetics - I would highly recommend the book if you're looking for a general discussion of cybernetics that incorporates its history with analysis of its emergence in literature e.g. Gibson, Stephenson etc. She recognizes that much cybernetic discussion dismisses the body, and when it does discuss the body, it is a normalized body, which is imagined to be white, adult and male.

If you think about it, she's right. Most of the time, the cyborg is this. Look at Robocop. Look at Terminator. Look at Case, the console cowboy. Look at Hiro Protagonist. Look at Johnny Mnemonic. All men. All white. All adult. Which means that they inhabit bodies that can go for years and years without changing.

For women, change occurs much more easily. For adult women of the same age as these men, there's the possibility that their body will change every month for a few days. There's also the very real possibility that their body will change shape radically if, for example, they become pregnant. And of course the nursing afterwards changes the shape of the body just as its function changes.

Which is why I started tweaking about this idea when I saw Psymbiote. In one of the pictures on her webpage, it shows how she was fitted for a kind of exoskeleton that she'll wear. It looks to be made of fiberglass. Now, fiberglass, when it's unprocessed, is highly malleable. But once you've formed it, it's very solid. (I dated a guy who worked in a fiberglass factory for a while, so I probably know more about it than is really necessary for an adult to know.)

My first thought on seeing this photo?

"What happens if her arm changes shape?"

Well, what does happen?

If Psymbiote is the combination of isa and technology, then what happens when isa changes so that the technology no longer fits comfortably into/onto her body? It will happen. isa's body will change. It may not change for many years. It may not change by much. But it will change.

If the technology attached to her body is indeed symbiotic, as is the project's supposed aim, then what happens when one member of that symbiotic relationship changes? How extensive are the effects?

In nature, if one organism in a symbiotic relationship changes, it can affect the health and even life of the other. Would a symbiotic relationship with a machine be the same? If you think about what would happen if the machine changes, imagines of all those disastrous science fiction scenarios where people are damaged by damaged machines certainly come to mind. Would the reverse be the same? Would a change in the human damage the machine?

And of course, this leads to the question of how much we can really integrate the body with machines, since in order for machines to change, they require an outside agent to effect that change (at least our machines of today - perhaps we'll invent machines that can alter their own structure in the future, the same as how we alter our body structures, intentionally or not). Our bodies can change through outside agency. But they also frequently change without any external intervention.

Of course what got me thinking about this is that Psymbiote's organic component - isa gordon - is clearly female. What if not just her arm changes, but large parts of her body if she chose to be pregnant? How would that affect those static technological pieces? How would it potentially constrain the function of her pregnant body? Can technology/machine adapt to interact in such an intimate manner as Psymbiote is constructed to act if the organic components of it change?

No matter how imaginative I get about it, I keep coming to the same conclusion. The machines that exist today, that are not able to adapt their form to changing circumstances, would be ill equipped to interact intimately with a human body that is capable of changing - often in radical ways - in response to its function a.k.a. its changing physiology.

This is a fundamental difference between humans and machines: our ever changing and changeable bodies. Until machines are capable of such change, there will always exist a gap between us and them that will make intimate mergings of body and machine difficult to say the least.

Friday, November 09, 2007

Flip side

"In the game of life and evolution there are three players at the table: human beings, nature, and machines. I am firmly on the side of nature. But nature, I suspect, is on the side of the machines".
George B Dyson Darwin Among the Machines

Wednesday, November 07, 2007

How to deke out an American*

As a Canadian, you have to be extra vigilant. There are a lot of impostors out there. If you suspect that someone is falsely trying to pass themselves off as a Canadian, make the following statement - and then carefully note their reaction:

"Last night, I cashed my pogey and went to buy a mickey of C.C. at the offsale, but my skidoo got stuck in the muskeg on my way back to the duplex.I was trying to deke out a deer, you see. Damn chinook, melted everything. And then a Mountie snuck up behind me in a ghost car and gave me an impaired. I was S.O.L., sitting there dressed only in my Stanfields and a toque at the time. And the Mountie, he's all chippy and everything, calling me a "shit disturber" and what not. What could I say, except, "Sorry, EH!"

If the person you are talking to nods sympathetically, they're one of us. If, however, they stare at you with a blank incomprehension, they are not a real Canadian. Have them reported to the authorities at once.

The passage cited above contains no fewer than 19 different Canadianisms.
In order:

pogey: EI (Employment insurance). Money provided by the government for not working.
mickey: A small bottle of booze (13 oz) (A Texas mickey, on the other hand, is a ridiculously big bottle of booze, which, despite the name, is still a Canadianism through and through.)
C.C.: Canadian Club, a brand of rye. Not to be confused with "hockey stick," another kind of Canadian Club.
offsale: Beer store.
skidoo: Self-propelled decapitation unit for teenagers.
muskeg: Boggy swampland.
duplex: A single building divided in half with two sets of inhabitants, each trying to pretend the other doesn't exist while at the same time managing to drive each other crazy; metaphor for Canada's french and english.
deke: Used as a verb, it means "to fool an opponent through skillful misdirection." As a noun, it is used most often in exclamatory constructions, such as: "Whadda deke!" Meaning, "My, what an impressive display of physical dexterity employing misdirection and guile."
chinook:An unseasonably warm wind that comes over the Rockies and onto the plains, melting snow banks in Calgary but just missing Edmonton, much to the pleasure of Calgarians.
Mountie:Canadian icon, strong of jaw, red of coat, pure of heart. Always get their man! (See also Pepper spray, uses of.)
snuck: To have sneaked; to move, past tense, in a sneaky manner; non-restrictive extended semi-gerundial form of "did sneak."
(We think.)
ghost car:An unmarked police car, easily identifiable by its inconspicuousness.
impaired:A charge of drunk driving. Used both as a noun and as an adjective(the alternative adjectival from of "impaired" being "pissed to the gills").
S.O.L.: Shit outta luck; in an unfortunate predicament.
Stanfields: Men's underwear, especially Grandpa-style, white cotton ones with a big elastic waistband and a large superfluous flap in the front. And back!
toque: Canada's official National Head Apparel, with about the same suave sex appeal as a pair of Stanfields.
chippy: Behaviour that is inappropriately aggressive; constantly looking for a reason to find offense; from "chip on one's shoulder." (See Western Canada)
shit disturber: (See Quebec) a troublemaker or provocateur. According to Katherine Barber, editor in Chief of the Canadian Oxford Dictionary, "shit disturber" is a distinctly Canadian term. (Just remember that Western Canada is chippy and Quebec is a shit disturber, and you will do fine.)
Sorry, eh.

*I can't take credit for this one, but thought you might enjoy.

Thursday, November 01, 2007

Synthetic biology

One of the hazards of the kind of dissertation I'm doing - one that engages with contemporary literature based on emerging technologies - is that there's always new technologies emerging that have an impact on the literature being produced. And that means there's always something new I'm learning about.

Take Synthetic Biology.

I actually hadn't even heard the term until a couple of weeks ago, just before the whole James-Watson-discoverer-of-DNA-is-a-racist thing blew up in his face. Frankly, I think the man made a big mistake making the comments he did - first because they smack of racism, and secondly (and more importantly), that because he's a scientist, people assume he's speaking from a scientific perspective - which he's not. But that's a whole different issue - the link above will give you a good idea of the nature of Watson's comments.

Synthetic biology is an interesting extension of transgenics, and it is moving human beings much, much closer to the science fiction scenarios in the books I'm writing about for the dissertation.

From the ETC report (PDF) on Synthetic Biology "Extreme Genetic Engineering: An Introduction to Synthetic Biology":
Transgenics, the kind of engineering you find in genetically modified tomatoes and corn, is old news. As recombinant DNA splicing-techniques turn 30 years old, a new generation of extreme biotech enthusiasts have moved to the next frontier in the manipulation of life: building it from scratch. They call it synthetic biology.
Essentially, genetic engineering is the alteration of existing organisms, while synthetic biology is the creation of new organisms from scratch. The report goes on to describe synthetic biology as "the design and construction of new biological parts, devices and systems that do not exist in the natural world and also the redesign of existing biological systems to perform specific tasks" and that the technologies that allow this design and construction are becoming readily and cheaply available to anyone with a laptop, some knowledge of genetic engineering and a few dollars.

Synthetic biology is interesting to me for two reasons. First, because it is a controversial topic that will require social input into policy production (and that policy will need to be more globally oriented than nationally oriented since the scientific community transcends national considerations and since globalization means that anyone anywhere can get ahold of the materials needed for synthetic biology). Proponents of synthetic biology, the most prominent of whom is J. Craig Venter (quite a character by all accounts), point to the humanitarian solutions to problems that synthetic biology brings. Producing medicines, like those needed for malaria treatment, or bio-fuels through synthetic organisms does indeed have the potential to solve some of the most pressing public health and environmental problems that the world faces today.

But producing beneficial drugs and other solutions to human-made problems is accompanied by a darker potential offered by synthetic biology, and that is the potential to create biological weapons or synthetic organisms that are harmful to human beings on this planet. Just as we have had to do for preceeding technologies (e.g. nuclear energy and weapons), humanity has to come to some sort of consensus about the appropriate use of these technologies and how to regulate them.

Not only am I interested in the development of public policy regarding synthetic biology as a human occupying a space on this planet, but it also impacts the discussion of genetic engineering and other emergent technologies (AI, alife, genomics, nanotechnology, robotics) that emerges in the novels that I'm examining in the dissertation. So it's both my life, and my life, so to speak, that are at stake.

This is in part the second reason that I'm interested in synthetic biology. Synthetic biology offers the means to produce the kinds of people/organisms/beings found in the science fiction novels I'm examining. In other words, synthetic biology, and associated technologies like nanotech, alife, and robotics could create the kinds of humans that right now only exist in science fiction. It could turn science fiction into science fact.

This transformation is the reason why science fiction theorists exist. Science fiction offers readers the opportunity to look into a potential future and decide whether it's a future they want or not. Certainly many of the scenarios imagined in past science fiction adventures have not come to pass. But in some cases I have to wonder if that's simply because history took a different turn, or because someone read a science fiction story that sounded a warning about a society we might not want to create and turned away from an avenue that would have led there.

So synthetic biology interests me as a human who will live in a world where it will become more and more prevalent, and it interests me as someone who studies literature that has already imagined what synthetic biology might produce.

And from what I've seen of the literature, even benign uses of synthetic biology have some negative and unpleasant consequences. But that also is another post.

Closely related to synthetic biology is posthumanism (or transhumanism or extropianism - they're all variations of a similar sentiment even though some within the movements very vehemently draw lines between them). Posthumanism tries to imagine what human beings could look like in the future, as we develop various technologies (robotics, alife, nanotechnology, genetic engineering etc.) that have the potential to change the nature of the human from a fully organic organism to a technologically-mediated one.

One of my first encounters with the ideas of posthumanism was with the Lifeboat Foundation, which appears at different times to be part crack-pot and part rational voice in the wilderness. I suppose that's what happens when you get enough people together that have the same goal, but different ideas about how to get there.

Posthumanists generally embrace the idea that humans will change, with the often unstated assumption that this change will be for the better. They have embraced some of the stranger manifestations of this - e.g. cyrogenics, but they have also encouraged researchers in more prosaic areas of endeavour, such as weather management and storm control.

For the most part, Lifeboat is more supportive of synthetic biology than ETC, though I think the difference has more to do with organizational structure and funding than with the relative assessment of risk associated with synthetic biology. Whether the general population embraces it or fears it, there will need to be public discussion of the technology in order to control it, and the sooner that happens, the better (both for my life and my dissertation!)

Tuesday, October 30, 2007

Still thinking about humans

I'm still thinking about "more human than human" especially after such great comments from everyone. Thanks!

I sat in the airport this weekend thinking about it, not just because of White Zombie, but because I thought it might be a worthwhile thing to think about.

At first, I rationalized that figuring out what "more human than human" means would require defining "human" first. But I also quickly realized that defining what makes a human, human in the first place would require a really long answer... say, dissertation length.

But then I rationalized that the reverse might be just as productive. If I were to examine the answers to what "more human than human" looks like, those answers might reveal the assumptions on which those answers are based. After all, in order to answer the question, you have to have some idea what "human" is in order to imagine what could transcend the human to become "more".

So I started thinking about it. The music video features a lot of kids in costumes, with the most frequent costumes being a pumpkin head and a robot. While in the live show, the huge creature wandering the stage is more alien looking. (I posted the best clip I could find from Youtube - and yes, that's actually the best one)

The first thought that came into my mind when I saw the thing in the stage show we attended was that it looked a bit like a homunculus. Not a lot, but a bit. More so like the sensory homunculus that is supposed to represent the brain's designated spaces for bodily sensations.

What both homunculi have in common is an overly large head, and the sensory homunculus has very long arms. I think that's what caught my attention the most with the big monster/alien thing at the Zombie show. But that's beside the point.

Zombie's answer to what is more human than human is an alien or a monster. And that got me thinking that the monster just might be more human than human.

Part of what makes the monster monstrous is exactly that it is "more". It's not more "human" than humans, but it's certainly "more" than human. Monsters either have more limbs, or eyes, or larger heads, or even just larger bodies overall. Their bodies tend to be "more" than human bodies are, which is what actually makes them monstrous.

Then I got to thinking about the novels I've been reading and realizing that many of the clones and cyborgs in them are monstrous in just this way - they are "more" than human. So maybe Zombie's answer of "monster" to the question of what is more human than human isn't so far off. Maybe what really makes us human is what unacademic advisor and rebeckler suggest in the comments below, that it is our frailty and our failings that make us human. To be "more human than human" is simply a construction we hold up that fails to provide us with a guideline about what it means to be human because it expects more from us than we are capable of. And that more is itself monstrous.

But I think in proposing that, I'm waxing philosophic in a way that would not be useful in a literary dissertation. But it's fun to think about, and has served the function of organizing some of my thinking about the topic and about the books I'm reading. In helping me organize my scattered thoughts, there is indeed much good.

I still am not sure about the robot in the music video though... that might be another post. Any thoughts?

Oh, and in answer to my question/suggestion last week: yes, a couple of days of sunshine and warm weather have made me feel less cheated about our short summer. Now I'm ready for fall and winter to arrive.

Rob Zombie More Human Than Human Live

Short clip from a live performance

White Zombie - More Human Than Human

The images in this video are more robotic than the images in the live show

Friday, October 26, 2007

Viva!

I'm running off this weekend to a place where I can wear dresses without needing tights, and can even wear shorts during the daytime. I can't wait!

I don't write this to make people envious - I know how it feels when other people are taking off for fun vacations in warm places and you're stuck at home, and it's not a nice feeling. But I am writing it because I'm looking forward to it. The last few months have been much more hectic than I would've expected them to be if you'd asked me about them at the beginning of the summer.

And August here was so wet and cool that it didn't feel like summer at all. Not to mention that the month was eaten up by moving and such. So I'm hoping a dose of sun AND warmth this weekend will dispell the feeling that I somehow missed summer this year. I'll have to let you know when I get back whether it worked...!

Wednesday, October 24, 2007

More human than human

Friends, readers, curious bypassers...

What image does this phrase bring to mind? What could be "more human than human"? If I asked you this question, what would your answer be?

Tuesday, October 23, 2007

Impatient yet overloaded

I get to go away this weekend... which I'm very much looking forward to doing.

But in the meantime, I've got lots to do.

Which I don't really want to do.

I want the weekend to come soon, but I need it to wait so that I can finish off everything that needs to be done this week.

Two equal but opposite desires...

Thursday, October 18, 2007

Hangin' out with the smart kids

I knew leaving my colleagues far behind and trying to write a dissertation from afar would be a challenge. But I thought it would be a challenge to stay focused, not that it might make me stupider.

Well, not really more stupid, but less articulate at least. I've been listening to Malcolm Gladwell's Blink, in which he proposes that split-second decisions can be the most accurate. He calls this rapid cognition and it results from something he calls the thin-slicing of experience. The reason thin-slicing and gut instinct can yield some of the most accurate information on which to base decisions is because they are processed subconsciously. For example, he found that gamblers playing a rigged game responded physiologically to the rig (sweaty palms, indicating increased stress) before they consciously figured out that the game was rigged.

This subconscious attention to the world also affects our interaction with it. In an experiment, researchers gave two groups of people 30 trivial pursuit questions to answer. Just before they were to answer the questions, the first group was told to think about what it would be like to work as a professor, while the second group were told to think about soccer hooligans.

Just thinking about intelligence (presumably a requisite to being a professor) yielded 55.6% correct answers in the first group, while thinking about a lack of intelligence (again, presumably a requisite for being a soccer hooligan) yielded correct answers only 42.6% of the time. Wow.

It wasn't just intelligence either. Test subjects given a linguistic test that included a high number of words having to do with aging walked slower after exiting the test room, and those with words about rudeness and agression were ruder to the examiner upon leaving.

So... maybe it would be of benefit to hang out with smart people... to think of myself as one of them... to admire how smart they really are.

Certainly couldn't hurt.

But I've left my academic community far behind, and try as I might, I don't belong to the ones here.

Might have more of a negative impact than I originally thought...

Wednesday, October 17, 2007

Tidbits

*I began this post almost a week ago before things went really hairy around here, so I'll give you the beginning and continue on from that point.

The bone marrow people think I'm a low risk donor, so they've decided that I'll move on to the testing phase of pre-screening. This means I have to go have blood drawn next week, which surprises me because this whole process has already taken a few weeks. I would've thought it would be faster since the sooner someone receives a transplant, the better, right?

William Gibson looks like a hippie farmer. Now, it's true that I've seen pictures of him before. But I'd never seen him in person till last night, and he looks and moves a bit like a hippie, and a bit like a farmer. I've met a few farmers before, and they have that same kind of look haunting their faces - I'd always thought it was a result of the unreliability of their business. After all, for a farmer, no matter how savvy a businessman you are, you're still at the mercy of the weather to some extent. This seemed to always inject a look that was one part concern, one part resignation into their faces.

But maybe I don't really know farmers that well. All I know is that Gibson had that same kind of look and could've passed for a praire farmer except for the retro-vintage Converse on his feet. Oh, and his writing is really cool!

One of these days I'd like to take a computer course that would explain to me in a bit more detail how this machine that I use every day, that I sometimes feel chained to, that frustrates me to no end at times, works. Just as I was starting this post, I ran into a problem that kept shutting down some of the functions of my system.

Now, I'm not afraid of trying to fix stuff myself, so since web access wasn't one of the problems, I googled the error message. The most prominent answer was that I'd picked up a virus, but as I kept reading, the only "people" claiming this was a problem were all pointing me to the same software - that I could purchase for the low, low price of $49.95 - that I could buy. So I had to wonder about that one, especially since I run virus scans fairly regularly with a product that I've been using successfully for years.

The other solution to the problem was to enter the registry and delete the files that had corruped. Now, I've actually done this before, even though for a non-techie person like me this is a scary venture, but I had a good set of instructions from several sources (including MS themselves) and it was successful. Only this time, the only instructions included were to erase a particular kind of file, whose designation was not provided. This meant I could go into the registry, but I'd have no clear idea which files I should delete. Not something I'm ready to mess with.

Logically, reformatting the hard drive to original specification should restore the corrupted files in their original form, yes? Yes. It did. But it also meant I had to back up all my data files, then reload them all (a process not quite complete) as well as reload all the additional software that I use (another process not quite complete). In consequence, I have been uploading but not producing much for the last few days. I guess there isn't an interesting observation there. Just some kvetching.


I have a greater appreciation for the time and care that goes into the production of a television show, or any other video production. Over the weekend, we shot a 20 minute segment for one of our elearning programs. It took about 10 1/2 hours.

Of course on a tv show, there are multiple cameras, so that if you change locations, you don't have to haul one set of lights, camera, monitor, associated cables, power units etc. to the new location as we did, which would make the process faster. And you would easily access backup equipment if a malfunction happened instead of having to drive across town to check that the equipment was indeed working. So we might have been able to finish a bit faster, but not by much. It is certainly a time-consuming production.

It was also really neat to see two professionals interacting in that kind of work environment. The talent and the cameraman we used had worked together before (in fact the talent was the person who recommended the cameraman to us), and the easy shorthand that they fell into was fascinating to watch. Perhaps it's just that film still holds a fascination for me that I was so impressed, but I was.

We're nearing the end of our semester at work (yes, we're off kilter from every other institution around us), which means students are starting to get desperate. I have one who is trying to do 10 weeks worth of work in 2 to catch up. Every time before this when students have tried to catch up like this, they've failed. But this guy just might make it - a week in and he's still on track. I'm rooting for him to do it, if for no other reason than to restore my faith in the human ability to transcend difficulties. Other students however are asking for bonus assignments to make up work done 6 weeks ago. Maybe I'm a hard ass, but I'm saying no. I've got better things to do than dream up additional work for them when they should've done the original stuff.

There you go. That's part of what's been going on lately. But it looks like things will slow down now, so I'll post a bit more often, at least for a little while...