Thursday, December 23, 2010

This One Ought To Be Popular

So . . . it's almost Christmas, that most joyous time of year, the one I can't stand.  Before you go rushing to judgment about my motivations, let me offer my modest proposal for how we all handle the holidays going forward and my reasons for it.  Keep in mind I am not out to ruin anyone's fun, that this is just a blog written by someone who occasionally goes on crazed rants (about pizza, amongst other things) and that I did this on December 23rd so that this could be well ignored and forgotten before the actual holiday hits.  My vitriol towards xmas after the jump.

Friday, December 17, 2010

Satisfyingly Maximized

So . . . there's two types of people in the world.  I like this intro because it can really go anywhere - just fill in the following blanks: those who (blank) and those who don't (blank).  It can be fun and is really flexible.  Use things like "watch The View" and "don't watch the View."  Or "like Dan Brown's novels" and "read crap."  Today I'm writing about a different kind of distinction, though.  On what may be the fundamental difference between types of people after the jump.

Sunday, December 12, 2010

This Blog is Three Times as Awesome as Others!

So . . . I had a couple of other topics I was thinking of writing about today (basketball, people's biases and how it affects behaviour) but I'll save those for another day.  Instead, I'm going to write about the commercial I just saw on TV (no, television hasn't given me a short attention span at - hey!  look at that!).  One of the topics that I teach that students tend to have a hard time accepting is that while we all know that we shouldn't trust ads, ads are the source of much of our product information.  Exploring this paradox, and a link to a unique ad, after the jump.

Friday, December 10, 2010

Warning! Steep grade ahead!

So . . . it's grading time!  That bittersweet time of the year where I'm happy that the semester is just about over, but faced with a mountain of student papers to assess.  This is a bigger task than most people realize, because I actually have to read the darn things.  And reading them can be tough.  Not to sound trite, but the quality of student writing is usually horrible.  But with one final push, I'm done the semester, on holidays, and ready to face a new batch of students in January.  Even so, there is one change I would like to see with regard to grading, discussed after the jump.

Tuesday, December 7, 2010

Pasta Sauce!

So . . . last week I made pasta sauce (tomato & meat) for the first time in six months.  No, I didn't give it up for an extended lent (or Omer, or Ramadan, depending on your faith) - I tend to make large batches (and freeze it) and only recently finished the last batch.  I love making pasta sauce, even though it takes several hours.  This twice-a-year (or so) activity is probably one of my favorites to do in the kitchen (cooking-wise; get your mind out of the gutter, sicko).  Why I love making it, and the recipe (kind of) after the jump.

Friday, December 3, 2010

We Are All (Lousy) Witnesses

So . . . it's coming up on exam time for my students, and I just did our course review and exam prep in class earlier this week.  The exam is for my business case course, which has the students read a business case (a story about a business and the problem they are facing).  They then must use the available information to come to a defensible decision about what the company in the case should do.  One of the skills involved is to be able to sort out useful information from useless (or in some cases, incorrect) information.  Which brings me to what I'm talking about today - how we can use information provided to us.  Examples and clarification after the jump.

Tuesday, November 30, 2010

News of the Day

So . . . once in a while I hear about a bunch of news stories within a short time, all of which inspire in me some emotion (usually disgust, but not always).  Today was one of those days.  In a matter of an hour or so I heard about four news stories on the radio that stuck with me, and I thought I could comment on them.  I found out about all of them on the Howard Stern show this morning, but have since checked into them a bit further to get more detail.  The stories after the jump.

Friday, November 26, 2010

True or False Question

So . . . I thought that today I would train my keen mind on a most deserving target - a couple of lines from Hamlet.  A couple of lines that find their way into the general consciousness through not only the play but also countless valedictory addresses each spring.  Hey, it's Friday, I don't want to complain about current events or stupid decisions people make (other than possibly mangling the intention of Shakespeare's words in graduation speeches), so I'll just ramble about some iambic pentameter.  Said ramblings after the jump.

Thursday, November 25, 2010

Thank God! (or Someone Else)

So . . . it's that busiest travel weekend of the year for our American friends, Thanksgiving weekend.  One of the most popular holidays, and it's not only because of the extended long weekend and Black Friday deals.  Many people get fulfillment from the act of giving thanks, of taking a moment to acknowledge that we are lucky to have what we do and that it may not all our own doing.  But who should we thank?  The answer (at least one version of it) after the jump.

Tuesday, November 23, 2010

Billboard Charts and Mathematical Graphs

So . . . I like music (yeah, kind of like saying "I like food," I know).  That is to say I enjoy music - I'm not really one of those "music can change the world" types.  My tastes tend to be eclectic (very few genres I don't listen to) and my collection is fairly extensive, I think (not a world record sized one, but a fairly robust 15000 songs).  Being a researcher, though, I also have an interest in what makes some music popular and others not.  The scientific answer after the jump.

Friday, November 19, 2010

Being Healthy and Wealthy, But Not Wise

So . . . one of the reasons I started this blog is that occasionally I read something in the newspaper that just makes me want to throw it across the room.  In fact, this is a primary reason why I still read paper newspapers - it would get very expensive if I threw the computer screen across the room every time something ticked me off.  Well, anyway, today was one of those days.  Find out what's peeving me today after the jump.

Thursday, November 18, 2010

I'm No Airhead But I've Got Gas On My Mind

So . . . yesterday I was asked to comment for a story in the local paper on gas prices.  And, because I have become accustomed to my mental spewings having a greater share of the article on the web (thanks to this blog!), I thought I'd expand.  Gas prices not only give me an opportunity to study my first research love, dynamic pricing (maybe I'll discuss that another day) but give a view into how consumers respond to ever-changing prices and stimuli.  And we aren't very good in our responses.  More gassy emissions after the jump.

Tuesday, November 16, 2010

Spare Change for Promising Politicians

So . . . I had a request from one of my regular readers (yes, they exist) that I provide a follow-up to the U.S. midterm elections earlier this month.  I had done a pre-election post discussing the sad state of reductionism in politics.  Today I'm going to write about the fallout from the Democrat's poor showing and the emergence of the tea party.  And the news will be good, kind of.  Also, I once again relate political decisions to buying a television, all after the jump.

Friday, November 12, 2010

Taking the Happy out of the Meal

So . . . toys are making kids fat.  Or so says Eric Mar, a city supervisor in San Francisco who sponsored a now-passed city law preventing quick-service restaurants (read: McDonalds) from giving out toys with (happy) meals unless said meals meet nutritional guidelines.  Taking toys away from kids, that's horrible, and right around Christmas, too.  Worst of all, it won't even work.  Fat kids and crappy toys after the jump.

Thursday, November 11, 2010

Are Babies Clairvoyant?

So . . . I thought I'd copy from local news reports and ask a provocative question as the title ("The snack food that will destroy your stomach lining is in your pantry . . . we'll tell you which one it is at eleven!). Or maybe you thought this was the tagline of Christopher Lloyd's latest cinematic triumph.  Alas, no.  It's just an example of our blindness to our own behaviour and how it may be perceived by others (huh? you lost me there . . .).  I'll try to add clarity to infant clairvoyance after the jump.

Tuesday, November 9, 2010

Is the Web of Research Related to the 'Net of Teaching?

So . . . it seems that this online learning thing may be for real.  This is a terrifying thought for many professors, because if lectures or notes are fully available online, what use are they (we)?  Isn't it enough that all of the information is available online, or is it really necessary that the dronings of a researcher into obscure South Pacific polydeistic tribal rituals have to be too?  Can't online learning be limited to those trade schools and administrative studies colleges that advertise on daytime TV?  The answers after the jump.

Friday, November 5, 2010

Batty Batty Batty Bat

So . . . apparently white nostril fungus is causing problems for bats.  The problem specifically is that a type of mold is killing whole populations of certain types of bats.  And even though there is zero indication that human activity caused this, there is a small but growing movement to do something to help these bats.  Some articles (like this one in Scientific American) are framing the bat deaths as something that will be a problem for people if we don't do something about it.  We'll do the batty bat after the jump.

Thursday, November 4, 2010

Power Ranger or Texas-Sized Bust?

So . . . a belated and bitter congratulations to the world-champion San Francisco Giants (bitter because back when I used to follow baseball the Rangers were "my" team).  They handled the Texas Rangers very adeptly en route to a 4-1 World Series win.  One of the biggest surprises was that supposed playoff ace Cliff Lee of the Rangers had two losses in the series, after having never lost a playoff game before.  On why the notion of a "playoff performer" is a fallacy after the jump.

Tuesday, November 2, 2010

Complex Issues, Simple Candidates

So . . . today is voting day for our friends to the south.  Maybe it's because I didn't really follow the politics of a foreign country when I was a kid, but it sure seems to me that midterm elections get a lot more attention than they used to.  Of course, this particular midterm election is being viewed as a referendum on both Obama's first two years and the Tea Party movement, so people on both sides feel as though they have a lot to say.  But part of the problem is that even though there is a lot of talking going on, no one is saying very much.  I contribute my piece of speech after the jump.

Saturday, October 30, 2010

Wearing a Costume To Show Who We Really Are

So . . . Boo!  It's hallowe'en tomorrow.  Candy, kids in costumes, pumpkins, and of course normally modest women dressing up like whores.  Okay, the last one is a bit harsh, but we all know that hallowe'en is one of those times of year when people allow themselves to behave in ways that they normally wouldn't (such as actually speaking with neighbours).  But is the excuse of "I'm not normally like this, this isn't who I really am" that true?  Who are you people, really?  The answer may be after the jump, along with some literary spoilers.

Thursday, October 28, 2010

Pray To the Basketball Odds

So . . . the NBA season opened a couple of days ago and we're pretty much ready to crown the champions.  After a 1-1 record, losing to heavyweight Boston (literally, now that they have Shaq and Jermaine O'Neal) and beating lightweight Philadelphia (whose new coach is already missing games and blaming it on a concussion from months ago), the oddsmakers, pundits, and so-called experts claim that the Miami superfriends are ready to be champions.  Based on odds I spent all of about 4 minutes researching, they have a 35-40% chance to win a title.  More on the problems with these odds after the jump(-ball).

Tuesday, October 26, 2010

More Powerful than 1.21 Gigawatts (or Jigawatts?)

So . . . today is the 25th anniversary of "Back To the Future Day," the day that Marty McFly went back in time from 1985 to 1955.  In honour of the day, BTTF is being released on Blu-Ray, there were screenings of the film yesterday evening, and half of Toronto is wishing they had a time machine so they could go back to yesterday and vote strategically.  I had my own time travel experience recently, and it's gotten me thinking about counterfactuals (basically an extended "what if" - alternatives to one's own reality).  Time jumps after the jump.

Friday, October 22, 2010

Mickificki

So . . . it seems strange to me that we as a society still have major issues with the idea of swearing on television.  While coarse language may not be an ideal, and isn't necessarily something we deliberately teach our children to use, it is pretty much used by everyone.  Just like the notion that on television a couple in bed before or after sex is always covered up to shoulder-level, a lack of profanity on TV just doesn't reflect reality.  More on this, some profanity, and the death of the sitcom, and some fun clips after the jump.

Thursday, October 21, 2010

If We Choose Crap, Then Crap Is Good

So . . . I like to read, you know, books.  Sometimes this makes me feel old, but really it's just an enjoyable experience that pales in comparison to reading from a screen (in fact, it is as pale in comparison as the complexion of someone who stares at a screen all day).  My taste in books (and TV, movies, and music) is quite eclectic; it isn't that I am not discerning (I'm not literature whore) but rather that my interests span a wide variety of fiction genres and non-fiction topics.  And recently I was presented with an interesting question (found, where else, in a book): how do we decide the merit of something like a book?  The answer (kind of) after the jump.

Tuesday, October 19, 2010

Freely Speeching Off

So . . . I have a problem with free speech.  Not the right to it, but rather how that right is misinterpreted.  Freedom to say whatever we want is extremely important in any society.  We can see from our privileged perch here in the western world (not jingoistic at all!) how the lack of this freedom wreaks havoc in other parts of the world.  Unfortunately, though, we are also spoiled by free speech, and not in the way you would expect (e.g. the "fair and balanced" speech of Fox News).  Instead, I have a problem with how the right is understood, and I would be exceeding happy to explain after the jump.

Friday, October 15, 2010

Curse These Young Whippersnappers!

So . . . I correspond a lot with people born after 1990 (no, that doesn't sound creepy at all!) because I teach in a university.  Nary a day goes by without some e-mails from my students, and it is very distressing.  I know I'm far from the first (or last) person to complain about this, but those young people today don't know how to write a message.  Whether this is caused by the popularity of texting and acronyms (with LOL and its descendants being pet peeves of mine) or the decline of proper English in general, it's a damn shame.  A lesson in e-mail writing after the jump.

Thursday, October 14, 2010

Here's a Quality Definition

So . . . one of the words that gets thrown around a lot in my classes is "quality."  This is a practice that I try to squash early in the semester.  Even though quality is used (over-)extensively in the business world (e.g. "Quality is Job One") and even has legitimate uses (i.e. quality control in an operational sense), it is more or less a meaningless word when applied to marketing.  It usually is applied as a synonym for "good," or "reliable," or "durable," but I don't buy that.  In other words, I think "quality" is a low-quality adjective.  And the same could be said for a lot of other adjectives we use.  Examples after the jump.

Tuesday, October 12, 2010

ACR Part Two: Change the World

So . . . the other topic I wanted to talk about from my consumer research conference in Jacksonville has to do with the motivation for a lot of the consumer research that's done.  Basically, a lot of the academics in the consumer behaviour field, including some of it's most highly regarded members, have chosen to do work with the purpose of making the world a better place.  More on how upset this gets me after the jump.

Saturday, October 9, 2010

ACR Part One: Not Being an Ass in Jacksonville

So . . . I'm here at an academic conference in Jacksonville, Florida.  The conference is the Association for Consumer Research (ACR) and is pretty much the biggest conference for people in my line of work.  I presented some of my research, saw other people's research, and basically marketing-prof-nerded it up for a couple of days.  I'm splitting my comments on the conferences into two postings - one today and one in the next day or few.  Skip to after the jump to find out my thoughts on Jacksonville and the various decision processes that go into attending a conference.

Tuesday, October 5, 2010

I'm Not Cowed by Vegetarians

So . . . I'm a carnivore.  I have no qualms about eating the roasted flesh of animal carcasses.  No moral dilemma, no health concern, just a love of meat.  I know that not everyone feels this way, and this was brought to the top of my mind recently.  I was walking the halls at the university and passed a display set up designed to convert people to vegetarianism (or veganism).  I also overheard a snippet of conversation between the people manning the booth and a passerby.  And as I am in the habit of writing blog posts about an overheard sentence that came to me out of context, here we are.  More about cows, people, and how they juxtapose after the jump.

Friday, October 1, 2010

A Few Updates

So . . . wow, Friday already and no posts this week yet.  Busy times.  Today I'm just going to do some housekeeping for my blog (blogkeeping?) and give a few updates about some of the stories and topics that I have covered in the few months I've been doing this.  And because I've been lax this week I've packed a bit more into this one. 

Bonus feature: I'm going to try a new thing that I may incorporate more extensively in the future.  But more about "after the jump" after the jump . . ..

Friday, September 24, 2010

The Paradox of Outcomes, revisited

So . . . one of the topics that I am most interested in is outcomes, and wrote a post about it a couple of months ago. If you can't be bothered to follow the link, the essence of the post is that we confuse best decisions (choosing the best option prior to learning the outcome) and right decision (the best decision in light of the eventual outcome). The example is use is the decision of whether to trade three lottery tickets for one; the best decision is to decline the trade, but if the single ticket is the eventual winner, the right decision would have been to do the trade.

One comment I got about this post was that it sounded like "hindsight is 20/20." I would agree insofar as the right decision goes - we determine whether we made the right decision through hindsight. The best decision remains unchanged - it is based on probabilities, not outcomes. And the fact remains that unless we are Marty McFly or Doc Brown, we never have access to the eventual outcomes. So ultimately what we really want are best decisions, even if that means accepting that some of them won't be right decisions.

This was illustrated when I appeared on Jeopardy! a few years back. Prior to taping the show, the contestant wranglers give an hour-long spiel about all aspects of the show - rules, gameplay, and some rudimentary strategy. One suggestion that was give was to avoid jumping around the gameboard looking for daily doubles, and instead pick the clues from lowest dollar amount to the highest. Now if you could somehow suss out where they were your probability of winning would be much higher. But looking for one or two clues out of 24 is a fool's errand, because the odds of actually finding it are low. Should you jump around the board and find it, you would have made the right decision, but the strategy itself would never the best decision (interestingly, the contestant wranglers said that only one contestant in the show's history had a talent for knowing where they were, and he was very successful; a more likely explanation is that he was lucky and that luck led to his success).

But the hindsight-is-20/20 argument also assumes a different point, that we would in fact change our decision given the opportunity. There's an old joke: two guys are watching a boxing match at a bar, and they make a bet on the outcome. One of the boxers makes a big mistake and gets knocked out. The winner collects his money, but then has a pang of guilt and admits that the match was actually re-broadcast, and he had seen it already, knew the outcome, and bet on the one he already knew won. The loser then admits to having seen the match before as well. When asked why he bet on the loser, he says that he didn't think he would make the same mistake twice.

If we had access to time travel (oh, if only we did . . .), would we change our behaviour? One of Kurt Vonnegut's novels, Timequake, centers around an incident by which everyone re-lives the previous ten years of their life. The catch is that we are passive observers reliving everything, and are unable to change anything. We have to watch ourselves make the same mistakes over again. Painful, huh? But if given free will, would we change anything, or hope that this time it works out?

And if you really believe in randomness, as I do, then you should continue to make the best decisions (rather than what you 'know' to be the right ones) in such a circumstance, because the outcome is still the result of a random mechanism. Because you change your action, the result may very well change. Look at Bill Murray in Groundhog Day - he changes a few things, and the fates of everyone in the town are also changed. Marty McFly meeting his parents negates his existence (but, luckily, he doesn't disappear immediately, and has time to entertain us with his attempts to restore his being). And swapping three lottery tickets for one may very well affect the outcome.

Okay, I'm going to stop now, because my brain is tired. Hopefully this kind of makes sense.

Thursday, September 23, 2010

They're Acting Like a Couple of Boobs

So . . . the breastfeeding propagandists are at it again (by the way, I love coming up with unexpected opening sentences. I defy anyone to have predicted that I would started a post this way). For the uninitiated, when you have a child you quickly learn that there is a large, well-organized, vociferous group of advocates for breastfeeding (of children). These medical professionals, mothers, and other interested parties are very eager to inform you of the vast superiority of breastfeeding vs. formula feeding (common examples of the positive benefits of breastfeeding: smarter children, happier children, healthier mothers, better bonding between mother and child, children who can fly, fewer incidences of gout. Okay, I made the last two up).

I am not entering the boob juice debate here (I was formula fed, and I turned out just . . . . well, I'll let you draw your own conclusions), just commenting on the latest lunacy by a group of people with too much time on their hands. In case you missed the earth-shattering news, Old Navy had to apologize to a group of breastmilk zealots yesterday because they sold a onesie with an apparently offensive message printed on it. The lactators and their supporters (the human kind, not the lycra-spandex kind) have organized a boycott of Old Navy as a result of their outrage.

Let's get back to reality, people. First of all, are there no worthier causes for which these people could be expending their effort? I'm not saying that breastfeeding isn't important, but it certainly isn't essential (in the sense that there is a safe substitute - of course, those with extreme enough views will dispute that as well). It's a shirt. That no one is forcing you to buy.

Apparently those blogging in favour of the boycott (you know those blogging types) claim that the onesie is a tool of the formula industry. That's it - you've cracked the code and found out all of Old Navy's secrets. It's all a plot to advance the interests of the formula makers. They are going to take over the world one onesie at a time. I mean, do people actually believe the crap they write? (don't worry, I do)

The extremism involved in breastfeeding advocacy is also troubling. It is often promoted as the only way to feed your baby. The reality is that there are, in fact, formula-fed babies, and what's wrong with a onesie for them? There are plenty of mothers out there who cannot or will not breastfeed, and they are made to feel inferior. I'd like to advance a few reasons as to why the debate is so heated:

1. Mothers who have decided to breastfeed are trying to combat cognitive dissonance (the thinking of two conflicting ideas simultaneously). If formula is acceptable, breastfeeding is inferior, because it is more time-consuming, tethers the mother to the baby almost all the time, prevents a return to work, etc. Therefore, if I choose to breastfeed, it must be superior (and studies have shown it to be better, on average, than formula). The more I promote breastfeeding, the more I justify my own decision. This is the same type of behaviour as new converts to a religion vigorously promote that religion, or people starting a diet talk about how others should also get healthier.

2. The health-care field supports it because it is in their interest to do so. Lactation consultants, breastfeeding doctors (those doctors who study and help with breastfeeding, not the doctors who breastfeed), maternity nurses, and so on all receive attention and respect from the need by new mothers to breastfeed. They are placed in a position of authority and given another field in which they can act as experts. I'm not saying that doctors seek out areas to be experts but rather that people in general do that. Nor would I suggest that doctors would promote something unsafe for the enhancement of their ego. But giving up authority is a hard thing to do, and as I wrote about a few weeks ago, the medical establishment can be slow to change.

There will be extremists in any group, but as the saying goes, sometimes one side of a debate is right, sometimes the other side is right, but the extremists are never right. And if you are so pro-mommy-milk that you can't accept a cute shirt for kids (some of whom might be, gasp! formula-fed), then you've got bigger problems than this.

And hey, maybe some of those people boycotting Old Navy will instead shop at The Gap or Banana Republic.

Tuesday, September 21, 2010

If You Know the Cost of Everything, How Can You Know the Value of Nothing?

So . . . over the past few weeks various sports teams have announced that they will used demand-based pricing. This term simply refers to businesses allowing themselves the flexibility to change their pricing based on a number of factors. An example would be charging different prices based on who a team's visiting opponent was; it's likely that Raptors fans will pay more to see the Lakers or the Heat than they will the Hornets or the Timberwolves.

A more complex example would be the sort of pricing that airlines do. There are so many factors that affect price (time of day of flight, connections, time of year, day of week, other flights, to name just a few) that the average person would not be able to predict which flights would be more expensive, outside of a few rules of thumb. At the far extreme of variable pricing is stock market pricing or pricing for gasoline - the price floats relatively freely based on real or expected demand.

A lot of the research that I did early in my academic journey had to do with variable pricing, specifically research into how people would react to such pricing schemes. From a company's perspective, variable pricing is great. If you have a stock of something (e.g. concert tickets), being able to change your prices based on demand and supply works well, as you can get the maximum that the market will bear at any one time. But consumers may view things differently. We have become used to the notion that we may have paid more for our airplane seat than the person sitting next to us, but we aren't ready to transfer that to a lot of other contexts.

But the thing of it is that consumer decision-making tends to operate under the assumption that people are free to buy something or not; in other words, no one is holding a gun to our heads forcing us to buy. Therefore it is necessarily true that if we buy something, we ascribe to it at least as much value as its cost. If not, we wouldn't buy it. That value may not be tangible or measurable (outside of price), but it is no less real.

All of this makes it difficult to complain that we have been gouged by sellers. If you want to see a Leafs game, you still will have (somewhat) affordable options, but if you specifically want to see the Leafs play the Canadiens or Canucks, you'll probably have to pay more. And you'll only do that if seeing the game is worth the exorbitant price at the time you make the decision.

I'd go even further. Gas companies have us over a barrel - they could jack the price of gas up to $5 a litre tomorrow and would still have customers, at least in the short term. And would we willingly pay $250 to fill up our tank if that amount of money was worth more to us than the alternative (e.g. not getting to work, lack of mobility). Taking advantage? Sure. Gouging? Not so sure. Unfair? Definitely not - we're willing parties to the transaction.

When I was in my MBA program I did some tutoring on the side, and chose an unusual pricing scheme - I let my tutees (?) decide for themselves how much they wanted to pay once the tutoring was complete. I could have put a price on it, and might have done ok, but this way I did pretty well (averaging about what I would have charged anyway), and no one complained about the price, because they themselves chose it. Likewise, had I quoted a price and they paid it, they would have had as much basis to complain (i.e. none), but in that case I might have heard some grumbling.

Then again, forget what I said. By this definition, this blog has no value, because it's free.

Thursday, September 16, 2010

The Small Screen is a Bigger Canvas

So . . . I read an interesting column in yesterday's Globe, by TV writer John Doyle. He takes the position that television has overtaken movies in providing high-quality narratives, and I gotta say, I tend to agree with him.

Now don't get me wrong, I love movies and always have. I see a lot fewer of them now than I used to (combination of parenthood and no longer working for a film exhibitor), but give me the chance to go and I will. But I also watch a lot less TV than I used to (parenthood again), and I think that there is better stuff to be found on the small screen than the large.

Now, both media have taken great strides over the past few decades. For everyone who bemoans the lack of quality movies these days, I say to you that I think the average movie is better now than ever. The best movies of the past few years (in my opinion, these include The Prestige, Almost Famous, Inglourious Basterds, 40 Year Old Virgin, just to name a few, and I'm sure I'm leaving out some that I like even better than these) rank among the best of all time. There are no movies that surpass Singin' in the Rain or The Godfather, but there weren't any in many other decades, too.

Essentially what I'm saying is that the best movies of the past 10 years stand up to the best movies of any 10-year period, but at the same time the average movie has improved. There are still clunkers (Transformers), but they're no worse than the worst movies of times gone by (Computer Beach Party, Manos: The Hands of Fate).

Television, however, has made huge strides forward. There are shows that I used to love (Knight Rider, Frasier, L.A. Law), that were considered quality shows once upon a time (well, maybe not Knight Rider), that are now unwatchable. Thanks to the myriad "oldies" TV stations, we can see just how excrutiating some of these shows were. The Cosby Show was an enormous hit; I was never a huge fan, but trying to watch it now is painful. Even Seinfeld seems stale.

And while current TV is not always stellar, it is of generally high quality. The sitcom is going through some extremely long death throes (the most popular and most critically acclaimed series, Two and a Half Men and Modern Family, are both horrible as far as I'm concerned); even so, we have The Office, and Curb Your Enthusiasm, and Glee (inconsistent, but still innovative, even if they did steal the concept from Cop Rock).

The TV drama seems to be where its at. The Wire is the best thing I have ever played on my DVD player. Ever. Mad Men is outstanding. Lost, despite its flaws, was unlike anything ever seen on TV before. Even formulaic network shows (The Good Wife, Law & Order SVU) are really good formulaic network shows. I'm excited to see shows I haven't seen before that have been well-received (e.g. Dexter, Deadwood, Weeds) the way I used to be excited to see movies that were coming soon.

Maybe it's just that I'm watching more TV than I'm seeing movies. Maybe it's because I'm no longer living in the downtown of a city, where innovative films are easier to find (though I never liked overly artsy stuff). Maybe I'm just older and would like the commercial Hollywood crap more if I were ten years younger. And maybe it's because the biggest televisions are rivalling the smaller movie theaters in terms of screen size.

But maybe, just maybe, TV is better.

Tuesday, September 14, 2010

Uncertain Odds on my Ambiguous Ignorance

So . . . as I've been beating you over the head with so far in this blog, we are a predicting, gambling, prognosticating species. We are compelled to try to determine what will happen in the future (and we are bad at it). Today I want to write about different types of future randomness, those of uncertainty, ambiguity, and ignorance (or to use dated political analogies, Paul Martin, John Kerry, and George W. Bush).

Let's start with uncertainty. An uncertain situation is one in which we know the potential outcomes and their probabilities, but we don't know what will happen. Think of a card or dice game; using math you can determine the odds of every possible outcome. If you roll two dice, there is a one in 36 chance of getting a total of two. Uncertaintly is extremely useful in this regard (but not in much else - more on that later).

Ambiguity is just as it sounds - you are ambiguous about the probabilities. You have some knowledge of the outcomes and probabilities, but not complete knowledge. One example would be a sporting event. Before a hockey game (assuming you know a little about hockey) you have some idea about who is more likely to win, but you could not put specific odds on it, no matter how sophisticated your math. Bookies try to, but these are based on past performance, thorough knowledge, and intuition. It would be impossible to say with certainty that one team has a 57% chance of winning, partially because it would be impossible to assess whether that was correct unless the same game was played the same way multiple times.

Ignorance describes a situation in which you don't know the outcomes or the probabilities. Imagine trying to guess the outcome of a hockey game if you had never heard of hockey before. You might expect that there would be a winner, but not even that is certain (ties, overtime losses, etc).

Our natural tendency as humans is to select situations where there is uncertainty as opposed to ignorance. We don't like to feel dumb. Ideally, we'd like certainty, but no one has certainty in their life (and trying to find it leads to other social problems, like being certain your religion is correct or certain that Mike Myers would have a long and illustrious film career).

The problem with all of this is that we treat most situations as though they only contained uncertainty, whereas in most real-world situations ignorance rules (and I'm just talking about future randomness here, not even what is, or isn't, in people's heads, though that statement could still be correct). According to Nassim Taleb (author of The Black Swan, what, you haven't read it yet? Go now!), the financial meltdown was because there were a lot people mistaking ignorance for uncertainty. Part of ignorance is blindness to possibilities; many of the financial models used ignored low-probability, high impact situations.

In other words, we treat situations in our day-to-day lives as though they were a card game. If I go to a job interview, I would look at it as we either get the job or not. But I may perform so poorly that my reputuation is sullied and word gets around the industry that I am not to be touched. I may perform so well that I get a better job than the one I applied for. Both are of very low probability, but possible. It is impossible to think of all the possible outcomes (go on, try - I bet you didn't think of the one in which the interviewer is a carnivorous alien who eats you - ok, maybe that one isn't possible). Yet we behave as though we know them.

Next time you think you have a handle on future outcomes, think of Lindsay Lohan. No one knows what she's going to do next. Anything is within the realm of possiblity. Now apply that to your own life. Scared yet?

Friday, September 10, 2010

Meet Your Professor, Dr. Chuckles

So . . . here's my last installment of my back-to-school week blogs. Today's posting has to do with the notion of professor as entertainer (hence "chuckles"; I could have used Dr. Giggles, but that would have conjured notions of a bad horror movie starring the mentally-challenged guy from L.A. Law as a psycho doctor). Steve Martin claimed in his book Born Standing Up that if he hadn't made it as a stand-up comic, he would have been a college professor, because then he would still be performing before a room every day. I don't necessarily think this sentiment is rare.

According to a recent survey of over 10,000 Ontario university students, what students want is an motivating, enthusiastic, entertaining professor. The OUSA survey reports that these qualities are desired by 74.6% of students, which only eclipsed by "delivers interesting, well-prepared and organized lectures" at 83.7%. The next-most important factor, the ability to communicate in multiple ways, is quite a bit less important (52.4% of respondents say it is necessary). From there, the more nuts-and-bolts elements of the course and listed, and none is chosen by more than 25% of students (things like outlining expectations, availability of the professor to meet, etc.).

So what common thread holds the three most important factors together? Communication. Style. Delivery. Not content. Which is a major disconnect with what professors consider important. Many profs believe that content is king, and that a) the students are there to learn the content, and so the onus is on them to pay attention and b) they have a responsibility to deliver the content, and there is no time for anything but.

The first point, that students should just pay attention, is fine in theory but wholly unrealistic. Could you sit and listen to a content-heavy lecture for 90 minutes, even on a topic have an interest in? What many profs forget is that the topic is not nearly as interesting to the students as it is to them. Most students will begin by paying attention, and at least try, but after 10-15 minutes of droning its hard to focus. Being enthusiastic and engaging helps combat this.

As far as content delivery goes, we profs do have a responsibility. But too often, "delivery" is considered to be the transmission of information, without consideration for receipt. Delivery is only complete when the information is received, not when it is sent. If FedEx regarded delivery in this way, they would be out of business.

Which raises the question of how to be engaging. Oh, how many of my profs from undergrad could have benefited from a basic teaching or communications course. First and foremost, consider the perspective of the students and what will interest them (in terms of style - I'm not advocating only teaching the most interesting content). Use humour, or examples, or a problem-solution approach. Make the information relevant. Give over some class time to practical problem-solving.

Humour works well, but I know that not everyone is as naturally funny as I. It is also potentially dangerous, because you don't want to be seen as a clown (as in this clip - fast forward to 3:19). The key factor is to be enthusiastic, i.e. behave as though you actually want to be there. Instead of counting on the information being interesting (because it is to you), consider what will make it interesting for the students. The book Made To Stick is a great read and has excellent points to make on this topic.

Anyway, now we're back to school, and next week I'll get back to my usual topics. And only 11 weeks till the end of semester!

Thursday, September 9, 2010

Getting Into a Group Scene

So . . . continuing with my "back to school" theme (the time of year, not the movie, though that Triple Lindy was sweet), today I'm talking about group work. I include group work in some of my courses, and typically students dislike it. It also presents many headaches for me too. So why do it?

Besides the fact that I'm a sadistic jerk, I include it because most of the work that anyone does in any job includes interactions with others. Outside of lighthouse keeper or Unabomber, there aren't many jobs where you never have to work in a group situation. Sure, relying on others is a pain in the ass, but you can't do everything yourself (even I know that, and apparently I have a superman complex - though I don't get the same reaction when I wear colorful spandex). Students should learn early that they will be disappointed by their colleagues, that other people will do as little work as possible, and that coming together to reach a common goal is a painful process.

This year I'm instituting some changes in how I manage my course group work. First of all, in at least one of my courses I'm going to assign the groups myself, rather than letting the students pick them. There are advantages to letting the students pick their own groups, but two big problems. First, that's not how it works in the real world. Second, there are invariably one or two groups that are composed of those who could not form groups on their own, and they could be at a disadvantage because there is then the need for members to get to know one another, an issue that would not be present in other groups.

The second change I'm instituting is that each group must meet with me a minimum of once during the term (prior to the assignment due date). This will not only allow the students to ask questions and get guidance from me (I may not know everything, but I will be the person grading the assignments, after all), but also allow me to observe group dynamics. Too often I have only heard about intra-group problems on the day the assignment is turned in, and that is too late for me to do anything about it (even Superman couldn't go back in time, I don't count the spinning-round-the-world thing in the first movie, it was a cop out). Not that I especially want to play cop to a misbehaving group, but I also don't want five people's grades to be bad because of one disruptive member.

Which brings me to the last point about group work - assigning grades. The first time I taught a course that included a group component, I asked for peer evaluation of group members. Never again. Led to many issues, vendettas, and at least one threatening e-mail. It seems that every group agreed amongst themselves to evaluate everyone the same, and then everyone did just the opposite. I had students complaining that other group members got higher grades than them, even though they all evaluated each group member the same (which they hadn't). I had one group in which four members downplayed the contributions of the hardest-working member, because they were all friends with each other and she wasn't. Too much pettiness.

My typical way of dealing with this is to make clear at the outset of the course that all group members will receive the same grade, regardless of anything that goes on within the group. Therefore, students should choose group members carefully. However, this method is far from perfect (especially if you are imposing groups on students). It assumes all students have the same goal (to do well) and are fair-minded (hah! why should students be any different than the rest of us?). But it is the most egalitarian. Furthermore, it should provide an incentive for those who want higher marks to manage the groups and light a fire under the butts of the lazier members. Or at least do more work and cover for their coasting.

And if that isn't like working in a group in the real world, I don't know what is.

Tuesday, September 7, 2010

Let's Have a Discussion about Lectures

So . . . this is the first week back for school and university, and I thought I would dedicate this week's posts to education, particularly the higher kind. As someone with a vested interest in the topic I am keen to learn the opinions of my reader(s). Today I am going to discuss something that I feel very strongly about, which is how information and knowledge are passed to students.

In yesterday's Globe and Mail, Julia Christensen Hughes, the dean of the faculty of business at the University of Guelph, basically expressed that lectures do not work and that teaching at universities must be more interactive (in the classical sense, not the electronic sense). Simply reading/reciting/giving lectures to (or at) students is ineffective, she says, and you must pause and ask questions of the students (gasp!) or solicit questions from them (double gasp!). Predictably, several university professors wrote letters to the editors disagreeing with her.

I, however, happen to agree with her points. From the time I was a university student myself I hated lectures, and I think I hate giving them more than I hated hearing them. We no longer live in an age where all of the knowledge on a particular topic resides in peoples heads - we have books, websites, etc. If you want to know some information, as Hughes says, you google it. I remember sitting in lectures in undergrad and thinking that I would just prefer if the professor would print out the lecture and give it to us, so that we can read it on our own time. Really, does hearing a professor talk add value over the written word? It can, and with some profs it does, but in many cases the answer is no.

Where value can be added is in teaching students to think critically, make decisions, and express opinions. In other words, participate in a discussion. This seems to be more the norm in business schools than other faculties (disclaimer: I am not saying that all business schools do this and all other faculties do not). In my courses, the learning goals are for students to develop their own way to face business situations. I assign textbook readings, but in class the discussion of the text focuses on clarifying what is not well understood, challenging conventional wisdom, and applying the knowledge. To simply repeat the text without adding value (as was the case in many an undergrad lecture I attended) is pointless.

Ultimately, the students in university are going to need to get jobs (well, most of them) and these jobs will entail the performance of tasks and the making of decisions (well, most of them - not academia). Lectures are not the best training method for these end goals. Lecture with discussion is a different story.

Leading the class in discussion requires different skills, knowledge, and experience than do lectures. I am not saying that a professor who can do one or the other is necessarily smarter or more knowledgeable. I for one could not deliver 1-3 hour lectures without wanting to hang myself (I don't like to hear myself talk too much . . . no, really I don't . . . writing this blog, that's another matter) and I know I would not be good at it. And I can understand why university lecturers would push back against this idea, as it would lead to greater uncertainty as well as the need to work in order to acquire new skills.

I certainly wasn't very good at leading class discussions when I started, but I think I have improved with time: practice makes perfect (no, I don't think I'm perfect, it's an expression). Which is exactly the point - I'm advocating giving the students the opportunity to practice expressing their opinions and analytical skills in a safe environment, before they have to go out and do so in the real world.

Wednesday, September 1, 2010

Putting Dr. Zamboni on Ice

So . . . a doctor named Paolo Zamboni has been trying for some time to get people to think differently about multiple sclerosis. MS has been long perceived to be an auto-immune disorder that causes degeneration of the nerve endings, which in turn causes a wide variety of symptoms, and eventually physical and mental disability. Dr. Zamboni wants to begin trials in Canada, lots of MS patients want to undergo the surgery, but the Canadian government has decided not to fund the trials.

Ok, that was kind of dry and probably not what you have come to expect from my blog entries. But it does relate - because it may come down the same type of errors in thinking that I'm so fond of writing about. I hedge my claim because I am not a medical doctor, and cannot assess the merit of the medical claims of Dr. Zamboni; I can only speak to the decision-making biases that surround his ideas.

Essentially what is going is that someone is trying to help people with MS, and the way in which he is trying to do it is different the accepted conceptualization of the disease. Dr. Zamboni is claiming that MS is caused by iron build-up in the brain due to vein blockages, and therefore treatable with angioplasty (surgery in which veins are opened up). Data regarding this new procedure is mixed but apparently there is evidence that it can work.

This all reminds me of the story of Heliobacter Pylori. Once upon a time, doctors all knew that peptic ulcers were caused by stress, diet, or blood type. Two Australian doctors, Barry Marshall and Robin Warren, came up with a crazy idea - ulcers were caused by bacteria. All the other doctors laughed at them; silly people thinking bacteria could cause ulcers. To prove them wrong, Barry Marshall drank a beaker full of H. Pylori bacteria (sort of like Sam Beckett proving that his theories of time travel were correct on Quantum Leap). Lo and behold, Marshall developed ulcers, that he then treated with antibiotics. Despite this (and other) proof, the medical establishment took a long time to accept this new idea regarding ulcers. And then they did, and everyone lived happily ever after, especially Drs. Marshall and Warren, who won Nobel Prizes.

Is Dr. Zamboni the next Marshall and Warren? I don't know, and the point is that no one does (he does share one similarity - while he didn't do the procedure on himself, he did do it on his wife, to apparent success). He has a method that has worked. There is risk involved (as with any procedure), but as long as the people willing to undergo the trials are aware of the risks, they should be allowed to volunteer themselves. The sense around the government's decision is that it was driven less by budget and more by dogma - MS is an autoimmune disorder, so why look for other causes? Prominent doctors are applauding the no-fund decision.

(I also think that if Dr. Zamboni had a different name, people might be more accepting of his ideas, especially in our hockey-mad country. Unfortunate but probably true)

The point is that it can be difficult to accept new ways of looking at old problems. Evolutionarily, this makes sense; the existing system works ok, and new ideas can be dangerous, so let's be biased against new ideas. The problem with this is that a few new ideas are improvements, and impediments to the acceptance are harmful in the long-term. The costs here are not huge, and no one is being coerced into anything. Just pony up the dough and lets see if we can help people.

Monday, August 30, 2010

The Will to Win

So . . . it is a time of anticipation in the sports world. The baseball playoffs are around the corner. The NFL season is beginning. Hockey and basketball, while still a bit away, are about to get going again. As such, it's prediction season! Who will win this year? Lucky for you, I have the answer; the knowledge of who will be the champion in every sport. Just examine the contenders closely. Don't bother looking at talent, or skill, or experience. None of those matter. According to most sportswriters and commentators, it usually comes down to only one thing.

Just find the team that wants it more.

So, amongst the division leaders in baseball, who wants to win the World Series the most? In the NFL, which team is hungriest for a Superbowl victory? Because if wanting it more truly leads to success on the field (or ice, or court), then it should be easy to suss out the eventual champion.

Funny thing about that. It's usually very hard to tell at this point who wants it more. I think that overall, it is usually much easier to determine who wants it more after the champion wins than to figure that out before the game is played (or while the game is being played). As such, "wanting it more" is not an antecedent to a win, but rather a post-hoc story that is supplied, a retrospective explanation.

I know what you're thinking - you've watched games where you could tell who was trying harder, who was more driven, who wanted it more. Two things about that. First, is it not possible you were entangling success with drive? Does a basketball player miss a shot because he didn't want it to go in? If wanting it more leads to success, why don't all of a very driven player's shots go in? Second, it is sometimes true that a team or a player is complacent (or hungover, or has just signed a long-term lucrative contract). This may seem like semantics, but while wanting more doesn't actually lead to a win, wanting it less certainly can. I would argue, however, that wanting it less is rare - to reach the professional leagues of any sport, it requires a whole lot of dedication, drive, and thirst for victory, and most people don't shrug that off once they make it.

Wanting it more also often gets mixed up with aggression. In many sports aggression can be successful (e.g. hockey teams down one goal nearing the end of the third period). But aggression is also a risky strategy. What ends up happening is that a confirmation bias gets introduced; when aggression is successful, that team wanted it more. When aggression is unsuccessful, then the losing team got reckless, or lost their head, or resorted to thug-like play.

I have suggested this idea to several other people and I tend to receive a negative reaction. It is part of the mythology of sport that inner strength and drive leads to success; after all, if it is not, then it's just talent and luck. Talent can be measured and often quantified, and luck is random and usually an unsatisfying explanation. But heart, heart is something that anyone can have. Rudy had heart.

The other thing I'm asked is if I have competed in sport, because then I would understand "wanting it more." I'm no athlete (no, please, it's nice of you to say, but I know my limitations), but when I do play basketball, or soccer, or whatever, I really, really want to do well. And I often don't. I get "in the zone" and it doesn't improve my shot. In other words, I have firsthand experience with the idea that wanting it more doesn't matter.

And if you need further proof, consider this example. If I were to play one-on-one with LeBron James, I guarantee you I would want to win more. The win would mean so much more to me than to him. And there is absolutely no way you could convince anyone that I would win.

Thursday, August 26, 2010

You're So Vain, You Probably Think This Blog Is About You

So . . . since I've started writing this blog I have, on a few occasions, been accused of writing it with certain people in mind (I haven't). A few friends and family members have asked if my comments or central point was directed right at them (it wasn't). In fact, this is probably the first time in my blog where I am referring to some people in particular (but definitely not you, so don't get paranoid).

Today's post is kind of a follow-up to yesterday's entry, which dealt with our need to come to half-assed conclusions about the causes of events or the motivations of others. I'd like to expand the topic by talking about how ego fits into all of this.

When we make attributions for actions, we are subject to what is called the egocentric bias; if the event is positive (we won the game!), we attribute the success to ourselves (because I'm awesome!). If the event is negative (we lost the game) we tend to attribute the failure to others (because the other team cheated and the refs were biased). Watch any sporting event on a local station and you'll see this in spades: the announcers will treat any favorable outcome as the result of the home team's incredible god-like talent, and any negative event as bad luck or an error on someone else's part.

But there is also another manifestation of the egocentric bias. We also tend to believe that we are at the centre of everyone's universe. So if we get cut off in traffic, the other driver wanted to cut us off specifically (ignoring the fact that in all likelihood the other driver either didn't notice us or give us a millisecond's thought). If a teenage girl goes to a Justin Bieber concert, naturally he made eye contact with her in particula and will go to sleep dreaming of her. And, from my own life, when my wife goes out and leaves me with the baby, the baby chooses to take that opportunity to wake, cry, and prevent me from writing this blog.

Several years back there was a research study in which university students were tasked with buying condoms. These condoms were to be purchased from a machine in a seldom-used bathroom down a seldom-used hallway. Even so (and despite the fact that precautions were taken to make sure the purchaser was alone), participants in the study reported that they were watched (or scrutinized) by others (there were no others!). Try it out - go to your local bookstore and browse the how-to sex books or the erotica section, and try to feel like you're not being watched by others. But the reality is that you're probably not.

Why? Because other people don't care about you. They're far to consumed with feeling like they themselves are the centre of the universe. Think about it. When you go to the bookstore, do you pay attention to what strangers are looking at (well, maybe if an attractive stranger is browsing the erotica section . . .). No - you're looking for your books, cool books, books that will impress all those people who really aren't paying attention to you.

The next time you feel like the victim of a slight or an insult, really make sure it's intended, because it may not be. It isn't necessary malice, just lack of consideration in its most literal sense. Remember also that the Carly Simon song I use in the title of the post was believed by several famous men (Mick Jagger, Warren Beatty, James Taylor) to be about them (despite two of them singing backup on the song), when in fact is was about none of them (Simon recently admitted that it was about David Geffen).

And that's right, this blog was directed at you. No, not you, you there.

Wednesday, August 25, 2010

Cause for Concern

So . . . I was reading a column by one of my favorite writers, Bill Simmons (this is the second consecutive blog that I go after him for something, but I actually really enjoy his stuff), the other day and the opening paragraph made me gasp in horror (figuratively). Here's what he wrote:

"I hate hearing the phrase "There's no answer." I can't accept it. Everything within reason should have an answer. And if you can't come up with one? Come up with a theory."

Guess what? There isn't always an answer. And coming up with a theory is often a bad idea, because we tend to draw on the obvious and apparent, and ignore that there may be causes that are hidden and unknown - unknown even to the actor.

Whether everything within reason has an answer, knowable or unknowable, is a philosophical question. That we cannot and do not always know the answer, or have the capacity to discover it, is not a philosophical question, it is a fact. When we ascribe causes or blame, bad things usually happen (world wars, genocides, Ann Coulter books). Yet we still persist in doing this (yes, it persists even though I've already written a similar blog on the topic - I guess no one followed my advice).

But this time I'm not writing about placing blame vs. taking action, but rather our propensity for searching out causes (noble) and coming to conclusions (bad idea); in other words, our naive scientism. A scientist methodically theorizes and tests various causes. What we do, instead, is the theorizing without the testing. And we seem to believe that knowledge of underlying causes is within our grasp, which it isn't.

Equally troublesome is our tendency to take causes and see effects that may not be there. Howard Stern (whom I enjoy) does this all the time - radio or Wi-fi waves must be bad for you because they are floating around. How can they not be bad for you? This type of thinking caused a good portion of a whole generation avoid vaccines. Putting part of a virus or disease into you? Must have an ill effect. I know, let's say it causes autism. Doesn't matter if the data doesn't support it, there is a cause, so there must be an effect.

In today's paper I read about a group of churchgoers who were prosthelytizing in a Toronto neighbourhood. Apparently an altercation ensued because some of the neighbourhood people were offended at the recruiting because they believed that the religion-pushing was directed specifically at a homosexual couple who lived nearby. Now, it is certainly possible that this was the motivation of the parishioners; however, it is equally likely that it was not. The altercation could have been caused by the neighbours determining on their own that this must have been the motivation of the parishoners, due to their own biases about people who would prosthelytize Christianity. In other words, they ascribed a cause to others' behaviour, with incomplete information. Maybe the church-folk were only trying to spread the word about Jesus to people who didn't really want to hear it.

I do this as much as anyone (ascribing causes, not prosthelytizing - I'm not a character in a Tim LaHaye book) - when I get bad reviews back about my research, my initial reaction is that the reviewer didn't understand the topic or the paper, or that they didn't read it thoroughly enough. Of course, that same reasoning could apply to me (I didn't communicate effectively). The real reasons for the bad review could have nothing to do with those (maybe a personal vendetta?). Anyway, keep this in mind when you comment on my blog - I'll be in the background checking and ascribing causes to your reactions.

Thursday, August 19, 2010

The Narrative Fallacy of Bill Belichick

So . . .we like stories. Stories populate our lives and help us make sense of the world around us. As part of our natural inclination to ascribe causality to disparate events, we like to see events linked in stories. After all, if there is no narrative to drive events, then it’s all just random noise (not Lou Reed-like Metal Machine Music random noise, but just random events), and we can’t have that, can we?

Now, a while ago, back when I was blogging at my daily/breakneck pace, my cousin Rob mentioned the Bill Belichick 4th-and-two as a good example of the paradox of outcomes (which it is). I would like to use the same situation as an example of the narrative fallacy, which involves the power of stories. Because of our propensity to invent stories, we are also unduly affected by them – we remember events better in narrative form, they have a greater impact, and we often view them as instructional. That’s fine for Aesop and Joe Eszterhas, but reality often doesn’t have a moral. In invented stories (books, movies, folk-rock) there is a grand design, a plan, a beginning/middle/end; in reality, there is no grand design and therefore not cohesive tale.

Background: in a meaningful regular-season football game last November, Bill Belichick (head coach of the New England Patriots) made the somewhat rare move of, on their fourth down with two yards to go, going for a first down. Because New England only led by 6 points, they needed to run down the clock so that Indianapolis wouldn’t have another chance to score. The risk was huge – if they failed to get the first down, Indianapolis would get the ball in excellent field position, but if they succeeded, New England would almost certainly win the game. Conventional wisdom dictated that New England punt and leave it to their defence (and Indy’s subsequently poor field position) to hold the lead.

Belichick didn’t follow the conventional wisdom. He went for the first down, and failed, Indianapolis then scored and won the game. It makes an interesting probability example (most analyses had the move leading to victory 55% of the time, whether they got the first down or not) and illustrates the paradox of outcomes nicely. It also led to some boneheaded analysis (Bill Simmons claims that one reason it was dumb to go for it was that Indianapolis had already scored twice in that quarter, and it hardly ever happens that a time scores three times; I guess conditional probabilities don’t apply for some reason).

But what it really illustrates is the narrative fallacy. Belichick is a well-regarded coach and considered one of the game’s thinkers. If New England had been successful, it would have been just more evidence of the same – not an interesting story. If Belichick had not gone for the first down, then win or lose it would have just been status quo (and the story would have been Indy’s dominance more than New England’s gamble). Not an interesting story either.

But what was provided instead was a classic tale of hubris and knocking someone down to size. Belichick had over-thought; he had bucked the trend and the gods punished him. It is a morality tale: hero (or anti-hero) is too successful/arrogant/invulnerable, so he does himself in with a poor strategic decision. And that’s the story that gets reported and remembered.

And that’s a fallacy. Because it’s not a narrative, it’s just stuff that happens. It’s not informative, because he took a roughly 50/50 proposition and lost – it’s just as likely the opposite could have happened. It does not overshadow his success, but it might be more remembered than it (because it stands in contrast).

If he finds himself in a similar situation this year, do you think Belichick will go for it? Difficult to say - to punt would be seen as an admission of error, but to go for it again (and potentially fail again) would certainly provide another tale.

Monday, August 16, 2010

The World Last Week

So . . . I decided today to just briefly comment (me, briefly comment? Ha!) on a few news items from last week's paper. I'll get back to my screeds and didacticism in my next post.

From the "Republicans Can't Do Math" department: Republican Senator Mitch McConnell said that "There’s no evidence whatsoever that the Bush tax cuts actually diminished revenue." Forgive me if I'm wrong, I'm not mathemagician, but don't tax cuts, by their very nature diminish revenue? I understand the fundamentals behind trickle-down economics, but if I let a person keep $100, and the tax rate is 10%, that money would have to be spent ten times over to make back the money I originally let go. There are intelligent arguments behind tax cuts, and this ain't one of them.

From the "While We're Mocking Republican Senators" department: Senator Ted Stevens of Alaska died last week. And while many remember him for corruption, pork-barrel politics, and generally being nasty, I choose to remember him for this quote:

"And again, the Internet is not something that you just dump something on. It's not a big truck. It's a series of tubes. And if you don't understand, those tubes can be filled and if they are filled, when you put your message in, it gets in line and it's going to be delayed by anyone that puts into that tube enormous amounts of material, enormous amounts of material."

What's odd is that I didn't even find out about his death via the series of tubes, but rather in a newspaper.

From the "Really? I'm Remembered For That?" department: Shirley Thompson, former director of the National Gallery of Canada, also died this past week. She is best known for paying $1.8 million for Barnett Newman's Voice of Fire, a painting that consists of one red stripe flanked by two blue stripes. I actually went to the National Gallery while it was there, and it was explained to me that if you look at the border between the red and the blue, you see what looks like fire. The optical illusion worked, but I was left wondering why they needed to pay so much for it - a trip to Home Depot and some canvas would have done fine.

On the other hand, I'm not expert in art. But I am well-versed in marketing, and I don't think the gallery would have gotten many visitors if they hadn't paid such an ostentatious sum for it.

From the "Not Getting to the Root of the Problem" department: The town of Shitterton, in England, unveiled a 1.5 tonne rock with the town name carved into it, to serve as the town's welcome sign. The problem was that conventional signs would often get stolen, because they say "Shitterton" on them. In my opinion, a better solution would have been to change the name of the town. Maybe the sign won't get stolen now, but the citizens still live in a town called Shitterton.

And finally, from the "Best Story of the Week" department: A 75-year-old man in Massachusetts was found to have a pea plant growing in his lung. The plant was discovered when his doctors ordered a biopsy of a dark mass in his lung, fearing cancer. It seems the man had accidentally inhaled a pea and it had sprouted. Apparently a similar thing had happened in the past to a man in Russia, who had a fir tree growing in his lung (well, the sprout of a fir tree).

And the pea-man's first meal after the operation? You guessed it, peas.

Saturday, August 14, 2010

Writing Late, Tempting Fate

So . . . yesterday was Friday the 13th, and though I'm a few hours late I thought I would talk about superstitions. As you might have guessed, I'm against them. I know that there are people out there who have specific instructional anecdotes illustrating the value of a particular superstition, but I don't buy it. It seems to me to just be a combination of confirmation bias and narrative fallacy.

(Confirmation bias is the overweighting of supportive evidence; for example, if you believe that Yahoo Serious is a comic genius, you cite the success of Young Einstein and ignore the subsequent failures, or worse yet, claim they are the exceptions that prove the rule. Narrative fallacy is our general bias towards stories and using narrative to make sense of disparate events, thus making them more impactful and memorable. So if you get fired, then get drunk, and then find yourself laughing at a Yahoo Serious movie on at 4 A.M., you may believe the events are related and that a higher power had guided you to comedy gold.)

If you believe in a superstition (e.g. black cat crossing your path equals bad luck, don't walk under a ladder, don't light three cigarettes on one match, etc.), then any time that the superstition is "supported," you have both evidence and a story to tell. Any time it is not supported, you tend not to notice, remember, or accept it.

I can see the logical basis for some superstitions. You shouldn't walk under a ladder because crap can fall on you. You shouldn't light three cigarettes on a match because it raises the probability that you'll get burned. But avoiding the 13th floor? Carrying a lucky charm? Come on.

One of the things that drives my wife nuts is when I'll make a statement like "Wow, the baby is sleeping really well these past few nights," because she believes I am tempting fate and that our good fortune will be reversed. Think about what would be required for that to be a causal chain. Our newborn baby would have to hear and understand what I am saying, and then make the conscious decision to then get upset. Highly unlikely. More likely is that if we've had a run of good luck, it will eventually end (due to random fluctuations and variance), and that we recall the times when "tempting fate" leads to negative outcomes and ignore the times it doesn't.

I've done some research on the topic of tempting fate as well. One of the studies involved priming people with either neutral words or words related to superstition (priming was done by having people do a word completion task, and the words that needed completion either had to do with superstition or not). Just reading and thinking about words that have to do with superstition (e.g. charm, fate, thirteen) caused people to avoid tempting fate (in this case, they said they would buy insurance) much more often than those with the neutral words (e.g. chart, date, twelve). Funnily enough, even though the superstition prime caused people to behave differently, when asked about their beliefs about superstition, the prime had no effect. In other words, people believed that they were not superstitious, but behaved superstitiously.

Anyway, I'm off to walk behind a black cat under a ladder on the thirteenth floor of a building while lighting three cigarettes on one match. While uninsured.

Thursday, August 12, 2010

Using the Wrong Map To Find My Way

So . . . like pretty much anyone else, I've done personal budgets in the past. After all, we have make sure we're keeping tabs on our spending and not buy things we can't afford (yeah, right, like no one does that). But the funny thing about budgets is that they're always wrong, if only because we can't predict the future. I've gotten better at making budgets by avoiding the planning fallacy (i.e. don't allocate every dollar of your income, because there will always be unexpected events) and leaving a generous (seemingly too big) buffer. But whether we overestimate or underestimate, our budgets and predictions will be wrong.

One recent example that caught my eye was that there is a proposed rail link between downtown Toronto and Pearson airport, and apparently the project is going forward. The lease for the lands is 46 years long. Besides wondering why they specifically chose that number, I had to wonder whether we would actually still be using airports 46 years from now (probably, but you never know . . .). My point is that we can't predict two weeks into the future - I am almost positive that something will have changed such that this rail link will no longer be (as) useful in 2056.

But I digress. In the book The Black Swan, Nicholas Nassim Taleb (who is an awesome thinker and writer, by the way, despite my occasional disagreement with his ideas, like here) takes issue with the notion that even flawed predictions are better than no predictions. Because a central tenet of his ideology is that we can't predict, he is often accused of being a nihilist and advocating no predictive tools at all. One response he has to this is that it would be a bad idea for someone to use the wrong map (e.g. using a map of the Pyrenees to navigate the Alps, or a pilot using a map of LAX to find a gate at JFK), and it would better to use no map at all in such instances.

I disagree (with a caveat). It would be beyond foolish to expect that the wrong map would accurately tell you where to go. But what the wrong map can tell you is the general information that may be helpful in navigating yourself. A map of LAX would at least give some information about the general design of airports and terminals, and their associated component parts. A map of the Pyrenees would inform you of mountainous terrain in general. But if we are going to use the wrong maps, we need to be aware of that fact and incorporate it into our thinking. Use the limited tool for limited purposes.

The problem occurs when people use flawed predictive tools as the decision-maker, rather than as information helpful in making decisions. Let's say you play the stock market, and have a computer program that predicts future prices. If you take the output of that program and blindly follow it, you are using the wrong map in the wrong way. If you take that output and combine it with other information and your own judgment, you are using the wrong map in a potentially helpful way.

So go ahead, make a budget. But don't ask why reality differs from your expectation, because it always will and we won't always know why. Recognize that it is guideline rather than a rule, and that it is a flawed map. It could help, but to find your way home you're going to have to use your own sense of direction.

Monday, August 9, 2010

A Rock Story

So . . . today I’m going to take a departure from my usual moaning and complaining about people in general to tell one of my favourite rock’n’roll stories. I’m actually amazed that it hasn’t been made into a movie yet – it would have sex, drugs, and some of the best rock music ever.

Picture this: England, 1970. Beatles have officially broken up after a long and lingering illness. All four Beatles working on their solo albums, but none would be as acclaimed as All Things Must Pass by George. The album is stunning – tons of classic tunes both well-known and less so (My Sweet Lord, What is Life, Isn’t it a Pity, Let it Roll, title track, and the list goes on). Everyone in rock music guested took part in the sessions: John, Ringo, Badfinger, Gary Wright (Dream Weaver), the drummer from Yes, Phil Collins, Billy Preston (Nothing from Nothing), Peter Frampton, Ginger Baker from Cream. And, of course, George’s best friend Eric Clapton.

Clapton at the time was between bands, having left Blind Faith the previous year. He was also addicted to heroin, which didn’t help his situation. Looking to be part of a band where he wasn’t the only star, he put together Derek and the Dominos (apparently originally called Del and the Dynamos, and changed when a concert announcer got it wrong). The band consisted of Clapton, Duane Allman (of the Allman Brothers, themselves riding a crest of success) and members of a Delaney and Williams (a band Clapton had been playing with).

And they created what is one of the greatest albums of all time: Layla and Other Assorted Love Songs. Though most of the songs (other than the title track) don’t get much radio airplay, there are tons of great tunes (pretty much all of them). They recorded Hendrix’s “Little Wing” because he died while recording was going on. They recorded “Key to the Highway” because they heard another artist (Manfred Mann) recording it in adjacent studio. These were some talented musicians, and their album bears their credo of “no chicks, no horns.” Good, simple rock music..

So we already have the kick-ass soundtrack and I haven’t even gotten to the story yet.

Layla was inspired by a woman who had captivated Clapton and occupied all of his thoughts. Unfortunately for him, she was a woman that he couldn’t have, even as a big rock star, because she was Patti Harrison, George Harrison’s wife. He wrote Layla for her, basing it on the Persian love story “The Tale of Layla and Majnun” where a man is driven mad by his love for a woman he cannot have. Other songs are less well-coded – in the song “Have You Ever Loved A Woman,” Clapton sings “Have you ever loved a woman so much it’s a shame and a sin/All the time you know she belongs to your very best friend.”

Unfortunately for George, Clapton eventually succeeded in winning Patti’s heart, and George had the misfortune on walking in on them when they were in bed together. Distraught, George took the natural step of then going and bedding Maureen Starkey, Ringo Starr’s wife. Wait – what?

Yup, that’s what happened. Clapton’s need to requite his love broke up the marriages of two of the Beatles. Clapton ended up with Patti for about 15 years until they broke up (she also inspired the less-inspiring “Wonderful Tonight”), George re-married, Ringo re-married, Duane Allman died in a motorcycle accident a year after Derek & the Dominos’ only album (they recorded some other songs intending a second album, which never materialized). Most amazingly, the friendships between Clapton and Harrison and Starr were not affected. Which speaks to how the two cuckolded men regarded their wives’ virtue (well, that and the fact that they cheated on their wives incessantly).

It would make a great movie and a killer soundtrack. Desmond from Lost could play Clapton. And if you haven’t before, listen to the album. It rocks.

Thursday, August 5, 2010

Try Re-Assembling an Egg

So . . . let's talk about problems. I have always maintained that there are two types of problems: those you can do something about, and those you can't. And why worry about problems that you can't do anything about? The trick, however, is knowing the difference. And we as a species seem to have a problem figuring out problems (which is a problem that we probably can't do anything about, but I'll write about it anyway).

The problem we have is arrogance. Humans are arrogant in that they think they can solve problems that are beyond their reach. And this arrogance is best summed up in a quote by John F. Kennedy, who said in June of 1963:

"Our problems are man-made, therefore they may be solved by man. And man can be as big as he wants. No problem of human destiny is beyond human beings."

Frankly, he's wrong. There are lots of problems created by man that cannot be solved by man. A simple example: I can crash a car, but I can't fix it. And it's ironic that JFK said this a scant 5 months before he had a problem, created by man, that couldn't be solved by man.
This problem problem puts us in a lot of messes. Two of the biggest issues of the past couple of years, the economy and the environment, are being treated as though we can solve them. Most of the debate around global warming has centered on whether it is man-made, as though that means it is man-solvable. Guess what: even if it is man-made, doesn't mean we can do anything about it. We could cut off all CO2 emissions starting today and it doesn't mean that global warming will reverse, or correct, or change. We can throw as much stimulus money as we want at the economic downturn, doesn't mean it will work. And it's arrogant to think otherwise.
War, famine, the Dustin Diamond sex tape - all man-made problems, all unsolvable by man. This isn't a philosophical issue like whether a deity is so powerful as to make a rock too heavy for him to lift it; it's a tried and true fact that we are much better at creating situations than resolving them. There exists pirate treasure from the 17th century that was so well hidden and booby-trapped that to this day it cannot be retrieved. I could continue with examples all day. So we need to get off of our high horses and be a little more humble. We can't fix everything and anything - we aren't all Vanilla Ice (if you got a problem, yo, he'll solve it).
This of course will beg the question of "well, what are we supposed to do, just sit on the sofa and eat Cheetos?" No - we should focus our efforts on those problems we can solve. To borrow an example from Bjorn Lomborg (author of the Skeptical Environmentalist, and maker of good points even if they are sometimes obscured by a rather forceful writing style), rather than try to stop global warming because coastal areas will be submerged, take direct steps to prevent submergement of coastal areas - build a wall. We could help millions of people dying of malaria and other tropical diseases right now, but instead we press on to try to solve a problem that is probably beyond our reach, and may affect people several decades from now. Seems like a no-brainer to me.
You got a problem with that?

Tuesday, August 3, 2010

The Good Guys is One of the Good Guys

So . . . I started watching a show called the Good Guys, starring Josh Lyman from the West Wing (who has a moustache now) and Tom Hanks’ younger clone, er, son. Not a bad show, really super-cheesy, but entertaining. And a complete departure from how TV has gone in the past few years. I like this change so much that I stopped watching the show.

Because I could! Imagine that, a TV show where you can enjoy it without having to watch every single episode. Most of the shows that I have watched in recent years (Lost, Mad Men, The Wire, The Sopranos, etc.) require that you watch every episode if you want to know what’s going on. You can’t just tune into a single episode of Lost and expect to follow the action. But this show, The Good Guys, you can miss as many as you want. It’s a definite throwback.

The serialized-series phenomenon goes back as far as TV itself, with soap operas. I don’t mean to generalize, but these were shows that were geared towards housewives with their hair up in curlers who sat around after the cleaning was done and waited for the husbands to come home from work so they could give them their martini and dressing-gown before serving dinner (hey, I said I’ve been watching Mad Men). Prime-time soaps like Dallas and Dynasty also had storylines that bled from one episode to another. But the soap-opera format was also designed to pick up viewers along the way, always providing enough information to let a new audience understand. Now they just do “Previously on [whatever show]” and expect that to be enough.

Two shows in the 1990’s pushed through this idea that viewers had to watch every episode, and both failed. Twin Peaks is more like a 25-hour movie than a series, with the disadvantage that anyone not watching from the start felt like they were walking into the middle of a (very weird) movie. As a result, it could only lose viewers, not gain them, and was cancelled after two seasons. The other was Murder One, which told the story of one murder trial stretched over a whole season. It abandoned this premise in season two and told several overlapping stories. As a counterpoint, Law & Order used to highlight the fact that each episode was stand-alone.

There was no penalty for missing an episode of Knight Rider, Diff’rent Strokes, or MacGyver; though there may have been the occasional recurring character, each week brought a clean slate. One of the most celebrated episodes of Family Ties had to do with Alex P. Keaton’s grief over the death of a close friend, but even the most loyal viewer wouldn’t have mourned the dead character, because he had never been on the show before. Clean slate.

Now if you want to watch decent TV, it has to be a ritual – miss a week, miss a lot. If you didn’t start watching from the beginning, you don’t know the whole story. And with iTunes and DVD sets and all of that, you can watch the whole thing, but sometimes it’s fun to just tune in when you feel like it. Otherwise, TV can feel like a chore; I recently finished watching season one of the Good Wife, which we had recorded all season because we didn’t have time to watch it every week. And we would refer to it as “working our way through the episodes.” TV shouldn’t be work!

So I’m happy can delete episodes of The Good Guys from my DVR worry-free when I need space. And unless I want to start watching the crappy dregs of TV (I don’t think I’m too at sea if I skip an episode of Ghost Whisperer or Cougartown, but who knows), this is what TV will be, at least for a while. And I can be more selective about the shows I take on, because it is a commitment, at least until the show starts sucking (e.g. Heroes). Even so, a few more cheesy throwback shows wouldn’t hurt.

Sunday, August 1, 2010

O Ban-It-A, My Home and Native Land

So . . . I read in the paper last week that they’re trying to ban teens from using tanning salons. If there’s one thing the Canadian governments are good at, it’s banning stuff. That’s their forte!

I can think of many recent bans and rules that have been proposed: light bulbs, weed killer, helmets for tobogganing, garbage disposals in the sink. This last one is my favourite, because it’s legal to sell it, it’s legal to buy it, and it’s legal to have one in your house – but it’s illegal to install one. So you can see a selection of them at Home Depot, purchase it, bring it home, but don’t even think of putting under the sink or the cops will come and get you. Or at least fine you.

Banning stuff is a very visible, obvious way of a government coming out against something, but it is almost always a misplaced overreaction. A lot of people drink bottled water, and some of the bottles end up in the garbage – let’s ban bottled water. A similar kind of thing is what happened a couple of winters ago, when a kid got a head injury while tobogganing – let’s make a toboggan helmet law. Never mind that there had only been 4 such accidents in the previous seven years; we need to make sure that everyone knows that we’re against toboggan-based injuries.

The tanning salon example is another case of the government stepping in with a rule that, when you think about it, makes little sense. Unless tanning, like smoking, is something that is picked up as a teenager and is then hard to stop (or addictive), setting the rule at eighteen years of age will have little long-term impact on the health of the populace. People will just wait a bit longer to get that healthy bronze sheen.

There are two other things going on here. One is the modern “think of the children” attitude, where we have to protect our kids from everything. I have kids, and I want to protect them. I expect the government to help protect them too, but more from criminals or alien invasions than from tanning beds. I don’t want my kids going to tanning salons, so I will make that clear. If my kids don't listen, that’s on me. I don’t want them avoiding certain behaviours because The Man says so; I want them to make the choice themselves.

The other complicating factor is that the government is also an investor in our health. Because we have government-funded health care (not going into the merits of that today, that’s a whole other series of posts), they have a financial interest in the populace staying healthy. But to single out one or two things will not lessen the tax burden on health care. And I don’t think tanning is the major concern. If they really wanted to cut costs, ban fat and sugar and salt. And alcohol, smoking, driving, guns, knives, swimming pools, and toboggans. But because too many people like those things (or at least some of them), they won’t do it; better to pick a target that won’t cost too many votes. And we all know that people under eighteen don’t vote.

Let’s work on helping people make better choices rather than constricting them. Because historically, reducing choices and options doesn’t work (see: prostitution, drugs, etc.). People will do what they want to do, even if it’s getting a deep, brown, carcinogenic tan while smoking crack with a hooker. Banning things isn’t going to stop people from making bad choices – you could ban Chevy Chase and there would still be a few people secretly selling DVDs of his movies. Now that’s a crime!

Thursday, July 29, 2010

Concussive Window Dressing in the NFL

So . . . apparently smashing your head against someone else's repeatedly over a three-hour period is not good for your brain. So say researchers who are examining the long-term impact of playing football, and the head injuries that go along with it. In fact, the results are downright disturbing (see Malcolm Gladwell's article on the subject for a more in-depth look).

The NFL had been denying and ignoring the issue, and with good cause; if the league were to take on this topic seriously, it would necessitate such big changes to the game that the very survival of football as we know it would be at stake. But yesterday the league announced a wide-ranging, hands-on approach to dealing with the long-term effects of concussive and sub-concussive injuries. They created a poster.

Calm down, calm down. I know that this is a major policy move for commissioner Goodell. A whole poster (actually, 32 of them - one in each team's locker room) may seem like going overboard, but I think such bold action is necessary. And this poster does more than just hang on the wall. It also informs the players that getting hit in the head is bad for your long-term health. And to not play if you have a head injury.

Because sarcasm does not lend itself well to the written form, you may think I'm a bit nuts right now. Just re-read the previous paragraph in an overly dramatic and slightly sneering tone.

This poster is going to accomplish one thing and one thing only - to dress the windows. To half-assedly say that the league regards this issue seriously and will do something about it (they won't, unless forced). Because no highly-payed, over-juiced NFL player is going to read that poster and say, "Really? I never knew. I should give this up and go be an accountant." The poster has a greater probability of being used as toilet paper than affecting the thinking of a single player. These are men who have given their blood, sweat and tears to achieve one goal and one goal only - to get to the NFL. They are focused on winning, on battling, on doing what is asked of them. To admit that there is a risk they had not previously considered, and to capitulate in the face of that risk, is to deny the very essence of who they are.

If you truly want to stop the long-term effects of playing football, start early. Get them while they're kids and high-schoolers, and make plain the risks. Trot out an old, doddering former pro who has even more trouble putting together a coherent sentence than he did when he had all of his faculties. Scare 'em. And then despair when the kids choose to continue banging their heads.

Because ultimately, that's what's going to happen. Just like all of the corner boys who earn less than minimum wage slinging drugs, in the hope that they will be the survivor who gets to be boss. Just like in that episode of Sliders where you can take as much cash out of the ATM as you want, but each dollar is an entry into a lottery where the prize is death (damn right I watched Sliders, you know, on occasion, when nothing else was on). People constantly take risks to earn rewards, even when the rewards are unlikely to occur.

The bottom line is that people like watching football, there are more than enough people willing to play football despite the risks (and earn a lot more than accountants do), so who are we to stop them? And while we're at it, I saw that movie Gladiator, we should get back to doing that too.

Tuesday, July 27, 2010

I Like My Movies Flat and Without Depth

So . . . I heard on the radio yesterday that the official policy of Hollywood (set at their last meeting, I guess) is that all big summer blockbusters now must be released in 3D. So all of next summer’s big films like Harry Potter, Transformers, and Pirates be 3D. Which sucks.

I have seen three movies so far that use this new 3-D technology: Journey to the Center of the Earth, Avatar, and Alice in Wonderland. Only the last was what I would consider to be a decent movie, and I found that the 3D took away from the experience. Avatar was really crappy, but looked really pretty, but it’s likely that it could have looked as great without the 3D; it was the lush backgrounds and imaginative creatures that provided its aesthetic appeal. Journey was just bad.

Haven’t we gone down the 3D path before? Several times, like in the fifties and seventies? Never caught on then. Hope it doesn’t catch on now. It’s distracting, fuzzy, and annoying. I don’t want to wear glasses to watch a movie – that’s why I got laser eye surgery 12 years ago.

Are movies so visually lacking that we need an ersatz third dimension? We’ve done pretty well over the past century in creating stimulating and inspiring images. I think the marginal gain on making a good movie into a 3D one is small – if you have a fun, summer blockbuster (e.g. Iron Man, Dark Knight), how much would it be improved by making it 3D? I see the upside being much smaller than the potential downside. And don’t even get me started on 3D TV.

The biggest issue with 3D, though, isn’t even its lack of true (or even clear) 3D visuals. It’s that 3D is now yet another excuse for making bad movies that people will flock to see. It used to be just special effects alone could draw people to the theatre, no matter how lousy the story, acting, or characters were (see: Twister, Independence Day, anything by Michael Bay). Then people started catching on a bit more, until we went in for digital effects (the main culprit, other than George Lucas’s ego, as to why the Star Wars prequels weren’t all they could have been). Now it’s 3D. So these new technologies are making filmmakers lazy and audiences apathetic.

But even that isn’t the primary motivation for why Hollywood is insisting on all of its major releases being 3D. It’s money. Tickets to 3D movies are more expensive, so only releasing 3D movies gives the distributors and exhibitors license to implement a de facto 50% ticket-price hike. And we, the dummies that we are, will go along with it.

So, if you have the choice, don’t go along with it. Pick the non-3D option and don’t pay the extra. Because the only way the message will get across is through the cash flow (or lack of it). I don’t actually think this will work (I’m not the type of person to take part in boycotts), but at least you won’t have to wear the glasses.