“Escape From Tomorrow” sounds like an urban legend that originated among Disney employees. “These guys made this horror movie and shot it in Disney World and Disneyland without anyone knowing. Everyone used smartphones. The actors had their scripts on their phones and the crew filmed them with their own phones. The director had to map out where the sun would be on each particular day so they always had the best lighting. Then they edited it in South Korea so there was less of a chance of Disney finding out.” The ending to the story would be that the movie was lost somewhere in South Korea but somehow, someday, it will be found and put on the Internet for all to see in all its glory. Except that’s not what happened. What happened is that “Escape From Tomorrow” was entered into Sundance and got picked up by a distributor. A distributor prominent enough that I saw it in a Raleigh multiplex and not, say, a basement in Brooklyn. This entry into the Mixed Reviews series is a little bit different from the others in that it’s not one of my own DVDs and therefore there’s no question as to whether it was really as good as I remember it. However, “Escape From Tomorrow” did receive mixed reviews from critics. And although I wouldn’t buy the DVD, I wouldn’t necessarily throw it out if I got it some other way.
When it comes to Disney parks, I have mixed feelings. I would like to go to one at some point in my life, but I also have a kind of punk-rock pride in not having gone to one. When I do think of Disney World, I think of being overwhelmed. I picture being confronted with the enormity of the whole thing and the Florida heat and long lines and overpriced everything—like any amusement park, but on a much larger scale. At the same time, I also picture an overwhelming sense of nostalgia and wonder and all those happy thoughts. That’s why I walked into the movie theater thinking that “Escape From Tomorrow” would be about how such an imposing place can trigger overpowering feelings. I was sort of right, but it’s not the whole story.
On Sunday I live-Tweeted the Emmys for my Writing for Digital Media homework assignment. Doing so meant that I missed the “Breaking Bad” finale (or so I thought), so yesterday I had issued a personal moratorium on all entertainment news to avoid any spoilers. There would not even be an image search for Emmy winners. Then it turned out that Sunday’s “Breaking Bad” episode was not the finale; that will be this Sunday’s. So while it was still nice not knowing anything about the next-to-last episode until I finally saw it through On Demand, it wasn’t worth the worry.
That said, I would never live-Tweet again. It’s not that it was a painful experience. It was just a weird mix of boredom, anxiety and mild amusement. To give you some background: I don’t think I’ve ever watched the entire Emmys before. Truth be told, I hadn’t watched an award show in years. I’ll watch snippets of the Oscars sometimes, but the last time I ever sat down and paid attention to them was when Jon Stewart hosted in 2006. Award shows were more important when I was a kid. That’s partly because the only other program I was allowed to watch past my bedtime was Nick News and partly because award shows have lots of sparkly things. Heck, I even watched the Miss America pageant back then.
The relative distance from award shows did give it a certain novelty when paired with the new experience of live-Tweeting. Sure, there was “Breaking Bad”, but there was always the encore presentation and if I missed that, it would reach On Demand pretty quickly. Equipped with my laptop and some homemade hummus, I was ready to go at eight o’clock sharp. My first Tweet of the evening read, “Let’s get this party started!”
I know. I’ve neglected this blog. My excuse is that I’ve been busy writing for another blog that’s part of a class I’m taking at UNC Chapel Hill. My life is barely exciting enough for one blog, not to mention two. This blog, though, still serves its original purpose as an outlet for things I don’t always get to express in real life.
Like my interest in marketing/advertising. It goes way back. The true origin is my introduction to Zillions magazine in the mid 1990s. Zillions was a magazine published by Consumer Reports as a kid-oriented offshoot of its flagship publication. There were reviews from kids who tested similar kinds of toys and snacks from different brands. There were articles about saving money and keeping a budget. Then there were the articles that told you exactly how companies tried to take advantage of you. Sometime between the ages of 9 and 11, I learned why candy was sold at cash registers and displayed to meet a child’s eye level. I learned the reason why, in the movie Cool Runnings, the only soda they were shown drinking was Coke. I learned why you had to walk to the back of the store to find less expensive items while the newer and more expensive ones were kept near the entrance. There was one Zillions issue that had a diagram of a shopping mall and pointed out all the common strategies involved with attracting each type of customer. It was an education that I wouldn’t have received anywhere else.
This enlightenment sometimes came at a price. Being told that adults are targeting you for money is a little like being told there’s no Santa. It makes sense and it explains a lot, but it’s still a letdown. That could explain my aversion to product placement. I’m not opposed to the idea of product placement. It’s just that most of the time, it’s distracting, and it takes away from the illusion that a movie or TV show is in no way a calculated grab at consumers’ wallets. I thought about this the other day while watching reruns of It’s Always Sunny in Philadelphia from what I like to call the Early Coors Period.
It’s a good strategy: the show largely takes place in a bar, so featuring Coors seems only natural. The problem is that whoever was in charge treated the Coors logo like a 6-year old treats unicorn stickers. They’ve toned it down a bit since, but it’s still obvious. (More subtle product placement is possible: in the movie The Blind Side, it almost looked like a coincidence that everyone wore Under Armour.) I’m not sure if it was Zillions that made me extra-sensitive to product placement, but I’m sure it helped.
Not that I’m complaining. Zillions took apart product marketing and pointed out how everything functions, like taking apart a toy car to see how it works. Sometimes it seemed cynical, but it was also a way to show kids how the world works while keeping it interesting. It stopped its publication in 2000. Trying to start it up again in the current state of magazine publishing is a bad idea, but I’m sure there’s a place for a Consumer Reports-type website for kids, if there isn’t one already. To twist a quote misattributed to Eva Perón: It will come back and it will be Zillions.
Like everyone else, I love a good train wreck. I especially love a good Development Hell story and a good Marketing Gone Wrong story. So when I saw this article in the NY Times recently, I thought “Jackpot!”. It was the story of Foodfight!, one of the greatest animated disasters of our time.
Foodfight! is a computer-animated film that intended to do for groceries what Toy Story did for toys. Sure, Toy Story is a classic, but don’t forget that it features already-established products like Slinky Dog and Mr. Potato Head, as well as cameos by Etch-A-Sketch and View-Master. It may not have been true product placement: Toy Story was a huge gamble for Disney because it was the first major feature film to be entirely computer-animated. Still, by including these familiar products, Toy Story appealed to nostalgic adult viewers. Generic versions may not have achieved that. It also lent itself to reviving some of those products. It’s not like Slinky Dog was a hot toy before Toy Story came out in 1995.
With that mindset, in 2000, independent studio Threshold Animation announced that it was making its own computer-animated movie that featured established food brand mascots. In fact, 80 different brand mascots made appearances, although, like Toy Story, the major characters were generic. The intention was to use the movie to turn those characters into food and household product mascots in their own right. While Threshold’s strategy is ethically dubious, it makes some sense when you compare it to Toy Story. Over the next couple of years, Foodfight! merchandising deals were made, then-relevant celebrities like Charlie Sheen and Hilary Duff provided voices, and everything seemed to be on track. Then in 2002, the year the movie was supposed to be released, the studio was broken into and part of the film was stolen with no backups to depend on. Whether or not that actually happened doesn’t matter. The rest, as they say, is history. (Read the article for the rest of the crazy story. The NY Times tells it better than I would.)
However, when you put aside all the production problems and moral issues, I doubt Foodfight! would have reached a Toy Story-level of success. Kids are attracted to animated movies regardless of quality, but it shouldn’t be a surprise that the highest-grossing animated movies tend to be genuinely good ones. Toy Story and its sequels are well-written films that put most of the focus on the story and make even the pre-existing toys well-developed characters. Based on this 15-minute highlight video from YouTube, the brand mascots of Foodfight! are little more than props that occasionally make jokes. Another one of Foodfight!‘s biggest problems is in its concept: kids’ relationships with brand mascots don’t have the same depth that they do with toys. I had a stuffed Snuggle Bear when I was little, but I really only knew who he was because of the TV commercials, where he’s more like a teddy bear. My Snuggle Bear had nothing to do with the two-dimensional character on the fabric softener bottle. Not that the Snuggle Bear appears in Foodfight!, although Mr. Clean and the Brawny guy do, and we all know how much kids love them. The characters on the DVD cover are probably the most kid-friendly ones in the movie, but I doubt there are many people of any age who are particularly attached to Charlie the Tuna or the Vlasic pickle bird. (I accidentally called it the Claussen’s pickle bird before I looked more closely at the DVD cover and saw “Vlasic” written on its hat. That should say it all.)
The lesson here is that brand loyalty is not enough to sell a movie. Threshold Animation placed so much importance on marketing Foodfight! that they neglected to make a decent product. I know hindsight is 20/20, but this kind of thing happens over and over again. Someone thought making a sitcom based on the Geico cavemen was a good idea and look how that turned out.
I blame the California Raisins, who disprove most of what I just said. And yes, they do make an appearance in Foodfight!.
I was going to use this movie to close the Mixed Reviews series, but the combination of watching the wonderful new Much Ado About Nothing and recent trip to New York (which delayed this post a bit) brought it to mind. Hamlet and Much Ado About Nothing are vastly different movies, but they are both “modern-dress” Shakespeare film adaptations with excellent senses of place. However, Much Ado About Nothing has been well-received. Hamlet, on the other hand, has the most mixed reviews of any movie I own, which is why I intended to save it for last. The truth is, it did get good reviews from some prominent sources. Ebert and Roeper liked it. Rolling Stone liked it. The NY Times and LA Times liked it. It’s just that no one else did. It made me wonder how I could have such terrible taste. I understand why some people wouldn’t like it, but that many of them?
Hamlet is the familiar story: The king of Denmark dies and his brother quickly marries his widow and takes the throne. The new king’s nephew, Hamlet gets a visit from his father’s ghost, who tells him to avenge his death. Madness and violence ensue. This version is an indie film originally released in 2000. It takes place in New York City of the same year. The king is the CEO of the Denmark corporation. Hamlet is an untalented filmmaker. Ophelia is a photographer with a penchant for Polaroids.
The Shakespearean dialogue is preserved, which might make it seem like a ripoff of Baz Luhrmann’s 1996 version of Romeo and Juliet. Luhrmann didn’t invent the idea of putting Shakespearean dialogue in an anachronistic film setting; Derek Jarman did it in the 1970s and I’m sure he wasn’t the first either. Still, I think one reason Hamlet garnered so many unfavorable reviews is because there was a trend in the late 1990s for movies to modernize Shakespeare for young audiences. After Romeo and Juliet came the similarly teen-oriented Ten Things I Hate About You, an updated Taming of the Shrew, and O, an updated Othello. (Those movies did not preserve the original dialogue, but they both star Julia Stiles, who plays Ophelia in Hamlet.) So by 2000, Hamlet was more of the same.
What does set Hamlet apart is its emphasis on technology and emotional disconnect. Both old and then-new devices are at the movie’s forefront. Hamlet records his soliloquies with a digital camera. Rosencrantz and Guildenstern are put on speakerphone. The “Get thee to a nunnery” speech ends as a series of voicemails. Shakespeare’s words are printed on faxes and typed on laptops. The technology allows the characters to distract themselves while others are trying to communicate with them, so they don’t always listen to each other. It takes desperate circumstances for them to have meaningful interactions.
The technology is part of why Hamlet is an even better movie now that some time has passed since its release, but it’s not the only factor. As played by a 29-year old Ethan Hawke, Hamlet appears to be either in his eleventh year of film school or has spent the last decade trying to find something to do with his life. Either way, he’s not so much mad as he is erratic and immature. (Although the character of Hamlet was written as a 30-year old, it’s been suggested that Shakespeare intended him to be younger and he revised the part to accommodate a famous but aging actor. It’s like what happened with The Wiz.) It works here because Hamlet comes across as the over-privileged product of the 1990s economic boom. You get the sense that he hasn’t had to grow up because he’s been living off a trust fund. Not that’s he’s alone in this regard: other than CEO Claudius (Kyle McLaughlin) and his right-hand man Polonius (Bill Murray), no one has or needs a job. Like Elizabethan royalty, they have a lot of time on their hands for cultivating obsessions. Conversely, there’s a sense of impending doom behind their ennui now that we know they’re situated in pre-9/11, pre-Bernie Madoff New York. It’s like reading Les Liaisones Dangereuses with the knowledge that the French Revolution was about to destroy the same bored bourgeoisie the story depicts.
Of course, I didn’t see any of those things when I first saw Hamlet. I had rented the VHS from the grocery store, so that should tell you how long ago that was. I was a sulky teenager then, and I loved the movie’s angst, its language and its beautifully shot New York City setting. It’s not that my emotional attachment keeps me from seeing the movie’s flaws. Hawke’s a decent Hamlet, but it helps that there’s a strong supporting cast, particularly Liev Schrieber (Laertes) and Sam Shepard (The Ghost). Cutting such a lengthy play to fit a two-hour runtime takes away some of its complexity as well. There are other weaknesses, but I still believe that Hamlet’s strengths supersede them. More importantly, now that I’m an adult, it shouldn’t matter if the cool kids agree with me or not. It shouldn’t, but I’m still hesitant to show Hamlet to anyone else. There’s a 50/50 chance that person will hate it, and I don’t like my odds.
Last Saturday night I happened to look at my phone before going to bed and saw a news update from my NY Times app: “Zimmerman is Acquitted in Trayvon Martin Killing.”
I said a bad word and thought, This is going to be bad. I immediately thought of the Rodney King shooting and the ensuing LA riots. I knew deep down that there wouldn’t be a repeat of that level of violence because that were a lot of other factors that led to the riots, but the thought was still there. So while I knew I should go to sleep, I couldn’t help but wonder what was going on in the country while I was lying safely in my bed.
Don’t go on Twitter. Don’t go on Twitter. Don’t go on Twitter.
I went on Twitter.
Understandably, the verdict dominated the trending topics, with #NoJustice at the top. Curiously, though, another trending topic was #RacialDraft. “The Racial Draft” is a sketch on Chappelle’s Show that is undoubtedly one of its best. It involves an NFL-style draft that defines the identities of multi-racial celebrities and celebrities who embrace a culture that is associated with a different race or ethnicity. For example, Tiger Woods officially becomes black, Lenny Kravitz officially becomes Jewish and the Wu-Tang Clan officially becomes Asian. It’s a lot funnier when you actually watch the sketch, but it’s also a telling reminder of how Americans seem to demand that everyone fit neatly into racial or ethnic groups.
I’m not sure how #RacialDraft started, but I think it was an outlet to define the white/Hispanic George Zimmerman in light of the verdict. If he’s white, he’s further proof that all white people are racist, particularly towards black people. If he’s Hispanic, white people don’t have to feel guilty. However, it evolved into an update of the Chappelle’s Show sketch, a bizarre version of an online fantasy football-type draft. Judging from profile pictures, it was mostly between black and white people, although there were one or two Asian ones and one Native American delegate who was determined to get Johnny Depp and Channing Tatum on his team. The whole thing was far from politically correct and there were many tweets that made me cringe, but overall it was one of the funniest Twitter topics I had seen in a long time.
Even I started getting in on the action. Eminem was a hot property, and there was no way the White delegation could trade him to the Black team for Soulja Boy, so I offered Eminem for either Tupac or a Biggie/Nas package deal. The counteroffer was Nas and the ability to dance, which was tempting, but I didn’t feel qualified to confirm such a huge trade.
All kind of potentially offensive stereotypes were thrown around, and yet, no one complained. It was crazy that at the same time the rest of the country simmered with outrage over such a racially charged trial, there was this little space where people came together to joke about race without anyone getting hurt. I’m not saying that jokes about trading fried chicken for fried rice are going to save the world, but #RacialDraft did provide a nice distraction, if not a tiny bit of hope for the improvement of interracial relations in America. It was something only social media could generate and I’m glad I was there to see it.
Below is the closest thing to an official record of the results, although it’s by no means definitive as Clinton was drafted to the Black team pretty early on. For what it’s worth, Zimmerman was made an undrafted free agent.
This post isn’t as timely as it should be. Part of this is because I wasn’t sure whether I should publish it at all and part of it is because I knew I had to choose my words carefully to avoid any misunderstanding. That incidentally relates to what I have to say.
While following the Trayvon Martin/George Zimmerman case on Twitter, one of the more popular topics was his friend Rachel Jeantel and the way she spoke. Judging from the commentators’ Twitter profile pictures, this came from users of several races, but more often from black users. It got to the point where others suggested in blogs that the criticism from the black community was a form of self-loathing. Everyone has something to say about Rachel Jeantel and I’m sick of the analysis of her every move. As a white person, I don’t need Russell Simmons retweeting “What White People Don’t Understand About Rachel Jeantel” every hour to tell me that I’ll never fully understand what it’s like to be another race. However, as someone who’s interested in language and dialects, I do think I can fairly examine the criticism of her speech.
One of them was her pronunciation of “asked” as “axed”. That stood out to me because I think it’s been unfairly racialized. The truth is, lots of people say “axed”. People from the outer boroughs of New York say “axed”. People from Jersey say “axed”. Pauly D from “Jersey Shore”, who’s originally from Rhode Island but sounds like he’s from Boston, says “axed”. I will admit that outside the Mid-Atlantic/Southern New England region, I’ve heard “axed” mostly used by African-Americans. I don’t why that is, but “axed” isn’t exclusive to one race. The other issue I took with “axed” is that technically, it is grammatically correct. Everyone likes to point out Jeantel’s socioeconomic and educational status for her lack of eloquence, but when you really listen to her words, she’s not as grammatically inept as she is perceived to be. She’s not even difficult to understand. She just happens to have a slight lisp and what should be called an accent.
My theory is that people with more education and/or of a higher socioeconomic status sound rather bland because of assimilation. The United States is a nation of immigrants, and in order to achieve a higher socioeconomic status, one had to drop his native culture. That would include an accent. People of upper classes had Anglo-Saxon roots, which meant they were more established in the country. That could be why British accents sound posh to Americans. That also could be why thicker Americans accents are associated with lower socioeconomic statuses. It sounds terrible, but it’s true. (Watch any movie with scientists in it. I’ll bet that not one of them sounds, say, Cajun.) However, as elocution lessons became less necessary over the generations, accents were passed down without interference. Jeantel is a multilingual child of Haitian immigrants. She shouldn’t be expected to sound like a national news anchor.
It is also true that some grammatical errors can be part of a dialect. As a Yankee living in the South, it used to annoy me that Southerners often say “waiting on” instead of “waiting for”. One is waiting on someone in the context of providing service. “Waiting for” describes expecting the presence of another person, yet “waiting on” is the one I usually hear in this context. Then at some point I remembered an incident from when I was in college. We were watching “Seinfeld” when a friend pointed out that people from New York City say “waiting on line” instead of “waiting in line”. I am from Poughkeepsie, which is two hours upstate from the city. Therefore, it makes sense that I use both forms interchangeably. I’m no better than those who “wait on” people.
On the flip side of this issue, there’s Paula Deen’s appearance on the “Today” show. In her trademark thick Southern accent, she memorably ended her “apology” for her racist behavior with “I is what I is.” That’s a lot different from Rachel Jeantel’s incorrect use of “at”(as it “where he’s at”). “I is what I is” is clearly intentional. That kind of grammar corruption is part of Paula Deen’s image as an ordinary, no-frills Southern woman. She could have taken elocution lessons before appearing on television, as many TV personalities do, but that would have hurt her brand. She was one of the common folk and that made her appealing. Now that everyone knows she not only holds herself above certain people, but an entire race at that, it makes her the object of ridicule on “The Soup”.
In the U.S., we don’t like to admit to class differences, but I think our attitudes toward diction and grammar reflect what is unsaid. Rachel Jeantel was scrutinized as a witness partly because she didn’t sound intelligent enough, but good grammar isn’t always a good thing either. I remember when Madonna was criticized for saying “whom” during an interview. Bloggers asked, “Who does she think she is?” (Her earlier adoption of a vaguely British accent probably didn’t help.) Thus, Paula Deen made an obvious grammatical error to pander to the public. We all walk the line between sounding uneducated and sounding elitist, although for African-Americans, this can translate into a matter of “not sounding black enough”. Were Jeantel to speak the Queen’s English, she still would have been criticized, though maybe not as much. But I’m not going to venture any further into that issue. Or, as I would say if I were speaking in conversation, I won’t try to go there.
Rope is the DVD I can’t get rid of. There’s nothing wrong with the DVD itself. It’s not scratched. There’s no damage to the case. Yet my local used book/movie/music store won’t accept it. On one or two occasions, I waited a few months before trying again; that worked once before with a previously rejected book. Then I gave up and thought, “Maybe I’m just meant to keep this movie.” The reason I even own Rope in the first place is because I read about it and wanted to see it, but I couldn’t find it anywhere except for a cheap copy on Amazon. (It was probably on Netflix, but it wasn’t worth getting a subscription.) Again, I don’t know why that is. It’s not a terrible movie; it’s just that it’s an Alfred Hitchcock movie that isn’t Psycho or Vertigo-caliber.
Rope is based on a play of the same name, which is based on the true story of Nathan Leopold and Richard Loeb. In the movie, two Harvard students, Brandon and Philip, strangle their classmate David with a rope simply because they consider him a lesser human being. They hide the body in a trunk, then host a dinner party with the food served on top of it, buffet-style. The guests include David’s father, his aunt, his fiancée and his fiancée’s ex-boyfriend. The last guest is Rupert, the prep-school teacher who inspired the students’ adoption of the “man and superman” philosophy.
Saying that Rope has a better premise than The Birds is an understatement. But then, I never understood how the latter attained “classic” status in the first place.
There’s also more going on beneath the surface. The “making of” featurette on the DVD includes an interesting interview with the screenwriter, Arthur Laurents. In it, he beams at the fact that a movie was made in 1948 about what everyone involved with the movie referred to as “it”: homosexuality. The implication is that Brandon and Philip have a relationship and Brandon used to have a relationship with Rupert. Cary Grant was offered the role of Rupert but he declined because there were already rumors that he was bisexual in real life and he didn’t want more fuel for the tabloids, although that could just be a rumor itself. The part was ultimately given to the decidedly straight James Stewart. Grant does show up in the movie, if only as the subject of conversation between David’s aunt and his fiancée. They adore him.
The biggest reason why Rope is not a great movie is because except for the last 15 minutes, it’s missing the kind of urgency that makes a good thriller. Part of the problem is that you know that Rupert is going to save the day simply because he’s played by James Stewart. Even Stewart thought he was miscast. It’s not that he’s a bad actor. He’s charming, but from the dialogue it seems like Rupert should be more suave. In fact, there are certain points in the movie where you get the sense he’s silently asking himself why he’s there. He’s much better toward the end, when his character plays the detective.
The other problem I have with Rope is more of a personal one. Rope is a movie for English majors. I found myself distracted by the possible symbolism of the unspeakable act of murder as the then-unspeakable act of gay sex. There was also the remembrance of the homosocial triangle theory. This is the literary tradition of having two men, usually enemies, use a woman to communicate with each other. (The example at the time was Dracula, Van Helsing and Lucy Westenra/Mina Harker.) In Rope, Philip has the woman’s role. You see, I can’t turn off the English major switch in my brain. I usually prefer movies that make me think, but this time I found myself wishing for a spontaneous car chase or an explosion that would give me a break from thinking for 30 seconds.
Rope had a lukewarm critical reception in 1948, but it’s become more appreciated over time, probably due to a greater appreciation of the gay subtext. If it wasn’t an Alfred Hitchcock movie, I’d say it deserved a remake, but we all know what happens when you remake Hitchcock. Still, despite Rope‘s intriguing story and solid performances, it’s not the kind of movie you own. It’s the kind of movie you watch on TCM on a late night. You don’t forget it, but you don’t need to revisit it either unless you’re writing a six-page analysis for your Gender in Literature class.
Maybe that’s why the used bookstore won’t take it. At least I can count on Goodwill.
I’ve wanted to pare down my DVD collection for a while. Though it was never actually said, DVDs were status symbols among people I knew and myself. Having an extensive collection with the right mix of classic, cult and obscure movies proved you had good taste. Then I moved around a few times and the DVDs became more of a nuisance. In the last five years, I’ve gone from having a small dresser full of DVDs to two shelves full. I was preparing to move again a couple of months ago, so it seemed necessary to cut down the collection even further. The move didn’t happen, but my DVDs still could use a good clearing out.
Another motivation for finally going through my movies was my recent viewing of The Great Gatsby, which as far as I know, received mixed reviews. Despite the “mixed” part of the term, “mixed reviews” has a bad connotation, and I realized then that while I often take the negative side when it comes to mixed movie opinions, a good portion of the movies I own either received mixed reviews or are critically reviled. Since I was going to watch all my DVDs again anyway to help me sort them, I decided to write reviews of the mixed/bad ones. The ones I still like will be defended, and the ones that aren’t as good as I remember will go into the pile to take to the used bookstore. This won’t be every new post for now on; just when I can’t think of anything better.
The first DVD on the chopping block is Tommy. This adaptation of The Who’s concept album was written and directed by Ken Russell and released in 1975. I think the critical derision largely comes from Who fans who hold the album sacred and resent the hallucinatory big-screen incarnation. That’s perfectly understandable. It’s a different situation when one isn’t familiar with the original work. For example, I don’t hold The Great Gatsby novel sacred. However, I was familiar enough with the story to have a pre-existing interpretation of it; specifically, that Daisy never loved Gatsby. (My view could be purely the product of teen angst, which is what happens when you read things in high school, but I can’t help that.) Therefore, watching The Great Gatsby movie would have been different if I didn’t know anything about the story beforehand. In the same way, I think the reason I hold a more favorable opinion of Tommy than most is because I had never heard the entire album before watching the movie.
No one can argue that basic story isn’t ridiculous. A boy is traumatized into becoming deaf, dumb and blind. He’s abused throughout his life. That’s not the ridiculous part. I learned in a college psychology class there was a case of a British solider whose post-traumatic stress disorder caused him to become deaf, dumb and blind. I would have raised my hand and said, “Hey, it’s Tommy!” if it hadn’t been during a lecture.
Here’s the rest of the story, though: Tommy discovers that he’s a genius pinball player and somehow becomes a millionaire because of it. Then he’s miraculously cured of his condition and uses the experience to become a messiah figure that preaches using sensory deprivation and pinball as the keys to spiritual fulfillment.
Ken Russell evidently decided that the music needed some over-the-top visuals and hammy acting to go with its inherent weirdness. Here are some sentences I had fun writing:
Eric Clapton plays a preacher at a church that worships Marilyn Monroe and substitutes scotch and pills for bread and wine in their version of Communion.
Pete Townshend is some kind of deacon at the same church, where he waves around a six-pointed star featuring Marilyn’s face while screaming in the parishioners’ faces.
Tina Turner plays a prostitute called the Acid Queen who turns into a psychedelic iron maiden with the help of several syringes.
Tommy’s mother (Ann-Margaret) goes insane watching TV commercials until the next thing you know, she’s rolling around in chocolate, soap suds and baked beans.
There’s more great stuff in there too, but this post is long enough as is. The funny thing is that it actually could have been weirder: the role of the female prostitute was originally envisioned for David Bowie. At one point, I think in the 1980s, Ken Russell wanted to direct a movie adaptation of Evita. I’m glad it never came to fruition, but I do like to imagine what it would have been like. My guess is that David Bowie would have played Evita and instead of waltzing with Che, they’d have a wrestling match involving Argentine beef.
The only real problem I have with the movie is Uncle Ernie. Tommy covers some heavy topics; namely, physical and sexual abuse. They’re handled well for the most part. The scene with Acid Queen is appropriately nightmarish. The scene with the sadistic Cousin Kevin isn’t as terrifying, but there’s still a minor-key creepiness to it. Yet I can’t overlook the scene where Uncle Ernie molests Tommy. Thankfully, it’s not explicit, but it is played as comedy. This is probably to balance out the unavoidable darkness of the subject matter. I also have a feeling Keith Moon was dying to play a role that allowed him to be his “Moon the Loon” self on camera but didn’t require more than talk-singing. The result, however, just isn’t funny, especially since it comes from a song written by a man who himself was sexually abused as a child. Even worse, Uncle Ernie is shown reading a newspaper called “Gay Times”, which seems to imply that gay men are rapists by nature. Hilarious.
At least with this last viewing, I finally saw the cracks in the movie’s candy-colored exterior. By this I mean that while the stylization makes an excellent effort at burying the emotional aspects of the story, there still are glimmers of an underlying sadness. A lot of that’s due to the music, but not all of it. For example, Ann-Margaret spends most of her screen time chewing the scenery, but there’s a genuine quality to her character’s guilt over her son’s condition, especially in the scenes where he’s a child. Another newly appreciated element was the way that after each traumatic experience, Tommy saw (presumably in his mind’s eye) a different version of himself when he stared into the mirror. Each new Tommy was the color of the room where the experience took place, until all of the Tommys combine to make a single one who eventually takes over the role of spiritual guide once played by Tommy’s dead father. That’s pretty profound for a detail too minor to even show up in the lyrics.
The third thing that changed for me was the second half of the movie, where Tommy’s condition is cured through a psychological breakthrough and he becomes a messiah figure. I used to find it boring because while it certainly has a good deal of absurdity, it’s less manic. Some of that’s because it mostly takes place outdoors as opposed to the gaudy interiors of the earlier scenes. Most of it, though, is in Roger Daltrey’s earnestness. In the Who documentary The Amazing Journey, Pete Townshend says that despite all the tension between them, Daltrey was Tommy. I like that Daltrey played the part with a complete lack of cynicism. Tommy has a faith in humanity that allows him to forgive his abusers, try to save the world as he was saved, and to be heartbroken when his followers ultimately turn on him. By the end of the movie, the excess of the first part has mostly been stripped away and we’re left with a message about self-salvation. Delivered in a stupid way, but it’s still there.
That’s why I don’t know what to do with this DVD. On one hand, it’s so crazy you want to show it to other people simply to prove it exists. On the other, the Uncle Ernie scene requires some forewarning, which is awkward. There is some cringe-worthy singing from Oliver Reed and Jack Nicholson, but there are amazing performances not only from The Who, Eric Clapton and Tina Turner, but also Paul Nicholas and Elton John. (In fact, I prefer Elton John’s version of “Pinball Wizard” to the original. He sure plays a mean piano.) I was all set to send Tommy to the reject pile simply for Uncle Ernie, after being reminded of what I liked about the movie in the first place, as well as seeing new and more redeeming things in it; I think I’ll keep it. Maybe next time I’ll change my mind.
I mentioned last time that something was missing from my beloved Yorkie bar, the English candy bar I had for the first time in four years a few weeks ago. Never mind that World Market doesn’t carry the Raisin and Biscuit bars: the real problem was that the wrappers didn’t have the distinct “It’s Not for Girls!” warning on them. My trusty source Wikipedia says that the slogan stopped being used in 2011. I just hope it’s not because people complained.
You might think that as a woman, I would be offended by the unapologetic circle-and-line logo and the all-caps text. Not so. Eating a chocolate bar I wasn’t “allowed” to have made it taste even more delicious. Plus, I understand how the marketing of a British candy bar would fit in with the sadomasochism that is British humor. The first time I saw a Yorkie bar, I thought, “There’s no way this would work over here.” I never thought I’d be proven right.
A couple of years later, Dr. Pepper introduced a soda called Dr. Pepper Ten. The tagline was “It’s Not for Women”. I didn’t like it simply because it was a ripoff of Yorkie’s, but I did admire the fact that Dr. Pepper had the balls, as it were, to use it. The decision made some sense; “manly” ad campaigns have been generally successful in recent years. Dr. Pepper Ten’s TV commercials even resembled those of Old Spice. Plus, fellow soft drinks Coke Zero and Pepsi Max are also marketed towards men.
Dr. Pepper Ten, however, flew too close to sun. People complained and the ads quickly disappeared.
I don’t really care about Dr. Pepper Ten, but now that Yorkie is apparently not for girls anymore, I’ve had enough. It’s time for a sit-down, ladies (and I know you’re the majority of the ones who complained about the ads). Here’s why you shouldn’t have a problem with products that are “not” for us:
1. It’s not about us. It’s about playing off male insecurity. Seriously, before Coke Zero and Pepsi Max came out, I knew a guy who made his female friend carry his diet soda for him because he thought it looked effeminate.
2. Its ridiculousness shows it’s not serious. Ads that implicitly exclude women are serious about their message. For example, women are perfectly capable of enjoying beer, but most beer commercials only show us as a reward for men who drink that particular brand. The ads have generally become less overtly objectifying over the years, but the message is still the same.
3. It works both ways. Look at the marketing for Dove chocolate. I very much enjoy eating chocolate, but it is not the quasi-erotic moment experienced by the women in the ads. There’s something condescending about the suggestion that chocolate consumption is a sacred feminine act.
In short, it’s humor. Laugh at the fact products have to be marketed to assuage gender identity crises in the first place. Laugh at the men who can’t enjoy a diet soda unless an ad campaign says it’s okay. These sad souls were the target of the joke all along.
Now if you don’t mind, I’m off to have a tryst with a charming English candy bar.