Friday, November 12, 2010

The Dumbest Author? Mark Bauerlein and "The Dumbest Generation"

This post is going to be be an in-depth, drawn-out, and often scathing (but fair) analysis and criticism of the book The Dumbest Generation by Mark Bauerlein. This could take a while, so grab yourself a warm beverage and settle in...

In his book The Dumbest Generation, Mark Bauerlein attempts to make a case for the declining intellect of younger generations and the ramifications of technological advancements. In short, Bauerlein argues that a "digital" generation is a "dumb" generation. As a teacher, I can tell you that there are, indeed, troubling gaps and trends in academia, work ethic, and critical thinking. What I can't paint with such a broad stroke, however, is that these trends are occurring because of our information-technology era. Unfortunately, neither can Mark Bauerlein... try as he might.

Bauerlein is a "professor of English at Emory University and has worked as a director of Research and Analysis at the National Endowment for the Arts." This does not bode well for indicating the level of empirical analysis skills required to get a PhD in literature nor for those same skills as a prerequsite for being a director of Research and Analysis – a position in which, presumably, you should know how to work with numbers, datas, and surveys. Unfortunately, Bauerlein's book and even the very studies he was involved in are rife with bias. It's a shame, because it dismisses his very real and important cause as nothing more than the ramblings of one of the antiquated literati.

Let's look at some of the many false assumptions, inadequately-supported conclusions, and examples of sheer bias riddling this book. "The 2002 Survey of Public Participation in the Arts (National Endowment for the Arts) charted several exposures over the preceding 12 months to record the bare presence of the fine arts in individuals' lives. Except for people 75 and older, 18- to 24-year-olds emerged with the lowest rates of all three groups" (p. 24). What, exactly, constitutes "fine arts"? Well, the data used to support his claim come from these instances: attending a jazz or classical music performance; attending a ballet or play; going to a museum or gallery; playing a classical instrument or singing in a choir.

You might be able to see, right away, what kind of absurd bias is inherent in this "study." How very convenient that "fine arts" does not include any "popular" art... thus films, pop music, and photography cannot, by this definition, be "fine art." What, then, is it defined by? Is it based on antiquity of the medium? This could account for many of them, sure... but what about jazz? Didn't that emerge in the 20th century? Well, there goes that theory. In reality, there is nothing to account for what Bauerlein portrays as "fine art" here other than his own personal tastes. And this guy was a Director of Research and Analysis??

Let's consider some very interesting thoughts that blow holes all through Bauerlein's argument. What makes attending a jazz or classical performance any different than attending a rock concert? Because rock is popular and the others are "old"? Consider this: in the early 1900s, jazz was the "rock" music of its time – it was new, it was popular, and even somewhat rebellious. In the 18th and 19th centuries, "classical" music was de rigeur and was simply called "music" because it was what people played and listened to. So, since jazz and classical both started out as "pop" music, what makes them any different from rock, hip-hop, disco, or any other genre? Sheer bias.

What about playing a "classical" instrument? What, exactly, is a classical instrument, and what makes playing one superior to playing any other instrument? Does a drum count as a classical instrument? After all, there is often a percussion section in an orchestra and, in fact, drums are the oldest form of instrument known to man. Yet they are used in rock and hip-hop, so does that disqualify them? What about guitar? Clearly out of the question... right? Well, it's not so clear; what if a student is playing "classical" nylon-string guitar?

As for live performance: film was and has always been an extension of theatrical performance. So what makes a movie "worse" than a play? Is it because it is recorded on a medium instead of being immediate, transient, and fleeting? I don't see how that affects the art or the thinking and appreciation processes involved. To see "live" performances generally costs more money, so if that is the factor then the National Endowment for the Arts is showing elitism and prejudice against less-fortunate people who can't, unfortunately, participate in live performances but can consume the art in a recorded format. And if it's not the fact that it is recorded that makes the difference, then what is it? What if kids watch a ballet that is tape-recorded... would that count, even though they aren't "attending" the performance?

You can see how patently absurd this survey and Bauerlein's conclusions drawn from it are. It makes him seem no better than a fascist Luddite. "Arts" are not the only area, however, where Bauerlein makes wild and erroneous pontifications. Being an English professor means, of course, that we must lament the decline of literature consumption.

This same SPPA written by the NEA and conducted by the U.S. Census Bureau is also used to compare literature reading to previous decades and draw some conclusions. "From 1982 to 2002, reading rates fell through the floor. The youngest cohort suffered the biggest drop, indicating something unusual happening to young adults and their relationship with books." On page 46, Bauerlein shares the resulting numbers with us in a table, broken down by age groups.This table represents the percentage of people in each age group who read for pleasure – this means voluntary reading of literature (not non-fiction) not required for work or school.Of these age groups – 18-24, 25-34, 35-44, 45-54, 55-64, and 65-74 – every single age group saw a decline in leisure reading from 1982 to 2002. In 1982, the reading rate of 65-74-year olds was 47.2% so, sure, it didn't decline much over 20 years, but it was never that high to begin with. Is that something to celebrate? Should we be calling them the "dumbest generation"?

In 2002, 42.8% of 18-24-year-olds counted as "literary readers"; yet the oldest generation ticked only slightly higher at 45.3%. In fact, the highest age group (45-54-year-olds) tallied only 51.6%. The table makes it appear that this is only a 3-point decline from 1982, when the percentage for that group was 54.9%. However, you must think about this logically: people in the 45-54-year-old group in 2002 were actually in the 25-34-year-old group in 1982. So how much did literary reading decline in that group? Actually, it went from 62.1 to 51.6, or a decline of 10.5%

So if even the "most literary" of the bunch are seeing a 10-point decline, what does that say about our society and the causes at hand? Bauerlein jumps to a conclusion throughout the book that it must be because of digital technologies and media. Well, maybe he's onto something... after all, these reading rates only dropped an average of 3 points from 1982-1992, but then plummeted 10 points from 1992 to 2002. What happened?

Well, for one thing, the World Wide Web was born in 1992. Ironically, though, the Web has (and especially had in the 1990s) an enormous amount of text. To say the Web would distract from reading would be pretty ironic, to say the least. The Web requires literacy and reading (especially in the days of 28.8 modems and no streaming music, videos, or animation.)

However, there's no denying that personal computing power has seen an amazing rise in multimedia possibilities, both for consumption and – perhaps more importantly – for production. Shrinking technology costs made video and audio production extremely efficient and affordable. It's no wonder, then, that some may find it much more effcient to communicate an idea using audio-visual medium instead of printed text. Is this such a bad thing? There are many instances when a picture truly is "worth a thousand words" – and motion media is 24 or more pictures per second so... do the math.Visual media can better illustrate sequential occurrences, cause and effect, or hard-to-imagine concepts for subjects as broad as history, geography, math, and sciences. Sure, words have their place and always will; they are symbolic representations of ideas that can be too complex or vague to represent visually. The fact remains that all groups across the board are spending less time on leisure reading... there is a reason for that, and it's not necessarily a bad one.

What is especially unfortunate, however, is that Bauerlein jumps to the conclusion that it must be the technology being the causal variable... as if that's the only variable at play! What about the fact that the past 20 years has seen an increase in school homework, extracurricular participation, and average workweek hours? Couldn't these factors impact the amount of free time available for leisure reading?

"Technophiles celebrate the ease and access, and teens and young adults derive a lesson from them. If you can call up a name, a date, an event, a text, a definition, a calculation, or a law in five seconds of key-punching, why fill your mind with them?" (p. 94) This question is raised in the snarky, sarcastic, sanctimonious tone Bauerlein likes to arrogantly pepper throughout his book. The question is meant to be rhetorical, the answer self-evident. Yet I would argue that this is a valid question, and should be asked in earnest. Why, indeed, should our thinking process regress or stagnate to a mode of mass acquisition and memorization, when in fact the tools are available to "outsource" those cognitively-intensive tasks to a computer, network, or communications system? Let's consider this logically. First of all, why were dictionaries and encyclopedias created? Were they invented to teach us the meaning of words that we are meant to forever-after record in our minds for easy recall whenever necessary? No. If they were, why would they be found in the "Reference" section of the library? The answer is simple: because they were designed for us to refer to them as necessary, when there is a specific piece of information we need that is not stored and readily available for instant recall in our minds. According to cognitive science, this makes sense; although the mind can store vast amounts of information, it prioritizes its storage (and connections required to recall the knowledge) based on frequency and recency of use. Therefore, we are only meant to memorize those items we will be using on a frequent basis in our life – and it makes sense to keep everything else in an external reference for which we memorize the knowledge of how to locate the information when we need it. This exactly parallels the knowledge network known as the Internet and World Wide Web. But let me guess, Mark... encyclopedias are "good" (because they are made of paper?) and online information is "bad" (because it is electronic?).

I hate to break it to you, but the answer to the question is self-evident. Why fill your mind with knowledge you (a) may never use again, and (b) could easily and instantly locate if you do happen to need it? This paradigm shift in thought processing has already occurred... and it happens to be an advantageous approach. When I applied to computer programming jobs 10 years ago, the interviews would involve barrages of "pop quizzes" and scenario-based tests. However, the examiners were not testing whether I memorized syntax rules like the need to put a semicolon at the end of each line of C++ code. No, the interviewers could care less; in fact, my answers were given in "pseudocode" – in other words, I would describe a process, an algorithm that would need to be used given a certain set of circumstances. This is what they cared about, and it makes sense... after all, every programmer I know has a stack of reference materials ready at hand as they work.Yes, even computer wizards use textbooks and occasionally have to look up the method to make a certain something happen in the computer. I've even seen this in play when, during visits to the doctor's office and ER, doctors (of all people!) refer to computer databases of symptoms in order to cross-reference or double-check their suspicions of illnesses based on their (traditionally memorized) knowledge.

Again, I ask... what is wrong with that? This thought process allows us to put energy into memorizing things we know we will need to do on a regular basis – drive a car, cook food, perhaps even quickly locate information on the Internet or efficiently use our cellphones (yes, Dr. Bauerlein, cellphones are actually a requisite part of many businessplace professions outside of the ivory tower.) Let's devil's advocate for a second here and say, for the sake of argument, that there is a danger in outsourcing our knowledge because we will then become dependent on a resource that may suddenly not be available when we need it. What if you've changed your mode of operation to simply look up something on the Web the second you need to do it – and suddenly the web is unavailable? In fact, this is not hard to imagine... I recall some scenarios during traveling when I had become so accustomed to using MapQuest that, when plans changed or I found myself in a foreign location with no Internet access, it would be very nice to have a good old-fashioned map. (Then again, a map is an external reference – it would be absurd to memorize maps in my head, especially for trips that are not likely to be repeated. But I digress...) Well, consider this: the information is becoming more readily-accessible as time goes on... not less so.

I suppose if an apocalyptic catastrophe strikes, we may suddenly find ourselves plunged into a world devoid of the energy resources needed to fuel our modern technology. But I think we can safely assume that, if that day comes, we will have much more pressing things to memorize than Shakespeare or Planck's constant. I imagine we'd more likely need to stay focused on seeking shelter from lightning storms and hunting down rodents or other meager source of protein. So the argument for the need to memorize: politics, history, fine art, literature, erudite vocabulary, specialized sciences, or 99% of other academic information is simply moot.

Then there is the chapter in which the author laments the lies inherent in a visual advertisement for an Apple laptop, in which the laptop is compared to piled of books. "In fact, it proclaimed, the laptop has rendered all those books pictured in the display window obsolete Who needs them? The computer is now the only book you need. Many books and journals you can find online, especially now that the Google project is well under way. And soon enough, e-publishers predict, all books will appear in digital format, making hardcover and paperback books something of a curiosity, like card catalogs and eight-track tapes. Already, with the Internet, we have enough handy information to dispense with encyclopedias, almanacs, textbooks, government record books, and other informational texts. That was the argument behind the declaration, and with so many techno-marvels inside, one could understand a 15-year-old absorbing the slogan as patently true." (p. 99) "For the average teenager looking for the latest iTunes, the titles in the pictures meant nothing, and the display assured them that they do, indeed, mean nothing." (p.100) "To replace the book with the screen is to remove a 2,500-year-old cornerstone of civilization and insert an altogether dissimilar building block." (p. 101) Here I feel the need to refer to a banal colloquialism: Mr. Bauerlein, what the fuck are you talking about? You cite "users can always gear down to reading page by page in PDF format." (p. 103) Yet you make no evidence-backed rebuttal of this claim, but instead refers back to techno-evangelists Johnson and Will Richardson, who coin the term "the Read/WriteWeb."

From here we shotgun into a rebuttal backed not by facts or evidence but via fictional anecdotes (more evidence of a literature professor at work here), in which Bauerlein refers to visions by authors Ray Bradbury and Aldous Huxley – authors who have all written about bleak futures dehumanized by infatuaItion with technology. The logician and empiricist in me says... "HUH?" I love Fahrenheit 451 and Brave New World as much as the next guy... but how did we get from point A to point B? I understand that we are supposed to realize Postman was comparing our modern "reality television" entertainment with the devices in these bleak fictional futures, but how does that rebut statements that you can, in fact, use a laptop computer the same way as you read a book via PDFs and eBook software?

The plain answer is: it doesn't.

And then there is Bauerlein's "debunking" of the idea that "real", meaningful reading happens online. His conclusions are based mainly on studies of web use conducted by the Nielson Norman Group, which "has no stake in grand pronouncements about the Digital Age" (p. 142) because they are a private firm that researches user interactions for the sole purpose of improving business/commercial viability of websites. This research has led to such shocking revelations that Bauerlein must share them with us in boldface type:

  • of subscribers of email newsletters, only 19% fully read the newsletters.
  • people scan headlines and blurbs in newsfeeds
  • readers choose the content they are interested in and only focus on that particular news
  • sites are recommended to use sixth-grade reading level on the main page and only eighth-grade level for internal articles
All of this sounds pretty dismal and appalling, right? Until you realize the hilarious fact that, despite being a professor of English at Emory University, Mark Bauerlein doesn't seem to realize that every single one of the above facts also applies directly to print newspapers!

Yes, newspapers are generally written at an eighth-grade level, so as to ensure accessibility of information for the majority of the population. Very few people read an entire newspaper -- in fact, there is a reason why the newspaper is broken down into different sections by topic and interest. There are many people who might pick up a paper just to read the "Sports" section or the "Finance" news. There are many more who will grab a paper simply because they scanned the headlines on the front page and found something of interest to them (this is, after all, why multiple headlines and story segments are placed on the front page to begin with. Because it's what people naturally do -- and have done for more than a century.) He laments the fact that people only look for the minimal amount of reading to get pertinent facts and information -- but isn't this what the news is for? The entire structure of journalistic writing is designed to cater to this style and purpose of reading: you start with an eye-catching headline (which, in itself, contains most of the important facts), then you follow up with an introductory paragraph which does not, unlike other essays and treatises, force you to read further to glean details. In news writing, all of the most pertinent facts and details -- the "5 W's" -- are answered right from the start. Why? Because that's what most people want to know. Many readers stop reading after that first paragraph or two. The remainder of the article will contain the extra details and specifics to elaborate on the information, for those people still interested in knowing more.

But hey... why would I expect an English professor to know this and take it into account, and realize that the habits described on the web almost perfectly mimic the habits practiced in print?

All of this is not to say that I disagree with everything revealed or lamented in The Dumbest Generation. For example, from pages 77-87, Bauerlein juxtaposes the results of a Kaiser report entitled Generation M: Media in the Lives of 8-18 Year Olds with the pro-pop-culture affirmations of Steven Johnson's Everything Bad is Good for You: How Today's Popular Culture is Actually making us Smarter. Setting aside the content of modern media (which is conceded to be "coarse and inane"), Johnson proclaims that modern entertainment is more sophisticated in its format and plot structures, that – as Bauerlein puts it, it builds "aptitudes for spatialization, pattern recognition, and problem solving, virtues that reflect twenty-first-century demands better, in fact, than do traditional knowledge and reading skills." Well, to be sure, spatial-temporal and problem-solving abilities are important in the new millennium... but that doesn't mean they are the only important pieces of the puzzle.Bauerlein notes – and rightly so – that, given the Generation M study showing kids 8-18 spending an average of more than 6 hours per day consuming media, and since modern pop media inculcates "intellectual or cognitive virtues," shouldn't this mean that kids' cognivite capacities are just growing by leaps and bounds? Yet standardized test scores falter and stagnate, and in some academic areas (sciences, social studies, politics), knowledge and interest seem to be declining altogether. So where is the disconnect?

In his own way, Bauerlein points to a problem of perception – that sometimes what is being assessed is too narrow in scope to really be a valid marker of overall cognitive prowess or academic proficiency. Bauerlein's example is the "Flynn Effect" – essentially, the general IQ test has had to be re-normed on a regular basis since its inception, because performance on it continued to improve. One could argue that, early on, this was due to improvements in medicine, nutrition, and education. But the past 50 years have not seen major strides in these areas ... and yet IQ scores have consistently climbed about 3 points per generation. Some would like to point to modern media as a catalyst for this... but if peoples' general intelligence is really higher than it used to be, then why are students faltering? Bauerlein draws the likely conclusion that people have become more spatial-temporal due to increased exposure to spatial-temporal media (p. 93). This seems to be a logical conclusion, since the IQ test does, indeed, use visual, spatial, and sequential-logic questions as its basis. Moreover, it points out that such skills do not transfer into "general" intelligence or learning. Still, it is a bit of a concession that Bauerlein unknowingly admits: perhaps the new media do improve this one specific type of cognition. Unfortunately, I have to agree that this type of thinking, alone, is not enough to thrive. The real argument, then, should not be that screen time is bad but that it should be mitigated in order to allow time for other activities (like reading) that will likely stimulate other cognitive modes like verbal or interpersonal proficiency.

More to the point, I agree nearly wholeheartedly with the focus of the chapter "online learning and non-learning," in which Bauerlein points out that "for all their adroitness with technology, students don't seek, find, and manage information very well...They're comfortable with the tools, but indiscriminate in their applications. ETS terms the missing aptitude Information and Communications Technology (ICT) literacy, and it includes the ability to conduct research, evaluate sources, communicate data, and understand ethical/legal issues of access and use" (p. 113). I think this is really the core and crux of the "problem" Mark Bauerlein is grasping at. In actuality, he tries to make a case that technology and media, in general, cause kids to become less critical, less informed, illiterate... dumb. However, as already pointed out, this is a weak argument because the amount of information and text involved with computers and the Web is staggering. The real issue is the assumption people are making that, being born "digital natives", the "millennials" inherently know how to use technology. This unfortunate assumption rises from the fact that, being immersed in a digital-media society, kids today do seem to feel more comfortable and adept at approaching and operating various technologies. However, using a technology and using it well or productively are two different matters entirely. Knowing the nature of kids (and humans, in general), it's not surprising that, if left to their own devices, they choose the path of least resistance – ie. If they have a choice between online research and reading or a non-educational video game, they will choose the latter. If they have a choice between writing a meaningful, well-structured blog or building credibility in their social circles via 200-letter tweet, they will choose the latter. If it's the choice between streaming Carl Sagan's "Cosmos" or watching a humorous-but-inane 2-minute YouTube video, which one are they going to choose?

In an ETS study of ICT skills in 6,300 students, Bauerlein notes, "Few test takers demonstrated key ICT literacy skills" (www.ets.org/itcliteracy). He goes on, in this chapter, to pick apart technology initiatives and show that they don't work to improve performance. First, he points out that "Taking the enthusiasm of 18-year-olds as a measure of educational benefits may sound like a dicey way to justify such a sweeping change in classroom practices, but many studies out to evaluate e-learning do precisely that" (p.119). In my research, I have found this to be true and, like Bauerlein, I find it ridiculous and lamentable. Using satisfaction surveys are a moot point and should be nearly irrelevant to determining the efficacy of educational technology, and here's why: the purpose of education is not to make students feel happy; the purpose of education is to increase knowledge and learning. Period.

Now, I don't think anyone would argue that motivation can play a crucial role in whether someone learns or not; if someone is very resistant to trying to learn, it likely won't happen. But this does not mean that motivation equals learning. If it is determined that knowledge is still not being acquired, despite significant motivation, this is not good news! It is not something to celebrate, as many edtech articles do. In fact, it points to a serious disconnect; if students are so happy and motivated, yet are learning less and less, then it begs the question: what, exactly, are they motivated to do?

On pages 120-124, Bauerlein lists a series of cherry-picked studies indicating no significant gains from technology (computer hardware and software) acquisitions, despite being often pricey investments. The irony here is that he segues from this into an analysis of the benefits of reading print media outside of school to showcase why reading at home is important. To say this feels like a non-sequitur is an understatement. After all, didn't we just establish that it's not the technology that is causing the problem, but in fact is the students' lack of skills and awareness with how to use the technology in meaningful, appropriate ways.

Some good points are made regarding the value of language use outside of school – that language and vocabulary are improved through constant use of language and especially leisure reading (p. 129). However, this doesn't explain why, after outfitting students with laptops, a New York state school board stated "After seven years, there was literally no evidence it had any impact on student achievement – none." (p. 124).

The explanation for this (much to Bauerlein's chagrin, I'm sure), has nothing to do with books and everything to do with education. The fact that students are missing "key ICT skills" shows the real problem: that students are not being taught how to use the technology in creative, productive, useful, and knowledge-building ways. As Bauerlein points out (in reference to language/vocabulary acquisition), accessing and using information at home is a huge component – and I believe it is critical for parents to treat computer literacy the same way. Parents have an imperative responsibility to monitor, guide, and tutor their children in responsible and practical uses of technology. Sadly, many (if not most) of those parents are unable to do this, being ignorant and untrained themselves. In other cases, parents themselves may "know" technology, but have adopted the same (highly-marketed) attitude that technology is solely for entertainment and leisure – often their transgressions are just as egregious as their child's.

It's no wonder, then, that – despite huge potential for expanding and improving knowledge and understanding of a limitless swath of topics (including literacy) – the technology is not holding up to promises. However, it's not a product of the technology itself, but a product of the neglect we are paying to dutiful training of our educators and subsequent educating of our students in proper, effective, purposeful use of technology. This, to me, does not sound like a justifiable reason to slash funding for technology initiatives. On the contrary, it sounds like a call to increase funding, albeit perhaps with a shift in focus from medium (technology) to message (how to use technology.) In short, knowledge and the ability to share it with others is more important, valuable, and powerful than any device out there.

Perhaps Bauerlein should avoid anything approaching math, science, or empiricism and stick to English-lit style "analysis", wherein one can ponder about the intricacies of emotion or character development and not actually have to worry about making erroneous statements or conclusions. I scored a perfect 6 in the analytical writing section of the GRE (and – what a coincidence! -- I've been using computers gratuitously since I was a child) and I can safely say that, sadly, with this level of illogical rhetoric I have my doubts whether this Emory English professor would even pass the test. At the very least, The Dumbest Generation fails.

Monday, November 8, 2010

CUE, Hall Davidson, and the Cult of "Isn't it cool??"

WARNING: This post will be longer (and more substantial) than a Tweet. If your attention span is not high enough, might I suggest you watch videos of Fred on YouTube?


Over the weekend I attended my first education technology conference in a while: it was the Fall 2010 CUE (Computer-Using Educators) conference -- "Soaring to New Heights" -- held in Napa Valley.

Yes, despite the fact that I have more on my plate than ever before (teaching computers to 400+ students in grades K-6, coordinating two different gifted/advanced learner programs at the school, and working on the final thesis for my MS in Educational Technology -- not to mention recovering from a nasty bout of pneumonia), I voluntarily chose to spend 1/2 of my weekend at a professional conference for use of computer technology in education. The things I do out of passion.

Anyway, it's been a few years since I attended an educational technology conference like this (the last one being about 5 years ago, attending a three-day conference held at the Baltimore Convention Center in Maryland), and I was reminded of something that really, truly bothers me about conventions like this: they are 50% sales pitch, 40% pep rally, and 10% useful information or tools. In the words of the great paragon of fast-food -- Wendy's: "Where's the beef??"

I have no problem, inherently, with sales or pep talks. They are both necessary sometimes. Without marketing, it's likely we wouldn't know about some of the truly wonderful tools and technology that exist for education. And without an occasional morale boost -- well, we'd all probably kill ourselves.

But what this CUE conference really drove home was the dangerous combination of glitz and hype that surround what I'm going to call "the cult of technology" (if you've ever seen people at an Apple store, you'll know what I'm talking about.)

I'm going to use Hall Davidson's closing keynote speech from the CUE conference to illustrate this point. To the man's credit, he came up onto the stage with much energy, enthusiasm, and gusto (following a brief appearance by newly-elected California State Superintendent of Public Instruction.) He showcases a vibrant personality, and that's all well and good for a keynote speaker / pep rally, right? And he didn't just give a speech -- he made use of his laptop and the projector and wireless internet to make it multimedia, to dive right in and show myriad tools in action. That's cool, right?

This is the part where I always have to groan, and the use of technology doesn't quite "float my boat" and make me giddy like the drug it appears to be to the iPad-wielding peers around me. Here's an example: Hall Davidson showcased how you can go to Google Images and do a search and the pictures will show up. But if you really want to make it better, you can go get a plugin -- oh, what's it called? He couldn't remember, so here's a "learning moment" to showcase how "cool" yet another new technology is: Twitter. To drive the point home, he showed how he had popped on and Tweeted about how he couldn't remember the name of the program that would make Google image searches show up in a format like an Apple gallery view. Somebody on Twitter (and with clearly nothing better to do with his time) quickly responded that the app was called "CoolIris"

So Davidson jumps back to Google images in the web browser and explains how you can download and install the CoolIris plugin and it will make the Google images show up in a shiny Apple-like interface in which they appear to be all lined up on a 2D plane (like a wall) in a 3D environment.

"Isn't that cool??" Mr. Davidson prompts. Nods and applause from the iSheep.

Pardon me if I don't jump on the bandwagon. Inadvertently, this "let's get excited about technology" speech pointed out exactly what I'm not excited about: people who think something is great just because it is a new, shiny novelty. People who covet style over substance -- even if it ends up wasting resources (like time or money).

So, let me get this straight: you used Google image search and all the images you wanted to find showed up. You could scroll down the page and find more, as necessary. So, in other words, Google images search functions just fine and allows you to do the task you set out to do. But instead of just being practical and getting the job done, you decided it is "cooler" to waste time Tweeting about it, getting a response, then searching for an app, installing it (bogging your computer with wasted processing power and RAM while your browser is running)... all for ZERO benefit to effectiveness or efficiency?

You can call me a Debbie Downer, but no... no I do not think that's "cool". See, to me what makes technology cool is that it allows us to do things faster, better, or otherwise impossible without it; this does none of the above. It merely illustrates the inanity of our modern-day Cult of Technology and how willing they are to trump substance with style. This app (and others like it) may be free, but the time wasted -- especially when we're talking about our limited instructional minutes in education -- is invaluable.

Oh, but was that example just a poor one, a fluke? Well, Mr. Davidson followed it up by showing how you can create a Wordle (which could, in and of itself, be argued to be another example of replacing substance with style; however, I can understand some of the ostensible benefits of this one for education -- raising awareness of key words, front-loading vocabulary, etc.) Okay, so in case you don't know: a Wordle is a computer-generated stylistic visual representation of some words/text. In and of itself, it doesn't serve much purpose (other than -- you guessed it -- "cool!"), but it doesn't take much time and it could build some awareness of words or their relationships, or maybe give students some motivation to explore synonyms, antonyms, etc. It's not amazing, but it could have its uses. But... useful?? That's not good enough for the Cult of Technology -- we need it to be flashy... cool! Hall Davidson shows you how: create a Wordle, then use PhotoBooth software (Mac required... we all know PCs are just... practical and... uncool.) And what you can do is create a chromakey effect where it will magically erase the background and replace it with your Wordle! And then you can... well, you can video record yourself with a Wordle image in the background which is very useful for, umm... uhh... it serves a practical application in lessons about... chromakey??

Oh, who cares! It's COOL!


(personally, I would have found it a lot cooler if the educators in the room were intelligent enough to be able to correctly spell their favorite beverages which they texted into a central repository -- which included classic wines and beers like "mascado" and "heifenvizen". But beggars can't be choosers, I suppose. We may no longer know how to spell, but at least we can take those misspelled words and make them glitzy, shiny, and cool through the newest -- and therefore the best, of course -- iPad apps available!)