Tuesday, November 15, 2011

iPad Apps vs Flash Files

There are currently about  96 million .swf Flash files on the internet. There are about 250,000 iPad apps.

At this rate of development, it will take 500 years for iPad apps to catch up to the number of SWF files on the web -- you know, those millions of websites (most of them free) that won't work on iPads

Monday, April 11, 2011

Critical Reading and Evaluation on the Internet

Ever since I first became a teacher (8 years ago, after being a computer programmer), I have been saying the same types of things as this quote by Michelle Herczog, pointed out on the Teachinghistory.org website: "Education as a paradigm needs to change from acquiring knowledge, to finding, understanding, and using knowledge to become responsible, engaged members of the human race capable of solving complex problems in a complex world."

This is easier said than done. There are still a lot of old-school teachers (and even standards) requiring kids to use dictionaries, and even when they ask students to look up information "on the web", they use the same guidelines as finding information in a library. This is a big no-no, because it doesn't take into account that anybody can publish anything on the web; at least with print materials, there is some level of editing and filtration that goes on (which is not to say you shouldn't read newspapers, magazines, etc. with a critical eye as well...)

This article about learning about history facts using the Internet has links to some good resources to emphasize the skill: Is the Internet a Reliable Source for History Content?

Flesch-Kincaid Reading Level and Calculators

So, here's something interesting: I am developing a program to assist students in reading, and as such I am looking into automated systems for determining readability/difficulty of any given text. One popular automated algorithm is the Flesch-Kincaid Grade Level, which tells you a supposed grade level of text based on word length and sentence length. Seems simple enough, right? In fact, the formula is here:


Why, then, does every Flesch-Kincaid calculator I use return different numbers... sometimes drastically different.

Microsoft Word: Flesch Reading Ease - 51.9, Flesch-Kincaid Grade Level - 8.8
Joe's Web Tools: Flesch Reading Ease - 59.6, Flesch-Kincaid Grade Level - 7.8
Standards-schmandards Readability Index Calculator: Flesch Reading Ease - 37, Flesch-Kincaid Grade Level - 11 (I find this highly unlikely)

Thursday, February 10, 2011

iPad: A POOR CHOICE for Education -- Myths and Facts

I am getting really, truly fed up of having to educate people who like to spread propaganda and sales pitches about Apple products. Are the masses really that easily brainwashed?

Here is one that I hear over and over again:

MYTH: "The higher cost of Apple products is worth it because they are more reliable and last longer."

FACT: Macbooks have been on the market for about 4 years, so the longest you can say they last is 4 years -- you can't possibly know beyond that point. Likewise, it has been noted that iPods (the technology on which iPads are built) last about 4 years before dying (usually due to battery, which can be replaced but, of course, you have to let Big Brother Apple do it for you to the tune of $50 for an iPod or $100 for an iPad. Meanwhile, I have a still-functioning IBM ThinkPad that is 12 years old (although the screen and battery have died, so it has to be used like a desktop), as well as a Toshiba tablet laptop that is about 7 years old with no problems whatsoever. However, my iPod just died after exactly 4 years of ownership.

The durability argument for Apple products just doesn't ring true, but what about the "reliability" and the argument that you have fewer technical problems? Also false. A survey of 43,000 laptop computer users conducted by Consumer Reports found that Apple was the third most reliable brand (19% reported serious technical problems) -- Windows-based PC laptops by Toshiba and Acer tied for the least amount (17%) of reported technical problems.

Like the above popular myth, such misconceptions also like to spread their seed regarding the iPad. I find it unfathomable and unforgivable that people in positions of authority in public education somehow think iPads are the best investment for students. The fact of the matter is that these people are wasting taxpayer dollars and unnecessarily stressing already-thin budgets. And for what? Not even a defensible improvement to education.

MYTH: The iPad is a computer.

FACT: The iPad is NOT a computer. For one thing, the iPad is not a stand-alone device. You have to plug it into a real computer in order for it to work. This means iPad is a peripheral device -- not a computer. Just like iPod. For another, it cannot do the same things a computer does. Even if you wanted to argue for it being a simple web browser, it fails at this task: iPad does not run common multimedia engines (Flash, Java, Shockwave, etc.) that are already installed on 99% of computers and used for millions of websites. How can you call something a web browser when it breaks half the usability of the web?? As one blog so succinctly shows with screenshots of multiple inaccessible websites on iPad: "Millions of websites use Flash. Get used to the blue legos."

MYTH: You don't need those multimedia tools like Flash anyway, because you can get any of the same things as an app, and HTML 5 is going to make Flash obsolete.

FACT: There are millions of rich internet applications that rely on Flash. Many of them are free. Companies are not going to invest the time and money to create Apple App versions when the vast majority of users out there are still not using iPads. And HTML 5 is not a replacement for Flash. It could make the need for Flash video obsolete, but FLV, or Flash Video, is a relatively new development and use of Flash -- it's not even what Flash was designed for. Flash was designed for scalable vector animations and interactivity. HTML 5 will not support the vector graphics and level of interactivity that Flash uses for things like games, simulations, and business applications.

MYTH: iPads are the most affordable multi-purpose solution.

FACT: iPads are a more affordable multi-purpose solution than Macbooks. Then again, Macbooks are actual computers that can do things like run on their own without being plugged into a computer, and can access Flash-enhanced websites. iPads are also more affordable than PC laptops, though those in turn are more affordable than Macbooks and can do the same things (or more, since there is more software supported on Windows). iPads are more expensive than eReaders, so if all you want is an eReader, you'd save a lot of money getting a Nook or Kindle... but to be honest, education is about more than just books, so why would you waste the money on something that isn't a multipurpose device?

The actual most affordable multi-purpose solution: netbooks
. Netbooks are smaller laptops which are slightly less powerful but much more portable, affordable, and efficient. In other words, they are like iPads except they are actually computers. Like iPads, their batteries last for 8 hours of continuous use. Like iPads, they do not have a built-in CD/DVD drive; unlike iPads, you can plug in an external one if you really need it. Unlike iPads, they have USB ports for attaching peripheral devices. Unlike iPads, they have a built-in keyboard (it's pretty hilarious that people said "It's better not to have a keyboard" and then, lo and behold, Apple produced one for the iPad anyway because, guess what? It's still the most efficient means of getting text into a device, and people tend to like to be able to do that efficiently). Unlike iPads, they can use the entire extent of web pages and rich internet applications -- many of which do the same things as Apple Apps, except for free.

I was pretty skeptical of netbooks when I first saw them -- they seemed too small to feel natural to work on, and I doubted what kind of power/performance you could get for that price. But I was wrong. The Acer netbooks my students use in the classroom run not only the Internet, but also multimedia and 3D applications -- Photoshop, Google Earth, Google SketchUp CAD software -- and even 3D games with no problem. And, as I mentioned, Acer scored higher for technical reliability than Apple in the Consumer Reports survey. So what does this mini-laptop cost? About $250 each... HALF the price of an iPad. And when the battery eventually dies, you can buy your own replacement and pop it in for $30. Apple charges you $100 to replace the iPad battery when it dies (which it will -- this is true for any electronic device).

So what do you miss out on? A touch-screen and motion/angle sensors. But I have yet to hear anyone explain to me why these are necessary -- or even beneficial -- in education. There are a few musical and artistic apps that can benefit from it, but for online research, document creation, collaboration/communication, and most educational games... it's simply not necessary. (And you can even get tablet netbooks which allow you to use a stylus on the touch screen monitor, for still less than an iPad)


MYTH: "iPads are the way of the future, so even though the infrastructure is not really there for them at this time, it will be soon because all of that Flash, Java, and other interactive multimedia software will be made iPad-compatible."

FACT: We don't have a crystal ball and don't know what is going to be the future, so we have to focus on the present and the immediate future, things that we know are in the pipeline. You cannot purchase a device that doesn't have the underlying infrastructure and support it needs on a universal basis when the thing has only been out for 6 months. Ever heard of Betamax? HD-DVD? They, too, claimed to be "the wave of the future." How about Tesla? A very cool, all-electric car but missing an infrastructure that allows it to be charged or battery-swapped at stations along the road, rendering it impossible to travel long distances.

What we do know is this: iPhones and iPod Touch have been out for years now, and they still do not support Flash, yet multimedia/RIA developers are still choosing to develop Flash... so it doesn't look like the near future is going to look a whole lot different than the present. Don't gamble taxpayer dollars that it will, just so you can boast to have the newest flashy gizmo.

Friday, November 12, 2010

The Dumbest Author? Mark Bauerlein and "The Dumbest Generation"

This post is going to be be an in-depth, drawn-out, and often scathing (but fair) analysis and criticism of the book The Dumbest Generation by Mark Bauerlein. This could take a while, so grab yourself a warm beverage and settle in...

In his book The Dumbest Generation, Mark Bauerlein attempts to make a case for the declining intellect of younger generations and the ramifications of technological advancements. In short, Bauerlein argues that a "digital" generation is a "dumb" generation. As a teacher, I can tell you that there are, indeed, troubling gaps and trends in academia, work ethic, and critical thinking. What I can't paint with such a broad stroke, however, is that these trends are occurring because of our information-technology era. Unfortunately, neither can Mark Bauerlein... try as he might.

Bauerlein is a "professor of English at Emory University and has worked as a director of Research and Analysis at the National Endowment for the Arts." This does not bode well for indicating the level of empirical analysis skills required to get a PhD in literature nor for those same skills as a prerequsite for being a director of Research and Analysis – a position in which, presumably, you should know how to work with numbers, datas, and surveys. Unfortunately, Bauerlein's book and even the very studies he was involved in are rife with bias. It's a shame, because it dismisses his very real and important cause as nothing more than the ramblings of one of the antiquated literati.

Let's look at some of the many false assumptions, inadequately-supported conclusions, and examples of sheer bias riddling this book. "The 2002 Survey of Public Participation in the Arts (National Endowment for the Arts) charted several exposures over the preceding 12 months to record the bare presence of the fine arts in individuals' lives. Except for people 75 and older, 18- to 24-year-olds emerged with the lowest rates of all three groups" (p. 24). What, exactly, constitutes "fine arts"? Well, the data used to support his claim come from these instances: attending a jazz or classical music performance; attending a ballet or play; going to a museum or gallery; playing a classical instrument or singing in a choir.

You might be able to see, right away, what kind of absurd bias is inherent in this "study." How very convenient that "fine arts" does not include any "popular" art... thus films, pop music, and photography cannot, by this definition, be "fine art." What, then, is it defined by? Is it based on antiquity of the medium? This could account for many of them, sure... but what about jazz? Didn't that emerge in the 20th century? Well, there goes that theory. In reality, there is nothing to account for what Bauerlein portrays as "fine art" here other than his own personal tastes. And this guy was a Director of Research and Analysis??

Let's consider some very interesting thoughts that blow holes all through Bauerlein's argument. What makes attending a jazz or classical performance any different than attending a rock concert? Because rock is popular and the others are "old"? Consider this: in the early 1900s, jazz was the "rock" music of its time – it was new, it was popular, and even somewhat rebellious. In the 18th and 19th centuries, "classical" music was de rigeur and was simply called "music" because it was what people played and listened to. So, since jazz and classical both started out as "pop" music, what makes them any different from rock, hip-hop, disco, or any other genre? Sheer bias.

What about playing a "classical" instrument? What, exactly, is a classical instrument, and what makes playing one superior to playing any other instrument? Does a drum count as a classical instrument? After all, there is often a percussion section in an orchestra and, in fact, drums are the oldest form of instrument known to man. Yet they are used in rock and hip-hop, so does that disqualify them? What about guitar? Clearly out of the question... right? Well, it's not so clear; what if a student is playing "classical" nylon-string guitar?

As for live performance: film was and has always been an extension of theatrical performance. So what makes a movie "worse" than a play? Is it because it is recorded on a medium instead of being immediate, transient, and fleeting? I don't see how that affects the art or the thinking and appreciation processes involved. To see "live" performances generally costs more money, so if that is the factor then the National Endowment for the Arts is showing elitism and prejudice against less-fortunate people who can't, unfortunately, participate in live performances but can consume the art in a recorded format. And if it's not the fact that it is recorded that makes the difference, then what is it? What if kids watch a ballet that is tape-recorded... would that count, even though they aren't "attending" the performance?

You can see how patently absurd this survey and Bauerlein's conclusions drawn from it are. It makes him seem no better than a fascist Luddite. "Arts" are not the only area, however, where Bauerlein makes wild and erroneous pontifications. Being an English professor means, of course, that we must lament the decline of literature consumption.

This same SPPA written by the NEA and conducted by the U.S. Census Bureau is also used to compare literature reading to previous decades and draw some conclusions. "From 1982 to 2002, reading rates fell through the floor. The youngest cohort suffered the biggest drop, indicating something unusual happening to young adults and their relationship with books." On page 46, Bauerlein shares the resulting numbers with us in a table, broken down by age groups.This table represents the percentage of people in each age group who read for pleasure – this means voluntary reading of literature (not non-fiction) not required for work or school.Of these age groups – 18-24, 25-34, 35-44, 45-54, 55-64, and 65-74 – every single age group saw a decline in leisure reading from 1982 to 2002. In 1982, the reading rate of 65-74-year olds was 47.2% so, sure, it didn't decline much over 20 years, but it was never that high to begin with. Is that something to celebrate? Should we be calling them the "dumbest generation"?

In 2002, 42.8% of 18-24-year-olds counted as "literary readers"; yet the oldest generation ticked only slightly higher at 45.3%. In fact, the highest age group (45-54-year-olds) tallied only 51.6%. The table makes it appear that this is only a 3-point decline from 1982, when the percentage for that group was 54.9%. However, you must think about this logically: people in the 45-54-year-old group in 2002 were actually in the 25-34-year-old group in 1982. So how much did literary reading decline in that group? Actually, it went from 62.1 to 51.6, or a decline of 10.5%

So if even the "most literary" of the bunch are seeing a 10-point decline, what does that say about our society and the causes at hand? Bauerlein jumps to a conclusion throughout the book that it must be because of digital technologies and media. Well, maybe he's onto something... after all, these reading rates only dropped an average of 3 points from 1982-1992, but then plummeted 10 points from 1992 to 2002. What happened?

Well, for one thing, the World Wide Web was born in 1992. Ironically, though, the Web has (and especially had in the 1990s) an enormous amount of text. To say the Web would distract from reading would be pretty ironic, to say the least. The Web requires literacy and reading (especially in the days of 28.8 modems and no streaming music, videos, or animation.)

However, there's no denying that personal computing power has seen an amazing rise in multimedia possibilities, both for consumption and – perhaps more importantly – for production. Shrinking technology costs made video and audio production extremely efficient and affordable. It's no wonder, then, that some may find it much more effcient to communicate an idea using audio-visual medium instead of printed text. Is this such a bad thing? There are many instances when a picture truly is "worth a thousand words" – and motion media is 24 or more pictures per second so... do the math.Visual media can better illustrate sequential occurrences, cause and effect, or hard-to-imagine concepts for subjects as broad as history, geography, math, and sciences. Sure, words have their place and always will; they are symbolic representations of ideas that can be too complex or vague to represent visually. The fact remains that all groups across the board are spending less time on leisure reading... there is a reason for that, and it's not necessarily a bad one.

What is especially unfortunate, however, is that Bauerlein jumps to the conclusion that it must be the technology being the causal variable... as if that's the only variable at play! What about the fact that the past 20 years has seen an increase in school homework, extracurricular participation, and average workweek hours? Couldn't these factors impact the amount of free time available for leisure reading?

"Technophiles celebrate the ease and access, and teens and young adults derive a lesson from them. If you can call up a name, a date, an event, a text, a definition, a calculation, or a law in five seconds of key-punching, why fill your mind with them?" (p. 94) This question is raised in the snarky, sarcastic, sanctimonious tone Bauerlein likes to arrogantly pepper throughout his book. The question is meant to be rhetorical, the answer self-evident. Yet I would argue that this is a valid question, and should be asked in earnest. Why, indeed, should our thinking process regress or stagnate to a mode of mass acquisition and memorization, when in fact the tools are available to "outsource" those cognitively-intensive tasks to a computer, network, or communications system? Let's consider this logically. First of all, why were dictionaries and encyclopedias created? Were they invented to teach us the meaning of words that we are meant to forever-after record in our minds for easy recall whenever necessary? No. If they were, why would they be found in the "Reference" section of the library? The answer is simple: because they were designed for us to refer to them as necessary, when there is a specific piece of information we need that is not stored and readily available for instant recall in our minds. According to cognitive science, this makes sense; although the mind can store vast amounts of information, it prioritizes its storage (and connections required to recall the knowledge) based on frequency and recency of use. Therefore, we are only meant to memorize those items we will be using on a frequent basis in our life – and it makes sense to keep everything else in an external reference for which we memorize the knowledge of how to locate the information when we need it. This exactly parallels the knowledge network known as the Internet and World Wide Web. But let me guess, Mark... encyclopedias are "good" (because they are made of paper?) and online information is "bad" (because it is electronic?).

I hate to break it to you, but the answer to the question is self-evident. Why fill your mind with knowledge you (a) may never use again, and (b) could easily and instantly locate if you do happen to need it? This paradigm shift in thought processing has already occurred... and it happens to be an advantageous approach. When I applied to computer programming jobs 10 years ago, the interviews would involve barrages of "pop quizzes" and scenario-based tests. However, the examiners were not testing whether I memorized syntax rules like the need to put a semicolon at the end of each line of C++ code. No, the interviewers could care less; in fact, my answers were given in "pseudocode" – in other words, I would describe a process, an algorithm that would need to be used given a certain set of circumstances. This is what they cared about, and it makes sense... after all, every programmer I know has a stack of reference materials ready at hand as they work.Yes, even computer wizards use textbooks and occasionally have to look up the method to make a certain something happen in the computer. I've even seen this in play when, during visits to the doctor's office and ER, doctors (of all people!) refer to computer databases of symptoms in order to cross-reference or double-check their suspicions of illnesses based on their (traditionally memorized) knowledge.

Again, I ask... what is wrong with that? This thought process allows us to put energy into memorizing things we know we will need to do on a regular basis – drive a car, cook food, perhaps even quickly locate information on the Internet or efficiently use our cellphones (yes, Dr. Bauerlein, cellphones are actually a requisite part of many businessplace professions outside of the ivory tower.) Let's devil's advocate for a second here and say, for the sake of argument, that there is a danger in outsourcing our knowledge because we will then become dependent on a resource that may suddenly not be available when we need it. What if you've changed your mode of operation to simply look up something on the Web the second you need to do it – and suddenly the web is unavailable? In fact, this is not hard to imagine... I recall some scenarios during traveling when I had become so accustomed to using MapQuest that, when plans changed or I found myself in a foreign location with no Internet access, it would be very nice to have a good old-fashioned map. (Then again, a map is an external reference – it would be absurd to memorize maps in my head, especially for trips that are not likely to be repeated. But I digress...) Well, consider this: the information is becoming more readily-accessible as time goes on... not less so.

I suppose if an apocalyptic catastrophe strikes, we may suddenly find ourselves plunged into a world devoid of the energy resources needed to fuel our modern technology. But I think we can safely assume that, if that day comes, we will have much more pressing things to memorize than Shakespeare or Planck's constant. I imagine we'd more likely need to stay focused on seeking shelter from lightning storms and hunting down rodents or other meager source of protein. So the argument for the need to memorize: politics, history, fine art, literature, erudite vocabulary, specialized sciences, or 99% of other academic information is simply moot.

Then there is the chapter in which the author laments the lies inherent in a visual advertisement for an Apple laptop, in which the laptop is compared to piled of books. "In fact, it proclaimed, the laptop has rendered all those books pictured in the display window obsolete Who needs them? The computer is now the only book you need. Many books and journals you can find online, especially now that the Google project is well under way. And soon enough, e-publishers predict, all books will appear in digital format, making hardcover and paperback books something of a curiosity, like card catalogs and eight-track tapes. Already, with the Internet, we have enough handy information to dispense with encyclopedias, almanacs, textbooks, government record books, and other informational texts. That was the argument behind the declaration, and with so many techno-marvels inside, one could understand a 15-year-old absorbing the slogan as patently true." (p. 99) "For the average teenager looking for the latest iTunes, the titles in the pictures meant nothing, and the display assured them that they do, indeed, mean nothing." (p.100) "To replace the book with the screen is to remove a 2,500-year-old cornerstone of civilization and insert an altogether dissimilar building block." (p. 101) Here I feel the need to refer to a banal colloquialism: Mr. Bauerlein, what the fuck are you talking about? You cite "users can always gear down to reading page by page in PDF format." (p. 103) Yet you make no evidence-backed rebuttal of this claim, but instead refers back to techno-evangelists Johnson and Will Richardson, who coin the term "the Read/WriteWeb."

From here we shotgun into a rebuttal backed not by facts or evidence but via fictional anecdotes (more evidence of a literature professor at work here), in which Bauerlein refers to visions by authors Ray Bradbury and Aldous Huxley – authors who have all written about bleak futures dehumanized by infatuaItion with technology. The logician and empiricist in me says... "HUH?" I love Fahrenheit 451 and Brave New World as much as the next guy... but how did we get from point A to point B? I understand that we are supposed to realize Postman was comparing our modern "reality television" entertainment with the devices in these bleak fictional futures, but how does that rebut statements that you can, in fact, use a laptop computer the same way as you read a book via PDFs and eBook software?

The plain answer is: it doesn't.

And then there is Bauerlein's "debunking" of the idea that "real", meaningful reading happens online. His conclusions are based mainly on studies of web use conducted by the Nielson Norman Group, which "has no stake in grand pronouncements about the Digital Age" (p. 142) because they are a private firm that researches user interactions for the sole purpose of improving business/commercial viability of websites. This research has led to such shocking revelations that Bauerlein must share them with us in boldface type:

  • of subscribers of email newsletters, only 19% fully read the newsletters.
  • people scan headlines and blurbs in newsfeeds
  • readers choose the content they are interested in and only focus on that particular news
  • sites are recommended to use sixth-grade reading level on the main page and only eighth-grade level for internal articles
All of this sounds pretty dismal and appalling, right? Until you realize the hilarious fact that, despite being a professor of English at Emory University, Mark Bauerlein doesn't seem to realize that every single one of the above facts also applies directly to print newspapers!

Yes, newspapers are generally written at an eighth-grade level, so as to ensure accessibility of information for the majority of the population. Very few people read an entire newspaper -- in fact, there is a reason why the newspaper is broken down into different sections by topic and interest. There are many people who might pick up a paper just to read the "Sports" section or the "Finance" news. There are many more who will grab a paper simply because they scanned the headlines on the front page and found something of interest to them (this is, after all, why multiple headlines and story segments are placed on the front page to begin with. Because it's what people naturally do -- and have done for more than a century.) He laments the fact that people only look for the minimal amount of reading to get pertinent facts and information -- but isn't this what the news is for? The entire structure of journalistic writing is designed to cater to this style and purpose of reading: you start with an eye-catching headline (which, in itself, contains most of the important facts), then you follow up with an introductory paragraph which does not, unlike other essays and treatises, force you to read further to glean details. In news writing, all of the most pertinent facts and details -- the "5 W's" -- are answered right from the start. Why? Because that's what most people want to know. Many readers stop reading after that first paragraph or two. The remainder of the article will contain the extra details and specifics to elaborate on the information, for those people still interested in knowing more.

But hey... why would I expect an English professor to know this and take it into account, and realize that the habits described on the web almost perfectly mimic the habits practiced in print?

All of this is not to say that I disagree with everything revealed or lamented in The Dumbest Generation. For example, from pages 77-87, Bauerlein juxtaposes the results of a Kaiser report entitled Generation M: Media in the Lives of 8-18 Year Olds with the pro-pop-culture affirmations of Steven Johnson's Everything Bad is Good for You: How Today's Popular Culture is Actually making us Smarter. Setting aside the content of modern media (which is conceded to be "coarse and inane"), Johnson proclaims that modern entertainment is more sophisticated in its format and plot structures, that – as Bauerlein puts it, it builds "aptitudes for spatialization, pattern recognition, and problem solving, virtues that reflect twenty-first-century demands better, in fact, than do traditional knowledge and reading skills." Well, to be sure, spatial-temporal and problem-solving abilities are important in the new millennium... but that doesn't mean they are the only important pieces of the puzzle.Bauerlein notes – and rightly so – that, given the Generation M study showing kids 8-18 spending an average of more than 6 hours per day consuming media, and since modern pop media inculcates "intellectual or cognitive virtues," shouldn't this mean that kids' cognivite capacities are just growing by leaps and bounds? Yet standardized test scores falter and stagnate, and in some academic areas (sciences, social studies, politics), knowledge and interest seem to be declining altogether. So where is the disconnect?

In his own way, Bauerlein points to a problem of perception – that sometimes what is being assessed is too narrow in scope to really be a valid marker of overall cognitive prowess or academic proficiency. Bauerlein's example is the "Flynn Effect" – essentially, the general IQ test has had to be re-normed on a regular basis since its inception, because performance on it continued to improve. One could argue that, early on, this was due to improvements in medicine, nutrition, and education. But the past 50 years have not seen major strides in these areas ... and yet IQ scores have consistently climbed about 3 points per generation. Some would like to point to modern media as a catalyst for this... but if peoples' general intelligence is really higher than it used to be, then why are students faltering? Bauerlein draws the likely conclusion that people have become more spatial-temporal due to increased exposure to spatial-temporal media (p. 93). This seems to be a logical conclusion, since the IQ test does, indeed, use visual, spatial, and sequential-logic questions as its basis. Moreover, it points out that such skills do not transfer into "general" intelligence or learning. Still, it is a bit of a concession that Bauerlein unknowingly admits: perhaps the new media do improve this one specific type of cognition. Unfortunately, I have to agree that this type of thinking, alone, is not enough to thrive. The real argument, then, should not be that screen time is bad but that it should be mitigated in order to allow time for other activities (like reading) that will likely stimulate other cognitive modes like verbal or interpersonal proficiency.

More to the point, I agree nearly wholeheartedly with the focus of the chapter "online learning and non-learning," in which Bauerlein points out that "for all their adroitness with technology, students don't seek, find, and manage information very well...They're comfortable with the tools, but indiscriminate in their applications. ETS terms the missing aptitude Information and Communications Technology (ICT) literacy, and it includes the ability to conduct research, evaluate sources, communicate data, and understand ethical/legal issues of access and use" (p. 113). I think this is really the core and crux of the "problem" Mark Bauerlein is grasping at. In actuality, he tries to make a case that technology and media, in general, cause kids to become less critical, less informed, illiterate... dumb. However, as already pointed out, this is a weak argument because the amount of information and text involved with computers and the Web is staggering. The real issue is the assumption people are making that, being born "digital natives", the "millennials" inherently know how to use technology. This unfortunate assumption rises from the fact that, being immersed in a digital-media society, kids today do seem to feel more comfortable and adept at approaching and operating various technologies. However, using a technology and using it well or productively are two different matters entirely. Knowing the nature of kids (and humans, in general), it's not surprising that, if left to their own devices, they choose the path of least resistance – ie. If they have a choice between online research and reading or a non-educational video game, they will choose the latter. If they have a choice between writing a meaningful, well-structured blog or building credibility in their social circles via 200-letter tweet, they will choose the latter. If it's the choice between streaming Carl Sagan's "Cosmos" or watching a humorous-but-inane 2-minute YouTube video, which one are they going to choose?

In an ETS study of ICT skills in 6,300 students, Bauerlein notes, "Few test takers demonstrated key ICT literacy skills" (www.ets.org/itcliteracy). He goes on, in this chapter, to pick apart technology initiatives and show that they don't work to improve performance. First, he points out that "Taking the enthusiasm of 18-year-olds as a measure of educational benefits may sound like a dicey way to justify such a sweeping change in classroom practices, but many studies out to evaluate e-learning do precisely that" (p.119). In my research, I have found this to be true and, like Bauerlein, I find it ridiculous and lamentable. Using satisfaction surveys are a moot point and should be nearly irrelevant to determining the efficacy of educational technology, and here's why: the purpose of education is not to make students feel happy; the purpose of education is to increase knowledge and learning. Period.

Now, I don't think anyone would argue that motivation can play a crucial role in whether someone learns or not; if someone is very resistant to trying to learn, it likely won't happen. But this does not mean that motivation equals learning. If it is determined that knowledge is still not being acquired, despite significant motivation, this is not good news! It is not something to celebrate, as many edtech articles do. In fact, it points to a serious disconnect; if students are so happy and motivated, yet are learning less and less, then it begs the question: what, exactly, are they motivated to do?

On pages 120-124, Bauerlein lists a series of cherry-picked studies indicating no significant gains from technology (computer hardware and software) acquisitions, despite being often pricey investments. The irony here is that he segues from this into an analysis of the benefits of reading print media outside of school to showcase why reading at home is important. To say this feels like a non-sequitur is an understatement. After all, didn't we just establish that it's not the technology that is causing the problem, but in fact is the students' lack of skills and awareness with how to use the technology in meaningful, appropriate ways.

Some good points are made regarding the value of language use outside of school – that language and vocabulary are improved through constant use of language and especially leisure reading (p. 129). However, this doesn't explain why, after outfitting students with laptops, a New York state school board stated "After seven years, there was literally no evidence it had any impact on student achievement – none." (p. 124).

The explanation for this (much to Bauerlein's chagrin, I'm sure), has nothing to do with books and everything to do with education. The fact that students are missing "key ICT skills" shows the real problem: that students are not being taught how to use the technology in creative, productive, useful, and knowledge-building ways. As Bauerlein points out (in reference to language/vocabulary acquisition), accessing and using information at home is a huge component – and I believe it is critical for parents to treat computer literacy the same way. Parents have an imperative responsibility to monitor, guide, and tutor their children in responsible and practical uses of technology. Sadly, many (if not most) of those parents are unable to do this, being ignorant and untrained themselves. In other cases, parents themselves may "know" technology, but have adopted the same (highly-marketed) attitude that technology is solely for entertainment and leisure – often their transgressions are just as egregious as their child's.

It's no wonder, then, that – despite huge potential for expanding and improving knowledge and understanding of a limitless swath of topics (including literacy) – the technology is not holding up to promises. However, it's not a product of the technology itself, but a product of the neglect we are paying to dutiful training of our educators and subsequent educating of our students in proper, effective, purposeful use of technology. This, to me, does not sound like a justifiable reason to slash funding for technology initiatives. On the contrary, it sounds like a call to increase funding, albeit perhaps with a shift in focus from medium (technology) to message (how to use technology.) In short, knowledge and the ability to share it with others is more important, valuable, and powerful than any device out there.

Perhaps Bauerlein should avoid anything approaching math, science, or empiricism and stick to English-lit style "analysis", wherein one can ponder about the intricacies of emotion or character development and not actually have to worry about making erroneous statements or conclusions. I scored a perfect 6 in the analytical writing section of the GRE (and – what a coincidence! -- I've been using computers gratuitously since I was a child) and I can safely say that, sadly, with this level of illogical rhetoric I have my doubts whether this Emory English professor would even pass the test. At the very least, The Dumbest Generation fails.

Monday, November 8, 2010

CUE, Hall Davidson, and the Cult of "Isn't it cool??"

WARNING: This post will be longer (and more substantial) than a Tweet. If your attention span is not high enough, might I suggest you watch videos of Fred on YouTube?


Over the weekend I attended my first education technology conference in a while: it was the Fall 2010 CUE (Computer-Using Educators) conference -- "Soaring to New Heights" -- held in Napa Valley.

Yes, despite the fact that I have more on my plate than ever before (teaching computers to 400+ students in grades K-6, coordinating two different gifted/advanced learner programs at the school, and working on the final thesis for my MS in Educational Technology -- not to mention recovering from a nasty bout of pneumonia), I voluntarily chose to spend 1/2 of my weekend at a professional conference for use of computer technology in education. The things I do out of passion.

Anyway, it's been a few years since I attended an educational technology conference like this (the last one being about 5 years ago, attending a three-day conference held at the Baltimore Convention Center in Maryland), and I was reminded of something that really, truly bothers me about conventions like this: they are 50% sales pitch, 40% pep rally, and 10% useful information or tools. In the words of the great paragon of fast-food -- Wendy's: "Where's the beef??"

I have no problem, inherently, with sales or pep talks. They are both necessary sometimes. Without marketing, it's likely we wouldn't know about some of the truly wonderful tools and technology that exist for education. And without an occasional morale boost -- well, we'd all probably kill ourselves.

But what this CUE conference really drove home was the dangerous combination of glitz and hype that surround what I'm going to call "the cult of technology" (if you've ever seen people at an Apple store, you'll know what I'm talking about.)

I'm going to use Hall Davidson's closing keynote speech from the CUE conference to illustrate this point. To the man's credit, he came up onto the stage with much energy, enthusiasm, and gusto (following a brief appearance by newly-elected California State Superintendent of Public Instruction.) He showcases a vibrant personality, and that's all well and good for a keynote speaker / pep rally, right? And he didn't just give a speech -- he made use of his laptop and the projector and wireless internet to make it multimedia, to dive right in and show myriad tools in action. That's cool, right?

This is the part where I always have to groan, and the use of technology doesn't quite "float my boat" and make me giddy like the drug it appears to be to the iPad-wielding peers around me. Here's an example: Hall Davidson showcased how you can go to Google Images and do a search and the pictures will show up. But if you really want to make it better, you can go get a plugin -- oh, what's it called? He couldn't remember, so here's a "learning moment" to showcase how "cool" yet another new technology is: Twitter. To drive the point home, he showed how he had popped on and Tweeted about how he couldn't remember the name of the program that would make Google image searches show up in a format like an Apple gallery view. Somebody on Twitter (and with clearly nothing better to do with his time) quickly responded that the app was called "CoolIris"

So Davidson jumps back to Google images in the web browser and explains how you can download and install the CoolIris plugin and it will make the Google images show up in a shiny Apple-like interface in which they appear to be all lined up on a 2D plane (like a wall) in a 3D environment.

"Isn't that cool??" Mr. Davidson prompts. Nods and applause from the iSheep.

Pardon me if I don't jump on the bandwagon. Inadvertently, this "let's get excited about technology" speech pointed out exactly what I'm not excited about: people who think something is great just because it is a new, shiny novelty. People who covet style over substance -- even if it ends up wasting resources (like time or money).

So, let me get this straight: you used Google image search and all the images you wanted to find showed up. You could scroll down the page and find more, as necessary. So, in other words, Google images search functions just fine and allows you to do the task you set out to do. But instead of just being practical and getting the job done, you decided it is "cooler" to waste time Tweeting about it, getting a response, then searching for an app, installing it (bogging your computer with wasted processing power and RAM while your browser is running)... all for ZERO benefit to effectiveness or efficiency?

You can call me a Debbie Downer, but no... no I do not think that's "cool". See, to me what makes technology cool is that it allows us to do things faster, better, or otherwise impossible without it; this does none of the above. It merely illustrates the inanity of our modern-day Cult of Technology and how willing they are to trump substance with style. This app (and others like it) may be free, but the time wasted -- especially when we're talking about our limited instructional minutes in education -- is invaluable.

Oh, but was that example just a poor one, a fluke? Well, Mr. Davidson followed it up by showing how you can create a Wordle (which could, in and of itself, be argued to be another example of replacing substance with style; however, I can understand some of the ostensible benefits of this one for education -- raising awareness of key words, front-loading vocabulary, etc.) Okay, so in case you don't know: a Wordle is a computer-generated stylistic visual representation of some words/text. In and of itself, it doesn't serve much purpose (other than -- you guessed it -- "cool!"), but it doesn't take much time and it could build some awareness of words or their relationships, or maybe give students some motivation to explore synonyms, antonyms, etc. It's not amazing, but it could have its uses. But... useful?? That's not good enough for the Cult of Technology -- we need it to be flashy... cool! Hall Davidson shows you how: create a Wordle, then use PhotoBooth software (Mac required... we all know PCs are just... practical and... uncool.) And what you can do is create a chromakey effect where it will magically erase the background and replace it with your Wordle! And then you can... well, you can video record yourself with a Wordle image in the background which is very useful for, umm... uhh... it serves a practical application in lessons about... chromakey??

Oh, who cares! It's COOL!


(personally, I would have found it a lot cooler if the educators in the room were intelligent enough to be able to correctly spell their favorite beverages which they texted into a central repository -- which included classic wines and beers like "mascado" and "heifenvizen". But beggars can't be choosers, I suppose. We may no longer know how to spell, but at least we can take those misspelled words and make them glitzy, shiny, and cool through the newest -- and therefore the best, of course -- iPad apps available!)

Sunday, May 2, 2010

A Reflection on learning Educational Technology Integration

Most of the posts I have made thus far to this blog have been as a result of requisite prompts for my Boise State University EDTECH 541: Integrating Technology into the Classroom Curriculum. I've learned a lot in this course, and it's nearing an end, but this blog is not. I strongly believe that, although educational technology has existed for decades, we are only now beginning to realize how ubiquitous it can -- and should -- be, but also the value and importance of applying technology in purposeful ways. To that end, I intend to continue this blog with a critical-but-hopeful eye on how to effectively and efficiently use educational technology... not technology for technology's sake, but technology for the sake of learning and forging a better future.

Over the past few months in this course, I have learned quite a bit. A central part of this course has been exposure to the multitude of free or easily-accessible resources out there, many focused on leveraging the Web for production, communication, and collaboration -- accounting for half of the key components of ISTE's National Educational Technology Standards (NETS) for Students. It is truly mind-blowing how many useful and innovative tools are now available online, and perhaps even more staggering is the rate at which they are emerging and expanding.

However, the aspect I appreciate the most and feel is critically important is the idea of critically analyzing when and how to technology can provide a "Relative Advantage" over traditional methods of learning, and how to best incorporate such technology through use of the Technology Integration Planning (TIP) Model. (Roblyer, 2006, p. )

I strongly believe in the value of empirical research and research-based decision-making. As such, I am pursuing a Master's of Science in Educational Technology rather than the M.E.T degree offered by Boise State University; this means that, in lieu of an M.E.T. standards-based portfolio, I will be writing a Master's Thesis. As such, it was refreshing to note the thorough theoretical foundations of Roblyer's (2006) recommendations and conclusions regarding suggested educational technology use. In addition, my own research into studies regarding multimedia yielded very interesting and scientifically-supported evidence of its beneficial value:


However, this does not mean that I do not value the practical importance of the M.E.T. standards set defined by the AECT. After all, what good is theory and research if it does not get applied in a practical way? (Hence the title of my blog) This EDTECH 541 course has been extremely beneficial in addressing these standard needs set forth by AECT. Perhaps its most significant contribution has been in the realm of Standard 3: "Utilization" (Earle, 2000, p.22 ):

“Utilization is the act of using processes and resources for learning” (Seels & Richey, 1994, p. 46). This domain involves matching learners with specific materials and activities, preparing learners for interacting with those materials, providing guidance during engagement, providing assessment of the results, and incorporating this usage into the continuing procedures of the organization.

3.1 Media Utilization
“Media utilization is the systematic use of resources for learning” (Seels & Richey, 1994, p. 46). Utilization is the decision-making process of implementation based on instructional design specifications.

3.2 Diffusion of Innovations
“Diffusion of innovations is the process of communicating through planned strategies for the purpose of gaining adoption” (Seels & Richey, 1994, p. 46). With an ultimate goal of bringing about change, the process includes stages such as awareness, interest, trial, and adoption.


These standards directly relate to the idea of "technology integration planning," and I would say that idea is the area in which I have grown most professionally. It's true that I learned and used many new tools in this course, and I have in fact already held peer-training professional development sessions to teach other teachers at my school about these tools and how they can be used. But for the most part ideas like the TIP model and AECT standard 3 simply resonate with what I have already felt strongly and tried to express in the workplace -- but Roblyer and this course have given me the means to eloquently express the value of such planning, organize a systemic way of implementing it, and showing examples of the TIP model in action. After all, tools and technologies will change over time (now more rapidly than ever!) but wise planning and systematic, purposeful decision-making will likely always be en vogue.


References

Earle, R. S. (Ed.) (2000). Standards for the Accreditation of School Media Specialist and Educational Technology Specialist Programs. Bloomington, IN: Association for Educational Communications and Technology.

Roblyer, M. D. (2006). Integrating Educational Technology Into Teaching (4th Ed.) Upper Saddle River, NJ: Pearson.