Friday, November 12, 2010

The Dumbest Author? Mark Bauerlein and "The Dumbest Generation"

This post is going to be be an in-depth, drawn-out, and often scathing (but fair) analysis and criticism of the book The Dumbest Generation by Mark Bauerlein. This could take a while, so grab yourself a warm beverage and settle in...

In his book The Dumbest Generation, Mark Bauerlein attempts to make a case for the declining intellect of younger generations and the ramifications of technological advancements. In short, Bauerlein argues that a "digital" generation is a "dumb" generation. As a teacher, I can tell you that there are, indeed, troubling gaps and trends in academia, work ethic, and critical thinking. What I can't paint with such a broad stroke, however, is that these trends are occurring because of our information-technology era. Unfortunately, neither can Mark Bauerlein... try as he might.

Bauerlein is a "professor of English at Emory University and has worked as a director of Research and Analysis at the National Endowment for the Arts." This does not bode well for indicating the level of empirical analysis skills required to get a PhD in literature nor for those same skills as a prerequsite for being a director of Research and Analysis – a position in which, presumably, you should know how to work with numbers, datas, and surveys. Unfortunately, Bauerlein's book and even the very studies he was involved in are rife with bias. It's a shame, because it dismisses his very real and important cause as nothing more than the ramblings of one of the antiquated literati.

Let's look at some of the many false assumptions, inadequately-supported conclusions, and examples of sheer bias riddling this book. "The 2002 Survey of Public Participation in the Arts (National Endowment for the Arts) charted several exposures over the preceding 12 months to record the bare presence of the fine arts in individuals' lives. Except for people 75 and older, 18- to 24-year-olds emerged with the lowest rates of all three groups" (p. 24). What, exactly, constitutes "fine arts"? Well, the data used to support his claim come from these instances: attending a jazz or classical music performance; attending a ballet or play; going to a museum or gallery; playing a classical instrument or singing in a choir.

You might be able to see, right away, what kind of absurd bias is inherent in this "study." How very convenient that "fine arts" does not include any "popular" art... thus films, pop music, and photography cannot, by this definition, be "fine art." What, then, is it defined by? Is it based on antiquity of the medium? This could account for many of them, sure... but what about jazz? Didn't that emerge in the 20th century? Well, there goes that theory. In reality, there is nothing to account for what Bauerlein portrays as "fine art" here other than his own personal tastes. And this guy was a Director of Research and Analysis??

Let's consider some very interesting thoughts that blow holes all through Bauerlein's argument. What makes attending a jazz or classical performance any different than attending a rock concert? Because rock is popular and the others are "old"? Consider this: in the early 1900s, jazz was the "rock" music of its time – it was new, it was popular, and even somewhat rebellious. In the 18th and 19th centuries, "classical" music was de rigeur and was simply called "music" because it was what people played and listened to. So, since jazz and classical both started out as "pop" music, what makes them any different from rock, hip-hop, disco, or any other genre? Sheer bias.

What about playing a "classical" instrument? What, exactly, is a classical instrument, and what makes playing one superior to playing any other instrument? Does a drum count as a classical instrument? After all, there is often a percussion section in an orchestra and, in fact, drums are the oldest form of instrument known to man. Yet they are used in rock and hip-hop, so does that disqualify them? What about guitar? Clearly out of the question... right? Well, it's not so clear; what if a student is playing "classical" nylon-string guitar?

As for live performance: film was and has always been an extension of theatrical performance. So what makes a movie "worse" than a play? Is it because it is recorded on a medium instead of being immediate, transient, and fleeting? I don't see how that affects the art or the thinking and appreciation processes involved. To see "live" performances generally costs more money, so if that is the factor then the National Endowment for the Arts is showing elitism and prejudice against less-fortunate people who can't, unfortunately, participate in live performances but can consume the art in a recorded format. And if it's not the fact that it is recorded that makes the difference, then what is it? What if kids watch a ballet that is tape-recorded... would that count, even though they aren't "attending" the performance?

You can see how patently absurd this survey and Bauerlein's conclusions drawn from it are. It makes him seem no better than a fascist Luddite. "Arts" are not the only area, however, where Bauerlein makes wild and erroneous pontifications. Being an English professor means, of course, that we must lament the decline of literature consumption.

This same SPPA written by the NEA and conducted by the U.S. Census Bureau is also used to compare literature reading to previous decades and draw some conclusions. "From 1982 to 2002, reading rates fell through the floor. The youngest cohort suffered the biggest drop, indicating something unusual happening to young adults and their relationship with books." On page 46, Bauerlein shares the resulting numbers with us in a table, broken down by age groups.This table represents the percentage of people in each age group who read for pleasure – this means voluntary reading of literature (not non-fiction) not required for work or school.Of these age groups – 18-24, 25-34, 35-44, 45-54, 55-64, and 65-74 – every single age group saw a decline in leisure reading from 1982 to 2002. In 1982, the reading rate of 65-74-year olds was 47.2% so, sure, it didn't decline much over 20 years, but it was never that high to begin with. Is that something to celebrate? Should we be calling them the "dumbest generation"?

In 2002, 42.8% of 18-24-year-olds counted as "literary readers"; yet the oldest generation ticked only slightly higher at 45.3%. In fact, the highest age group (45-54-year-olds) tallied only 51.6%. The table makes it appear that this is only a 3-point decline from 1982, when the percentage for that group was 54.9%. However, you must think about this logically: people in the 45-54-year-old group in 2002 were actually in the 25-34-year-old group in 1982. So how much did literary reading decline in that group? Actually, it went from 62.1 to 51.6, or a decline of 10.5%

So if even the "most literary" of the bunch are seeing a 10-point decline, what does that say about our society and the causes at hand? Bauerlein jumps to a conclusion throughout the book that it must be because of digital technologies and media. Well, maybe he's onto something... after all, these reading rates only dropped an average of 3 points from 1982-1992, but then plummeted 10 points from 1992 to 2002. What happened?

Well, for one thing, the World Wide Web was born in 1992. Ironically, though, the Web has (and especially had in the 1990s) an enormous amount of text. To say the Web would distract from reading would be pretty ironic, to say the least. The Web requires literacy and reading (especially in the days of 28.8 modems and no streaming music, videos, or animation.)

However, there's no denying that personal computing power has seen an amazing rise in multimedia possibilities, both for consumption and – perhaps more importantly – for production. Shrinking technology costs made video and audio production extremely efficient and affordable. It's no wonder, then, that some may find it much more effcient to communicate an idea using audio-visual medium instead of printed text. Is this such a bad thing? There are many instances when a picture truly is "worth a thousand words" – and motion media is 24 or more pictures per second so... do the math.Visual media can better illustrate sequential occurrences, cause and effect, or hard-to-imagine concepts for subjects as broad as history, geography, math, and sciences. Sure, words have their place and always will; they are symbolic representations of ideas that can be too complex or vague to represent visually. The fact remains that all groups across the board are spending less time on leisure reading... there is a reason for that, and it's not necessarily a bad one.

What is especially unfortunate, however, is that Bauerlein jumps to the conclusion that it must be the technology being the causal variable... as if that's the only variable at play! What about the fact that the past 20 years has seen an increase in school homework, extracurricular participation, and average workweek hours? Couldn't these factors impact the amount of free time available for leisure reading?

"Technophiles celebrate the ease and access, and teens and young adults derive a lesson from them. If you can call up a name, a date, an event, a text, a definition, a calculation, or a law in five seconds of key-punching, why fill your mind with them?" (p. 94) This question is raised in the snarky, sarcastic, sanctimonious tone Bauerlein likes to arrogantly pepper throughout his book. The question is meant to be rhetorical, the answer self-evident. Yet I would argue that this is a valid question, and should be asked in earnest. Why, indeed, should our thinking process regress or stagnate to a mode of mass acquisition and memorization, when in fact the tools are available to "outsource" those cognitively-intensive tasks to a computer, network, or communications system? Let's consider this logically. First of all, why were dictionaries and encyclopedias created? Were they invented to teach us the meaning of words that we are meant to forever-after record in our minds for easy recall whenever necessary? No. If they were, why would they be found in the "Reference" section of the library? The answer is simple: because they were designed for us to refer to them as necessary, when there is a specific piece of information we need that is not stored and readily available for instant recall in our minds. According to cognitive science, this makes sense; although the mind can store vast amounts of information, it prioritizes its storage (and connections required to recall the knowledge) based on frequency and recency of use. Therefore, we are only meant to memorize those items we will be using on a frequent basis in our life – and it makes sense to keep everything else in an external reference for which we memorize the knowledge of how to locate the information when we need it. This exactly parallels the knowledge network known as the Internet and World Wide Web. But let me guess, Mark... encyclopedias are "good" (because they are made of paper?) and online information is "bad" (because it is electronic?).

I hate to break it to you, but the answer to the question is self-evident. Why fill your mind with knowledge you (a) may never use again, and (b) could easily and instantly locate if you do happen to need it? This paradigm shift in thought processing has already occurred... and it happens to be an advantageous approach. When I applied to computer programming jobs 10 years ago, the interviews would involve barrages of "pop quizzes" and scenario-based tests. However, the examiners were not testing whether I memorized syntax rules like the need to put a semicolon at the end of each line of C++ code. No, the interviewers could care less; in fact, my answers were given in "pseudocode" – in other words, I would describe a process, an algorithm that would need to be used given a certain set of circumstances. This is what they cared about, and it makes sense... after all, every programmer I know has a stack of reference materials ready at hand as they work.Yes, even computer wizards use textbooks and occasionally have to look up the method to make a certain something happen in the computer. I've even seen this in play when, during visits to the doctor's office and ER, doctors (of all people!) refer to computer databases of symptoms in order to cross-reference or double-check their suspicions of illnesses based on their (traditionally memorized) knowledge.

Again, I ask... what is wrong with that? This thought process allows us to put energy into memorizing things we know we will need to do on a regular basis – drive a car, cook food, perhaps even quickly locate information on the Internet or efficiently use our cellphones (yes, Dr. Bauerlein, cellphones are actually a requisite part of many businessplace professions outside of the ivory tower.) Let's devil's advocate for a second here and say, for the sake of argument, that there is a danger in outsourcing our knowledge because we will then become dependent on a resource that may suddenly not be available when we need it. What if you've changed your mode of operation to simply look up something on the Web the second you need to do it – and suddenly the web is unavailable? In fact, this is not hard to imagine... I recall some scenarios during traveling when I had become so accustomed to using MapQuest that, when plans changed or I found myself in a foreign location with no Internet access, it would be very nice to have a good old-fashioned map. (Then again, a map is an external reference – it would be absurd to memorize maps in my head, especially for trips that are not likely to be repeated. But I digress...) Well, consider this: the information is becoming more readily-accessible as time goes on... not less so.

I suppose if an apocalyptic catastrophe strikes, we may suddenly find ourselves plunged into a world devoid of the energy resources needed to fuel our modern technology. But I think we can safely assume that, if that day comes, we will have much more pressing things to memorize than Shakespeare or Planck's constant. I imagine we'd more likely need to stay focused on seeking shelter from lightning storms and hunting down rodents or other meager source of protein. So the argument for the need to memorize: politics, history, fine art, literature, erudite vocabulary, specialized sciences, or 99% of other academic information is simply moot.

Then there is the chapter in which the author laments the lies inherent in a visual advertisement for an Apple laptop, in which the laptop is compared to piled of books. "In fact, it proclaimed, the laptop has rendered all those books pictured in the display window obsolete Who needs them? The computer is now the only book you need. Many books and journals you can find online, especially now that the Google project is well under way. And soon enough, e-publishers predict, all books will appear in digital format, making hardcover and paperback books something of a curiosity, like card catalogs and eight-track tapes. Already, with the Internet, we have enough handy information to dispense with encyclopedias, almanacs, textbooks, government record books, and other informational texts. That was the argument behind the declaration, and with so many techno-marvels inside, one could understand a 15-year-old absorbing the slogan as patently true." (p. 99) "For the average teenager looking for the latest iTunes, the titles in the pictures meant nothing, and the display assured them that they do, indeed, mean nothing." (p.100) "To replace the book with the screen is to remove a 2,500-year-old cornerstone of civilization and insert an altogether dissimilar building block." (p. 101) Here I feel the need to refer to a banal colloquialism: Mr. Bauerlein, what the fuck are you talking about? You cite "users can always gear down to reading page by page in PDF format." (p. 103) Yet you make no evidence-backed rebuttal of this claim, but instead refers back to techno-evangelists Johnson and Will Richardson, who coin the term "the Read/WriteWeb."

From here we shotgun into a rebuttal backed not by facts or evidence but via fictional anecdotes (more evidence of a literature professor at work here), in which Bauerlein refers to visions by authors Ray Bradbury and Aldous Huxley – authors who have all written about bleak futures dehumanized by infatuaItion with technology. The logician and empiricist in me says... "HUH?" I love Fahrenheit 451 and Brave New World as much as the next guy... but how did we get from point A to point B? I understand that we are supposed to realize Postman was comparing our modern "reality television" entertainment with the devices in these bleak fictional futures, but how does that rebut statements that you can, in fact, use a laptop computer the same way as you read a book via PDFs and eBook software?

The plain answer is: it doesn't.

And then there is Bauerlein's "debunking" of the idea that "real", meaningful reading happens online. His conclusions are based mainly on studies of web use conducted by the Nielson Norman Group, which "has no stake in grand pronouncements about the Digital Age" (p. 142) because they are a private firm that researches user interactions for the sole purpose of improving business/commercial viability of websites. This research has led to such shocking revelations that Bauerlein must share them with us in boldface type:

  • of subscribers of email newsletters, only 19% fully read the newsletters.
  • people scan headlines and blurbs in newsfeeds
  • readers choose the content they are interested in and only focus on that particular news
  • sites are recommended to use sixth-grade reading level on the main page and only eighth-grade level for internal articles
All of this sounds pretty dismal and appalling, right? Until you realize the hilarious fact that, despite being a professor of English at Emory University, Mark Bauerlein doesn't seem to realize that every single one of the above facts also applies directly to print newspapers!

Yes, newspapers are generally written at an eighth-grade level, so as to ensure accessibility of information for the majority of the population. Very few people read an entire newspaper -- in fact, there is a reason why the newspaper is broken down into different sections by topic and interest. There are many people who might pick up a paper just to read the "Sports" section or the "Finance" news. There are many more who will grab a paper simply because they scanned the headlines on the front page and found something of interest to them (this is, after all, why multiple headlines and story segments are placed on the front page to begin with. Because it's what people naturally do -- and have done for more than a century.) He laments the fact that people only look for the minimal amount of reading to get pertinent facts and information -- but isn't this what the news is for? The entire structure of journalistic writing is designed to cater to this style and purpose of reading: you start with an eye-catching headline (which, in itself, contains most of the important facts), then you follow up with an introductory paragraph which does not, unlike other essays and treatises, force you to read further to glean details. In news writing, all of the most pertinent facts and details -- the "5 W's" -- are answered right from the start. Why? Because that's what most people want to know. Many readers stop reading after that first paragraph or two. The remainder of the article will contain the extra details and specifics to elaborate on the information, for those people still interested in knowing more.

But hey... why would I expect an English professor to know this and take it into account, and realize that the habits described on the web almost perfectly mimic the habits practiced in print?

All of this is not to say that I disagree with everything revealed or lamented in The Dumbest Generation. For example, from pages 77-87, Bauerlein juxtaposes the results of a Kaiser report entitled Generation M: Media in the Lives of 8-18 Year Olds with the pro-pop-culture affirmations of Steven Johnson's Everything Bad is Good for You: How Today's Popular Culture is Actually making us Smarter. Setting aside the content of modern media (which is conceded to be "coarse and inane"), Johnson proclaims that modern entertainment is more sophisticated in its format and plot structures, that – as Bauerlein puts it, it builds "aptitudes for spatialization, pattern recognition, and problem solving, virtues that reflect twenty-first-century demands better, in fact, than do traditional knowledge and reading skills." Well, to be sure, spatial-temporal and problem-solving abilities are important in the new millennium... but that doesn't mean they are the only important pieces of the puzzle.Bauerlein notes – and rightly so – that, given the Generation M study showing kids 8-18 spending an average of more than 6 hours per day consuming media, and since modern pop media inculcates "intellectual or cognitive virtues," shouldn't this mean that kids' cognivite capacities are just growing by leaps and bounds? Yet standardized test scores falter and stagnate, and in some academic areas (sciences, social studies, politics), knowledge and interest seem to be declining altogether. So where is the disconnect?

In his own way, Bauerlein points to a problem of perception – that sometimes what is being assessed is too narrow in scope to really be a valid marker of overall cognitive prowess or academic proficiency. Bauerlein's example is the "Flynn Effect" – essentially, the general IQ test has had to be re-normed on a regular basis since its inception, because performance on it continued to improve. One could argue that, early on, this was due to improvements in medicine, nutrition, and education. But the past 50 years have not seen major strides in these areas ... and yet IQ scores have consistently climbed about 3 points per generation. Some would like to point to modern media as a catalyst for this... but if peoples' general intelligence is really higher than it used to be, then why are students faltering? Bauerlein draws the likely conclusion that people have become more spatial-temporal due to increased exposure to spatial-temporal media (p. 93). This seems to be a logical conclusion, since the IQ test does, indeed, use visual, spatial, and sequential-logic questions as its basis. Moreover, it points out that such skills do not transfer into "general" intelligence or learning. Still, it is a bit of a concession that Bauerlein unknowingly admits: perhaps the new media do improve this one specific type of cognition. Unfortunately, I have to agree that this type of thinking, alone, is not enough to thrive. The real argument, then, should not be that screen time is bad but that it should be mitigated in order to allow time for other activities (like reading) that will likely stimulate other cognitive modes like verbal or interpersonal proficiency.

More to the point, I agree nearly wholeheartedly with the focus of the chapter "online learning and non-learning," in which Bauerlein points out that "for all their adroitness with technology, students don't seek, find, and manage information very well...They're comfortable with the tools, but indiscriminate in their applications. ETS terms the missing aptitude Information and Communications Technology (ICT) literacy, and it includes the ability to conduct research, evaluate sources, communicate data, and understand ethical/legal issues of access and use" (p. 113). I think this is really the core and crux of the "problem" Mark Bauerlein is grasping at. In actuality, he tries to make a case that technology and media, in general, cause kids to become less critical, less informed, illiterate... dumb. However, as already pointed out, this is a weak argument because the amount of information and text involved with computers and the Web is staggering. The real issue is the assumption people are making that, being born "digital natives", the "millennials" inherently know how to use technology. This unfortunate assumption rises from the fact that, being immersed in a digital-media society, kids today do seem to feel more comfortable and adept at approaching and operating various technologies. However, using a technology and using it well or productively are two different matters entirely. Knowing the nature of kids (and humans, in general), it's not surprising that, if left to their own devices, they choose the path of least resistance – ie. If they have a choice between online research and reading or a non-educational video game, they will choose the latter. If they have a choice between writing a meaningful, well-structured blog or building credibility in their social circles via 200-letter tweet, they will choose the latter. If it's the choice between streaming Carl Sagan's "Cosmos" or watching a humorous-but-inane 2-minute YouTube video, which one are they going to choose?

In an ETS study of ICT skills in 6,300 students, Bauerlein notes, "Few test takers demonstrated key ICT literacy skills" (www.ets.org/itcliteracy). He goes on, in this chapter, to pick apart technology initiatives and show that they don't work to improve performance. First, he points out that "Taking the enthusiasm of 18-year-olds as a measure of educational benefits may sound like a dicey way to justify such a sweeping change in classroom practices, but many studies out to evaluate e-learning do precisely that" (p.119). In my research, I have found this to be true and, like Bauerlein, I find it ridiculous and lamentable. Using satisfaction surveys are a moot point and should be nearly irrelevant to determining the efficacy of educational technology, and here's why: the purpose of education is not to make students feel happy; the purpose of education is to increase knowledge and learning. Period.

Now, I don't think anyone would argue that motivation can play a crucial role in whether someone learns or not; if someone is very resistant to trying to learn, it likely won't happen. But this does not mean that motivation equals learning. If it is determined that knowledge is still not being acquired, despite significant motivation, this is not good news! It is not something to celebrate, as many edtech articles do. In fact, it points to a serious disconnect; if students are so happy and motivated, yet are learning less and less, then it begs the question: what, exactly, are they motivated to do?

On pages 120-124, Bauerlein lists a series of cherry-picked studies indicating no significant gains from technology (computer hardware and software) acquisitions, despite being often pricey investments. The irony here is that he segues from this into an analysis of the benefits of reading print media outside of school to showcase why reading at home is important. To say this feels like a non-sequitur is an understatement. After all, didn't we just establish that it's not the technology that is causing the problem, but in fact is the students' lack of skills and awareness with how to use the technology in meaningful, appropriate ways.

Some good points are made regarding the value of language use outside of school – that language and vocabulary are improved through constant use of language and especially leisure reading (p. 129). However, this doesn't explain why, after outfitting students with laptops, a New York state school board stated "After seven years, there was literally no evidence it had any impact on student achievement – none." (p. 124).

The explanation for this (much to Bauerlein's chagrin, I'm sure), has nothing to do with books and everything to do with education. The fact that students are missing "key ICT skills" shows the real problem: that students are not being taught how to use the technology in creative, productive, useful, and knowledge-building ways. As Bauerlein points out (in reference to language/vocabulary acquisition), accessing and using information at home is a huge component – and I believe it is critical for parents to treat computer literacy the same way. Parents have an imperative responsibility to monitor, guide, and tutor their children in responsible and practical uses of technology. Sadly, many (if not most) of those parents are unable to do this, being ignorant and untrained themselves. In other cases, parents themselves may "know" technology, but have adopted the same (highly-marketed) attitude that technology is solely for entertainment and leisure – often their transgressions are just as egregious as their child's.

It's no wonder, then, that – despite huge potential for expanding and improving knowledge and understanding of a limitless swath of topics (including literacy) – the technology is not holding up to promises. However, it's not a product of the technology itself, but a product of the neglect we are paying to dutiful training of our educators and subsequent educating of our students in proper, effective, purposeful use of technology. This, to me, does not sound like a justifiable reason to slash funding for technology initiatives. On the contrary, it sounds like a call to increase funding, albeit perhaps with a shift in focus from medium (technology) to message (how to use technology.) In short, knowledge and the ability to share it with others is more important, valuable, and powerful than any device out there.

Perhaps Bauerlein should avoid anything approaching math, science, or empiricism and stick to English-lit style "analysis", wherein one can ponder about the intricacies of emotion or character development and not actually have to worry about making erroneous statements or conclusions. I scored a perfect 6 in the analytical writing section of the GRE (and – what a coincidence! -- I've been using computers gratuitously since I was a child) and I can safely say that, sadly, with this level of illogical rhetoric I have my doubts whether this Emory English professor would even pass the test. At the very least, The Dumbest Generation fails.

Monday, November 8, 2010

CUE, Hall Davidson, and the Cult of "Isn't it cool??"

WARNING: This post will be longer (and more substantial) than a Tweet. If your attention span is not high enough, might I suggest you watch videos of Fred on YouTube?


Over the weekend I attended my first education technology conference in a while: it was the Fall 2010 CUE (Computer-Using Educators) conference -- "Soaring to New Heights" -- held in Napa Valley.

Yes, despite the fact that I have more on my plate than ever before (teaching computers to 400+ students in grades K-6, coordinating two different gifted/advanced learner programs at the school, and working on the final thesis for my MS in Educational Technology -- not to mention recovering from a nasty bout of pneumonia), I voluntarily chose to spend 1/2 of my weekend at a professional conference for use of computer technology in education. The things I do out of passion.

Anyway, it's been a few years since I attended an educational technology conference like this (the last one being about 5 years ago, attending a three-day conference held at the Baltimore Convention Center in Maryland), and I was reminded of something that really, truly bothers me about conventions like this: they are 50% sales pitch, 40% pep rally, and 10% useful information or tools. In the words of the great paragon of fast-food -- Wendy's: "Where's the beef??"

I have no problem, inherently, with sales or pep talks. They are both necessary sometimes. Without marketing, it's likely we wouldn't know about some of the truly wonderful tools and technology that exist for education. And without an occasional morale boost -- well, we'd all probably kill ourselves.

But what this CUE conference really drove home was the dangerous combination of glitz and hype that surround what I'm going to call "the cult of technology" (if you've ever seen people at an Apple store, you'll know what I'm talking about.)

I'm going to use Hall Davidson's closing keynote speech from the CUE conference to illustrate this point. To the man's credit, he came up onto the stage with much energy, enthusiasm, and gusto (following a brief appearance by newly-elected California State Superintendent of Public Instruction.) He showcases a vibrant personality, and that's all well and good for a keynote speaker / pep rally, right? And he didn't just give a speech -- he made use of his laptop and the projector and wireless internet to make it multimedia, to dive right in and show myriad tools in action. That's cool, right?

This is the part where I always have to groan, and the use of technology doesn't quite "float my boat" and make me giddy like the drug it appears to be to the iPad-wielding peers around me. Here's an example: Hall Davidson showcased how you can go to Google Images and do a search and the pictures will show up. But if you really want to make it better, you can go get a plugin -- oh, what's it called? He couldn't remember, so here's a "learning moment" to showcase how "cool" yet another new technology is: Twitter. To drive the point home, he showed how he had popped on and Tweeted about how he couldn't remember the name of the program that would make Google image searches show up in a format like an Apple gallery view. Somebody on Twitter (and with clearly nothing better to do with his time) quickly responded that the app was called "CoolIris"

So Davidson jumps back to Google images in the web browser and explains how you can download and install the CoolIris plugin and it will make the Google images show up in a shiny Apple-like interface in which they appear to be all lined up on a 2D plane (like a wall) in a 3D environment.

"Isn't that cool??" Mr. Davidson prompts. Nods and applause from the iSheep.

Pardon me if I don't jump on the bandwagon. Inadvertently, this "let's get excited about technology" speech pointed out exactly what I'm not excited about: people who think something is great just because it is a new, shiny novelty. People who covet style over substance -- even if it ends up wasting resources (like time or money).

So, let me get this straight: you used Google image search and all the images you wanted to find showed up. You could scroll down the page and find more, as necessary. So, in other words, Google images search functions just fine and allows you to do the task you set out to do. But instead of just being practical and getting the job done, you decided it is "cooler" to waste time Tweeting about it, getting a response, then searching for an app, installing it (bogging your computer with wasted processing power and RAM while your browser is running)... all for ZERO benefit to effectiveness or efficiency?

You can call me a Debbie Downer, but no... no I do not think that's "cool". See, to me what makes technology cool is that it allows us to do things faster, better, or otherwise impossible without it; this does none of the above. It merely illustrates the inanity of our modern-day Cult of Technology and how willing they are to trump substance with style. This app (and others like it) may be free, but the time wasted -- especially when we're talking about our limited instructional minutes in education -- is invaluable.

Oh, but was that example just a poor one, a fluke? Well, Mr. Davidson followed it up by showing how you can create a Wordle (which could, in and of itself, be argued to be another example of replacing substance with style; however, I can understand some of the ostensible benefits of this one for education -- raising awareness of key words, front-loading vocabulary, etc.) Okay, so in case you don't know: a Wordle is a computer-generated stylistic visual representation of some words/text. In and of itself, it doesn't serve much purpose (other than -- you guessed it -- "cool!"), but it doesn't take much time and it could build some awareness of words or their relationships, or maybe give students some motivation to explore synonyms, antonyms, etc. It's not amazing, but it could have its uses. But... useful?? That's not good enough for the Cult of Technology -- we need it to be flashy... cool! Hall Davidson shows you how: create a Wordle, then use PhotoBooth software (Mac required... we all know PCs are just... practical and... uncool.) And what you can do is create a chromakey effect where it will magically erase the background and replace it with your Wordle! And then you can... well, you can video record yourself with a Wordle image in the background which is very useful for, umm... uhh... it serves a practical application in lessons about... chromakey??

Oh, who cares! It's COOL!


(personally, I would have found it a lot cooler if the educators in the room were intelligent enough to be able to correctly spell their favorite beverages which they texted into a central repository -- which included classic wines and beers like "mascado" and "heifenvizen". But beggars can't be choosers, I suppose. We may no longer know how to spell, but at least we can take those misspelled words and make them glitzy, shiny, and cool through the newest -- and therefore the best, of course -- iPad apps available!)

Sunday, May 2, 2010

A Reflection on learning Educational Technology Integration

Most of the posts I have made thus far to this blog have been as a result of requisite prompts for my Boise State University EDTECH 541: Integrating Technology into the Classroom Curriculum. I've learned a lot in this course, and it's nearing an end, but this blog is not. I strongly believe that, although educational technology has existed for decades, we are only now beginning to realize how ubiquitous it can -- and should -- be, but also the value and importance of applying technology in purposeful ways. To that end, I intend to continue this blog with a critical-but-hopeful eye on how to effectively and efficiently use educational technology... not technology for technology's sake, but technology for the sake of learning and forging a better future.

Over the past few months in this course, I have learned quite a bit. A central part of this course has been exposure to the multitude of free or easily-accessible resources out there, many focused on leveraging the Web for production, communication, and collaboration -- accounting for half of the key components of ISTE's National Educational Technology Standards (NETS) for Students. It is truly mind-blowing how many useful and innovative tools are now available online, and perhaps even more staggering is the rate at which they are emerging and expanding.

However, the aspect I appreciate the most and feel is critically important is the idea of critically analyzing when and how to technology can provide a "Relative Advantage" over traditional methods of learning, and how to best incorporate such technology through use of the Technology Integration Planning (TIP) Model. (Roblyer, 2006, p. )

I strongly believe in the value of empirical research and research-based decision-making. As such, I am pursuing a Master's of Science in Educational Technology rather than the M.E.T degree offered by Boise State University; this means that, in lieu of an M.E.T. standards-based portfolio, I will be writing a Master's Thesis. As such, it was refreshing to note the thorough theoretical foundations of Roblyer's (2006) recommendations and conclusions regarding suggested educational technology use. In addition, my own research into studies regarding multimedia yielded very interesting and scientifically-supported evidence of its beneficial value:


However, this does not mean that I do not value the practical importance of the M.E.T. standards set defined by the AECT. After all, what good is theory and research if it does not get applied in a practical way? (Hence the title of my blog) This EDTECH 541 course has been extremely beneficial in addressing these standard needs set forth by AECT. Perhaps its most significant contribution has been in the realm of Standard 3: "Utilization" (Earle, 2000, p.22 ):

“Utilization is the act of using processes and resources for learning” (Seels & Richey, 1994, p. 46). This domain involves matching learners with specific materials and activities, preparing learners for interacting with those materials, providing guidance during engagement, providing assessment of the results, and incorporating this usage into the continuing procedures of the organization.

3.1 Media Utilization
“Media utilization is the systematic use of resources for learning” (Seels & Richey, 1994, p. 46). Utilization is the decision-making process of implementation based on instructional design specifications.

3.2 Diffusion of Innovations
“Diffusion of innovations is the process of communicating through planned strategies for the purpose of gaining adoption” (Seels & Richey, 1994, p. 46). With an ultimate goal of bringing about change, the process includes stages such as awareness, interest, trial, and adoption.


These standards directly relate to the idea of "technology integration planning," and I would say that idea is the area in which I have grown most professionally. It's true that I learned and used many new tools in this course, and I have in fact already held peer-training professional development sessions to teach other teachers at my school about these tools and how they can be used. But for the most part ideas like the TIP model and AECT standard 3 simply resonate with what I have already felt strongly and tried to express in the workplace -- but Roblyer and this course have given me the means to eloquently express the value of such planning, organize a systemic way of implementing it, and showing examples of the TIP model in action. After all, tools and technologies will change over time (now more rapidly than ever!) but wise planning and systematic, purposeful decision-making will likely always be en vogue.


References

Earle, R. S. (Ed.) (2000). Standards for the Accreditation of School Media Specialist and Educational Technology Specialist Programs. Bloomington, IN: Association for Educational Communications and Technology.

Roblyer, M. D. (2006). Integrating Educational Technology Into Teaching (4th Ed.) Upper Saddle River, NJ: Pearson.

Tuesday, April 27, 2010

Accommodation Through Assistive Technology

A huge education buzzword, for at least as long as I have been teaching (8 years now), has been: Differentiation. The Neag Center for Gifted Education and Talented Development at the University of Connecticut states: "Three components that are most notably associated with differentiation are: content--what is being taught; process—how it is being taught; and product—tangible results produced based on students' interests and abilities."

This idea of "differentiation" stems from a goal over the past 20 years of maintaining inclusive, heterogeneous classrooms that contain a variety of students, ability levels, and needs. Although there are many benefits to this, it is easier said than done! This means that an average class of 20-35 students will likely contain special-education students, gifted and talented students, and even among the remaining "normal" population, students vary wildly with regard to their strengths, weaknesses, learning styles, personalities and attitudes.

As a result, the idea of "differentiation" stems also from standards-based instruction, or the idea that the same core standards of learning should apply to the multitude of students in the same class or grade. As such, differentiation attempts to define ways to impart the same set of knowledge or skills to students, even if the products, processes, or systems of delivery vary from student to student.

In theory, this sounds wonderful. In practice, nearly every teacher -- from the overwhelmed rookie to the seasoned veteran -- will tell you that it is a Herculean task! Designing alternative lessons and locating (or *gasp* creating) the variety of materials to accommodate various groups and types of learners obviously takes much longer than "teaching to the middle." Yet the benefits are also clearly visible, as "teaching to the middle" leaves some students in the dust while shackling and boring speedier, more capable and gifted students. Is there a way to differentiate without overworking as a teacher?

Yes... and technology can play a critical role in that process. For as long as there have been federal mandates regarding special education, there have been provisions for technologies to assist those students -- from wheelchair ramps and mobility tools, to personal planners and modified classroom tools, to cutting-edge electronics. But the fact is that differentiation takes into account that not only "special needs" students are special; every student is unique and can benefit from tailored instruction.

So we need to get away from the view that technology as merely "assistive" for the disabled or "challenging" for the whiz-kids. The fact of the matter is that modern technology may be the single most effective means of making truly differentiated instruction a reality -- and this means for all students.

The Center for Applied Special Technology (CAST) prescribes a Universal Design for Learning (UDL) framework "for designing curricula that enable all individuals to gain knowledge, skills, and enthusiasm for learning."

It simply makes sense to design lessons, projects, activities, and classrooms such that they use multimedia, multi-sensory stimuli, multiple streams and sources of information and support, a variety of collaborative or independent opportunities, and tools to support student self-planning, organization, goal-setting, independent scaffolding, and other meta-cognitive tasks. By doing so, you reach not only "subgroups" of students, but you actually enable individually differentiated instruction that can meet the needs of every student -- whose profiles are as distinct and unique as fingerprints and snowflakes.

This seems like a daunting task, but through use of modern technologies it is actually much easier on the teacher than trying to do this with traditional tools, and management of students becomes much more hands-off as student/community autonomy develops. Here are some examples:
  • By using audio-video tutorials/demonstrations, students are able to get direct instruction as usual... but are also able to repeat the lesson as needed on an individually self-assessed basis. This frees the teacher to monitor and manage other tasks and saves the teacher valuable time that would otherwise be spent tutoring or re-teaching.
  • Interactive tutorials in the computer can act like "super texts" by extending textbook information with audio narration, animated or guided examples, and practice with immediate feedback. Not only that, but adaptive systems can determine student needs and provide immediate tailored intervention. Students benefit from the guided instructions, while teachers save time to provide more personal assistance.
  • Likewise, computerized practice and assessment tools allow students to immediately gain awareness of their academic strengths and weaknesses and, with some correct procedures instilled, can seek their own remediation or challenges accordingly.
  • Social networking and digital communication tools allow students to contribute to the class community, to share resources with each other, and to have equitable access to discussions, regardless of time constraints or personality types. Teachers no longer need to worry about the ticking clock, as students can contribute "on their own time" in asynchronous discussions... and even shy students can feel empowered to contribute their thoughts, ideas, and knowledge. All of this means much less stress and classroom management by the teacher. And by locating, evaluating, and sharing resources themselves, teacher planning time for preparing and providing materials gets drastically cut, while students gain 21st century information literacy and evaluation skills that will be critical in their lives.
  • Creativity and productivity software allows students to create the same types of professional products the pros and experts do -- while allowing for easy corrections, sharing, and storing. Students get the benefit of motivational real-world skills, potential vocational training, and tools that make up for possible artistic or motor-skill weaknesses. Meanwhile, teachers save time, headaches, and money by not having to deal with acquiring, distributing, managing, cleaning up, and storing physical products and materials.
  • Assistive technology for disabled students automatically mitigate a variety of handicaps, and may often be beneficial for "in-between" students on the spectrum. For example, magnification/zoom features and text-to-speech may not only benefit the legally blind, but also those with merely poorer-than-ideal vision, or students which reading difficulties.
As you can see, there are many examples of technology that are: (a) individually and communally beneficial; and (b) are a win-win situation for both students and teachers.

References

Dinnocenti, S. T. (1998). Differentiation: Definition and Description for Gifted and Talented. Storrs, CT: University of Connecticut.

Roblyer, M. D. (2006). Integrating Educational Technology Into Teaching (4th Ed.) Upper Saddle River, NJ: Pearson.

Wednesday, April 14, 2010

Relative Advantage of Technology for History

The study of history is powerful and important for many reasons. It allows us to determine the variables that influence why we are where we are, but the ultimate goal of studying history is actually much more pragmatic; in fact, the study of history, we could argue, is actually the first step in prediction and forging of the the future. It gives a window to past situations, problems, challenges, and solutions -- it allows us to learn from the mistakes and successes of others in similar circumstances.

If one thing has revolutionized the study of history more than anything else, it has been multimedia. History requires understanding a multitude of variables, experiences, and perspectives. Such ideas were expressed, recorded, and shared throughout time in oral history stories, ballads, songs, engravings, and other arts. Eventually, invention of the printing press allowed, for the first time, mass production and distribution of these individual experiences, stories and perspectives.

The benefit of being able to share and consume from a multitude of viewpoints and perspectives cannot be underestimated in its value to truly understanding and using the lessons history provides. When an event occurs in space and time, the impact, consequences, and connotations will always be perceived differently depending on who is involved and how their first-hand experience. Thus, we can say there is not really such a thing as one "true" or "objective" history, but rather that every point in history is a Gestalt experience. In other words, the more perspectives and viewpoints and artifacts and experiences we can piece together, the more "true" our impression of that event or experience will be, as a whole. The sum of the parts is what provides the synergistic power of studying history.

For this reason, multimedia and the Internet are perhaps the most powerful tools ever created to understand history.

  1. A variety of media allow for multiple types of "experiences" -- visual, verbal, physical, interpersonal, intellectual, emotional -- to be shared and witnessed. The printing press allowed a variety of stories and perspectives to be shared, but use of only words is not always sufficient to convey a place, time, thought, or feeling. Art, photography, television, radio, and film have provided the tools required to convey and evoke the emotional and visceral impact that words alone may not be capable of evincing. Having our own sensibilities addressed in the variety of ways that a first-person witness of a historical event would means that we can more realistically understand and empathize with the situation. In essence, multiple media creates a "virtual reality" of the past.

  2. Unlike mass media of the past, the Internet and World Wide Web has become a huge game-changer in the social studies -- especially politics, sociology, and history. Our understanding of history is enabled through the examination of stories and artifacts -- items and images and recordings throughout time. Proliferation of affordable media tools and the advent of "Web 2.0"user-created online content has resulted in an explosion of such individual experiences, perspectives, stories and artifacts. Unlike the idea of a single expert or textbook delivering the "true" history, we now have an overflowing source of primary-source materials and experiences that is growing and evolving daily.
The great power and benefit of this is that we can observe, consume, analyze, and compare a wide variety of perspectives, viewpoints, and ideas. Unfortunately, the drawback is that this allows allows for wide dissemination of misinformation, or even pure bias.

However, I believe this does not need to be a reason to avoid using the Internet to study history. The more perspectives we have, the closer we can get to understanding the "universal truth" of a historical event. How, then, can we reconcile the fact that many sources on the web are likely to be biased or just plain wrong? I believe instead of avoiding the plethora of resources out there we should embrace this opportunity to learn and to teach our students some of the most basic and useful cores of social studies:
  • History is not a one-sided story
  • There is no such thing as "objective" -- everything is shown through a lens or told through a framework of personal experience
  • Drawing conclusions and making predictions requires analyzing and comparing multiple sources and considering many variables
The fact of the matter is that this now the world in which we live -- a world of universally-shared and universally-accessible content and media. We now have an imperative and a wonderful opportunity to teach our students how to be thoughtful, careful, critical, and perceptive interpreters forgers of conclusions, beliefs, predictions, and solutions... and this can all be aided through Internet and multimedia technology.

Wednesday, March 24, 2010

Technology for Reading and Writing in a Content Area

I began my teaching career 8 years ago as a 7th grade language arts teacher. I remember often getting frustrated at how low (below grade level) the abilities of the students coming to me were. I would get upset when I saw that they either: (a) didn't have reading and writing assignments in other subjects (like math); or (b) even if they did read and write, teachers would ignore reinforcing reading and writing skills, dismissing it with "It's not my job to teach reading and writing... that's what Language Arts class is for."

At that time, I also saw a disconnect because I saw the growth and potential of the Internet and technology, yet there were no computers in the classroom. I had also been a computer programmer for a while (2 years professionally, though I had been programming computers since using DOS and MS-BASIC when I was a 3rd grader.) I expressed concern about this to one of the assistant principals, explaining what would later be affirmed by Kinzer (2003) and Leu (2002): traditional definitions of literacy are no longer sufficient and that it is critical for students to learn how to learn, how to analyze, and how to navigate in a world of new media and information technology.

Although the AP agreed with me, most people from the principal to the community dismissed it at the time. Now, however, that is not the case. People from the President of the United States down to a kindergarten student are starting to realize that the ways they must be literate are vast and varied.

Ironically, many people equate computers, technology, and multimedia as being the antithesis of literacy, as the antagonistic foe of "written language" (books.) This could not be further from the truth. Although it is true that many forms of media are now audio-visual and thus do not engage the reader in visual text processing and construction, the Internet is still chock-full of written content. Web 2.0 tools like blogs have caused an unprecedented explosion in the amount of written content being created and shared. Proper spelling has reemerged has a priority due to the necessity of search engines and global communication. Quite contrary to stoic traditional beliefs, reading and writing skills may be more important and valuable than ever, and fortunately there are many ways technology tools can help.

Reading

Two of the most common and popular computer-based tools for improving reading are through the use of interactive/multimedia books and evaluation/tracking systems.

Interactive books. I remember when my family first installed a "SoundBlaster" audio card in our computer during the late 1980's. For the first time, our computer could churn out more than blips and bleeps -- suddenly, fully-orchestrated music and voice narration existed. One of the first games to take advantage of this was "Reader Rabbit," which combined a visual and interactive experience with spoken words and phonetics to assist in the decoding practice.

Over the two decades since then, this use of computer media has seen a lot of use. One example I use with primary students is Starfall.com, a free Flash-based online system. These types of books allow students to click on words to hear decoding and pronunciation. There are also animations and interactions within the books themselves, making them motivational and engaging. Small games are incorporated to practice such skills.

There are many sites and programs such as these. Their primary intent and focus is on decoding and verbal fluency. Although this is a great first step, it is disappointing that in 20 years we have not seen the technology go further. There is room for advancement and improvement in this area. What would next steps look like? Well, students can currently hear narration and words -- but the programs do not require the students themselves to read out loud. Some systems allow students to record and play back their own voices, which is a start, but there is no assessment of their actual ability. With modern technology, shouldn't we be using voice recognition to engage, assess, monitor, and dynamically adjust instruction for the user?

Evaluation/Tracking systems. Another use of technology is to serve as a repository, database, or assessment tool for student reading progress and comprehension. Two examples of this are Accelerated Reader and Scholastic Reading Inventory/Scholastic Reading Counts. These programs are used to evaluate student reading levels based on comprehension abilities. For example, SRI can determine a student's lexile score and make recommendations for books based on appropriate reading level and students' own interests. Students can then read those books and take comprehension quizzes in Scholastic Reading Counts about elements of the story they read. Such tools can be motivational and can certainly provide useful data for tailored instruction or targeted interventions. One obvious improvement seems like it would be an "all-in-one" literacy system that includes interactive and engaging digital books and games and an assessment/tracking system to determine performance in fluency, comprehension, and literary analysis.

Writing
Although topics and purposes of written assignments may vary dramatically between content areas, there is one thing that stays the same: the writing process. Let's look at the standard phases of the writing process -- for any content standard -- and see how technology can help.
  1. Prewriting: Think and organize your ideas.

    One of the most difficult things for students to do well when writing is to structure and organize their ideas and information in a logical way. Visual organizers such as webs, Venn diagrams, compare/contrast columns, and outlines have long been used on paper to help students organize and plan their writing. There are now virtual versions of these planning and organizing tools that go one step further because they are dynamic and easily changeable, which is important when planning because ideas need to be quickly moved or modified. Furthermore, all of the typical drawbacks of paper -- use of resources, easy-to-lose, and hard to share or replicate -- are avoided. Examples of these useful tools include Kidspiration/Inspiration, Webspiration, Text2MindMap, and Read-Write-Think.

  2. Drafting: Begin to write down and structure the thoughts and supporting details.

    Drafting is one area where computer technologies may provide little to no advantage -- other than speed an efficiency. One would think hand-writing a draft may be faster than typing it, and in some ways it is. But consider the previous prewriting tools -- using these tools with typed text, students can quickly and easily cut-and-paste their existing written text and notes into a draft-paper format, saving time while simultaneously making the cognitive connection between visual-spatial organization and final verbal/textual structure. Drafting most often occurs in word processing software such as Microsoft Word, Google Docs, or OpenOffice.

  3. Revising and Editing: Revising looks at how the structure, organization, and choices for words and supporting details might be improved. In editing, writers make sure the "mechanics" are correct -- spelling, grammar, punctuation, formatting.

    There are many, many ways computer software improves the editing and revision process. For one, proofreading is easier (for the author and especially for other readers) in typed format than it is when reading handwriting. More importantly, there are tools that are specifically useful. Highlighting or font colors can be used to check for important elements such as topic sentences, transitions, and supporting details. I also have my students use these tools to point out spelling, grammar, or punctuation errors on peer papers. Many programs also have a built-in dictionary and thesaurus which can help students quickly choose the most appropriate words to use. Finally, the ability to cut and paste blocks of text allows students to quickly omit, rearrange, or insert text in appropriate locations... a feat that is nearly impossible on paper.

    In addition to the word processing tools, multimedia can prove extremely useful in the editing and revision process. For example, students can use microphones to record themselves reading their own papers as well as their classmates'. By doing so, the flow (sentence fragments, missing or misspelled words, etc) and structure of the writing may become clear in a way that would not have occurred with silent reading.

    These are some of the key reasons that multiple studies have shown increase in effective writing skills and performance when using a word processor (Kulik, 2003, p.60; Roblyer, 2006, p. 300).

  4. Publishing: Make the written work presentable and easily consumable by your target audience.

    Computer technology has made a huge and obvious impact on publication. In the past, a "finished paper" might look like a rewrite of the edited draft using indelible ink. Through the use of desktop-publishing and presentation software, anyone can now produce finished products to rival professional publications... and the Internet even allows sharing and distribution of that work to a mass audience. In short, there is no longer any reason that the "classroom" experience needs to be inauthentic or inferior to the professional world.

References

Kinzer, C. K. (2003). The importance of recognizing the expanding boundaries of literacy. Integrating Reading Online, 6(10).

Kulik, J. (2003). Effects of Using Instructional Technology in Elementary and Secondary Schools: What Controlled Evaluation Studies Say. Arlington, VA: SRI International.

Leu, D. J., Jr. (2002). The new literacies: Research on reading instruction with the Internet. In A. E. Farstrup & S. J Samuels (Eds.), What research has to say about reading instruction. (3rd ed., pp. 310-336). Newark, DE: International Reading Association.

Roblyer, M. D. (2006). Integrating Educational Technology Into Teaching (4th Ed.) Upper Saddle River, NJ: Pearson.

Tuesday, March 9, 2010

Social Networks: To Use or Not to Use?

For the past decade or so, the idea of "social networking" has been a hot topic, buzzword, and source of much revenue on the Internet. To be sure, this idea has been around a bit longer than that... in fact, I was "social networking" on a local BBS (bulletin-board system) using a modem when I was a middle-school student in the late 1980s... before the World Wide Web was invented and any websites existed.

We all know of ubiquitous sites now like MySpace, Facebook, and Twitter. A few years ago the buzzwords were Friendster and Xanga. Now there are also new systems like Ning and Edmodo, which allow for smaller network groups or collaborative spaces online. This could be useful for education... but does that mean we should use them?

Some advocates, like José Picardo (posting on his "Box of Tricks" blog), give a resounding "YES!" But I'm not so sure. The idea is that, currently, many (if not most) schools use a "walled garden" approach to learning with technology, in which students are provided with a well-defined "sandbox" within limits of ways they can explore, browse, use, and communicate via the Internet. The argument is that we should break down the wall and avoid sheltering these kids, for the following reasons:
  • Students are growing up in a global economy requiring global citizenship. They "face a future where boundaries are abstract and global learning is critical. Tomorrow’s citizens must be global communicators, must be able to participate successfully in project-based activities, and must have collaborative skills." (Reed, 2010.)
  • The Web 2.0 is how students like to communicate outside of the classroom. "Many will argue that most students are just wasting their time and gossiping online but, whatever anyone’s opinion on the benefits or dangers of social networking is, it cannot be denied that they are all sharing, collaborating and networking and they are doing so in a way which they enjoy and find engaging, otherwise they simply would not do it." (Picardo, 2010.)
  • Communication, teamwork, and collaboration are valuable skills both in and out of school.
Okay, these all may be very real and valid statements... but these ideas alone do not constitute a reason to jump on-board the social-networking bandwagon. One thing I find particularly disturbing is the lack of references to scientific studies or empirical evidence that shows results of using social network technologies for learning. All of the reasons I have seen on various blogs and propaganda on the web are mere opinions or conjecture -- and the only studies I have seen have been qualitative ones, in which students indicate that they do enjoy using the technologies. So, clearly, social networking can be a benefit for motivation -- but does it help learning? And at what costs? Breaking down the walls of the garden may reveal some unfortunate apples in this newly-discovered Eden. Let's do a cost-benefit analysis:

Benefits
  • Improved motivation / enjoyment.
  • Simulates or represents what the "real world" will be like after school.
  • Allows for collaboration. (but can't this be done without the technology? Not for online education or home-schooling, but for the vast majority of students who are in traditional schools and classrooms, the school itself is a "social network" where the kids can communicate, interact, and work on projects together. )
Costs
  • Distraction. Recent studies have shown that students do not multitask well and actually learn less when multitasking. Social networking does not necessarily cause students to attempt multitasking, but it certainly encourages it and makes it easy to try to do.
  • Online privacy / safety -- and lack of parental guidance. Multiple studies at Netsmartz.org reveal the dangers of using online communication with students: (1) 45% of teens have been asked for personal information from someone they don't know online; (2) 42% of parents do not review or monitor what their children do online, and 30% allow them to use the computer in a private room. Some of these problems can be solved by using private networks/groups such as Edmodo and Ning allow. However, if we as educators encourage or provide requirements for online networking, we cannot monitor what these students are doing at home, such as...
  • Cyberbullying. Tomorrow, my elementary school is holding a special session for parents to teach them about the issue of cyberbullying. It has started to occur in students as young as 4th grade. They are posting disparaging or slanderous remarks, hijacking MySpace accounts, etc. Most social networking sites are not moderated -- this opens up a huge potential for cyberbullying in a forum that you the teacher created and endorsed. Unless the service you are using allows moderation and approval of posts and comments, students can post inappropriate or damaging materials and do serious psychological damage to their peers in your classroom before the offending content is caught and removed.
  • Laws prevent children from using many online services. The Children's Online Privacy Protection Act (COPPA) of 1998 stipulates that any website that collects personal information (including name, address, phone number, or email) of a child under 13 years old requires special notifications to, and verified approval from, a parent. Because of the difficulties in doing this -- and the fact that the web service can be fined $10,000 if it is discovered that they did not adequately comply -- many sites and services simply refuse to allow children under 13 to use their service. Some of the sites which explicitly state that children under 13 cannot use their services include some of the most common and popular email and social networking sites: Google, Yahoo, MySpace, Facebook. So... how is cyberbullying and Facebook-jacking being done by 4th graders? Due to the lack of parental guidance as mentioned above, students can simply do what they like at home... and they know all they need to do to create an account is lie about their age.
    But if kids can't get an email account... how can they use a service like Ning which requires an email address to sign up? (note: Edmodo does not require an email account or personal information, so it is a viable option)
  • Increasing the Digital Divide. Finally, something to seriously consider is the fact that many students are disadvantaged and simply do not have computers, Internet, or smartphones outside of school. Mr. Picardo argues that microblogging is what kids do at home... but this is not the case when the kids don't have the tools. Let's assume that there are benefits to online social networking for education. If you are a teacher at a school (like mine) with a large disparity in socio-economic status, then moving your collaboration and communication tools online for homework activities and projects (instead of in the classroom) means you are providing those educational opportunities only for those students privileged enough to have the tools to participate and engage themselves in the discussions and learning outside of the classroom. The only solution, then, is to use computers and social-networking tools while in the classroom... but if you are doing this, doesn't it defeat the purpose? The students are there -- together, in the same place, as a group -- so they can collaborate and "social network" just by using their mouths and hands and brains, without even needing any technology.
Conclusion
Social networking tools are certainly a hot topic right now, and I think they have their place. I'm not going to say they would not be useful for education; Vygotskian theory (among others) really highlights the value of communication for constructing knowledge. For this reason, I think online social networking would be a crucial component of home-schooling or online education environments. For adults -- such as college students -- they may prove to be a useful extension of the classroom.

However, there are serious costs and problems with trying to use them in a traditional school classroom and especially for students in elementary school. I believe there will be a time when they are well-designed in a way that is feasible for use (in fact, I am in the process of developing a new website for my school district using a web-hosting service called SchoolFusion which includes online teacher-created -- private and moderated -- blogs, discussions, forums and activities. ) However, we will not see the true success of social networking for education until these three conditions are met:
  1. the digital divide would have to be closed and all students would need to have access to technology and the Internet at home.
  2. any system that is used for kids in school must be: private, moderated, and not collect private information (name, address, or email) in order for student sign-in.
  3. we need more scientific studies and evidence of the values and efficacy of online social networking (ie. what is the benefit over traditional classroom collaboration?)
References

Federal Trade Commission. (1998). Children's Online Privacy Protection Act. Retrieved from COPPA.org: http://coppa.org/

National Center for Missing and Exploited Children and Boys & Girls Clubs of America. (2010). Statistics. Retrieved from NetSmartz website: http://www.netsmartz.org/safety/statistics.htm

Picardo, J. (2008, November 15). Social networking in education: why is it taking so long?
[Weblog message]. Retrieved from http://www.boxoftricks.net/?p=549

Picardo, J. (2010, February 16). Microblogging: making the case for social networking in education [Weblog message]. Retrieved from http://www.boxoftricks.net/?p=1727

Rede, J. (2010). Global collaboration and learning. Retrieved March 8, 2010 from EdTechMag.com website: http://www.edtechmag.com/k12/events/updates/global-collaboration-and-learning.html

Tuesday, February 23, 2010

Using Spreadsheets and Databases in the Classroom

Considering the impact computer-based spreadsheets and databases have had on the business world over the past few decades, it is astounding how rarely they seem to be used in the classroom.

Sure, teachers these days often use spreadsheets or databases to record class rosters or digital gradebooks, and that's useful and great... but it doesn't address the fact that our students are growing up in the information age and will be finding work in a global knowledge-based economy. Because of this, it is more critical than ever that students know why and how to use tools that help them sort, organize, and explore the vast amounts of information that is being created and shared every day via the Internet.

This is where spreadsheets and databases come in.

Spreadsheets
Most teachers in classrooms today have at least some experience with Microsoft Excel. This is a spreadsheet program, but it is not the only one. There is a free online Google Spreadsheet program which is similar (but not as full-featured) as Microsoft Excel. There is also OpenOffice Calc, which is a free download to install on your computer and also reads/writes Excel files.

Spreadsheet software provides a large table of cells which can contain a variety of data -- text, dates, numbers, currency, etc. Although the spreadsheet can be used as a basic storage and organization system this way, its real strength lies in the ability to perform long or complex calculations as well as the ability to easily visualize data sets using charts and graphs.

Relative Advantages: Spreadsheets allow both students and teachers to perform calculations on large sets of numbers quickly and efficiently. One benefit is that the spreadsheet software allows for quickly creating visual representations of data using a variety of charts and graphs, which can sometimes help to illustrate disparity or similarity between numbers far more quickly or powerfully than the numbers themselves could. Another benefit of working with calculations and formulas is that students can quickly see how changing certain numbers affects the overall output or result of the formula. This allows them to focus on higher-level concepts (Roblyer, 2006, p. 132) but it is important, if students are new to spreadsheets, to teach spreadsheet concepts and familiarity before using it to teach math concepts (Thorsen, 2009, p. 237).

Databases
Database software seems to be much more daunting for teachers to approach, perhaps because they are a little more time-consuming and complex than Excel, or perhaps because teachers are not as familiar with their use since they would not be a likely tool as a class gradebook (they are, however, perfectly suited as SIS software: student information systems.) Many databases -- especially high-end ones used by businesses -- cost money. However, there are several smaller and easy-to-use databases, including FileMaker and Microsoft Access (part of the Office suite) as well as OpenOffice's free version, Base. Computer programmers who want to create database-driven websites often use a free and powerful database called MySQL.

It's really a shame that databases are underutilized in the classroom, because they are truly powerful tools and are useful for organizing, filing, and "making sense" of large quantities of data. A "database" is just what it sounds like: a "base" or place for storing and organizing data. Like spreadsheets, this data can include a variety of formats such as dates, numbers, and text. However, the grouping and relationships between types of data can be more clearly defined in a database. Unlike spreadsheets, databases are used less for calculations or visual representations and more for organizing, analyzing, and predicting by using queries or "questions" to ask the database about what types of records it contains in its tables.

Relative Advantages: The biggest benefit of using a database is that it allows people to quickly find specific facts or information and to organize that information in a way that helps them determine relationships or answer questions. For this reason, it is an ideal problem-solving and inquiry-based learning tool. Effective uses of a database would be lessons that require students to describe something unknown based on its characteristics; make a decision or analyze a problem; or make a prediction based on existing data (Thorsen, 2009, p. 203). Databases also allow multiple people to access a central repository of data, which eliminates redundancy of storage and helps ensure accurate and up-to-date content. However, it is important that students have structured guidance in asking relevant questions and analyzing the results (Roblyer, 2006, p. 139).

Example Academic Activities
Click here to see examples of activities that could be performed by students as young as elementary school as part of a thematic unit about transportation.

References

Roblyer, M. D. (2006). Integrating Educational Technology Into Teaching (4th Ed.) Upper Saddle River, NJ: Pearson.

Thorsen, C. (2009). Tech Tactics: Technology for Teachers. Upper Saddle River, NJ: Pearson.