Classroom Clickers and the Cost of Technology

by Stephanie Chasteen on February 11, 2009

There’s been a quite interesting (and sometimes vitriolic) exchange of ideas on the usefulness (and cost) of clickers in college classrooms, in which I recently took part.  A “clicker,” for those of you who haven’t heard of them yet, is just a little device which lets an instructor take a real-time poll of the class.  Each clicker has 5 buttons labeled A through E, corresponding to answer choices to a question posed by the instructor.  A real-time histogram is created showing the class response. In itself, clickers are no magic bullet, they’re just a tool.  But we’ve found them to be incredibly effective tools for learning when we use them in conjunction with Peer Instruction (as coined by Eric Mazur).

We ask students a challenging question and then ask them to talk to their neighbors about the question.  In this way, the clicker isn’t a quiz, but rather it focuses the class on a particular conundrum in the physics (or biology, or what have you) material being taught, and they get a chance to wrestle with it and teach each other.  Their answers are anonymous (at least to their peers), so there’s no embarassment if they raise their hands for the wrong answer.  If they’re not sure of the right reasoning, their peers can often help them better than the instructor, who is a long way past his or her own struggles with the material.  And then the instructor gets a snapshot of whether the class is following his or her (brilliant) lecture  — both by seeing the histogram, and by eavesdropping on conversations that students have with each other. It can be a fantastic tool.

But here’s the “controversy” (in quotes, because I’m not so sure that both sides are really arguing about the same thing).  In a recent issue of the Chronicle of Higher Education, Michael Bugeja, argues that clickers are the product of successful marketing by companies who want to make a buck, at the expense of our students and academic integrity (see “Classroom clickers and the cost of technology“).  He argues this not just for clickers, but for technology in general (see “Could you be a Hoopla-dite” in the Chronicle).

He says, for example (from the article in the Chronicle):

Clickers, or “audience-response systems,” were designed in the 1960s in Hollywood to test unreleased movies, commercials, and television shows. A decade later, a retired planner at IBM, Bill Simmons, developed a rudimentary response system to simplify boring business meetings. Soon the business world commercialized and adapted audience-response systems to augment consultations and presentations.

Then, in one rhetorical stroke, manufacturers substituted “student” for “audience,” introducing clickers into education.

. Institutions have much to learn from students about the cost and effectiveness of technology. Chief information officers need to be consulted before departments invest in expensive for-profit consumer technologies. Professors need to realize that technology comes at a price, even when advertised as “free.” Finally, administrators need to double their efforts at cost containment, demanding assessment before investment, especially in schemes that bypass mandated accountability standards.

Otherwise business as usual will continue to disenfranchise our students, who will hold their debt-ridden futures in their clicking hands.

There have been, of course, many responses to this, such as Richard Hake’s very complete and referenced online response, a detailed post (and response by Bugeja) on Derek Bruff’s blog, as well as several letters to the editor which appeared in the January issue, including one by myself, and one by Doug Duncan (also at CU-Boulder).  Those are online at the Chronicle but only available to subscribers.  Derek Bruff commented on those letters as well.

Derek Bruff responds to Bugeja, in part:

I agree with some of Bugeja’s takeaways from his institution’s experiences with clicker vendors.  He argues that students should be involved in decisions about instructional technology, that chief information officers should be consulted by departments making such decisions, that faculty adopting technologies should be aware of not-so-obvious costs of using these technologies, and that administrators should be prudent when conducting cost-benefit analyses of new instructional technologies.

Those are all very sensible points.  However, I see some problems in the ways Bugeja uses clickers as an example in support of these points.  The fundamental weakness of the essay is that Bugeja seems to be doing a cost-benefit analysis on clickers without paying much attention to the benefits portion of that analysis.  As well-referenced as the cost portion of his analysis is, he fails to consider any of the research looking into the impact of teaching with clickers on student learning.

In kind, here is a portion of Doug Duncan’s response to Bugeja:

Most of the practices [Bugeja] describes are what our research shows to be worst practices. We see them fail, too. When instructors use clickers as part of peer instruction and explain to students that they will attend class more, work harder, learn more, and be rewarded for that, peer instruction and clickers produce learning gains. When instructors ask low-level memorization questions and don’t explain why they are using clickers, students call them dumb and worthless.

And here is my response in its entirety.  (Here is a a PDF including Duncan’s whole response).

Based on years of extensive systematic surveys, observations, and peer-reviewed research on personal response systems (“clickers”) we strongly disagree with many of the points raised in Mr. Bugeja’s December 5 article “Classroom clickers and the cost of technology.”  This simple tool, when used well, can result in a remarkable transformation of a university classroom, increasing how much students learn and enjoy a given course through increased interaction and engagement with their instructor and peers.  We have found no other single tool that achieves as many benefits for such low cost.

Mr. Bugeja argues that “manufacturers substituted ‘student’ for ‘audience’” when adapting clickers from audience-response systems in Hollywood.  We do not find it interesting to hypothesize about intentions of manufacturers or their profit motives, but rather to adapt their tools for our needs. As a familiar example, computer manufacturers’ goal is profit, yet nobody doubts the utility that the personal computer has as a tool towards accomplishing great things.  Or not (think Tetris). And so, as with any technology, a clicker is simply a tool, and depending on the hands of the craftsman, it can either be an expensive toy or a vehicle for classroom transformation.   Here at the University of Colorado (CU) at Boulder we use iClickers campus-wide, which are designed to be technically simple and robust.  Students buy the iClicker, once, for about $40 ($20 used), and use it in multiple courses. That’s it.  And from there, it’s up to the instructors.

Now, if instructors substitute “student,” for “audience” then there is a problem.  In fact, that already is the problem.  Decades of research on learning and cognition show that students learn more when they are engaged interactively .  But it’s hard to change university classrooms to encourage active learning, given a myriad of institutional constraints such as fixed stadium style seating.  Clickers, however, offer an easy, cost-effective way to engage students in the material, by the following process:  (1) asking students a question that’s challenging, but not too hard, (2) giving them adequate time to discuss with their neighbors before they give their final vote and (3) asking students to explain their answers, including why the wrong answers are wrong.

Using this method of clickers and peer instruction (modeled after that of Eric Mazur at Harvard), research at CU and elsewhere  has shown that student learning increases compared to traditional lecture, across several disciplines. Additionally, several studies ,  show that, after talking to their neighbors, students gravitate towards the correct answer, and the experience of talking to their neighbors substantially contributes to learning.  In addition, clickers give students a chance to practice communicating their thinking to their peers, a skill that they would not achieve while passively listening to a lecture.  Classes that use clickers to ask simple quiz-like questions without peer discussion aren’t achieving these full benefits.

Obviously, teachers don’t need clickers in order to ask students thoughtful questions and have them discuss the answers with each other.  But clicker technology itself provides several key benefits that promote active engagement, namely:  (A) focusing the class clearly on a question, (B) having students commit to an answer, instead of retroactively deciding that they would have answered correctly, and (C) allowing the safety of anonymity. Mr. Bugeja quotes Ira David Socol claiming that clickers are “no more sophisticated pedagogically than raising your hand.”  Come now.  If most people are raising their hands for answer “A”, will you still bravely raise your hand for “B”?    And once you know the answer to the question, your own reasoning process has been short-circuited.  And lastly, clickers offer (D) the computer tabulation of responses, giving both instructors and students real-time systematic feedback about student understanding.
Mr. Bugeja hypothesizes that students would vote against the use of clickers because the costs outweigh the benefits.  Research suggests otherwise.  In our own large introductory physics courses, 95% of students stated that clickers were helpful to them in learning the material .  Studies in other disciplines suggest that students are more likely to value clickers when they’re used to promote discussion, rather than to ask simple questions or take attendance.
Sure, clickers cost students money.  But students are paying thousands of dollars to sit in your classroom.  Clickers can make this experience more educationally productive, for a marginal additional cost.  Let’s not concern ourselves with the business of the manufacturers.  We’re in the business of supporting the development of thinking minds.

Dr. Stephanie Chasteen (Physics Department) on behalf of the Science Education Initiative of the University of Colorado at Boulder
http://www.colorado.edu/sei
Stephanie Chasteen
Research Associate
Science Education Initiative
University of Colorado at Boulder
Boulder, Colo.

Note that Bugeja was not given the opportunity to respond to these letters in the Chronicle.  You can see his unpublished reply here. I was saddened to see that he still didn’t address the pedagogical usefulness of the tool.  He focuses on the cost ($20-40 for an item that can be used in multiple classes, unlike hundreds of dollars worth of textbooks good for a single semester).  On the other hand, those of us who advocate clickers and see their pedagogical usefulness are arguing for that. Bugeja is arguing on basis of cost.  Are we comparing apples to oranges?  We advocates don’t want to tax students unnecessarily for their education.  We do the cost benefit analysis and see the benefit far outweighing the cost.  Does Bugeja see no benefit, and thus the cost benefit analysis focuses on the cost?

For the record, at CU we like the i>clicker system a lot!

{ 4 comments }

sciencegeekgirl February 27, 2009 at 10:35 pm

Here’s another recent article in the Chronicle about clicker use — this one more positive in general.

http://chronicle.com/wiredcampus/article/3637/best-ways-for-professors-to-use-student-response-systems

Diane Bugeja August 27, 2009 at 2:42 am

What is the sample size and how was the survey distributed to students? The research methods should be examined here so the reader has some basis for understanding and drawing his/her own conclusion. Why not include a link to the study? Did I miss it? How many different schools were in the sample size? What were the questions that were asked of students and when? Seperate survey or part of the end-of the year evaluation? Were all, some or none of the instructors trained in clicker use? If you are going to use research studies to shore up your point of view don’t you think the reader should be given some important facts about the study?

sciencegeekgirl August 27, 2009 at 5:57 am

Diane,

What survey are you referring to??? I don’t see a reference to a survey in this post. My article to the Chronicle (reposted here) did not have room for detailed references, though I could certainly add them. Multiple studies document how peer instruction improves student learning, which I could certainly reference. However, the research literature is not the main point of the post. Here, I wished to air the discussion that was created by Michael Bugeja. In this post I argue that Michael and his critics were really not arguing the same issue, and I also gave some voice to his sadly unpublished response to the critique of his article (of which mine was one). Your comments are voiced in such a manner that it seems that you’re hostile to my ideas, rather than genuinely curious about the details of any research studies of clickers. I would be most interested if Michael Bugeja would be interested in responding thoughtfully to some of the respectful questions that I posed at the end of this post. I think that would further this discussion, and all of our understanding of the issue.

Stella April 6, 2015 at 8:25 am

ARS helping lot in education era. http://ow.ly/Lew3i

Comments on this entry are closed.

Previous post:

Next post: