This is a repost of my work on the iclicker blog.


Lately, I’ve been thinking about the purpose and approach that we take in various forms of assessment. Today I’d like to step back into a little bit of theory-land, and consider a broader framework of assessment, and the ways that clickers fit into that broader framework.

Point 1: There is a disconnect between how we see a course, and how our students see the course.

As instructors, we have strongly-held values about what our course is about, and why it’s important. But we don’t always teach for what we value. Sometimes we cram in content because it’s interesting, even though it doesn’t serve our broader purpose. Sometimes we think that the ultimate point of our instruction is clear through our activities and assignments – but that is not always the case.

Students operate in a different reality from us. Because of this, and the fact that our course goals are not necessarily as transparent as they could be, it is critical to be explicit about our expectations for students, and the connections between our instruction and the course goals.

What does this have to do with clickers? Hang on, I’ll get there.

Point 2: Designing a course around core objectives provides clarity – for instructors and students

One way to address the mismatch that I talk about above is to explicitly design our course around learning goals that are identified in advance. This isn’t how we usually design our course – a common approach is that we decide what goes into our course by finding some activities that we think will be fun and interesting to students, and building the course around those. Instead, consider starting with your goals, in an approach called “Backwards Design”. Backwards Design isn’t a prescribed set of steps, but rather a philosophy of education. First, we define explicit learning outcomes, such as “Students will be able to recognize equilibrium points on a graph.” You can read more about learning goals here. Then, determine how you will assess that outcome, such as having students use graphs to predict the behavior of objects. Then, LAST, you identify the instructional approaches that will help students be successful on those assessments – perhaps, working in groups on a tutorial on equilibrium, and getting more practice on their homework.
Backward Design: Start at the End
Backwards Design & Alignment: An example

Point 3: Frequent assessment gives powerful feedback as to whether students are achieving your goals

So, now that you have your clearly defined objectives, how do you know whether students are making adequate progress? How do your students know whether they’re on track? Assessment is a critical piece of instruction – it’s not just about finding out whether students got the message at the end of the day; assessment is about continually evaluating and giving feedback to students on where they stand. In fact…

Rapid, targeted feedback is perhaps
THE most important element of learning.

Without feedback, we can’t improve. This kind of feedback is achieved through the use of“formative assessment” – or “Assessments that provide information to students and teachers that is used to improve teaching and learning” (NRC, 2001). Compared to exams and other end-of-instruction exams (termed “summative assessment”, formative assessments are low-stakes, and aimed at helping students improve, and helping teachers appropriately target instruction, before that exam. Another way of looking at it is that formative assessment is “When the cook tastes the soup,” and summative assessment is “When the customer tastes the soup.”

Point 4: Clickers are an excellent form of formative assessment, to student and teacher

So, here’s where we get to clickers. Clickers and peer instruction fit perfectly into the formative assessment model – clicker questions are low-stakes, rapid, ongoing opportunities for students to find out how they’re doing, for instructors to see how the class is going, and for both students and instructors to redirect their efforts based on those results.

Point 4: Align your clicker questions with your goals for students

This is important! Too often, we write clicker questions that simply test students’ ability to recall basic information. Those are, after all, the easiest questions to write. But if our goals are that our students be able to evaluate information, or analyze pieces of an equation – we had better be asking those kinds of questions through our use of clickers, for two reasons: (1) to give students practice in achieving our goals, and (2) to appropriately communicate our expectations to students. Otherwise, students may not find out what’s important to you until the exam, when it’s too late. Clickers define for students, continually, what it means to you to “understand” a topic. See my previous post on “Bloomifying Up” your questions for more tips on this.

Backwards Design: Alignment
Photo caption: Align your goals, instruction, and assessments

So, in summary:

  • Write learning goals for your courses
  • Use clickers to assess student achievement of those learning goals
  • Make sure your clicker questions are aligned with your actual expectations for students.

In the next post I’ll write more about how clickers tie into the research on effective forms of feedback.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon


This is a re-post from my work on the iclicker blog.

There are lots of different types of clicker questions you can draw from (see last post for some examples), but there’s a clear distinction between two types of questions:

Questions that have a right answer
Questions that don’t have a right answer

Questions that DO have a right answer are the canonical fodder of many a clicker-user – we ask students a question, have a list of answer choices that represent common errors that students can make on that question, and ask them to hash it out amongst themselves.

But in the social sciences and humanities, often such questions don’t quite cut the mustard – what instructors in these disciplines often want students to do is to wrestle with course content, arguing for a position or a point, or juxtaposing two theories or points of view. While it’s important to be able to know certain facts, or to understand certain concepts, in order to argue for a position, such knowledge or understanding is not sufficient. Thus, it may be fruitful for instructors in the humanities and social sciences to move beyond one-right-answer conceptual questions, to a broader range of questions that force students to delve into their own experience, opinions, or arguments. Questions with no-one-right-answer can be very effective at doing this.

Note: I do also believe that instructors in the natural and physical sciences could also benefit from the use of more questions with no-one-right-answer, and many of these techniques can be used in such courses. However, for the sake of focus, this article will address only the social sciences and humanities.

A recent article by a pair of sociologists gives a wonderful framework for the types of questions that can be used in sociology, and forms the basis for this article: “A Meeting of Minds”: Using Clickers for Critical Thinking and Discussion in Large Sociology Classes (Mollborn and Hoekstra, Teaching Sociology, 28, 2010).

The authors identify several types of questions for achieving various goals.

Some types of one-right answer questions. The following could be used to address student understanding of the material:

  • Reading quiz questions
  • Concept questions

For example, one might ask students to state ideas from the reading, to test their comprehension or attention. Or, ask students to apply an idea or concept to predict an outcome.

An example from Molborn and Hoekstra:

Does the sex labeling of occupations affect supply-side gender discrimination, demand-side gender discrimination, or both?

  1. Supply side only
  2. Demand side only
  3. Both
  4. Neither
  5. Don’t know/other

In this case, there is a correct answer, “C”. Such questions have their utility, but Mollborn and Hoekstra remarked that using too many such conceptually-oriented questions resulted in “a learning community that felt examination oriented, rather than a cooperative exploration of course material, and these questions seemed too ‘detached’ from real-life experience.” This led them to explore other types of questions to support students’ ability to apply, discuss, evaluate, and critique.

Some types of no-one-right-answer questions. On the other hand, a rich set of question types can be used to get at higher order thinking skills, and do not have a right answer:

  • Demographic questions
  • Opinion questions
  • Past experience questions
  • Student-designed questions

Such questions can help an instructor, in an active way, help students wrestle with ideas in a deep way, through relating material to real life experiences and data, critiquing sociological theories and methods, or improving the learning experience.

For example, opinion questions can be used to determine students’ incoming ideas of a topic, and initiate discussion. Or, they might be used after a lecture, to see whether students’ ideas were changed. Students can also be asked to give their opinions as a way to directly build disciplinary skills, such as whether they agree with a particular research findings, or whether they feel a survey question is well-worded.

A particularly powerful example of an opinion question is given by Mollborn and Hoekstra:

How much do you personally think cultural factors explain differences in evidence of violent behaviors between men and women?

  1. Not much at all
  2. A little
  3. They are sometimes useful
  4. They explain most of what we see
  5. Don’t know/other

The social sciences are particularly rich areas for such questions, since the topic of study are very accessible to students through prior experiences and knowledge. However, the physical sciences can also leverage students’ interest in course material through such opinion or prior knowledge questions, which can motivate students to engage in the lecture topic.

Past experience questions are also highly relevant in the social sciences, since we (humans) form the unit of study in the social sciences. Students can compare their experiences with those of other students in the class, or to the population of a particular research study. One of my favorite example questions in this area is given in the paper:

When you were growing up, which of you parents earned the most money?

  1. Don’t have two opposite sex parents, one or both didn’t work/ varied year to year
  2. Dad usually earned a lot more
  3. Dad usually earned a little more
  4. Mom usually earned a lot more
  5. Mom usually earned a little more

Dr. Mollborn tells me that, in her class, students usually don’t quite believe that gender can influence wages, and that the gender-wage gap seems like something that happens to other people. However, class results for this question consistently show the same results as in published research studies – that men typically earn more than women in a household.

Demographic questions are another question type that are highlighted in the article. In the social sciences, often data is presented about a particular population – such as ethnic diversity. In the article, the authors describe asking students to use clickers to indicate their race, and overlaying this data with US census data, and using this to prompt a discussion of how the wording of survey questions can impact results.

Lastly, the authors recommend that upper-level students can even design their own clicker questions as part of an assignment.

All these types of questions place a heavy emphasis on discussion – but unlike conceptual questions, where students need to discuss the answers with their neighbors in order to construct their understanding – with no-one-right-answer questions the power often lies in the aggregate student responses to the question.

Thus, the instructor must learn to gracefully facilitate a whole-class discussion using the results of such real-time class statistics to build the point that he or she wishes to emphasize. Often, such no-one-right-answer questions serve as a springboard for a broader class discussion.

Reference: “A Meeting of Minds”: Using Clickers for Critical Thinking and Discussion in Large Sociology Classes (Mollborn and Hoekstra, Teaching Sociology, 28, 2010).

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon


Opening your eyes to new types of clicker questions

August 25, 2014

This is a re-post from material that I’ve shared on the iClicker Blog. One of the best things that I think you can do to get fresh ideas for clicker questions is, simply, to look at lots of different types of questions. One of the things that I have enjoyed the most about giving workshops […]

Read the full article →

FTEP workshops on learning goals and clickers

August 12, 2014

I am giving a set of three workshops on learning goals and clickers at the University of Colorado; here are the slides and handouts for participants to download.  Please let me know if you have any problems with these or are looking for something that’s not here.  (I gave a similar set of workshops at […]

Read the full article →

Spreading reform – beyond development and dissemination (Raina Katri, #aaptsm14)

August 11, 2014

I’m catching up on some blog posts from the AAPT meeting.   I have to say, it’s nice to blog again, and I hope to make some time for it in the future! Writing a grant?   One effort that I wanted to make sure that more people know about is the Increase the Impact project […]

Read the full article →

Lessons learned from 8 years of institutional transformation (#aaptsm14)

August 7, 2014

I was so busy blogging about everybody else’s presentations that I haven’t had a chance to write about my own talk at AAPT!  I’ve been working madly for the past few months to pull together a monstrosity of data on the outcomes and lessons learned from our work in the Science Education Initiative at the University […]

Read the full article →

Plenary at #perc2014: Carl Wieman and the future of PER

August 1, 2014

My mentor Carl Wieman was called upon to synthesize some of the main themes of the physics education research conference (PERC) this year.  Here are some of the things he discussed.  Note, he had a hard job, to try to draw some meaning from a lively conference with a short preparation time! Talking to some of […]

Read the full article →

Apples vs Oranges: MOOCs vs Brick-and-Mortar course (Mike Dubson #aaptsm14 #perc2014)

July 30, 2014

The PERC bridging session was kicked off by my colleague Mike Dubson, regarding an experiment we ran at Colorado with a MOOC vs traditional university courses. MOOCs have been hailed as revolutionary educational technology. What other revolutionary technologies have affected education? The printing press, the gasoline engine (allowing us to eliminate one-room schoolhouses). But there […]

Read the full article →

Peer Instruction and Student Preparation (#AAPTsm14)

July 29, 2014

I write a lot about the effective use of clickers and peer instruction, so I was excited at AAPT to see a talk with some interesting results on this educational technique.  Judy Vondruska (South Dakota State University) spoke about the “Influence of previous subject experience on interactions during peer instruction.”  She was using clickers like […]

Read the full article →

The gap between knowledge and practice (#AAPTsm14)

July 28, 2014

I’m at the American Association of Physics Teachers conference this week, and will be liveblogging from a few sessions. One my main interests is in how to support successful uptake of innovative educational techniques.  My talk on Wednesday will focus on some of the outcomes from the Science Education Initiative at Colorado, and lessons learned […]

Read the full article →

“Because the research tells me so”: Best practices in facilitating peer instruction

June 14, 2014

This is another repost from an article I wrote on the great i>clicker blog. — As a follow-up to last month’s post, on research showing that peer discussion helps students learn I’d like to share a variety of the messages that are coming out of the research on clickers and peer instruction – with particularly pertinent implications […]

Read the full article →

Do students learn by talking to each other?

May 30, 2014

Here is another re-post from an article I wrote on the i>clicker blog. —- This month I’d like to highlight a study which I think is crucially important in cementing the value behind peer instruction. It’s not new work anymore, but it so elegantly answers a key question – “do students learn by talking to […]

Read the full article →

New videos on undergraduate biology instruction

May 20, 2014

I’m happy to share the news about a new set of Creative Commons videos on undergraduate instruction – the Scientific Teaching series from iBiology:  These videos are all Creative Commons licensed so you can use them in your workshops, etc.  They have a newsletter you can sign up on to find out about new releases, […]

Read the full article →

How can you make a “good” clicker question GREAT?

May 16, 2014

This is another re-post of a blog post at the i>clicker blog. —- Sometimes we can be lucky enough to have access to a great set of clicker questions (see, for example, the list at But often a good set of questions for our course doesn’t exist, or another instructor’s questions don’t quite fit. Or, […]

Read the full article →

Getting students on-board with clickers and peer discussion

May 2, 2014

I have been blogging recently for the i>clicker blog (which has a lot of great articles on clicker use).  With their permission, I am reposting some of my articles here. —- I work a lot with faculty who are considering using clickers and peer instruction. Many faculty confide in me that they are concerned that students […]

Read the full article →

Free #clicker webinar: Facilitating Peer Instruction Effectively

January 25, 2014

I’m giving two free webinars this coming Wednesday on the use of clickers in the classroom to promote student discussion.  I’ve given a lot of these and they’ve always been very well received, come join us, it should be a good time!  Each is one hour long. 11 am PT / 2pm ET:  Recording  (I […]

Read the full article →

George Washington U. clicker workshop – Dec 10th

December 10, 2013

I am giving a workshop at George Washington University on the effective use of clickers, along with my wonderful colleague Stefanie Mollborn from Sociology.  This is a four-part half-day workshop, including information on facilitation, question writing, and tips for success. Do you want to learn how to use clickers – or any student voting technique […]

Read the full article →

Why I donated to PhET for #GivingTuesday

December 3, 2013

When I first came to CU from the Exploratorium — the premiere hands-on, “tinkering” science museum in the world — I was pretty disdainful about the idea of spending a lot of resources creating interactive simulations.  These aren’t hands-on, I thought, they’re fake, they’re missing the point.  Then I got to know the PhET simulations ( […]

Read the full article →

PhET is looking for a K12 specialist!

October 30, 2013

I work part-time with the PhET Interactive Simulation project (, which many readers are familiar with.  They have a rare position open, focusing on simulation design and use at the K12 level, and I wanted to share with you all!  Please share this announcement with others who might be interested. The online posting can be found […]

Read the full article →

Getting students to buy-in to non-traditional instruction

August 26, 2013

As the new semester is starting up, many of you are considering how to best promote student engagement in your course  – especially if you use non-traditional, research-based forms of instruction such as clickers, student discussion, or group work. We have a compiled set of approaches and materials, representing how instructors around the country help to […]

Read the full article →