photoThe PERC bridging session was kicked off by my colleague Mike Dubson, regarding an experiment we ran at Colorado with a MOOC vs traditional university courses.

MOOCs have been hailed as revolutionary educational technology. What other revolutionary technologies have affected education? The printing press, the gasoline engine (allowing us to eliminate one-room schoolhouses). But there are many other “revolutionary” educational technologies: Movies, radio, television, personal computer, internet, and now… the MOOC. Why do we assume that a complex social issue such as education can be solved with a technological fix?

These all follow the Gartner hype cycle; we see a peak of inflated expectations, and then a trough of disillusionment, until we eventually reach a plateau of productivity. MOOCs are now near the trough of disillusionment; where will they settle out?

At CU Boulder, a MOOC provided an opportunity for an educational experiment. We taught our brock and mortar introductory course (PHYS1110) versus Coursera’s Physics 1. The two courses were made as similar as possible; a 50 minute lecture with clicker questions and clicker discussion. The lectures were thus edited down to about 30 minutes of lecture delivery and clicker questions. The reading assignments and schedule was the same, but the online course was 12 weeks. The same homework and exams were used for both. However, the MOOC students didn’t have a recitation meeting with the Washington Tutorials, or access to the physics helproom.

So, how did the two courses differ? The Coursera course, of course, had a much higher attrition rate; 41% of the students who attempted HW2 took the first exam, but the numbers continued to drop thereafter. (It’s been suggested to use the number of students who take the 2nd homework as a true measure of enrollment, as the number of enrollees is large and irrelevant). However, a lot of students watched all the lectures, even if they didn’t do the assignments.

They gave students the FMCE at the start and ending of the course. The FMCE pre-test scores were surprisingly similar between the two courses. The MOOC students are older, better educated, and more are international, and more are female.

The exam scores for MOOC students was slightly better than for the brick and mortar course, with a similar distribution. However, he found that because there was a small attribution rate in the brick and mortar course, the weaker students (as measured by FMCE pretest score), tend to stick around in the brick and mortar course, but drop out of the MOOC. The end result is at the end of the course, the MOOC students are better prepared than the students in the brick and mortar course. Understandably, then, the MOOC students better on the final exam, but they score as you would expect given their FMCE pre-test score. The FMCE post-test scores for the MOOC students was 80%, which is quite high.

So…   MOOCs are great, they can be very effective if the student is well-prepared and well-motivated. They’re like a new kind of public library. But students spend thousands of dollars to come to a brick and mortar campus because learning is hard, and they need to be immersed in an environment that supports and creates in them into a culture of learning. The job of teachers can’t be mechanized, even given all the hype.

Questions included:

  • What was the cost to do the MOOC? About $10,000 in equipment, and about $50,000 in faculty and staff time – which is unsustainably large. This was a full-time job. But the second time, it only required about 3 hours/week.
  • How was this as an educator, was it satisfying ? I was initially discouraged by the lack of contact with students, so was just concentrating on materials. But I gradually got drawn into the discussion boards, and felt that I was making contact with a few hundred dedicated students.
  • What is the percent of time spent in passive vs. active learning in the MOOC?   We are still mining that data, but I would guess it’s mostly passive due to the time to watch the videos.
  • Can we get an active learning experience like tutorials in the MOOC?   It would be difficult to get students organized to work together, but perhaps Google hangouts would be an option.  Students did organize Facebook study groups, which we didn’t join.
  • Why did students not persist, or fail?  We had a pre-survey regarding their interests and motivations.  People who interested they would stick with the course no matter what were likely to fail; but those who said they would try very hard were more likely to persist. The thing is that it’s difficult to get information about who dropped the course.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon


Peer Instruction and Student Preparation (#AAPTsm14)

by Stephanie Chasteen on July 29, 2014

I write a lot about the effective use of clickers and peer instruction, so I was excited at AAPT to see a talk with some interesting results on this educational technique.  Judy Vondruska (South Dakota State University) spoke about the “Influence of previous subject experience on interactions during peer instruction.”  She was using clickers like a pro, including peer discussion, but she consistently found that about 10-15% of students indicated that they didn’t find those discussions to be beneficial.  She found tht a lot of those people were those without high school physics.  She wondered if there were some issues that were being masked by her lumping all clicker outcome data together – were the discussions actually less beneficial for some groups, in terms of coming up with the right answer?

So, she looked across several different questions (I missed how many) with 46 students, some of whom had had high school physics and some who did not.  Students were in pairs where both had had high school physics, both had not, or one had had high school physics and the other had not.  She then looked at whether the pairs had the correct answer to the clicker question before discussion and after discussion.   The most interesting categories (and those comprising the most students) are those who got the correct answer both before and after discussion, and those who got the incorrect answer before discussion and changed to the correct answer after discussion.

Here are the results:

  • Both students have HS physics – 50% correct before discussion – 28% switch after discussion = ~80% correct by the end of the question
  • One student has HS physics – 36% correct before discussion  - 21% switch after discussion= 57% correct by the end
  • Neither student has HS physics – 30% correct before discussion – 9% switch after discussion = 39% correct by the end.

As you can see, the students without HS physics not only don’t get as many questions correct, they are much less likely to change their answer due to discussion.  These students were also less likely to indicate that they would like to work with that same partner in the future.

So, discussion may be less useful for students with less background preparation.

This was very interesting to me in light of some previous research by Jenny Knight.  She found that, especially for difficult questions where few students had the right answer before discussion, student groups were able to put together the right answer even though none of them had the right answer in advance.  So, I would be interested to see Judy’s results broken down by question difficulty.  However, Knight et al. also found some nuanced results when looking at weak vs. strong students.  In a majors course, the weak students have large learning gains when they talk to their peers.  In a nonmajors course, however, the weaker students did NOT gain very much from discussing with one another.   Hypothesized Knight, “One likely reason for this difference is that nonmajors were less inclined to regard their peers as learning resources.”

So, this seems to be a real phenomenon, though it’s not clear whether it’s in majors and non-majors classes.  I’ve seen this happen in my own nonmajors courses; it’s really hard to get good discussion going among the students on clicker questions, unless they already know their stuff.  They seem shy, and just not that interested.   So, the question remains, how can we make peer discussions among weaker students more valuable, and encourage students to participate in them?  I wonder if this relates to a fixed-growth mindset, where these weaker students might feel like either they know it or they don’t (fixed mindset), and so they just need to get the answer from an authority figure (the instructor), instead of recognizing that the process of reasoning through the ideas can help them improve (a growth mindset).

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon


The gap between knowledge and practice (#AAPTsm14)

July 28, 2014

I’m at the American Association of Physics Teachers conference this week, and will be liveblogging from a few sessions. One my main interests is in how to support successful uptake of innovative educational techniques.  My talk on Wednesday will focus on some of the outcomes from the Science Education Initiative at Colorado, and lessons learned […]

Read the full article →

“Because the research tells me so”: Best practices in facilitating peer instruction

June 14, 2014

This is another repost from an article I wrote on the great i>clicker blog. — As a follow-up to last month’s post, on research showing that peer discussion helps students learn I’d like to share a variety of the messages that are coming out of the research on clickers and peer instruction – with particularly pertinent implications […]

Read the full article →

Do students learn by talking to each other?

May 30, 2014

Here is another re-post from an article I wrote on the i>clicker blog. —- This month I’d like to highlight a study which I think is crucially important in cementing the value behind peer instruction. It’s not new work anymore, but it so elegantly answers a key question – “do students learn by talking to […]

Read the full article →

New videos on undergraduate biology instruction

May 20, 2014

I’m happy to share the news about a new set of Creative Commons videos on undergraduate instruction – the Scientific Teaching series from iBiology:  These videos are all Creative Commons licensed so you can use them in your workshops, etc.  They have a newsletter you can sign up on to find out about new releases, […]

Read the full article →

How can you make a “good” clicker question GREAT?

May 16, 2014

This is another re-post of a blog post at the i>clicker blog. —- Sometimes we can be lucky enough to have access to a great set of clicker questions (see, for example, the list at But often a good set of questions for our course doesn’t exist, or another instructor’s questions don’t quite fit. Or, […]

Read the full article →

Getting students on-board with clickers and peer discussion

May 2, 2014

I have been blogging recently for the i>clicker blog (which has a lot of great articles on clicker use).  With their permission, I am reposting some of my articles here. —- I work a lot with faculty who are considering using clickers and peer instruction. Many faculty confide in me that they are concerned that students […]

Read the full article →

Free #clicker webinar: Facilitating Peer Instruction Effectively

January 25, 2014

I’m giving two free webinars this coming Wednesday on the use of clickers in the classroom to promote student discussion.  I’ve given a lot of these and they’ve always been very well received, come join us, it should be a good time!  Each is one hour long. 11 am PT / 2pm ET:  Recording  (I […]

Read the full article →

George Washington U. clicker workshop – Dec 10th

December 10, 2013

I am giving a workshop at George Washington University on the effective use of clickers, along with my wonderful colleague Stefanie Mollborn from Sociology.  This is a four-part half-day workshop, including information on facilitation, question writing, and tips for success. Do you want to learn how to use clickers – or any student voting technique […]

Read the full article →

Why I donated to PhET for #GivingTuesday

December 3, 2013

When I first came to CU from the Exploratorium — the premiere hands-on, “tinkering” science museum in the world — I was pretty disdainful about the idea of spending a lot of resources creating interactive simulations.  These aren’t hands-on, I thought, they’re fake, they’re missing the point.  Then I got to know the PhET simulations ( […]

Read the full article →

PhET is looking for a K12 specialist!

October 30, 2013

I work part-time with the PhET Interactive Simulation project (, which many readers are familiar with.  They have a rare position open, focusing on simulation design and use at the K12 level, and I wanted to share with you all!  Please share this announcement with others who might be interested. The online posting can be found […]

Read the full article →

Getting students to buy-in to non-traditional instruction

August 26, 2013

As the new semester is starting up, many of you are considering how to best promote student engagement in your course  – especially if you use non-traditional, research-based forms of instruction such as clickers, student discussion, or group work. We have a compiled set of approaches and materials, representing how instructors around the country help to […]

Read the full article →

PhET Simulations: Now on tablets! And a new logo!

August 21, 2013

Two big announcements from the PhET Interactive Simulations project! New!  Now for touch screens! First, PhET has been working their techie little butts off for quite a while to port their simulations over to HTML5.  No, I didn’t know what HTML5 was before this project started either.  It doesn’t really matter except that (a) it’s […]

Read the full article →

Clickers 101: Free webinar on Weds

August 19, 2013

Are you a college faculty member interested in clickers?  Come to our free, introductory webinar on Wednesday, 10:00 PT / 1:00 ET. To register, and for other webinars in this series, see (Note the session on October 30th geared towards humanities and social sciences, by my colleague Angel Hoekstra at CU Boulder). Handouts and slides […]

Read the full article →

Series of workshops on clickers and learning goals

August 19, 2013

I just completed a series of workshops on writing learning goals and using clickers to help with student achievement of those learning goals. You can find all the workshop materials on our website at the Science Education Initiative.  (Look for Past Workshops).  Includes handouts and slides, and you can download a zip of all materials. […]

Read the full article →

Postdoc job to transform UG courses at Colorado + STEM Center Director in Boston

July 26, 2013

Looking for a postdoc position in science educational research and course transformation?  Two exciting opportunities here at CU Boulder; these are fairly similar positions to my work here in the Science Education Initiative.  I get a lot of queries about where to find such positions, so hopefully this announcement will get out there to the […]

Read the full article →

How math anxiety affects performance (#PERC2013)

July 24, 2013

My other favorite talk at AAPT/PERC was by Sian Beilock (University of Chicago, Psychology), titled “Academic Performance under stress.”  Who would have guessed that from such an innocuous title would spring an intensely interesting, well-researched, sparklingly-clear exposition.  It is so refreshing to find a speaker who has clearly worked hard to communicate her field, and I’m […]

Read the full article →

Transformative experiences in science education (#AAPTsm13, #PERC2013)

July 23, 2013

One of the better talks at AAPT/PERC last week was one by Kevin Pugh of the University of Northern Colorado (Psychology dept).   Kevin discussed the psychology of a phenomenon that we are probably all implicitly familiar with as instructors, but wouldn’t generally consider to be the topic of scholarly work:  Under what conditions does […]

Read the full article →

Moving beyond telling faculty about educational innovations #aaptsm13

July 18, 2013

This post details a talk by Chandra Turpen about how faculty decide to adopt new instructional methods. A lot of previous work by Charles Henderson and Melissa Dancy has shown that the “develop and disseminate” model doesn’t work.  This is business-as-usual for educational innovators:  We develop innovations, share them at conferences and in papers, explain […]

Read the full article →