New meta-analysis: Active learning improves student performance

by Stephanie Chasteen on March 27, 2015

It’s not quite so new anymore, but still exciting!

While we have more and more data that active learning techniques improve student learning, this field has been sorely needing a systematic review of the evidence on active learning. Recently, a crackerjack team of education researchers stepped up to the plate with just what I’ve been looking for; a lovely meta-analysis that highlights the impact of active learning – across disciplines, classrooms, and study designs (Freeman, Eddy, McDonough, Smith, Okoroafor, Jordt, and Wenderoth, PNAS, 111(23), 2014).

A meta-analysis is a study in which other studies (rather than people) are themselves the object of research. For this meta-analysis, 225 studies were chosen
to determine whether active learning improves exam scores and/or lowers failure rates. Active learning was defined (after extensive analysis) as follows:

“Active learning engages students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert. It also emphasizes higher-order thinking and often involves group work.”

All the studies contrasted traditional lecturing with some kind of active learning intervention, which occurred within class or recitation time, in any STEM field. These studies analyzed the effects on student learning on exams, concept inventories, and other types of outcomes. The power of a meta-analysis is to be able to coalesce such diverse outcome data into a single measure – the average effect size. In this case, the effect size is a weighted average of the differences between traditional and active-learning classrooms, adjusted to a common scale.

Here are the core findings of the study:

  1. Student performance on exams increases by 0.47 standard deviations in active learning classrooms – roughly equivalent to 0.3 point increase in final grade.
  2. Students in traditional classrooms are 1.5 times more likely to fail than students in courses with active learning (failure rates were 33.8% and 21.8% respectively).
  3. These results held true across different STEM disciplines, in courses for majors or non-majors, and in lower- and upper-division courses.
  4. Effect sizes were greater for concept inventories than for instructor-written exams.
  5. Active learning had the greatest impact in courses of 50 students or fewer.

Point #4 above is interesting, that the effect sizes were greater for concept inventories than for traditional exams. Concept inventories are tests, developed by researchers, to assess student learning of difficult concepts. They do not contribute to a student’s grade, and allows comparison across courses, instructors, and institutions. The authors note that concept inventories often test higher-order cognitive skills (like critical thinking, rather than memorization). Active learning has been shown to have a greater impact on these higher-order skills, which may result in the higher gains on concept inventories. Additionally, instructor exams are likely to have variable levels of reliability and validity – in short, they may or may not measure the student learning that the instructor hopes that they will.

But what about the fact that we tend to publish studies that show an effect, and not publish null results? The authors took that into consideration, and found that an additional 114 studies showing no effect would be needed in order to erase the results on student performance; and an additional 438 studies would be needed to erase the results on failure rate. Also, this effect size is likely an under-estimate due to the fact that active-learning classrooms retain more students (including underperforming students) than do traditional ones. Lastly, as the authors point out, the effect size they calculated nearly matches that of two other meta-analyses.

These results are very compelling, but the authors do note that it’s hard to know if all faculty would achieve these results if they were forced to use active-learning techniques. The volunteer faculty in this study may be more motivated, or more skilled in such techniques. Also, most studies do not report randomized trials, which does reduce the robustness of the results.

Some highlights of the authors’ discussion of the results:

  • If education were a randomized medical trial, such effect sizes on failure rate would be halted for ethical reasons
  • The failure rates cited in the study convert to over US $3,500,000 in lost tuition dollars, which could be saved through the use of active learning techniques
  • Retaining students in STEM disciplines would help meet our national “pipeline” problem – which is often impacted by low passing rates and low grades in STEM courses

So, gosh, it’s hard to justify not using active-learning techniques in class!

Resources:

This is a repost of my article on the i>clicker blog. 

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon

{ 0 comments }

Tutorials in Introductory Physics at CU

by Stephanie Chasteen on March 24, 2015

I just finished a short video on the use of Tutorials in Introductory Physics at the University of Colorado Boulder, and wanted to share it with you all.  It gives a good overview of Tutorials and why you would want to use them.

You can find out more about Tutorials here.

Here is a link to the playlist which will contain all the videos eventually, which will include a history of Tutorials at CU, and facilitation tips.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Reddit Post to StumbleUpon

{ 3 comments }

Using clickers in small classes

March 14, 2015

As more instructors are trying clickers and peer instruction in their courses, I get more questions about how to use them in small classes. I’d like to share a few things I’ve learned through talking with faculty who teach courses of various sizes. The first question I ask is, “what do you mean by small?” […]

Read the full article →

Student motivation to engage with clicker questions

February 27, 2015

I’ve been doing a lot of reading in the educational psychology literature lately, to better understand what the learning sciences has to tell us about student motivation – and how that might relate to what we should do as instructors to motivate students to engage in clicker questions. I wanted to share what I’ve found […]

Read the full article →

Learn the latest advances in physics education… from your living room

January 22, 2015

I’m excited to announce that the New Faculty Workshop videos are online! https://www.physport.org/nfw This is a project that I helped with, doing the filming and editing of the presentations.  For those of you who aren’t familiar with them, the Workshop for New Faculty in Physics and Astronomy is a 3-day workshop for new faculty in physics and […]

Read the full article →

Videos on scientific teaching

January 9, 2015

I wanted to make a pitch for a very nice set of videos on research-based teaching methods:  the  iBiology Scientific Teaching Series.  This is a series of videos about Active Learning in undergraduate biology education, but is applicable across STEM.  They are looking to publicize their videos, and get feedback! From the producers:   The videos include […]

Read the full article →

Feedback codes: Giving student feedback while maintaining sanity

January 5, 2015

One of the most important things in learning is timely, targeted feedback.  What exactly does that mean?  It means that in order to learn to do something well, we need someone to tell us… Specifically, what we can do to improve Soon after we’ve completed the task. Unfortunately, most feedback that students receive is too general […]

Read the full article →

Learning, and assessing, collaboratively: Group Exams

December 29, 2014

I am one of many who are convinced that people learn better in collaboration with others.  However, there’s always this somewhat disturbing schizophrenia when it comes to assessment — we spend all this time emphasizing group work and collaboration, but come exam time — it’s everyone for him or herself. So I was very excited […]

Read the full article →

Free webinar, December 11th: ClickerStarter

December 5, 2014

I’m giving another free webinar for i>clicker this coming Thursday, December 11th, at 10 am ET (7 am PT).  This is called “ClickerStarter for College Faculty” and is intended as a quick primer on the effective use of clickers for those who want an overview of the benefits and uses of clickers. Have you heard […]

Read the full article →

Clicker Q&A

December 4, 2014

As some teachers are just getting things rolling with clickers and peer instruction for the Spring, I thought I would share some questions that faculty have asked me about clickers and peer instruction. This is something I’ve added recently to my workshops, and am really liking it – I ask participants to share their questions in advance, […]

Read the full article →

Why NOT to grade clicker questions for correctness

November 15, 2014

One thing that faculty really struggle with is whether or not, and how much, to give students credit for their clicker question answers. You want to give students some incentive to participate, but grading opens a whole can of worms. One of my faculty workshop participants explained the dilemma very astutely: “If I do not […]

Read the full article →

Measuring and improving students’ engagement

November 2, 2014

I’ve been working over the last year or so to better understand how to promote student buy-in to interactive techniques such as clickers and group work.  That work resulted in a set of resources on how to “frame” students’ roles in the class, especially in the first week. Now I’ve been delving deeper into this […]

Read the full article →

What is effective feedback? And how do clickers provide it?

October 2, 2014

Another re-post from my work on the iclicker blog. Last time I wrote about how clicker questions fit into a theoretical framework of assessment, and some considerations for aligning your clicker questions with your goals for your course. This week I want to review some of the literature on what features and kinds of feedback are most […]

Read the full article →

Backwards design: Where clicker questions fit into a framework of assessment

September 14, 2014

This is a repost of my work on the iclicker blog.   Lately, I’ve been thinking about the purpose and approach that we take in various forms of assessment. Today I’d like to step back into a little bit of theory-land, and consider a broader framework of assessment, and the ways that clickers fit into […]

Read the full article →

Using clickers in social sciences and humanities: No-one-right answer questions

September 4, 2014

This is a re-post from my work on the iclicker blog. There are lots of different types of clicker questions you can draw from (see last post for some examples), but there’s a clear distinction between two types of questions: Questions that have a right answer vs. Questions that don’t have a right answer Questions that […]

Read the full article →

Opening your eyes to new types of clicker questions

August 25, 2014

This is a re-post from material that I’ve shared on the iClicker Blog. One of the best things that I think you can do to get fresh ideas for clicker questions is, simply, to look at lots of different types of questions. One of the things that I have enjoyed the most about giving workshops […]

Read the full article →

FTEP workshops on learning goals and clickers

August 12, 2014

I am giving a set of three workshops on learning goals and clickers at the University of Colorado; here are the slides and handouts for participants to download.  Please let me know if you have any problems with these or are looking for something that’s not here.  (I gave a similar set of workshops at […]

Read the full article →

Spreading reform – beyond development and dissemination (Raina Katri, #aaptsm14)

August 11, 2014

I’m catching up on some blog posts from the AAPT meeting.   I have to say, it’s nice to blog again, and I hope to make some time for it in the future! Writing a grant?   One effort that I wanted to make sure that more people know about is the Increase the Impact project […]

Read the full article →

Lessons learned from 8 years of institutional transformation (#aaptsm14)

August 7, 2014

I was so busy blogging about everybody else’s presentations that I haven’t had a chance to write about my own talk at AAPT!  I’ve been working madly for the past few months to pull together a monstrosity of data on the outcomes and lessons learned from our work in the Science Education Initiative at the University […]

Read the full article →

Plenary at #perc2014: Carl Wieman and the future of PER

August 1, 2014

My mentor Carl Wieman was called upon to synthesize some of the main themes of the physics education research conference (PERC) this year.  Here are some of the things he discussed.  Note, he had a hard job, to try to draw some meaning from a lively conference with a short preparation time! Talking to some of […]

Read the full article →