Providing validated tests for instructors to use (#AAPTsm11)

by Stephanie Chasteen on August 4, 2011

Our plenary speaker this morning was Thomas Holme, of Iowa State University, speaking to us about the standardized assessments in chemistry.  Sounds boring, but he raised some interesting and insightful thoughts about assessment.

He started out by describing the fine line he has to walk as an instructor:

“Teaching is inherently personal and inescapably corporate.  The corporate interests are largely articulated in terms of assessment.”

“Assessment” has typically been equated with “accountability,” he pointed out, making it somewhat unppuar as a topic.  This is unfortunate, since assessments are so useful on many levels.

The American Chemical Society (ACS) develops validated exams for use in courses every year, which I found astounding.  Back in 1921 the Division of Chemical Education decided to construct objective tests for use by the education community.  These exams are written by committee, which over a course of meetings set the content coverage, write the items, edit the items, do trial testing in classes, look at the item stats, and then set the exams.

They’ve written a variety of instruments, the full-year chemistry exam, first and second term exams, conceptual exams, and brief exams.  A new exam is added every year.  There is a bit of an issue with publishing average, however; they invite users to send them their course averages.  However, there is a strong self-selection effect, since teachers often don’t send in poor scores.  When they published their pilot test averages, they found that there was a “Lake Woebegone” effect – nobody would send them scores that were below that average.  However, the averages are helpful for instructors because they can use that to set the grade point average for the exam.  (Unfortunately, I missed their solution to this problem).

The process seems quite well thought-out, including content validity, psychometrics,  etc etc.   Good stuff.  And more, of course, than we could expect an individual instructor to do.  This is a grassroots process, he says – ACS isn’t telling anybody how to teach.  They hope the exams will be used in 15-20% of classes.  Hundreds of volunteers put in time to write the exams.  Interestingly, the ACS doesn’t actually fund the test — they make their money from actually selling the test for use by instructors.

“I’m simultaneously jealous that ACS is doing this,” commented Noah Finkelstein, “and happy that APS isn’t doing it because then I’d have to do it.”

Still, we’ve noticed in PER that instructors aren’t using research-based assessments, and typically rely on exams and HW to identify whether their instruction is working.  If they’re not using instruments like the FCI, I wonder if we could develop such research-based assessments to provide instructors  meaningful feedback on student performance, that is also valued by the administration?  But how would we avoid the K12 dilemma of “teaching to the test” and assessing instructors overly on student performance?

{ 2 comments }

BlackGriffen August 4, 2011 at 6:10 pm

“But how would we avoid the K12 dilemma of “teaching to the test” and assessing instructors overly on student performance?”

Not to mention the problem users of “standard” textbooks face of solutions being posted online?

Krishna Chowdary August 5, 2011 at 6:40 pm

Hi Stephanie

(thanks for your reports on the meeting!)

Here’s a link to the ACS Division of Chemical Education Examinations Institute

http://chemexams.chem.iastate.edu/

Comments on this entry are closed.

Previous post:

Next post: