Over the past several years, I’ve been working with the PhET Interactive Simulations to determine how best to do video “walk-throughs” of their simulations for teachers, showing the main features and how they can be used in a teaching setting. We call these short videos “teacher primers,” and I’ve got a set of them that I’ve made up online now that I wanted to share: (note that you will need to be logged into the PhET site to view them — that’s to keep students from short-cutting the learning process by looking at the primers. Click on “for teachers” to find the primer.).
- Color vision
- Resistance in a wire
- John Travoltage
- Balloons and Static Electricity
- Faraday’s Law
- Wave on a String
- Ohm’s Law (coming soon)
It was a great pleasure to work with master teacher Mike Dubson on most of these. I am but an instrument, bringing his great ideas to life. There are also other great primers out there on a lot of other HTML5 sims. We certainly learned a lot while we were putting these together, both about the technical requirements of making a walk through, but also how to guide the viewers’ eye as you talk them through a visual landscape. These are fun to make, but more work than you’d think! If anyone needs access to our guidelines for doing these, just let me know.
(liveblogging from the AAPT). Adrian Madsen shared some work to identify faculty ideas and beliefs around research based assessments in physics, such as concept inventories (think FCI) or non-content instruments (e.g., CLASS). This work is part of a project by PhysPort.org to collect research-based assessments on a website, to provide a more coherent portal to such assessments so they’re more accessible to the broader faculty.
To design the site, they undertook a wide variety of semi-structured interviews with faculty, to figure out what their needs are regarding assessment.
The following themes emerged:
- Faculty have practical needs: How do you find and administer assessments, how can you score and interpret them? PhysPort is addressing these by organizing the assessments, providing guidance in using them, and giving automated analysis through the Data Explorer.
- They believe research based assessments are limited. They might not be well aligned to the course the faculty member is teaching, or be hard to interpret in a small course, and they are concerned about the content that is covered by the assessments. These are valid concerns, and suggest a need for more flexible assessments, which assess non-content knowledge, and that these research-based assessments can be coordinated with other assessments.
- Faculty want help. What do the scores mean? How do I compare to other faculty? How do I use these results to improve my teaching? How can I talk to colleagues about my scores? So, community resources are helpful (like learning communities), as well as more accessible information and comparison.
- Faculty consider broader contexts. There are programmatic assessments they need to consider, as well as accreditation requirements. Some faculty are also skeptical of these assessments, feeling that they will limit their academic freedom. To address these issues, PhysPort is currently working on creating departmental assessment tools which can feed into accreditation reports.
So, faculty make decisions about their teaching in complex ways, and what we really need are different kinds of data, as well as help in interpreting data, rather than just more data. Physport’s assessment page is at http://physport.org/assessments, and the Data Explorer, where you can upload your data and compare it to national samples, is at http://physport.org/DataExplorer. Their Expert Recommendations which give helpful guidance are at http://physport.org/expertrecommendations.