Computer simulations are a fantastic tool in education — especially in science. They can show us the models that we have our heads, like how atoms pack themselves into a molecule, or let us travel to the moon to see how gravity works. They can give us insight into the invisible, or let us see exactly what’s supposed to be happening if the world were really a frictionless vacuum. Some great simulation tools are the PhET interactive simulations, or the Physlet applets. But we too often tend to show students the “right” model, the “right” way of doing the problem, or try to show them the “correct” way of thinking about it. But, think about it, does a wine expert learn what the taste of “oak” is in a Cabernet only by tasting the very best examplar of “oak”? No, he tastes a wide variety of wines, to learn to distinguish flavors of “oak” from flavors of fruit or smoke. This idea of using “contrasting cases” to help students learn to discern and differentiate different features of a problem has been used in a variety of educational settings, from teachers’ understanding of educational psychology, to interactive lecture demonstrations. If you’re not familiar with this idea, or its twin “A Time for Telling”, take the time to at least skim the seminal article by Schwartz and Bransford in Cognition and Instruction: A Time for Telling. This article changed the way that I think about teaching and lecture.
But, how does this relate to simulations? Most simulations tend to do the best they can to depict and accurate vision of how the world really works. And that’s fine, that’s what the design goal tends to be. However, we also want to teach students to be critical consumers of information — yet they tend to blindly trust simulations. So, I was very pleased to see a recent article in The Physics Teacher by Anne Cox et al, where they took the instructional idea of asking students to identify”What, if Anything, is Wrong?” and applied it to instructional simulations. The “What, if Anything, is Wrong?” technique is a wonderfully simple and powerful tool from the TIPER project — check out their website for a variety of other little gems, such as ranking tasks, working backwards tasks, and predict and explain.
The authors of this study not only had students identify the error in the simulation, but provided the code for them to fix it; thus building in computational skill into the classroom practice as well. For example, we know that students tend to do poorly on questions regarding the electric force on a charged object due to another charged object. The authors created a simulation where students can add and move charges around, visualizing the force vectors on the object. However, if they happen to change the charges so that the charges are not equal, they will find that they will no longer push on each other with equal and opposite force (as required by Newton’s Third Law). So, not only must they identify this error, but also find out what in the simulation is causing it.
Regardless of whether your aim is for students to be able to do computational physics, the brilliance of this task is that students go in knowing that they’re looking for an error — and if that error is one that students often make themselves, then finding it is both challenging and illuminating, and MUCH more powerful than just telling students what is the proper way to think about that concept. Now, it’s cemented for them. The authors’ “What is Wrong?” package for electric fields is available on Open Source Physics.
For those who don’t happen to have the time or resources to create intentionally incorrect simulations, you can still use this same method with pencil-and-paper tasks, such as the TIPER “What, if Anything, is Wrong” tasks, or “Find the Flaw” problems. Daniel Styer write about Find the Flaw problems in the same recent issue of The Physics Teacher. He has a very nice method for giving these problems. He presents the problem to the students, and tells them that four friends have worked the problem and produced four different answers. He asks students to provide simple reasons showing that three of these candidate answers must be incorrect. So, this is basically a multiple choice problem, with the focus on the incorrect answers, rather than the correct answer. He provides some examples here. I can imagine using this method with clicker questions and peer instruction — ask the clicker question, but instead of telling students to “find the correct answer” by discussing with their peers, have them determine why the incorrect answers are wrong. This gives students valuable practice in checking their own work — how do they know if their own answer is right or wrong? They should be able to check the values, units, dimensions (or whatever is important in your discipline). And, says Styer, students like them: find-the-flaw problems “appeal to their sense of adventure and of Sherlock Holmes-style sleuthing.” He finds that students are not a bit better at checking their own work, certainly better than when just asking students vaguely to “discuss your result.” What a boring question.
And, of course, this can easily apply to watching films in class. Finding the flaw in films is an old mainstay of science instruction — see, for example, Insultingly Stupid Movie Physics or Bad Astronomy’s Bad Movies or some more geology focused Good and Bad Sci-fi movies. Showing students clips from movies and asking them to identify the incorrect science is fun and valuable.