Like it? Share it!

Sign up for news and updates!






Enter word seen below
Visually impaired? Click here to have an audio challenge played.  You will then need to enter the code that is spelled out.
Change image

CAPTCHA image
Please leave this field empty

Login Form



Success Story Time! PDF Print E-mail
Swift
Written by Barbara Drescher   

We often talk about the need to learn the process of science, rather than simply memorizing the things that science has discovered. In fact, “science literacy” is defined by most as a combined knowledge of process and information. Indeed, the national science education standards state that “A literate citizen should be able to evaluate the quality of scientific information on the basis of its source and the methods used to generate it.”

Evaluating the quality of scientific information is not easy, especially in areas in which the methods are complex. In my own field of psychology, most undergraduates are terrified of the core requirements for the major: intermediate statistics and research methods. These are notoriously difficult courses, but without them it is nearly impossible to evaluate the quality of any study in the field.

So if college students have a tough time understanding the scientific method or how to use knowledge of it to properly evaluate claims, how can we expect middle school students to learn this?

One solution is to teach the rather specific skill of skeptical thinking—separating science from pseudoscience. Because this approach is critical by nature, it promotes scientific thought in general. When we look for signs that something is pseudoscientific, we think about what methods would make it scientific. When we consider what questions need to be answered, we figure out how to answer them. For example, if someone claimed that studies showed that a bracelet had special powers to increase athletic ability, what would you want to know about how that study was conducted? The list (which might include how many people tested it, if all wore the bracelet, etc.) is one that tells you how to design a good study.

There are many colleges and universities with courses teaching about pseudoscience, but primary and secondary schools rarely have room in the curriculum. Supplementary programs are becoming more common and there are good indications that these programs are effective.

VanDyke
Dr. Van Dyke talks about the pseudoscience of creationism

David J. Van Dyke received a JREF Educator Grant to execute such a program. He held a two-week workshop during which he taught a group of 8th Graders about the difference between science and pseudoscience using examples of claims from skeptic history, such as “The Mars Effect” and “Orgone Energy”.

Dr. Van Dyke is no stranger to this approach. His dissertation, available for download here, examined the relationship between origin beliefs (e.g., special creation) and science achievement. The good news is that beliefs do not seem to be a barrier to learning science for high school students, but they need the instruction to get there. Supplemental instruction in pseudoscience is an excellent way to increase science literacy.

The students in this year’s program recreated Emily Rosa’s “therapeutic touch” experiment, then chose a topic and designed their own double-blind study to test a claim. They presented their findings at a local science fair in a special category.

TheraputicTouch
Students recreate Emily Rosa’s
“Theraputic Touch” experiment

But how do we know that they learned anything? We know because Dr. Van Dyke measured it.

The students took a pretest and a post-test to assess learning. On average, student scores increase by 6.7 points out of 20—increases ranged from one to 14 points and none of the students’ scores went down.

This is often the best that we can do in real-world situations, but it leaves us with some questions. Any gains in scores could easily be attributed to something called “testing effects”—on average, most people perform better on a test the second time through it simply because they have experience with the test. This is why we need to compare treated subjects with those who have not received the treatment (control subjects), and he did. The average score of students in the control group remained the same, so even testing effects were not present. For the statistics-minded among you, a t-test comparing the change scores produced a Cohen’s d of 2.3 and a p-value of less than .001.

Furthermore, gender was not a factor in either performance on the pretest or the effects of the course. Both groups were ethnically and economically diverse.

In other words, instruction clearly mattered. The program works.

This was a volunteer program. The kids were not graded on participation and were not required to participate, so why did they? I think there are two things that make this program work. First, the students were rewarded for completion of the program with a book (Bausell’s “Snake Oil Science”). Second, and perhaps more important, they presented their findings at a local science fair. The opportunity to show off one’s work and compete for recognition (the fair was judged) is a strong motivator. It makes the work exciting and fun.

The JREF is proud to have played a role in the work Dr. Van Dyke has done and is equally proud to offer other educators the opportunity to conduct their own study sessions by providing the basic materials he used. We have edited the lesson plan slightly to ensure that it is flexible enough for many situations, but otherwise the materials are intact.

The JREF offers a limited number of grants to educators, including scholarships to attend The Amaz!ng Meeting. All educators can apply for the TAM Teacher Grant here. If you are an educator with a project you would like to implement, watch Swift (at randi.org) for announcements regarding the next round of awards.

If you would like to help us create more of these success stories and invest in our most precious resource, our teachers, a gift to the James Randi Educational Foundation is tax-deductible. You can also contribute directly to the TAM Teacher Grant, which provides registration to The Amaz!ng Meeting to educators here.

Download:

Pseuscience verses Empiricism Lesson Plan

What is Scientific? PowerPoint

Trackback(0)
Comments (9)Add Comment
Quiz tests trivia, not understanding
written by Tim Smith, March 20, 2013
I think this kind of education is very valuable for both students and adults. I'd love to see it spread. I looked at the lesson plan, and tried out the 20-point online quiz linked there:

http://dsc.discovery.com/tv-shows/curiosity/topics/real-science-pseudoscience-quiz.htm

I assume it's the same one the students took. The quiz is made mostly of trivia questions that don't test understanding of skepticism or science. Really only the first question tests fundamental understanding. A typical question from the quiz is:

Practitioners of applied kinesiology (AK) diagnose illnesses and determine treatment (aura testing, muscle testing, saliva testing, skin testing).

Critical thinking skills and science literacy are no help in trying to answer these questions.

The results of the test, then, don't demonstrate that students learned the fundamental skills needed to think critically about claims to truth and to spot fakes, quacks and hoaxes in the real world.

I applaud the lesson concept. I understand that with resource constraints, it's reasonable to use a pre-existing test rather than develop a new one. However, I would encourage anyone who wants to implement this lesson plan to find or create a meaningful test, and not use the Discovery Channel's online quiz.
report abuse
vote down
vote up
Votes: +3
John Van Dyke
written by Johnvandyke, March 20, 2013
You are correct.

I should have used a better instrument. I was unable to locate a pseudoscience vs. science test online that met measures of reliability and validity. I considered using John Miller's "Civic Scientific Literacy" study, but that, too, measured just how much science-knowledge had been acquired. (As opposed to thinking critically.)

Next time, I'll write the pre- and post-assessments myself. Live and learn.
report abuse
vote down
vote up
Votes: +1
...
written by daveg703, March 20, 2013
Mr. Smith is correct. That alleged
test
is a trivial waste of time, if you're trying to evaluate someone's grasp of skepticism and the scientific approach to a situation or a problem. As with most tests, it is created with the (often unjustified) belief of the designer that it will accomplish its purported mission. Unfortunately, that belief is little more than an assumption based on questionable evidence, with no standard of validity that can be applied. It comes down to the question of reliability versus validity. It is very easy to be consistently wrong.
report abuse
vote down
vote up
Votes: +0
Doing what is practical is better than doing nothing at all.
written by badrescher, March 20, 2013
Although the measure is far from ideal, it is also far from "a waste of time". It is extremely difficult to measure how well lessons transfer to other situations and it is extremely important to keep in mind the goals and scope of this project.

The project was not a scientific experiment. It was a course. Measuring its effectiveness is secondary, especially considering that these types of courses are not new or untested.

Furthermore, if anyone expects to teach critical thought in a single, two-week, one-hour/day course to 8th Graders, I think they've got a challenge on their hands. What these courses do is introduce the concepts of questioning claims. That is a huge step toward critical thinking, but it is just a step. Many of the questions in that quiz measure just that - the willingness to question claims. It is not "a waste of time" simply because it is not ideal.
report abuse
vote down
vote up
Votes: +3
Doing what is practical is better than doing nothing at all.
written by badrescher, March 20, 2013
Although the measure is far from ideal, it is also far from "a waste of time". It is extremely difficult to measure how well lessons transfer to other situations and it is extremely important to keep in mind the goals and scope of this project.

The project was not a scientific experiment. It was a course. Measuring its effectiveness is secondary, especially considering that these types of courses are not new or untested.

Furthermore, if anyone expects to teach critical thought in a single, two-week, one-hour/day course to 8th Graders, I think they've got a challenge on their hands. What these courses do is introduce the concepts of questioning claims. That is a huge step toward critical thinking, but it is just a step. Many of the questions in that quiz measure just that - the willingness to question claims. It is not "a waste of time" simply because it is not ideal.
report abuse
vote down
vote up
Votes: +3
John Van Dyke
written by Johnvandyke, March 20, 2013
Another problem with Miller's study is that it was written for adults, not middle-schoolers.

I also considered standardized assessments.
The MEAP test (Michigan's standardized assessment) measures the scientific process, but the ISTEP (Indiana's) does not. The problem with the MEAP, of course, is that I only had the students for two weeks and taking the science portion of the MEAP would have encompassed at least 1/4 of the sessions.
report abuse
vote down
vote up
Votes: +0
...
written by daveg703, March 20, 2013
smilies/smiley.gifI see that Mr. Van Dyke's comment slid in a split second ahead of my response to Mr. Smith. On my screen it was immediately after Mr. Smith's comment, but the Comment Counter read 3 comments. Be that as it may, what we have here is a fine micro example of the scientific approach, complete with peer review, acknowledgement of error, self-correction, and the promise to do better. Wonderful! Now if only Dr. Oz was amenable to participation in this healthy process. smilies/wink.gif
report abuse
vote down
vote up
Votes: +1
...
written by Johnvandyke, March 20, 2013
It happens.

I wouldn't say the intervention itself was a waste of time by any reasonable measure. Most of the students didn't know the difference between astrology and astronomy (just one example) prior to the study, and informal formative assessments throughout the study confirm they were able to differentiate the two.



report abuse
vote down
vote up
Votes: +0
Dr. Oz?
written by Johnvandyke, March 20, 2013
I was disappointed to read Dr. Oz promotes quack medicine, which was noted here.
report abuse
vote down
vote up
Votes: +1

Write comment
This content has been locked. You can no longer post any comment.
You must be logged in to post a comment. Please register if you do not have an account yet.

busy