Medical Microbiology at Univ of Florida College of Medicine, Gainesville
(Southwick FS: Spare me the powerpoint and bring back the medical textbook. Trans Am Clin Climatol Assoc 118: 115-122, 2007.)

At the Univ. of Florida (Gainesville) Frederick Southwick changed the way his Medical course in Microbiology & Infectious Disease was taught from a traditional lecture-based (powerpoint intensive) course to one using Just-in-Time-Teaching & Peer Instruction. Their NBME shelf exam percentile scores increased from an average between the 50th & 60th percentile to the 83rd percentile in one year.

Another interesting tidbit. When they switched to using active learning strategies, they relied upon the heavy use of a textbook for background reading by students – readings to be done prior to class. Text book sales skyrocketed.

(Personal Note: Our Medical Pharmcology course at Tulane is fairly traditional in the way it is taught, and almost all lectures use powerpoint. I went down to our Medical School bookstore and asked how many of our “recommended” Med Pharm text books they sold this academic year. The answer was 20. We have a class size of 175 medical students + 25 graduate students. However, they did sell ~100 of the companion board review books. I think I see a parallel here between our Med Pharm course & what the Microbiology course experienced in Gainesville!)


Medical Respiratory Physiology at Wayne State Univ.
(Rao SP & DiCarlo SE: Peer instruction improves performance on quizzes. Adv Physiol Educ 24:51-55, 2000.)

Rao & DiCarlo used peer instruction during 10 classes of 50 min duration. Each class session was divided into 12-20 min short presentations followed by a quiz on the subject discussed. Student answered a total of 35 single best answer questions during the 10 class sessions (256 1st year medical students).

Simple Recall Questions:
Before Peer Instruction: 94.3 +/- 1.8%
After Peer Instruction: 99.4 +/- 0.4% (p<0.05)

Intermediate Questions:
Before Peer Instruction: 82.5 +/- 6.0%
After Peer Instruction: 99.1 +/- 0.9% (p<0.05)

Integrative Questions:
Before Peer Instruction: 73.1 +/- 11.6%
After Peer Instruction: 99.8 +/- 0.24% (p<0.05)

Renal Pathophysiology at UC Davis School of Medicine
(Stevenson F: Clickers. The use of audience response questions to enliven lectures and stimulate teamwork. JIAMSE 17: 106-111; 2007)

Frazier Stevenson reports on his use of an ARS for the past two years in his 2nd year medical course in renal pathophysiology. His results indicate that its use increased lecture interactivity & student teamwork, provided formative feedback & energized both faculty & students. He also describes of the limitations & challenges of using an ARS. His 2nd year medical students rated the educational value of the ARS highly (6.8 out of a 7 point score, n=54). Various descriptors associated with its use were (by percent responding) were:

Percentage of Respondants:
75%: “added needed variety to the session”
70.5% “clarifying”
77.3%: “provided feedback on my understanding”
72.7%: “stimulating”
25%: “confusing”
2.3%: “tedious”


OB/Gyn Residents – Contraceptive Options at UMDNJ-Robert Wood Johnson School of Medicine
(Pradhan A, Sparano D, AnanthCV: The influence of an audience response system on knowledge retention. An application to resident education. Am J Obstet Gynecol 193: 1827-30, 2005)

17 ObGyn residents were randomized into 2 groups. One group was given a traditional lecture on “contraceptive options” (n=9) and a second group (n=8 ) was given an interactive lecture with an audience response system on the same topic. Each group was given a pre-test before the lecture, and a post-test 6 weeks after the lecture. There was no significant difference in pre-test scores (indicating that both groups had similar background knowledge prior to the lecture). Residents given a traditional lecture had only a 2% improvement in scores between the pre-test & post-test. However, the group given the interactive lecture had a 21% improvement between the two tests (p=0.018).

Traditional Lecture:
Pre-test : 80 +/- 2.8%
6 weeks Post-test: 82.4 +/- 2.3%

Interactive Lecture:
Pre-test : 78 +/- 1.4%
6 weeks Post-test: 95 +/- 1.6%

Conclusion: Residents given an interactive lecture had significantly longer retention of lecture material compared to residents given a traditional lecture.


Family Medicine Residents – at Univ. of Illinois, Chicago & St. Elizabeth Hospital
(Schackow TE, Chavez M, Loya L, Friedman M: Audience response system. Effect on learning in family medicine residents. Family Medicine 36(7): 496-504,2004

A prospective controlled crossover study of 24 family medicine residents was performed that compared quiz scores after: a) didactic lectures with no interactive component, b) lectures with an interactive component (questions were asked to participants) but did not use an audience response system (ARS), and c) interactive lectures including the use of an ARS. Post-lecture quiz scores were significantly higher following interactive lectures (w/ or w/o use of an ARS) as compared to didactic lectures w/o an interactive component (p<0.001)

Post-Lecture quiz scores (7 point scale):
Didactic lecture : 61% correct (4.25 +/- 0.28 ) (n=32)
Interactive lecture w/o ARS: 93% correct (6.50 +/- 0.13) (n=22)
Interactive lecture w/ ARS: 96% correct (6.70 +/- 0.13) (n=23)

Quiz scores obtained a month later after the lecture in a follow-up session were lower, but remained highest amongst the ARS group.



Introductory Biology at Rensselaer Polytechnic Institute
(McDaniel CN et al.: Increased learning observed in redesigned introductory biology course that employed web-enhanced, interactive pedagogy. CBE Life Sci Educ 6:243-249, 2007).

A “knowledge gain” was obtained by measuring the difference in performance on a standardized exam given before & after the course (pre-test vs. post-test). It was found that two courses taught with the use of interactive peer instruction & Web-based pre-class activities resulted in a significant increase in knowledge gain compared to the same course taught in a tradition manor:

Course Type & Knowledge gain (post test – pretest)
Traditional Ecology Course Knowledge Gain: 0.10
Interactive Ecology Course Knowledge Gain: 0.19 (p = 0.024)

Traditional Evolution Course: 0.05
Interactive Evolution Course: 0.14 (p = 0.000009)


Biology Courses at New Mexico State Univ.
(Preszler RW et al.: Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses. CBE Life Sci Educ 6:29-41, 2007.)

Preszler et al. (2007) observed a correlation between the extent of course use of response systems & student performance on exams in six different biology courses. Student surveys also showed that the use of response systems increased student interest in courses, increased attendance, and understanding of course content.


Introductory Biology at Univ. of Washington
(Freeman S, O’Connor E et al.: Prescribed active learning increases performance in introductory biology. CBE Life Sci Educ 6:132-139, 2007)

Introduction of active-learning & audience response system – based questions in lectures:

• Reduced failure rates
• Increased exam scores
• Increased attendance
• Students did better on clicker questions if they were graded for correctness vs. for participation.


Developmental Biology (Upper level college course) at Univ. of Colorado, Boulder CO.
(Knight JK, Wood WB: Teaching more by lecturing less. Cell Biol Educ 4:298-310; 2005.)

Knight & Wood compared student learning gains in a large upper-division lecture course in developmental biology before and after changing to a more interactive classroom format. They used a mixture of different classroom formats using Audience Response System and peer instruction. They used a course pre-test vs. course post-test method to assess learning gains after the course was taught with a traditional lecture format vs. one that included a component of active learning. The course taught with an interactive component had a significantly higher learning gain (P=0.001).

Teaching Method Learning Gain (post vs. pre test)
Traditional Course: 16%
Interactive Course: 33% (p=0.001)


Undergrad Exercise Physiology at Wayne Statue Univ.
(Cortright RN, Collins HL, DiCarlo SE: Peer instruction enhanced meaningful learning: ability to solve novel problems. Adv Physiol Educ 29:107-111, 2005.)

A randomized crossover design was used to divide students into two groups of 19 each. The group that was allowed to undergo peer instruction performed significantly better on both multiple choice questions, as well as their ability to solve novel problems.

Solving Multiple Choice Questions:
Traditionally Taught 44 +/- 5%
Interactive Course: 59 +/- 6% (p<0.02)

Solving Novel Questions:
Traditionally Taught: 24 +/- 2%
Interactive Course: 47 +/- 5% (p<0.04)


Anatomy & Physiology I at Univ of Akron (undergrad)
(Laipply RS: Interactive electronic response systems boost understanding and enjoyment in anatomy & physiology. HAPS-Educator, Summer 110-11; 2006.)

A pilot program to increase student learning & confidence was implemented that employed in-class conceptual questions delivered using an Audience Response System, with peer instruction. Student surveys indicated that the new method of teaching:
• improved student performance in class
• increased their willingness to ask questions
• made class more enjoyable.


Microbiology (Undergraduate) at Colorado State University, Fort Collins
(Suchman E, Uchiyama K, Smith R, Bender K: Evaluating the impact of a classroom response system in a microbiology course. Microbiology Education 7:3-11, 2006.)

This study was performed in a large lecture microbiology course given to college juniors. Two renditions of the course (Sections A & B) were taught where a classroom response system (CRS) was used in different ways.

In Section A class began with the use of a multiple choice review question posed using the CRS (on material covered in a previous lecture). Students who answered correctly were given extra credit. In addition the CRS was used to ask questions throughout the lecture, but not for credit. For the questions asked using the CRS during lecture, students were asked to vote once, and then given the opportunity to discuss the question with fellow classmates before voting again (peer instruction).

In Section B, only the first question was asked at the beginning of class using the CRS, worth extra credit if answered correctly.

In both sections, the percentage of students answering the first question correctly was comparable (78.2% vs 82%).

In Section A, the percent of students answering correctly improved dramatically when they were allowed to discuss a question and then re-answer it (after peer instruction), with an average gain of 22%.

On formal exams students were asked questions on material covered in lecture in both sections (non-CRSQs), or by CRS-questions in section A only (CRSQs). Students in Section A performed better on BOTH types of questions compared to students in Section B (P < 0.0001)

Physics at Harvard University
(Rosenberg JL, Lorenzo M, Mazur E: Peer Instruction: Making Science Engaging. In: Handbook of College Science Teaching. Chp 8. Edited by JJ Mintzes & WH Leonard. NSTA Press Book)

Documents that the learning gain in traditional lecture-based courses results in an learning gain (23%) less than half of that obtained in courses involving active engagement / peer instruction (48%).


62 Physics Courses (high school thru college)
(Hake RR: Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66:64-74, 1998.)

Comparison of student performance on physics course pre-tests vs. course post-tests was used to determine a knowledge gain for each course. Data from 14 “traditional” courses (N=2084 students) and 48 courses using “interactive-engagement” (N=4458 students) was collected. The average gain following the interactive engagement courses was almost 2 Standard Deviations above that obtained from traditional courses:

Teaching Method Learning Gain (post vs. pre test)
14 Traditional Courses 0.23 +/- 0.04 (SD)
48 Interactive Courses 0.48 +/- 0.14 (SD)


Physiology Course
(Paschal CB: Formative assessment in physiology teaching using a wireless classroom communication system. Adv Physiol Educ 26:299-308; 2002.)

A wireless audience response system was used to provide instant feedback on in-class quizzes. It stimulated 100% student participation. However, student GPA & course performance data were not significantly affected compared to a previous year.

(Perhaps the absence of peer instruction is the reason? After all, clickers are just another piece of technology. It’s how you use them that counts.)


Medical Pathology Course
(Duggan PM, Palmer E, Devitt P: Electronic voting to encourage interactive lectures: a randomized trial. BMC Medical Education 7:25, 2007)

A group of 127 5th year medical students at the Univ of Adelaide (Australia) were randomized into 2 groups of 63 & 64 students. One group was given traditional lectures, and the other was taught using sessions that incorporated the use of an electronic voting system. The performance on multiple choice questions was not significantly different.

(Note: No peer instruction type interaction between students was used in this study).

Quoted From: Handheld Transmitters Connect Students And Teachers in Class. Science Daily (Feb 8, 2005)

Teaching a large class without a classroom response system is like driving with your eyes closed” said William McNairy, lecturer in physics at Duke… “We don’t drive cars that way, but it is how we teach, with eyes closed until a midterm and a final”.

The feedback from these systems enables students to learn better rather than just finding out at the end that they failed” Sherryl Broverman (Asst Prof Biology, Duke)

We’ve been piloting the use of Tegrity Classroom 2.0 for the past 9 months at Tulane Medical School. I composed a ~7 min “how to” video on how it can be used to record and play back of lectures for those interested. Be forewarned that the volume level of the audio increases substantially during the playback of the lecture. Be prepared to adjust the volume!

I created a “How to” video on creating video podcasts for the April 2008 meeting of the Southern Group on Educational Affairs (SGEA). I thought I would make it available to anyone else who has an interest in the topic. The 22 min video describes how to create a video podcast on either PC or Mac platforms. It also contains a brief commentary on my negative outcome with the use of a course blog at the end (there was a general lack of student interest).

Eric Mazur – a Harvard Physics professor who is one of the pioneers in the use of Audience Response Systems. This article published in the Harvard Journal in the summer of 1995 explains how & why he started using this form of active learning & how it changed the culture of the large class lectures that he teaches.  Harvard Journal 1995

I tested various modes of the Olympus Digital Voice Recorder (Model WS-300M; ~$80 from B&H online) and decided that the following settings gave the best sound when using a Noise-Canceling lapel microphone (Olympus Model ME52W; ~$20). This model of microphone hs a 1 m long cord that plugs directly into the top of the digital recorder & does not need another adapter.

HQ mode – which has a sample rate of 44 kHz. HQ mode is slightly better than Standard Mode (22 kHz), which has a little distortion to it. LP has significant distortion.

Mic SensitivityDict(ation) instead of Conf(erence). This cuts out a lot of background noise and dampens the “high pitch” overtones that I did not like. I am not 100% sure, but there may be more variation in audio volume when using this mode. I tried moving the mic up from mid chest to about 6 in from my mouth (near the first major button below the collar button), and this improved the sound a bit.

File Sizes: The Olympus recorder saves files in the WMA (Windows Media Audio) format, which can be listened to using either the Windows Media or Quick Time players. A 53 min lecture recorded at a 44 kHz sampling rate (HQ mode) created a file size of 12.3 MB.

File Conversion: If you want to upload or play the file on iTunes (e.g. you wish to store the files on an iTunes University website), you will need to convert the file to either MP3 or AAC formats. I used a software program called “Switch” (Vers. 1.05, 2005 Copyright NCH Swift Sound; shareware) on a Macintosh workstation to convert files to MP3 (the conversion takes 4-5 mins per hour long lecture). When converting the files from WMA to mp3, I tried using a slower sampling rate to reduce the file size (the smaller the file, the faster it will download). Doing this sacrifices some degree of sound quality. The following sampling rates resulted in the following file sizes and “sound quality”

Original WMA (44 kHz) file: 12.3 MB (53 min lecture)
Converted to MP3 at 32kb/sec : 12.1 MB file (almost identical sound) Converted at 24 kb/sec : 9 MB MP3 file (slightly less sound quality)
Converted at 16 kb/sec: 6 MB MP3 file (clearly detectible distortion)

The amount of distortion you wish to tolerate is highly subjective.