Welcome! This week, we’re chatting about elaboration and assessments. Specifically, we’re hoping to answer:
- Does “Make & Take Quizzing” boost learner studying and assessment scores?
- Can generative tasks improve performance on assessments?
Make a Question, Take a Quiz
Here’s the thing, I’m a bit of a sucker for “proof of concept” studies - I think they’re such an important stepping stone for research. Recently, I filtered through my *many* open Chrome tabs and landed on a study looking at a new quizzing strategy. Now, we know that learners are a bit prone to “cramming” for assessments, and, as we’ve talked about before here, setting deadlines doesn’t seem to help. So, how can we help? We know that spaced (or distributed) practice, retrieval (or practice testing), and elaboration are all important pieces to enhance long-term retention and comprehension. Gallagher (2022) sought to build a way for learners to engage in evidenced-based study habits that utilize the concepts of distributed practice, retrieval, and elaboration.
In this study, a “novel in-class quizzing strategy” was assessed - Make & Take (M&T) Quizzes (Gallagher, 2022). To evaluate the effectiveness of Make & Take Quizzing, half of the learners were in a course where the Make & Take Quiz strategy was used, while the other half were a control group. In place of Make & Take time, those in the control groups “participated in a brief (5–10 min), informal, instructor-led review” (Gallagher, 2022).
What actually is the Make & Take Quiz method? In the first week of the course, learners received 30 index cards - half were magenta and half were teal. The magenta cards served as the “make” mechanism, while the teal cards served as the “take.” Once a week, learners reviewed their notes on the course and developed 2 potential exam questions (1 multiple choice & 1 short answer); they were also required to provide the correct answer. The potential exam questions were written on the magenta card and turned in to the instructor - this served as the “make” portion. Learners received points if the magenta card was complete and accurate. The “make” activity encourages learners to use elaboration techniques and pulls from self-generation concepts! It is also distributed practice, since it was completed once per week (Gallagher, 2022). Now, to the taking! In the very next course session, the instructor displayed 5 learner-generated questions; generally, the questions were ones that “embodied a main idea or topic from the lesson, were applied in nature, and/or presented a novel example/illustration” (Gallagher, 2022). Learners had 5 minutes to respond to the prompts on their teal cards and turn them in. The cards were returned the next session, along with points for accurate responses. The “take” portion of this method heavily focuses on retrieval practice. The “Make & Take Quizzing” was done 12 times in the course (Gallagher, 2022).
In order to compare the M&T group with the control group, learners completed a study strategies questionnaire and the beginning and end of the course. Further, performance on an assessment at the end of the course was compared. Results illustrated that learners in the M&T Quiz courses scored significantly higher on the assessment than those in the control group, with the M&T average at 85.4% and the control group average at 77.5% (Gallagher, 2022). Although study strategies did not change overall, 2 subscales showed significant improvement for learners in the M&T group: 1. “Less trouble deciding on the main ideas when studying” & 2. “More likely to put ideas into their own words when studying.” This change indicates that the M&T method aided learners in their ability to pull out a main idea and elaborate on it (Gallagher, 2022).
Some notes on this study, of course! First, I think follow-up work should be done to evaluate if the elaboration and study technique improvements persist past this one course/if transfer occurs. Additionally, the M&T Quiz method requires some oversight from an instructor, so this method, in the current state, isn’t applicable to asynchronous learning. However, I do think a lot of the underlying principles can be applied to a virtual setting! Honestly, if we want to take it a step further (and really - why not?), we can apply the ideas here to asynchronous learning as well. For instance, having learners submit “potential exam questions” as a required part of a course, but not moderating them. Of course, there’s no telling how it would pan out, and I don’t want to give the impression that it would have the same results, but I think it’s a new avenue worth exploring.
Key Takeaway(s): The “Make & Take Quizzes” pull on the ideas of distributed practice, elaboration, and retrieval - 3 very well studied aspects of memory and learning. This method showed higher exam scores for learners and positive improvement in elaboration techniques.
Read More ($): Gallagher, K. M. (2022). Using “Make & Take Quizzes” to Improve Exam Performance and Engage Students in Effective Study Strategies. Teaching of Psychology, 49(2), 124-129.
“Instead of describing or demonstrating effective study strategies, what if we could actively engage students in spaced practice, retrieval practice, and elaboration in the classroom? Moreover, what if we could harness the additive power of these three evidence-based study strategies?” - Gallagher (2022)
Events (from our Inbox)
Reader Samara provided the following event:
"The DS4E coalition has launched the 2022 Commitments Campaign to help prepare all students for a data science future! The campaign spotlights major actions in building a strong talent pipeline and expanding data science education. Join leaders and commitment-makers from the State of Virginia, the Gates Foundation, San Diego USD, NetApp, and others by participating in our community-building event on June 22nd at 1 pm ET. Register and submit a commitment here."
Pets of Learning Science Weekly
Reader Clea M. provided us with this incredibly relatable photo of pup, Cheyenne, as she “sits atop 'laundry mountain,' her shiny black fur gleaming in the morning sunlight”
Thanks for sharing this beauty, Clea!
Send us your pet pics at firstname.lastname@example.org.
Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!
The LSW Crew
Learning Science Weekly is written and edited by Katie Erhardt, Ph.D.
Have something to share? Want to see something in next week's issue? Send your suggestions: email@example.com