Skip to main content

Outline

Welcome, welcome! This week we’re chatting about assessments and testing. Specifically, our articles address the following:

  • Does feedback after practice tests improve metacognition and exam performance?
  • Can deadlines prevent cramming for tests?

Let’s jump into recent research on practice tests, metacognition, and cramming ⤵️

Practice makes progress 📈

Research has consistently illustrated that practice tests are useful for later testing. In fact, the testing effect has been replicated time and time again since Abbott’s study in 1909. As a side note, unless you are a huge research nerd like me, I don’t generally suggest reading work from the early 1900s - they’re a pain to get through and there are plenty of modern articles summarizing them. If you want to *feel* the early 1900’s, go for it! But I digress... Abbott’s testing effect essentially says that participating in practice tests, even without feedback, provides benefit to learning through repeated retrieval (1909). In addition to “working out” our retrieval muscles with new information, another reason practice tests may be useful are because they can prompt metacognition, or thinking about what we do and don’t know (check out LSW Issue #30 for more on metacognition). However, some aspects of what makes a “successful” practice test have yet to be explored fully. The researchers of this study aimed to evaluate the effects of feedback after practice tests on learner metacognition and final exam performance (Naujoks, Harder, & Händel, 2022). Does feedback help learners “monitor their performance” or does the absence of feedback push students to “generate internal feedback?”

Learners were in a course on educational psychology, which lasted for 14 weeks. After approximately every 3 weeks, learners were provided with the opportunity to participate in “an online practice test referring to the topic of the last weeks” (Naujoks et al., 2022). This procedure was done with two separate cohorts of learners. The first cohort did not receive feedback on their practice tests, the second cohort “received individual item-specific feedback” (Naujoks et al., 2022). Data regarding prior knowledge, number of practice tests taken, test performance, metacognitive judgments, and perceived usefulness were taken for both cohorts.

In both scenarios, the more practice tests learners participated in, “the better their performance on the posttest” (Naujoks et al., 2022). Further, the benefits extended to metacognition. In both cohorts, students that took practice tests showed higher metacognitive awareness and accuracy (Naujoks et al., 2022). In other words, participating in practice tests boosted their ability to estimate their own performance. Lastly, looking toward learner perceptions - learners in the feedback cohort stated they had higher levels of engagement after the practice tests than those from the cohort without feedback. The feedback group also rated the practice tests as more useful for performance monitoring. While learner perception may not always relate to outcomes, in this case, feedback may be a helpful tool to encourage learners to attend practice sessions (Naujoks et al., 2022).

Before we round out this article, I want to address an aspect of this study that struck me as odd before reading the whole paper: the voluntary nature of the practice tests. Voluntary participation may have impacted the results, as learners attending an “extra” session are likely high achievers. Thus, the results should be considered in that context. However, one aspect of this study that I really liked is that four practice tests were offered, meaning the authors were able to assess the variation (i.e., not all learners attended all four sessions). Overall, the results seem to be quite robust in that practice tests, with or without feedback, are helpful in boosting overall exam performance and metacognition.

This finding is extremely relevant to any learning happening prior to high-stakes tests, such as a credentialing course, licensing course, etc. Offering opportunities for learners to participate in practice tests has the potential to improve exam scores. Further, offering feedback may encourage learners to actually take part!

Key Takeaway: Practice tests can be an extremely valuable tool to improve assessment scores. While the authors found this to be true whether feedback was provided or not, learners perceived feedback to be helpful. Thus, offering feedback might encourage learners to engage with practice assessments.

Read More (Open): Naujoks, N., Harder, B., & Händel, M. (2022). Testing pays off twice: Potentials of practice tests and feedback regarding exam performance and judgment accuracy. Metacognition and Learning.

Deadlines & Cramming

Any time we talk about assessments or high-stakes testing, my mind always wanders back to the idea of cramming. We all know it - maybe you forgot, maybe you didn’t plan too well, life got in the way, etc. Whatever the reason may be, now you’re up at 4 am trying to shove tons of information into your brain to the best of your ability. As a personal anecdote, my mother actually woke up at 4 am (one of those mysterious “early risers”), so my poor-planning teenage years were generally met with a disapproving look. Sorry, mom!

For instructors, finding a way to prevent cramming would be a great help. The authors of this study sought to understand how course modality and deadlines would impact cramming behaviors (Theobald, Bellhäuser, & Imhof, 2021). For this study, two cohorts of learners were evaluated in an educational psychology course. One cohort took a blended learning course (consisting of online and in-class instruction), while the other cohort took a purely online course.

The online course remained fully open, consisting of 6 modules with the textbook readings, lectures, podcasts, and online self-tests. The blended course consisted of the same materials, but learners also had access to weekly voluntary in-class meetings held by the instructor. The blended course also utilized a flipped classroom approach (Theobald et al., 2021). Learners were expected to engage with course materials and “work on a self-test” prior to attending the in-class sessions. Further, the online course was provided with access to all materials, while the blended course was provided with materials successively (Theobald et al., 2021). Regarding learner measures, researchers assessed conscientiousness from the Big Five Inventory, as conscientiousness was expected to relate with distributed practice (Theobald et al., 2021). They also evaluated online time investment (overall time in the LMS), distributed practice (if the LMS was accessed each week or “crammed” into a couple weeks), self-testing, exam performance, and in-class attendance (Theobald et al., 2021).

Overall, learners in the blended course “used significantly fewer self-tests” than those in the online course; this was, as expected, associated with lower exam performance (Theobald et al., 2021). It was also found that learners in both modalities spent the most time studying in the last few weeks of the course. Ultimately, voluntary in-class meetings and deadlines did not prevent cramming behaviors (Theobald et al., 2021). One reason for this may be that in-class sessions and self-tests were optional. Thus, the authors suggest instructors to incentivize self-tests in some way, i.e., with “bonus points” (Theobald et al., 2021).

However, researchers did find that more conscientious learners employed more distributed practice techniques, which predicted higher exam performance (Theobald et al., 2021). Learners that are lower in conscientiousness may need more support with learning strategies. Luckily, there are ways to increase conscientiousness - and metacognition is one!

Key Takeaway: While deadlines may seem helpful, they don’t seem to prevent cramming. Therefore, it’s helpful to keep those self-tests open for the whole course, not just a set timeframe. Learners low in conscientiousness may not engage with self-tests. If possible, incentivize self-assessments to build learner metacognitive skills, which can help improve assessment performance.

Read More ($): Theobald, M., Bellhäuser, H., & Imhof, M. (2021). Deadlines don’t prevent cramming: Course instruction and individual differences predict learning strategy use and exam performance. Learning and Individual Differences, 87.

Upcoming Event

Reader Hasmik D. sent the following blurb about an event that may be of interest!

"If you’re interested in research and experimentation, join an Ask Us Anything with Research Partnerships for Professional Learning (RPPL) on March 28 at 1pm ET. The conversation will focus on exciting new efforts to build a collaborative research consortium around teacher learning and professional development, to understand why effective programs work and whether their design principles can be replicated.

The event speakers are Sarah Johnson, CEO of Teaching Lab; Nate Schwartz, Professor of Practice at the Annenberg Institute at Brown University and a Fellow at The Policy Lab; and Emily Freitag, co-founder and CEO of Instruction Partners.

This online event is free and available to all members of the learning engineering community. Read more and register here today!"

Pets of Learning Science Weekly

Reader Sue G. brings us a true star this week - 8-month-old schnoodle puppy, Otto! Personally, I just feel like he “gets it,” you know?

Here we see that Otto is “obviously planning his next mischievous move! He loves chewing most things - lumps of wood from the firewood basket, paper from the recycling bin, fur-sister Poppy's ears...He's brought a new dimension to our lives - and we love it!”

Send us your pet pics at editor@learningscienceweekly.com.

Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!

The LSW Crew

Learning Science Weekly is written and edited by Katie Erhardt, Ph.D.

Have something to share? Want to see something in next week's issue? Send your suggestions: editor@learningscienceweekly.com