Skip to main content


Welcome! This week, we’ll be chatting about a couple of ways to improve your assessment strategies. Our articles ask the following questions:

  1. Should learners have access to content during assessments, study sessions, and practice tests?
  2. Is restudying an effective technique for conceptual learning?

At the end, you’ll get a peak at a sweet furry friend! There’s a test on the articles at the end though, so be sure to read carefully 😉

Retrieval & Generative Learning Tasks

The first article this week looks at how we can improve learning with generative tasks. We know that generative tasks are helpful for improving comprehension, much of which comes from linking new information to pre-existing knowledge. Generative tasks might include activities such as creating concept maps, a timeline, etc. by creating connections between information and increasing comprehension of the learned material. Retrieval practice is a well-researched strategy known to decrease forgetting (Adesope, Trevisan, & Sundrarajan, 2017). In this study, retrieval is indirectly assessed through a closed-book review style. The idea here is that utilizing a closed-book style during a generative task will be beneficial in that the learner must engage in retrieval while completing the task. Thus, the researchers for the current study were interested in whether retrieval during a generative task improved learning outcomes (Waldeyer, Heitmann, Moning, & Roelle, 2020).

In this study, learners were randomly assigned to either a closed-book or open-book group. All participants were provided with a text to read on the topic of social influence. For the generative activity, four prompts were provided:

  1. "What is the most important content (e.g., concepts or thoughts)?
  2. Try to highlight the most important content and connections.
  3. Try to illustrate the most important content by giving your own examples.
  4. Which content did you find interesting, useful, or convincing? Explain why.”

After the generative task, participants were given a posttest one week later to assess learning outcomes. The results indicated that those in the open-book group covered more content in their responses, but those in the closed-book group elaborated more. As for learning outcomes, open-book learners scored higher on the posttest. Considering the pros and cons of open- and closed-book styles, the researchers continued with their research into a second study.

In a follow-up study, a third group called “closed/open switch” was added. The procedure for this study was the same and included both closed- and open-book groups as well. The closed/open switch group was similar to the closed-book group, but allowed learners to use a “show text” button to bring up the literature. The findings indicated that the closed/open switch group outperformed the “pure” closed-book and open-book groups. The authors suggest this was likely due to the enhanced retrieval over the open-book group, as well as the ability to create deeper connections over the closed-book group.

These findings are important when considering high-stakes testing, which is often used for training, certification, promotions, etc. If assessments are used in training, customer education, or employee education, it’s crucial to consider what types of studying or practice would be beneficial. In line with these findings, utilizing a generative task with a closed/open switch strategy can help to improve memory, elaboration, and overall learning!

Key Takeaway: For high-stakes assessments, integrating a generative task with a closed/open style can help to improve learning outcomes. When considering assisting learners, consider adding generative prompts (as above) with clickable access to the text.

Read More ($): Waldeyer, J., Heitmann, S., Moning, J., & Roelle, J. (2020). Can Generative Learning Tasks be Optimized by Incorporation of Retrieval Practice?. Journal of Applied Research in Memory and Cognition, 9 , 255-369.

“Not only is restudying an ineffective study strategy for learning, but it also gives students a false sense of mastery of the material.”
- Cho & Powers, 2019

Re-Study or Practice Tests?

The other article this week continues with the trend of testing, but with an emphasis on conceptual learning. A recent study published in the Journal of Applied Research in Memory and Cognition delves deeper into the testing effect and how it impacts learning transfer. When considering test-taking, people often think that testing lends to rote learning rather than conceptual understanding. However, more recent research suggests that tests can improve conceptual learning/transfer (Cho & Powers, 2019). The current study expands past work by evaluating categorical materials, specifically Chinese-English word translations. Further, previous studies told participants that they would engage in a conceptual test, while the current study (in Experiment 3 only) did not. So, does testing enhance memory and transfer for Chinese-English translations?

To answer this question, participants went through a research procedure that included 2 sessions. The first session was predominantly a study session, consisting of an initial study phase, two review phases, an encoding questionnaire, and a confidence questionnaire. During the review phases, participants either experienced restudy or test practice. The restudy group repeated the study phase, while the test group practiced the translations in a test format (see example below).

(Cho & Powers, 2019)

The second session, the testing session, consisted of a multiple choice test and confidence questionnaire. Overall, the results showed that final test performance was higher for the test groups over the restudy groups. Interestingly, the restudy groups actually rated their confidence levels higher than the test groups! The graphic visual really drives this difference home - see below for the comparison from the paper. Figure 2 illustrates that the final test performance was higher for the test groups (black line), while Figure 3 illustrates that the restudy group rated their confidence as higher (gray line).

(Cho & Powers, 2019)

When looking at the encoding strategies students reported, those in the restudy group were significantly more likely to use rote rehearsal, while those in the test group were significantly more likely to use inter-item association. As mentioned in the first article, creating connections between information leads to a deeper level of processing and better comprehension. Thus, the encoding strategy employed by the test groups would be more beneficial for conceptual learning. These results suggest that testing is an effective strategy to improve conceptual learning and transfer.

Key Takeaway: While learners may be drawn toward restudying, it would be much more effective to promote testing as a strategy for meaningful learning. Prior to assessments, providing practice tests will be more effective than re-routing learners back to the readings.

For further information on transfer, feel free to check out an earlier newsletter - LSW Issue #37!

Read More ($): Cho, K., & Powers, A. (2019). Testing Enhances Both Memorization and Conceptual Learning of Categorical Materials. Journal of Applied Research in Memory and Cognition, 8, 166-177.

Pets of Learning Science Weekly

Still looking for the test? Congrats, you've passed! Here's an adorable dog as a reward!

Our friend Angela L. sent us some excellent information on a beautiful boy, Winston!

“Here is Winston - a needy, snorey, playful Boxer. Similar to many of us who have or developed anxiety during quarantine and the general state of things, he came to us from Irving Animal Shelter with some separation anxiety back in 2018. He has a great wigglebutt too when he sees someone or somedog he wants to play with.”

We hear ya, Winston, and we get it! Thanks for the quarantine LSW hangs :)

Send us your pet pics at

Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!

The LSW Crew

Learning Science Weekly is written by Kaitlyn Erhardt, Ph.D. and edited by Julia Huprich, Ph.D.

Have something to share? Want to see something in next week's issue? Send your suggestions: