Skip to main content

Outline

Good day! Thanks for taking some time out of your busy schedule to engage with us about the science of learning. In this week’s newsletter, we’re tackling the following questions:

  1. How does an interactive peer review method compare to conventional peer review when related to learning
  2. Does the order of materials, specifically virtual reality and verbal information, impact learning?

This will be our last week recording responses for our survey - we want to learn more about you & be able to tailor our content. I'd appreciate your participation!

Responding to “Reviewer #3”

When I first saw this article, I immediately connected to the idea of responding to a review. Think back to a time when you got some form of your writing back with suggestions from an anonymous source. I imagine many of you thought: “if I could just respond, surely they would understand.” A recent study from Computers & Education sought to assess just that - whether an interactive peer review would be more beneficial than the conventional peer review method.

Many courses adopt a flipped learning approach, which entails engaging with instructional materials (reading, instructional videos, etc.) in the learner’s own time and participating in activities during class time with the instructor (Bergmann & Sams, 2012). Flipped learning has previously been shown to engage students, improve comprehension, and promote active, as well as in-depth, learning (check out LSW issue #15 for more information on flipped classrooms). The current study evaluates the effectiveness of employing peer review within a flipped classroom (Lin, Hwang, Chang, & Hsu, 2021). Peer review not only allows learners to get feedback on their writing, but also enables them to see peer work from which they can reflect on their own writing (Topping, 1998). Peer review has been shown to enhance learning performance, critical thinking, and metacognition (Zheng, Chen, Li, & Huang, 2016). However, in traditional peer review, assessees do not have a chance to respond to assessors. Conversely, an interactive peer-review method enables assessees and assessors to communicate regarding comments. Thus, the researchers of this study aimed to evaluate whether the interactive peer review model would be more effective than the conventional peer review model (Lin et al., 2021).

This study was conducted with participation from nurse practitioners in a learning course. The experimental group and control group both completed an instructional video prior to class, attended an in-class session, a peer review session outside of class, and then an in-class discussion. However, the experimental group participated in the interactive peer review model while the control group completed a conventional peer review.

Students in the interactive peer review model performed better on the Objective Structured Clinical Examination (OSCE) than the conventional peer review group. The OSCE is a standardized measure of health assessment skills. This measure may indicate higher learning transfer, but more research is necessary to confirm (Lin et al., 2021). In a questionnaire after the activities, the interactive peer review group illustrated higher levels of critical and reflective thinking than the conventional peer review group. The results illustrate that interactive peer review can promote “higher order thinking capabilities” and improve health assessment skills. The authors suggest the improvements are likely due to using logical thinking and evaluation during the interactive peer review, as well as perspective-taking (Lin et al., 2021).

Key Takeaway: When possible, implementing a flipped interactive peer review process can improve achievement in real world settings better than conventional peer review. The social aspect of the interactive peer review process can improve critical and reflective thinking.

Read More ($): Lin, H. C., Hwang, G. J., Chang, S. C., & Hsu, Y. D. (2021). Facilitating critical thinking in decision making-based professional training: An online interactive peer-review approach in a flipped learning context. Computers & Education, 173.

Does the order of information impact learning?

Virtual reality learning environments (VRLEs) have been increasing, particularly because they have been shown to increase learner engagement (Allcoat, 2018). However, VRLEs show mixed results on learning outcomes. While some studies show VR as incredibly beneficial (see LSW Issue #28 on why this is best for training astronauts), others illustrate superior performance when learning from PowerPoint slides (Alfadil, 2020; Parong & Mayer, 2020). Largely, the debate regarding VRLEs and other forms of learning revolve around cognitive load. Since VRLEs are heavily visual, presenting information via auditory text may mitigate the cognitive load (Makransky et al., 2019). Various factors may impact how effective VRLEs can be in improving learning, including the subject, VR training, and prompting (Parong & Mayer, 2018; Klingenberg et al., 2020). Thus, the current research aims to assess how the sequence of VR and text, as well as the presence or absence of cognitive prompting (a prompt that stimulates elaboration and learning strategies), impact learning outcomes (Vogt et al., 2021).

The study varied on the sequence of materials, VR animation & auditory text, and the presence of an elaboration prompt. Thus, 4 groups were created:

  1. Prompt with auditory text first
  2. Prompt with VR animation first
  3. No prompt and auditory text first
  4. No prompt and VR animation first

Learning materials were focused on information about robotics. Participants used the Oculus Go to position themselves in a kitchen environment while watching the REEM robot (see image below).

(Vogt et al., 2021)

The auditory text featured information on the REEM robot, including specifications and motion systems. The elaboration prompt asked participants to relate concepts to each other and to imagine explaining the robotic information. Learning outcomes were measured through a post-test that included 3 levels: knowledge, comprehension, and application (Vogt et al., 2021).

When looking at the presentation order of materials, learners performed better at the knowledge level when the VR was presented first. However, the other 2 levels were not impacted. This means that groups with VR first “performed better in defining and describing basic concepts but had no advantage when answering comprehension or application questions” (Vogt et al., 2021). Focusing on the prompt, results showed better performance on the application level when an elaboration prompt was presented. Similar to above, this was specific for the application level and no difference was seen for knowledge or comprehension. The results of this study show us that our approach may vary depending on our specific learning goal, which is consistent with our previous summaries here at LSW.

Key Takeaway: When the aim is to learn basic concepts and definitions, presenting VR material before auditory text can improve knowledge. Further, presenting an elaboration prompt prior to VR material may encourage the learner to process the content deeply.

Read More (OA): Vogt, A., Babel, F., Hock, P., Baumann, M., & Seufert, T. (2021). Immersive virtual reality or auditory text first? Effects of adequate sequencing and prompting on learning outcome. British Journal of Educational Technology, 52(5), 2058-2076.

Interesting Events

There are two exciting events coming up from the Futures Forum on Learning. The first is a competition for innovative learning engineering tools. See below for details:

“The 2021 Learning Engineering Tools Competition is now open for submissions! The competition invites technologists, digital learning platforms, researchers, students and teachers from around the globe to propose innovative tools or technologies that address one of the pressing challenges in education while advancing the field of learning engineering. This year’s focus areas are accelerating learning in literacy and math, improving K-12 assessment, refining adult education, and developing new research tools.

The competition will award at least $3 million in prizes, making it one of the largest edtech competitions ever convened. The deadline for applications is October 1st, 2021.

You can learn more here. If you have any questions, reach out to toolscompetition@the-learning-agency.com.”

Additionally, they are hosting a weekly seminar series. Based on your inbox, I think many of the discussions may be of interest!

“Each week this September, the Futures Forum on Learning, hosted by Schmidt Futures, is facilitating a deep-dive discussion among leaders in education and technology on the immense learning gaps caused by COVID-19, and the opportunity to overcome them. The series includes:

  • Sept. 14 from 12-1pm ET: Innovative Assessment: Making High Quality Testing Affordable, Fast, & Effective

  • Sept. 21 from 12-1pm ET: Accelerating Learning: Helping Students Catch Up

  • Sept. 28 from 12-1pm ET: Testbed Technology: Building Platforms That Improve as More People Use Them

Each discussion will feature a panel of experts, surprise guests, and opportunities for networking among the attendees. All the discussions are free and open to the public. Find more details on the schedule and registration here.”

Thanks for sharing, Kent!

Pets of Learning Science Weekly

This week, I’m sharing the oldest of my fur babies, Ranger Squeaker! While he is a scholar in retirement, he was previously a park ranger. He enjoys long naps and “brushings for days” (his words, not mine). If you’re into Easter eggs, there’s another friend in this photo too!

Send us your pet pics at editor@learningscienceweekly.com.

Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!

The LSW Crew

Learning Science Weekly is written by Kaitlyn Erhardt, Ph.D. and edited by Julia Huprich, Ph.D.

Have something to share? Want to see something in next week's issue? Send your suggestions: editor@learningscienceweekly.com