Skip to main content

Outline

Welcome to our last February issue of 2022! In honor of the (hopefully) fleeting cold weather here, we’re focusing on proximity (i.e., goodbye to the tight cold weather snuggles from my pups). Specifically, the research we’re covering this week looks at proximity in visual processing. The questions that we’re answering are:

  • Does the layout of slide/text multimedia impact learner retention?
  • When using a multi-touch device, can hand proximity impact learning?

Slide that text over ↔️

When presenting through slides, we often include pictures to illustrate concepts, boost learner engagement, and as a reference. However, including more materials means that our brains need to process more information (re: cognitive load; Sweller, 2011). Thus, cue the split-attention effect. The split-attention effect is also covered extensively in Mayer’s (2012) Handbook of Multimedia Learning, if you are familiar or interested! Essentially, the split-attention effects tells us that learning from text/picture combinations is “more successful when the text is displayed near the corresponding area of the image, rather than when the text and pictures are placed at geographically separate positions" (Mutlu-Bayraktar, Ozel, Altindis, & Yilmaz, 2022). This article looked at learning outcomes and retention, as well as extended past research by evaluating eye-movement and brain wave patterns to better understand this phenomenon (Mutlu-Bayraktar et al., 2022).

Multimedia learners were split into two groups: focused attention format (FAM) and split-attention format (SAM). Both groups completed a pre-test, a multimedia learning environment, and a retention test. The learning environment consisted of nine self-spaced slides/scenes that included animation, narration, and/or text (Mutlu-Bayraktar et al., 2022). The FAM group received materials where “videos and text were integrated in such a way that the text was presented as audio narration simultaneously with the related scene. Furthermore, this approach focused attention by signaling on the visual presentation.” While in the materials for the SAM group, “the names of the concepts were not integrated into the image but added as a text below. Moreover, the concept described was presented using two separate visuals as an image and a video on the same scene” (Mutlu-Bayraktar et al., 2022). See image for the difference in one of the scenes.

(Mutlu-Bayraktar et al., 2022)

Results illustrated distinct differences between the FAM and SAM groups regarding the eye-tracking heat maps, EEG signals, and retention. For the sake of simplicity, we’re sticking with the heat maps and retention measurements (although the EEG results were interesting if you like neuroscience!). Regarding scanning of the scenes, the SAM group spent more time browsing, with their attention split between reading text, viewing pictures, and watching the video. Overall, they focused more on images while the FAM group centered around the text integrated into images. The eye tracking results show that the SAM group’s attention was divided (Mutlu-Bayraktar et al., 2022). So, how does this actually impact retention? The FAM group had significantly higher retention performance than the SAM group (Mutlu-Bayraktar et al., 2022). Ultimately, research supports keeping text close to images, keeping text a minimum when possible, and narrating rather than utilizing heavy printed text.

Key Takeaway: If pictures and text are used together in presentations, the text should be in close proximity to the image to avoid the split-attention effect. Further, narration can be used in place of extra text to keep learners focused. Diminishing the split-attention effect leads to increased retention performance for learners.

Read More ($): Mutlu-Bayraktar, D., Ozel, P., Altindis, F., & Yilmaz, B. (2022). Split-attention effects in multimedia learning environments: eye-tracking and EEG analysis. Multimedia Tools and Applications.

“To prevent the emergence of a split-attention effect, the text should be integrated into the picture in designs.” - Mutlu-Bayraktar et al. (2022)

Is this too far?

Interactive multi-touch displays encompass a wide range of devices, from smartphones to large displays at museums. One of my personal favorite large multi-touch displays is in the Space Needle! They can also be found throughout online courses (i.e., tablets), in classrooms (i.e., “SMART board”), in customer education, etc. These devices allow us to manipulate objects without the use of a keyboard or mouse. While we use touch devices daily, are they beneficial for learning over not touching? Does the proximity of your hands to the information actually matter?

An article published in Computers in Human Behavior investigated the effects of hand proximity on learning, specifically regarding multi-touch devices (Brucker, Brömme, Ehrmann, Edelmann, & Gerjets, 2021). The idea that authors present is that having one's hands close to the materials may “free up” cognitive resources for learning to occur (Brucker et al., 2021). For the study, learners were split into two groups based on hand proximity: direct (hands close) and indirect (hands far). With the topic of Art History, learners in the direct group resized, turned, and moved paintings to a gray box on the screen by touching the actual painting; learners in the indirect group went through the same process, but touched a placeholder object rather than the painting (Brucker et al., 2021). See image for hand proximity for the task (Brucker et al., 2021).

(Brucker et al., 2021)

Learning outcomes were measured through information provided during the task, such as recognizing missing objects (visuospatial) and artist name (verbal). Hand proximity significantly affected learning outcomes, such that “visuospatial information was remembered better when it was processed near the hands” (Brucker et al., 2021). Note: this is for visuospatial information, as verbal information was not impacted.

The researchers conducted a follow-up study to evaluate the potential impact of duration of touch. In this experiment, the screen was split left/right rather than top/bottom to ensure the distance from the body as a whole did not impact results. The results from the follow-up experiment supported the previous findings, hand proximity positively impacted learning outcomes. However, this only occurred when paired with longer duration. Researchers suggested that a longer touch enabled “elaborated processing of the displayed information” (Brucker et al., 2021). Future research is needed to assess any impact on verbal information.

Key Takeaway: When trying to teach visuospatial information, hand proximity to the materials impacts learning outcomes. If possible, give learners the ability to manipulate the materials on a multi-touch device. In turn, this experience can improve retention.

Read More ($): Brucker, B., Brömme, R., Ehrmann, A., Edelmann, J., & Gerjets, P. (2021). Touching digital objects directly on multi-touch devices fosters learning about visual contents. Computers in Human Behavior, 119.

Pets of Learning Science Weekly

This week, reader Tami C. shared Chico with us! “Chico is a 26 year old Peruvian Paso horse. He loves long walks in the park, cheeseburgers, and kids. He thinks he's a dog and loves to give kisses like one.” I’m not so sure that a presentation would do much for this sweet dude… but he sure seems like a brilliant learner!

Send us your pet pics at editor@learningscienceweekly.com.

We are once again short on animal pictures, meaning the inbox is So Sad! Please send us your beautiful babies!

Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!

The LSW Crew

Learning Science Weekly is written and edited by Katie Erhardt, Ph.D.

Have something to share? Want to see something in next week's issue? Send your suggestions: editor@learningscienceweekly.com