Skip to main content

Outline

Ready Player One

Thanks to reader Lisa B. for her suggestion to cover gamification and online learning! We found a great (and open!) article that discusses motivation, gamification user types, and associated gamification mechanics. In their study, the authors surveyed students to determine what player type they were (adapted from Marczewski’s Hexad Scale: Socializer, Free Spirit, Achiever, Philanthropist, or Player -- note that “Gunter” wasn’t an option) and what online learning tools (like forums, quizzes, wikis, and peer assessments) motivated each one. Table 5 on p. 11 provides a great overview of the online learning activities that are associated with specific gamification mechanics that motivate different user types. We won’t recreate it here, but it’s worth checking out and/or bookmarking. This article does a good job at providing foundational information about the connection between gamification mechanics and learner motivation.

Key Finding: The heterogeneity of learners means that one size does not fit all when it comes to gamification. By understanding what motivates learners, instructional designers can create learning experiences and environments that leverage gamification in a meaningful way for all learners.

Read More (open access): Bovermann, K. & Bastiaens, T.J. (2020). Towards a motivational design? Connecting gamification user types and online learning activities. Research and Practice in Technology Enhanced Learning, 15(1).


Quality Control

How do you judge the instructional design quality of your online courses? Do you use a standardized evaluation instrument like the 5 Star Rubric or the Quality Matters framework? In a recent study, researchers used Merrill’s first principles of instruction to develop a Concise Course Scan (CCS) rubric to evaluate the instructional quality of 101 free, business-related Massive Open Online Courses (MOOCs). Three trainers used the CCS rubric, which included items related to activation, demonstration, application, and integration (see article p. 90/PDF p. 6), to evaluate the courses. Their findings? While MOOCs that were authored by academic institutions showed slightly higher instructional quality than those from nonacademic institutions, the overall findings indicate low overall instructional quality of the analyzed MOOCs. One limitation we’ll note is that the researchers didn’t take learning outcomes into consideration.

Practical Tip: If you’re creating an online course, leverage one of the existing tools to evaluate your class and ensure high instructional quality. Or, do like these researchers and create your own rubric. Regardless of which one you use, evaluating your courses before launching them is a good rule of thumb.

Read More (open access): Egloffstein, M., Koegler, K., Ifenthaler, D. (2019). Instructional quality of business MOOCs: Indicators and initial findings. Online Learning, 23(4), 85-105.


Holy Learning Strategies, Batman

Thanks to reader Detlef H. for the suggestion of investigating holistic learning strategies. He asks, “We talk about ecosystems, build edtech systems and platforms to engage users, drive traffic, and measure NPS, but what does an effective digital learning strategy really entail?” Good question! I was reading Evidence-Informed Learning Design: Creating Training to Improve Performance from learning science royalty Mirjam Neelen and Paul A. Kirschner and came across this topic on p. 36.

Neelen and Kirschner recommend the following to create a holistic learning experience:

  1. Start with the performance problem
  2. Determine if learning can help
  3. Clarify what success looks like
  4. Determine learners’ needs

And, they note, all four of these steps should focus on learning tasks that are authentic, which Gulikers, Bastiaens, & Kirschner (2004) define as tasks that:

  • Consider the physical and social context;
  • Include an authentic assessment; and
  • Consider criteria and standards that incorporate job requirements.

According to Neelen & Kirschner, a good learning strategy is focused on whole and authentic tasks that ensure learning transfer and create a learning experience that is effective, efficient, and enjoyable (p. 38). We’ll talk more about this topic in future issues. Is there something that you’d like for us to cover? Let us know.

Read More (book, borrow from your local library or support your local bookstore): Neelen, M. & Kirschner, P.A. (2020). Evidence-Informed Learning Design: Creating Training to Improve Performance. Kogan Page: London.


Eyes Wide Shut

Reader Judith R. had a question about video lectures, and we dug a bit deeper to find this article, from 2012, which analyzed instructional YouTube videos and provided a list of practical recommendations based on that analysis. Here’s one gem: Seduce the viewer. (No, not like that.) “Make a big promise of something to be learned or accomplished, proceed incrementally by making and fulfilling small promises along the way, and then make good on the larger promise” (p. 204). Why? The researcher proposed that, by doing this, you’re helping the learner understand that they are about to watch something worth their time.

Key Takeaways:

  • Good instructional video begins with an introduction that frames the lesson to be learned.
  • Good instructional video spends more time demonstrating steps (doing and explaining) than either doing or explaining alone.
  • Good instructional video delivers content whose message is easy to locate and access, easy to understand and utilize, and is engaging and reassuring.

Read More (open access): Swarts, J. (2012). New Modes of Help: Best Practices for Instructional Video. Technical Communication, 59(3).


Eyes Wide Shut, Part II

When it comes to instructional media, I always look to see what Richard E. Mayer has to say. In an article published with Logan Fiorella, Mayer provided an analysis of best practices from a series of articles appearing in a 2018 issue of Computers in Human Behavior.

Key Takeaway: Their summary included the following effective methods:

  • Segmenting: breaking the video into parts and allowing students to control the pace of the presentation
  • Mixed perspective: instructional videos that show different viewpoints may allow learners to become engaged in the learning experience by seeing the material from their own perspective, causing the learner to process the material more deeply

What doesn’t work in instructional videos? According to this summary:

  • Seeing the instructor’s face in conjunction with the content
  • Inserting pauses into the video
  • Matching the gender of the video instructor with the gender of the learner

Read More (paywall): Fiorella, L. & Mayer, R.E. (2018). What works and doesn’t work with instructional video. Computers in Human Behavior, 89, 465-470.

We’ll have more from Mayer in next week’s issue.


Stefan Rothschuh is a Ph.D. student in the Learning Sciences at the University of Calgary, where the mascot is the fierce and beloved Rex O’Saurus. Stefan’s research investigates technologically-enhanced embodied learning designs of pre-calculus mathematics concepts and is supported by The Social Science and Humanities Research Council in Canada. He’s a former high school mathematics teacher who enjoys working with educators in design-based research settings.


Pets of Learning Science Weekly

Cassi (short for Cassiopeia) is a gorgeous German Shepherd-Staffordshire Terrier mix and companion of reader Janice S.O. What a face!

Send us your pet pics at editor@learningscienceweekly.com.

Wondering why we’re including animal photos in a learning science newsletter? It may seem weird, we admit. But we’re banking on the baby schema effect and the “power of Kawaii.” So, send us your cute pet pics -- you’re helping us all learn better!


The LSW Crew

Learning Science Weekly is edited by Julia Huprich, Ph.D. Our head of growth and community is Julieta Cygiel.

Have something to share? Want to see something in next week's issue? Send your suggestions: editor@learningscienceweekly.com