Online Learning – Michigan Virtual https://michiganvirtual.org Thu, 14 Aug 2025 20:26:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://michiganvirtual.org/wp-content/uploads/2021/10/cropped-mv-favicon-32x32.png Online Learning – Michigan Virtual https://michiganvirtual.org 32 32 Have You Considered AI in Your Classroom? A Khanmigo Pilot Story https://michiganvirtual.org/blog/have-you-considered-ai-in-your-classroom-a-khanmigo-pilot-story/ Wed, 13 Aug 2025 20:17:10 +0000 https://michiganvirtual.org/?p=97099

In a two-phase pilot across Michigan schools, educators used Khanmigo, an AI-powered tutor and teaching assistant, to explore how AI might support teaching and learning. Their reflections surfaced both opportunities and challenges. The big takeaway? AI has potential, but only with intentional support.

]]>

A Pilot Rooted in Curiosity

What happens when AI becomes a teacher’s assistant, a student’s tutor, and a school’s data collector—all in one? We wanted to find out. We wanted to explore what might be possible. 

What would happen if teachers and students were given the opportunity to test an AI tool with structure, some support, and space to play? What might they discover?

Curious, Not Convinced: Why We Tried Khanmigo

Like many educators, we were hearing a lot about AI—its potential, its risks, and its growing presence in classrooms. But figuring out how to explore AI in a thoughtful, low-stakes way isn’t necessarily easy. 

We were curious about Khanmigo, an AI-powered tutor and teaching assistant developed by Khan Academy, because it offers both student- and teacher-facing tools. It has features for lesson planning and idea generation, and it provides real-time student support. 

What stood out was that Khanmigo isn’t focused on delivering quick answers. Instead, its prompts often encourage students to think through problems, explain their reasoning, and reflect as they arrive at the answer independently. Because it is integrated into Khan Academy’s content-rich platform covering subjects like math, history, coding, and computer science, it also offers some built-in structure for classroom use. 

A Glimpse at the Khanmigo Pilot

Over two pilot phases, Michigan Virtual’s Research and Development team partnered with several Michigan school districts to see what might happen if teachers and students explored Khanmigo together.

Pilot 1—Spring 2024:

  • 19 teachers, 687 students, 14 school districts
  • Focused on Algebra I (9th-grade)
  • Emphasis on Khanmigo as a math tutor and support tool

Pilot 2—Fall 2024-Spring 2025:

  • 24 teachers, 1102 students, 8 school districts
  • Open to all subjects and grade levels
  • Greater emphasis on teacher-facing tools, instructional use cases, and student behavior

We gathered pre- and post-survey data, facilitated professional learning sessions, and asked participants to reflect on their experience. Their feedback about Khanmigo and AI tools in general helped us understand how comfortable they were with using AI tools, how frequently they used them, and where AI tools had the biggest impact in their teaching and their classrooms. 

Real Feedback, Real Classrooms

So what did educators have to say? 

Teacher Tools? Surprisingly Helpful. 

Khanmigo proved especially valuable for brainstorming questions, writing lesson hooks, reviewing content, and suggesting student activities.

“I used it to get ideas for introducing new concepts and making learning more applicable to the real world. It felt like having a planning partner.”

While the survey participants differed, teachers who responded in Spring 2025 expressed more confidence in using AI for planning than those in Fall 2024. By the end of the second pilot, all responding teachers reported at least some confidence in lesson planning, with the number of teachers reporting they felt “very confident” increasing from pre- to post-survey across several categories, including:

  • Supporting diverse learners
  • Drafting emails & communication
  • Content creation & curation
  • Grading, assessment, & feedback
  • Lesson planning

Students? Curious…But Complicated. 

From the teacher’s perspective, student engagement with Khanmigo varied. Some students, particularly quieter students or those working independently, embraced it and used it productively. Others saw it as a workaround—a way to get answers quickly or to check their work, rather than a tool to deepen their understanding. 

“My students struggled to ask Khanmigo the correct question. They didn’t know what they didn’t know—where to start or what to ask.” 

Several teachers described this as less of a problem with the actual tool, but more of a reflection of student readiness and maturity. Still, several agreed that with clear modeling and structure, students were more likely to engage with it as intended. 

“We have to teach kids how to use it and how not to use it—and be clear about the intended purpose.”

What the Data Told Us

  • According to the teachers who responded in the Fall of 2024, compared to those in the Spring of 2025, student use of AI tools increased, moving from monthly to weekly use for most classrooms.
  • Teachers’ familiarity with AI tools grew, with a jump in the number who reported using AI several times a week, and one even reported daily use.
  • Professional learning priorities included personalizing learning, lesson planning, and supporting diverse learners.

Some Speed Bumps Along the Way 

Not everything worked perfectly. Technical challenges like typing equations or interpreting math prompts made things clunky at times. Some students were frustrated with how “chatty” Khanmigo’s AI chatbot tutor was. Others were quick to use it for shortcuts.

And some challenges weren’t technical—they had more to do with timing and student mindset. Teachers noted the importance of starting the year with an AI tool like Khanmigo, rather than trying to integrate it midway through the semester. Teachers also emphasized the importance of setting clear expectations and guidance as to when and why to use the tool, key to ensuring students have the foundational knowledge to understand how to use Khanmigo and other AI tools appropriately.

“Students don’t want to use it how we would like them to. We have to model appropriate usage for them consistently.” 

“It worked better for older or more self-motivated students.”

“AI may need to be taught progressively, with different levels of understanding increased gradually from year to year in school.”

So, Where Do We Go From Here? 

Throughout the two pilots, one thing became clear: there is real potential, but only if we stay intentional.

Teachers told us they want help with things like:

  • Personalizing learning (top interest in both pilots)
  • Using AI for lesson planning and brainstorming
  • Supporting diverse learners with more targeted content

But they also flagged what’s missing:

  • Clearer policies and norms for student use
  • Tools that are more visual and interactive, especially in math
  • Better examples of what “good AI use” looks like

So, now we’re asking big questions:

  • Is AI a shortcut or a scaffold? 
  • How do we teach students to use it ethically and wisely? 
  • What kind of support do teachers need to make the most of it?

If you’re a district leader, teacher, or coach wondering how AI could support your work, know you’re not alone. These are big, complex questions. But trying something small, with proper support, can make a world of difference.

We’re not claiming Khanmigo, or AI in general, is the answer. But AI might be one exciting tool in a growing toolbox. And maybe, just maybe, it can help make learning a little more personal, creative, and supported—if we use it with care and intentionality. 

Your Turn: What Role Should AI Play? 

We’d love to hear how your school approaches AI—cautiously or creatively. We’re happy to share what we’ve learned, what we’d do differently, and what we’re still figuring out.

Because this conversation is just beginning.

]]>
https://michiganvirtual.org/wp-content/uploads/2025/08/iStock-2186172880.jpghttps://michiganvirtual.org/wp-content/uploads/2025/08/iStock-2186172880-150x150.jpg
Out of Order, Still Out of Reach: Variations in Pacing among World Language Students https://michiganvirtual.org/blog/variations-in-pacing-among-world-language-students/ Fri, 25 Jul 2025 12:31:03 +0000 https://michiganvirtual.org/?p=96345

Cuccolo & Green’s (2025) report highlighted the relationship between students’ assignment submission patterns and final course scores. Given that pacing has important implications for student performance, knowing what assignment submission patterns look like across schools with varying demographics could help prompt early identification and intervention. As such, this blog explores students’ assignment submission patterns based on school-level demographic information.

]]>

Pacing and progression in online learning

In virtual learning, students are often told they can learn and complete coursework “any time, any place, any pace.” However, previous research suggests that the timing of students’ assignment submissions (their pace), in fact, does matter (Kwon, 2018; Zweig, 2023). For example, students who submitted an assignment within the first week of a course had higher final course scores than students who missed this window (Zweig, 2023). 

In addition to the timing of assignment submissions, it is also important to consider the order in which students submit their assignments. Because the content in many courses is scaffolded, moving through a course sequentially should help students build foundational skills, receive timely feedback on their comprehension, and understand instructor expectations before moving on to increasingly complex topics. 

The impact of deviations from course pacing guides

Across two reports (linked below), researchers from the Michigan Virtual Learning Research Institute examined how the order of students’ assignment submissions was related to course performance by benchmarking student progress against Michigan Virtual’s course pacing guides, which are provided to help students stay on track in their courses. Both reports identified that as students become increasingly out of alignment with course pacing guides, final course scores tend to decline. 

Diving deeper into this relationship, researchers divided students into four equal groups based on how much they deviated from course pacing guides—students who deviated the least were in the first group, and students who deviated the most were in the fourth group. In the first report, which focused on Michigan Virtual’s STEM courses, researchers found a 9.5-point difference in final course score (out of 100) between students in group 1 (the least out of order) and group 4 (the most out of order). In the second report, which focused on Michigan Virtual’s World Language courses, this difference was 9.6 points. Effectively, this translates to about a letter grade difference. Across both reports, final course scores steadily decreased as students’ deviation from course pacing guides increased. 

It is important to be able to identify student characteristics that may be related to virtual course outcomes, as this could help teachers more quickly identify students who may need additional support. While pacing is associated with students’ final course scores, Michigan Virtual’s 2023-2024 Effectiveness Report highlights differences in virtual course pass rates by poverty level and race/ethnicity. For example, Freidhoff and colleagues (2025) note that the virtual pass rate for students in poverty was 58% while students not in poverty had a pass rate of 77%. Given that deviating from course pacing guides is associated with lower final course scores, understanding the extent to which groups with varying demographic characteristics complete assignments out of sequence could help inform proactive supports and interventions. As such, using data from the second report, this blog (part of a blog series exploring the impact of student assignment submission patterns) examines pacing guide deviation based on the demographic makeup of students’ home school buildings.

Methodology snapshot

Student-level assignment and performance data and building-level demographic data were analyzed for students enrolled in Michigan Virtual World Languages courses in Spring 2024. Variables were created to measure the amount and the extent to which students submitted assignments out of alignment with course pacing guides. The “percentage of assignments completed out of order” variable reflects the number of assignments students submitted out of the intended pacing guide order out of all assignments submitted. The “average magnitude” variable refers to the average difference between the intended submission order of consecutively submitted assignments for all of the assignments submitted by a student. For a complete description of the study methodology, please review the full report.

To get a better sense of how the poverty level of schools might be associated with pacing behaviors, schools were categorized based on the percentage of all learners at the school (not just virtual learners) who qualified for free or reduced-price lunch:

  • Low Poverty (≤25%)
  • Mid-Low Poverty (>25% to ≤50%)
  • Mid-High Poverty (>50%)1

School-level poverty data were available for 1,674 students. Approximately 45% (n = 748) were students from “Mid-Low Poverty (>25% to ≤50%)” buildings. In contrast, 23% (n = 385) came from “Mid-High Poverty (>50%)” buildings. 

To better understand how school demographics may relate to pacing, schools were also categorized by the percentage of Non-White students.

  • Non-White School Population ≤25%
  • Non-White School Population >25% and ≤50%
  • Non-White School Population >50%2

Data on the Non-White School Population was available for 1,676 students. Approximately 68% (n = 1140) were from buildings where the Non-White School Population was ≤25%. Just under 10% (n = 164) of students came from buildings where the Non-White School Population was >50%.

Connecting pacing patterns to school demographics

Cuccolo and Green’s report (2025) revealed that most students (97%) deviate from course pacing guides at least once. When examining pacing guide deviations by the poverty level of students’ home schools, the percentage of students who submitted at least one assignment out of the intended order remained remarkably consistent (approximately 97%), varying by only about one percentage point. Further highlighting the commonality of moving out of sequence, almost 98% of students from Mid-High Poverty (>50%) buildings went out of sequence at least once. Review Table 1 for a detailed breakdown of sequencing behaviors by school poverty level.

 Table 1. Pacing Groups by School’s Poverty Level 

Poverty LevelnIn-SequenceOut-of-Sequence
Low Poverty (≤25%)5413.88%96.12%
Mid-Low Poverty (>25% to ≤50%)7482.41%97.59%
Mid-High Poverty (>50%)3852.34%97.66%

A similar pattern was observed when analyzing pacing guide deviations by the Racial/Ethnic makeup of students’ home school buildings. About 97% of students attending schools where the Non-White population was ≤25% submitted at least one assignment out of order. While students from these schools submitted assignments out of sequence most frequently, this value is within two percentage points of those observed in the other categories. Further, the percentage of students who submitted at least one assignment out of order was within .02% across schools where the Non-White student population was between >25% and ≤50%, and >50%. Review Table 2 for more details.

Table 2. Pacing Groups by Percent of Schools’ Non-White Population 

%Non-White CategorynIn-SequenceOut-of-Sequence
Non-White School Population ≤25%11402.54%97.46%
Non-White School Population >25% and ≤50%3724.03%95.97%
Non-White School Population >50%1643.05%96.95%

Connecting pacing patterns to school poverty level

Inspecting the average frequency of course pacing guide deviation by school poverty level revealed that the percentage of assignments submitted out of order was highest among students from Mid-High Poverty (>50%) buildings, on average (M = 47.15, SD = 24.39). This was approximately two to four percentage points higher than the other categories. Overall, the average percentage of assignments submitted out of order was fairly comparable across economic categories (approximately 43-47%).

The average magnitude variable provided a look at how “off” pace students were when they submitted assignments out of order. While there was consistency in average magnitude values across economic cateogories, students from Low-Poverty (≤25%) buildings had the largest values on average (M = 3.74, SD = 3.12) while students from Mid-Low Poverty (>25% to ≤50%) buildings had the smallest values on average (M = 3.39, SD = 2.95). It is worth noting the similarity of these means, as they are within 0.35 percentage points of each other. Taken together, across economic categories, the extent to which students are “off” pace is typically between three and four assignments. Review Table 3 for the average percentage of assignments submitted out of order and the average magnitude for each group of students.

Table 3. Out of Order Assignments and Average Magnitude by School’s Poverty Level

Economic CategoryMean (SD)MinMedianMax
Percentage Out of Order
Low Poverty (≤25%)45.02 (26.17)0.0048.1597.22
Mid-Low Poverty (>25% to ≤50%)43.55 (25.05)0.0046.1197.70
Mid-High Poverty (>50%)47.15 (24.39)0.0050.0095.38
Average Magnitude
Low Poverty (≤25%)3.74 (3.12)0.002.9313.89
Mid-Low Poverty (>25% to ≤50%)3.39 (2.95)0.002.5014.16
Mid-High Poverty (>50%)3.42 (2.81)0.002.6214.56

Connecting pacing patterns to the percentage of schools’ Non-White population

Breaking down the percentage of assignments submitted out of order by the school’s Non-White population suggested that students from school buildings where >50% of the population was Non-White submitted the greatest percentage of assignments out of order, on average (M = 48.99, SD = 26.91). On the other hand, students from buildings where the Non-White School Population was >25% and ≤50% had the lowest percentage of assignments submitted out of order, on average (M = 43.81, SD = 25.97). Overall, this was fairly similar to the trends observed across poverty levels, as the percentage of assignments submitted out of order varied by approximately one to five percentage points across ethnic/racial categories. 

There was remarkable consistency in magnitude values when looking across schools’ Non-White populations. The highest average magnitude values were noted among students whose school had a Non-White population of >50% (M = 3.76, SD = 3.01), which was only 0.3 percentage points greater than the values observed in the two remaining categories. Across buildings with various Non-White populations, students were approximately three and a half to four assignments “off” pace on average. Review Table 4 for the average percentage of assignments submitted out of order and the average magnitude for each group.

Table 4. Out of Order Assignments and Average Magnitude by School’s Non-White Population

%Non-White CategoryMean (SD)MinMedianMax
Percentage Out of Order
Non-White School Population ≤25%44.54 (24.81)0.0047.1197.70
Non-White School Population >25% and ≤50%43.81 (25.97)0.0045.9497.22
Non-White School Population >50%48.99 (26.91)0.0054.2395.00
Average Magnitude
Non-White School Population ≤25%3.47 (2.98)0.002.5814.56
Non-White School Population >25% and ≤50%3.48 (2.94)0.002.7012.22
Non-White School Population >50% 3.76 (3.01)0.003.0013.95

Key findings

On average, students from schools with varying economic and racial/ethnic makeups deviated from pacing guides by approximately 3-4 assignments and submitted just under half of the course content out of order. While this was a near-universal behavior, several patterns stood out:

  • High prevalence of out-of-sequence submissions: Over 95% of students from schools in every demographic group submitted at least one assignment out of order.
  • Pacing trends by poverty level: There was consistency in the percentage of assignments submitted out of order across poverty levels, with a difference of approximately four percentage points between the group with the lowest and highest values. On average, students submitted just under half of their assignments out of order, regardless of building type.
  • Pacing trends by percentage of schools’ Non-White population: There was consistency in the percentage of assignments submitted out of order across buildings with varying Non-White student population percentages—a difference of approximately five percentage points between the group with the lowest and the highest values. Across buildings with varying makeups, students submitted just under half of their assignments out of their intended order. 
  • Extent of deviation: Across building types, students were typically between three and four assignments “off” the intended assignment sequence, on average.
  • Performance thresholds: Cuccolo & Green (2025) found that a drop in final course scores may occur when students submit over 25% of assignments out of order or are more than one assignment “off” from pacing recommendations—on average, all demographic groups exceeded these thresholds.

Implications for educators

These trends suggest that pacing guide deviations are common, but not trivial, among students whose schools have a variety of demographic makeups. Since students who stray from their course pacing guide tend to earn lower grades, early identification is key. Mentors and instructors can support students by:

  • Actively monitoring gradebooks for early signs of pacing issues
  • Reinforcing pacing expectations clearly and consistently
  • Offering feedback and support targeted at helping students stay, or get back, on track

It is important to note that a variety of student, course, and school factors likely interact to contribute to students’ pacing behavior. Although school demographics do not cause pacing behaviors, understanding these patterns may help educators intervene sooner and do so more effectively.

You can check out the full reports below: 

References

Cuccolo, K. & DeBruler, K. (2024). Out of Order, Out of Reach: Navigating Assignment Sequences for STEM Success. Michigan Virtual. https://michiganvirtual.org/research/publications/out-of-order-out-of-reach-navigating-assignment-sequences-for-stem-success/ 

Cuccolo, K. & Green, C. (2025). Out of Order, Still Out of Reach: Navigating Assignment Sequences for MV World Language Courses. Michigan Virtual. https://michiganvirtual.org/research/publications/navigating-assignment-sequences-for-mv-world-language-courses/ 

Freidhoff, J. R., DeBruler, K., Cuccolo, K., & Green, C. (2025). Michigan’s k-12 virtual learning effectiveness report 2023-24. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2023-24/

Kwon, J. B. (2018). Learning trajectories in online mathematics courses. Lansing, MI: Michigan Virtual University. Retrieved from https://michiganvirtual.org/research/publications/learning-trajectories-in-online-mathematics-courses/

Zweig. J. (2023). The first week in an online course: Differences across schools. Michigan Virtual. https://michiganvirtual.org/research/publications/first-weeks-in-an-online-course/

  1. Due to low ns, the ‘High Poverty >75%’ category was combined with the ‘Mid-High Poverty (>50% to ≤75%)’ category to form the existing ‘Mid-High Poverty (>50%)’ category.
    ↩︎
  2. Due to low ns, the ‘Non-White Population >75%’ category was combined with the ‘Mid-High Poverty (>50% to ≤75%)’ category to form the existing ‘Non-White School Population >50%’ category. ↩︎
]]>
https://michiganvirtual.org/wp-content/uploads/2025/07/iStock-1308948297.jpghttps://michiganvirtual.org/wp-content/uploads/2025/07/iStock-1308948297-150x150.jpg
Out of Order, Still Out of Reach: An Interview with a Researcher https://michiganvirtual.org/blog/out-of-order-still-out-of-reach-an-interview-with-a-researcher/ Wed, 04 Jun 2025 18:52:31 +0000 https://michiganvirtual.org/?p=96044

In this blog, MVLRI researchers synthesize the key findings from two research studies about student assignment submission patterns in Michigan Virtual online courses.

]]>

Self-paced asynchronous online courses offer students significant flexibility in when and where learning occurs. Recent research by the Michigan Virtual Research Institute examined how student pacing, particularly the order in which they submit assignments, is related to online STEM and World Language course performance. Understanding students’ pacing behavior and its relationship to course performance can help inform the strategies educators and mentors use when working with students in self-paced online courses.  

In the following interview from our “Interview with a Researcher” blog series, the lead researchers behind Michigan Virtual Learning Research Institute’s (MVLRI) STEM and World Languages reports synthesize some take-home messages about students’ assignment submission patterns. 

Why is it important to consider online learners’ assignment submission patterns? 

Assignment submission patterns are a part of a set of student behaviors called pacing—how students progress through a course. Pacing has traditionally been thought of as the timing of students’ assignment submissions. When conceptualizing pacing in this way, we often ask questions like, Are students waiting until the last minute to submit assignments? Are they submitting assignments early or late? Are they submitting a lot of assignments close together? It’s well-established that the timing of student assignment submissions is related to course outcomes. However, our team wanted to know more about the possible impact of out-of-order assignment submissions because, anecdotally, this was a pattern Michigan Virtual (MV) instructors were noticing within these asynchronous courses. Assignment sequencing is the term we gave to describe the order in which students submit their assignments. When we looked at this behavior across two domains (STEM and World Languages), we found evidence that it is related to course outcomes. Specifically, as students submit more assignments out of their intended course order, final course scores tend to decline.  

Why did you feel like it was important to look at assignment sequencing in World Language courses?

Great question! The original hypothesis that prompted this research was that submitting assignments out of their intended order would be detrimental to student performance because it would undermine the scaffolding built into the courses. So, based on this hypothesis, our first study on assignment sequencing used a sample of students enrolled in Michigan Virtual STEM courses since they are highly scaffolded. Of course, scaffolding is likely to vary by subject area and course, so we felt it was important to expand our research. After preliminary analyses of assignment sequencing in several other subject areas, World Language had a high percentage of students who moved out of alignment with course pacing guides and is a distinct subject area from STEM, making it an ideal choice for expanding our previous research. Looking across these two subject areas also allows us to understand the generalizability of our findings, compare and contrast key differences, and provide data-backed recommendations to instructors and mentors of students in these subject areas. 

What did students’ assignment submission patterns look like in World Language courses? Could you explain the relationship between students’ assignment sequencing and their final course scores?

We found that it was really common for students to deviate from course pacing guides! 97% of students submitted at least one assignment out of alignment with their course pacing guide. Among these students, approximately 45% of completed course assignments were submitted out of order. While the volume of assignments submitted out of order was fairly high, students were about three assignments “off” from the intended pacing guide order.

Looking across the spectrum of student performance, we observed that students’ final course scores steadily declined as their assignment submissions became increasingly out of order, both in terms of the number of assignments submitted out of order and how far “off” students were from the pacing guide expectation. To put this in perspective, students who submitted the fewest assignments out of order had average final course scores as much as a full letter grade higher than students who had the greatest number of assignments submitted out of order. 

You mentioned that the first study in this series looked at assignment sequencing in online STEM courses. Were there any notable differences between that study and this one? Did you see any patterns across these two studies?

The general pattern of results was similar across the two studies in that students’ assignment submission patterns had a relationship with final course scores. The biggest difference, however, was in the percentage of students who went out of order in each subject area. While both studies showed high rates of out-of-sequence behavior, 93% of students went out of order in the STEM study compared to 97% in World Languages. Across both studies, course scores steadily declined as students submitted a greater percentage of assignments out of order and strayed further from the intended assignment sequence. In STEM courses, there was a 9.5 point difference in average final course scores between students with the fewest and greatest number of assignments submitted out of order, whereas in World Language courses, there was a 9.6 point difference. The relationship between assignment sequencing and final course scores was really similar across the two studies, which suggests that monitoring and encouraging proper pacing is important for student performance in both subject areas.

Based on your findings across these two studies, what recommendations do you have for online instructors and mentors?

Our findings indicate that it is common for students to deviate from course pacing guides at least once during their time enrolled in MV online asynchronous courses. Some deviation is to be expected and is unlikely to negatively impact student performance, especially if that deviation is infrequent or small (e.g., within a unit). However, if students are consistently moving between units or submitting a high volume (more than 25%) of their assignments out of order, online instructors may want to flag these behaviors (and students) and monitor for performance declines. 

In general, adhering to best practices for online teaching and mentoring is recommended to help online learners be as successful as possible. Communicating course expectations early on (informing students of the structure, workload, pacing, and demands of self-paced online learning) may help students adjust their expectations and approach to their course(s). Regularly checking the gradebook and benchmarking student progress against course pacing guides can help teachers and mentors identify students who may be struggling with course pacing. Mentors and instructors should also communicate regularly about students’ progress and work collaboratively to address pacing issues.

It is also possible that submitting assignments out of order may have a greater impact on some students’ performance than others. For example, students with less content knowledge may miss key benefits of built-in scaffolding when submitting assignments out of order, which may negatively impact course performance. Further, because the design of these studies limits our ability to make cause-and-effect statements, it is likely that other factors interact with pacing to affect student performance. In particular, encouraging the development of metacognitive, time management, and self-regulated learning skills may help students reflect and make adjustments to their own learning behaviors. In this regard, providing students with personalized feedback may be useful.

Final Thoughts

Across two reports, the relationship between pacing and final course scores has consistently shown that final course scores decline as students become increasingly out of alignment with their course pacing guides. Instructors and mentors can help students succeed by paying particular attention to students’ pacing within their online courses. 

You can check out the full reports below: 

Out of Order, Out of Reach: Navigating Assignment Sequences for STEM Success

Out of Order, Still Out of Reach: Navigating Assignment Sequences for Michigan Virtual World Language Courses

In addition, this blog is part of a blog series exploring the impact of student assignment submission patterns.

]]>
https://michiganvirtual.org/wp-content/uploads/2025/06/iStock-1267191179.jpghttps://michiganvirtual.org/wp-content/uploads/2025/06/iStock-1267191179-150x150.jpg
Exploring Literacy Growth and Engagement: An 8-Week Pilot of Shoelace Learning in the Classroom https://michiganvirtual.org/blog/exploring-literacy-growth-and-engagement-an-8-week-pilot-of-shoelace-learning-in-the-classroom/ Mon, 17 Feb 2025 17:50:52 +0000 https://michiganvirtual.org/?p=93592

Introduction Michigan’s Top Ten Strategic Education Plan was adopted in August 2020 to guide K-12 education stakeholders in working toward a common set of goals. The second goal on the list is to “Improve early literacy achievement” (Michigan Department of Education, n.d.). In recent years, small declines in Michigan students’ performance on the reading portion...

]]>

Introduction

Michigan’s Top Ten Strategic Education Plan was adopted in August 2020 to guide K-12 education stakeholders in working toward a common set of goals. The second goal on the list is to “Improve early literacy achievement” (Michigan Department of Education, n.d.). In recent years, small declines in Michigan students’ performance on the reading portion of the National Assessment of Educational Progress (NAEP) (The Nation’s Report Card, n.d.) have emphasized the need for this goal, as well as the need for new interventions that will boost students’ literacy skills. In Fall 2024, Michigan Virtual, a key stakeholder in the education success of K-12 students in Michigan, partnered with Shoelace Learning to run a pilot aimed directly at tackling this goal.

Shoelace Learning is an educational technology company focused on building students’ reading comprehension skills through video games. Michigan Virtual worked with Shoelace Learning to implement their games in 13 elementary classrooms across Michigan to study whether these games would improve students’ confidence and literacy skills.

Literacy in the US

Literacy is an important skill with wide-reaching implications for all ages. A study by the Annie E. Casey Foundation noted that children who were not proficient in reading by the end of third grade were four times more likely to drop out of high school than reading-proficient peers (Hernandez, 2011). Those who continue to have poor literacy levels in secondary school often experience difficulty throughout and beyond school (Hakkarainen et al., 2016; The Children’s Reading Foundation, n.d.). Indeed, insufficient literacy skills in adulthood can have financial impacts, as adults with low literacy skills are more likely to be unemployed than adults with high literacy skills (The National School Boards Association, 2014). Educators and policymakers have recognized literacy’s importance and the need to help students develop strong skills early (Michigan Department of Education, 2024).

Over the years, many reading interventions have been implemented with the hopes of improving K-12 literacy with varied success. Interventions, such as small-group and motivational reading have been shown to positively impact students’ reading abilities (Hall & Burns, 2018; McBreen & Savage, 2021). Unrau et al. (2018) and Moon et al. (2017) have also highlighted that increasing reading enjoyment and self-efficacy may also help students to engage and persist in reading. But, most importantly, early interventions seem to be crucial for struggling readers (Wanzek et al., 2018).

Gamification

Gamification in education, creating game-like experiences to engage learners with content and help them progress toward a goal, has become a popular intervention strategy in recent years (Dehghanzadeh et al., 2024). Gamification has shown promise in improving behavioral, affective, and cognitive/learning outcomes in K-12 settings, and this may, in part, be attributed to its ability to engage and motivate students (Dehghanzadeh et al., 2024; Huang et al., 2020; Prados Sánchez et al., 2021; Sailer & Homer, 2020). While gamification has been shown to engage students in the task at hand, its ability to improve student outcomes has been mixed and can hinge on various factors related to game design, the context in which it is delivered, and learner characteristics (Dehghanzadeh et al., 2024). For this reason, this pilot was conducted to examine Shoelace Learning’s reading comprehension games to evaluate them both on their ability to engage and motivate students and, even more importantly, on their ability to impact student learning.

Study

Given the promise of gamification for engaging students, the goal was to examine the impact that providing teachers with games designed to engage students in reading comprehension could have on literacy rates. The current study was designed to assess teachers’ perceptions of Shoelace (both in terms of the impact on their students and the ease of use in their classrooms), student engagement with the platform, and the efficacy of the games in improving literacy skills. The pilot lasted for eight weeks and consisted of 13 teachers and their classes, all from Michigan.

Design

The main component of the pilot was the eight-week period of play for the students. However, the teachers started their participation a couple of weeks in advance by first participating in a 60-minute orientation and introduction to Shoelace Learning. During this orientation, teachers were provided with an overview of the Shoelace platform and teacher dashboard, including examples of gameplay, assessments, reporting, and assignments.

While teachers had considerable autonomy over how they implemented Shoelace in their classrooms, they were asked to have their students play for a minimum of 30 minutes each week for the eight weeks. During this play period, Michigan Virtual researchers conducted roundtable discussions with many of the teachers and students, and Shoelace employees also did in-class visits with a number of the classrooms. 

Following the eight weeks of play, gameplay data was collected, and the teachers received a survey that assessed their perceptions of students’ reading comprehension, enjoyment, and confidence, and the usability of Shoelace in the classroom.

Participants

Thirteen participants were recruited through the Michigan Elementary and Middle School Principals Association (MEMSPA). They came from six school districts with which Michigan Virtual had existing relationships. In return for participating, Michigan Virtual provided teachers with Shoelace access for two years, from September 2024 to June 2026, and a $150 stipend upon completing the pilot.

Table 1 provides an overview of the 13 classes that participated. 

Table 1 – Grade level and number of students for each participating class.

ClassGrade# of Students
1223
2Reading Interventionist (multi-grade)15
3430
4425
5419
6320
7427
8324
9321
10324
11325
12323
13424

Shoelace

Shoelace is an online platform that provides reading comprehension practice through game-based delivery and is intended for students in grades 3-8. In order to progress in the games, students must correctly answer reading comprehension questions. The reading comprehension questions cover over 100 different reading skills and are each assigned a grade and difficulty level. The questions may be delivered in conjunction with a short reading passage. These bundles (questions + passage) are used to evaluate students’ overall Reading Comprehension Level (RCL). Students also encounter standalone questions, which are questions that do not have a corresponding passage and instead focus on specific skill development.

There are two games that students can choose to play: Dreamscape or Dreamseeker Drift. Dreamscape is a strategy game that is similar to Clash of Clans and is built around a central “vision core,” a diamond-like structure that players must protect and level up to progress. Dreamseeker Drift is an endless runner in a similar vein to Subway Surfers, where players aim to achieve the longest “run” possible by avoiding obstacles. In both games, in order to engage with the game elements (for example, buy new avatars and skins, compete in challenges, or start new runs) students must correctly answer reading comprehension questions.

Gameplay data reported by Shoelace:

  • Learning Moments Delivered (LMD): The number of questions a player answered (regardless of correctness). 
  • Reading Comprehension Level (RCL): A leveling system based on passage (e.g., sentence structure, vocabulary) and question difficulty. The scale ranges from 1.0 to 8.9, with the first number representing grade level (e.g., 3.4 is grade 3) and the second indicating progress towards the next grade level (e.g., 0.4 indicates approximately halfway through the level).

Results

Participating classrooms were expected to have students use Shoelace for 30 minutes per week for eight weeks. As time data was not available, prior Shoelace data showing that 30 minutes of quality play was equivalent to approximately 25 LMD was used instead. For the purpose of our evaluation, student fidelity is defined as students who played a minimum of seven weeks (to provide flexibility for absences) with an average of 25 LMD per week. As Table 2 shows, fidelity varied across participating classrooms. For classroom fidelity, we modified this definition to be that 80% of the students in the class played with fidelity. As shown in the next section, a class or student who did not meet the fidelity benchmarks should not be interpreted as the class or student did not play (or even that they played very little). While 58.7% of students played with fidelity, 91% of students had a minimum of five weeks of participation.

Table 2 – Participation and fidelity levels by class. Classes marked with an * achieved class level fidelity.

Class# of Students% who participated% who played at least 7 weeks% who played with fidelity (7 weeks + min average of 25 LMD)
Overall30099.7%69.0%58.7%
1*23100%100%100%
215100%40.0%20.0%
33096.7%0%0%
425100%76.0%52.0%
519100%0%0%
620100%65.0%55.0%
727100%77.8%66.7%
8*24100%100%95.8%
9*21100%85.7%85.7%
10*24100%100%100%
1125100%52.0%8.0%
1223100%95.7%73.9%
13*24100%100%100%

Student and Class Participation

A student was considered to have participated in a given week if they answered a minimum of one LMD. Total class participation (active # of students / total students in class * 100) provided insight into what percentage of a class played. Students were free to play either game during the eight-week pilot period so the following discussion will be agnostic as to which game they played. Of the 300 students across the 13 classes, 299 of them (or 99.7%) played at least once. 

Figure 1 shows the average percentage of active students per week. From week to week, participation ranged from about 71% to 91%. There was no clear trend, with participation fluctuating from week to week but overall staying relatively high between approximately 80 and 90%. The largest gain was seen from weeks 2 to 3 of 9.7%, while the largest loss happened between weeks 5 and 6 of about 6.7%.

Figure 1 – Percent of students who played by week.

At the class level, as seen in Table 3, participation fluctuated more widely from week to week, as most weeks’ classes either had most everyone play or no one play. Class 3 was the main outlier, as max participation never surpassed 60% in a given week.

Table 3 – Min, max, and average participation levels by class.

ClassMin ParticipationMax ParticipationAverage Participation Across Weeks
191.3%100%98.4%
20.0%93.3%76.7%
30.0%60.0%40.8%
456.0%100%87.5%
50.0%100%71.7%
665.0%95.0%81.3v
777.8%96.3%89.8%
887.5%100%96.9%
981.0%100%93.5%
1091.7%100%97.4%
1148.0%96.0%76.0%
1287.0%100%96.7%
1395.8%100%99.5%

Learning Moments Delivered (LMD), Accuracy and Guessing

Learning Moments Delivered (LMD) represents a single question attempted by a student (regardless of whether they got it right or wrong). On average, students attempted 721 LMD over the eight-week period. Excluding the one student who did not participate, the number of LMD earned over the period showed large variation, as they ranged from a low of 3 to a max of 6,858. Similarly, when looked at by weekly averages, students ranged from a low of 1 to a high of 859. Figure 2 highlights the variation in student data.

Figure 2 – Percent of students grouped by their average weekly LMD counts. Target was ≥25 LMD/week.

In addition to looking at the total number of LMD, student accuracy was also examined. Accuracy was calculated by dividing the number of correctly answered questions by the total number of questions students attempted. Table 4 shows the min, max, and average accuracy by class and overall. Students averaged an accuracy rate of 50.2%, with a minimum of 26.7% and a maximum of 94.7%.

Table 4 – Min, max, average accuracy by class over the eight-week period. 

ClassMin AccuracyMax AccuracyAverage Accuracy
Overall26.794.750.2%
130.4%74.4%40.4%
231.5%63.4%47.7%
328.6%86.4%53.0%
433.3%77.4%55.7%
529.5%76.0%43.2%
626.7%80.0%50.9%
727.9%72.2%42.0%
831.9%81.1%54.7%
928.9%76.4%50.0%
1028.0%74.2%45.8%
1131.3%72.0%54.2%
1229.9%71.7%51.5%
1332.8%94.7%61.0%

Prior Shoelace data shows that maintaining an accuracy rate of 50% or higher from their learning engine is suggestive of positive use and learning results. Because the learning engine will periodically introduce incrementally more difficult content to determine if students are ready to engage with it, student accuracy rates will not be akin to, nor should they be compared to, those produced from summative assessments. Accuracy rates between 40-50% usually indicate students are struggling more with the content and may be turning to guessing (something often seen with students whose reading comprehension is too low for the program). Accuracy rates that fall below 40% are generally indicative of students who have primarily turned to guessing.

Reading Comprehension Levels (RCL)

Reading Comprehension Level (RCL) is Shoelace’s leveling system based on passage and question difficulty. RCL, thus, provides important information about students’ progress and reading abilities over the course of their engagement with Shoelace. A student’s RCL can range from 1.0 to 8.9. The first digit represents the grade, and the value after the decimal represents progress towards the next grade. An RCL of 3.4 would mean the student is reading at a grade 3 level and approximately 40% of the way to grade 4. Most students in a class had RCL data available (97.9%, SD = 0.04). 

The first RCL value that students receive is set by the placement test that is initiated upon a student’s first play session. The placement test is a quick assessment designed to adjust the starting point for the learning engine. The placement test results in a value between 1.0 and 8.5 in increments of 0.5. At the start of the pilot, the average RCL after the placement test was 2.0, with a range from 1.0 to 6.5. By the end of the 8 weeks, the average RCL value was 2.1, with a range of 1.0 to 6.6. 

Figure 3 shows the average change by class. On average, classes showed an increase of 0.13 RCL, with a range of -0.1 to 0.4.

Figure 3 – Average start vs end RCL by class.

Table 5 shows how many students fall into specific RCL Change groupings based on whether they played with fidelity. RCL Change data was available for 295 out of 299 (98.7%) students. One hundred and seventy-five students played with fidelity, while 120 did not. Of the 175 students (59.3%) who played with fidelity, 97 (55.4%) saw an RCL Change of greater than 0.0, and 75.4% of those had an RCL Change value greater than 0.2. Taking a look at the 120 students who did not play with fidelity (40.7%), 40 (33.3%) saw an RCL Change of greater than 0.0 while 20 (50.0%) saw RCL Change values of greater than 0.2.

Table 5 – RCL changes by student based on fidelity of play.

RCL ChangePlayed with Fidelity 7+ weeks + min 25 LMD/WeekDid Not Play with Fidelity
(<7 weeks and/or < 25 min LMD/week)
 # of students% of students# of students% of students
< 02514.3%3327.5%
05330.3%4739.2%
> 09755.4%4033.3%
Total Students175 120 

Teacher Perceptions

At the end of the eight-week play period, the 13 teachers filled out a survey about their perceptions of how playing Shoelace games impacted students’ reading confidence, comprehension, fluency, vocabulary, and enjoyment. The vast majority of the teachers reported that they believed using Shoelace increased or significantly increased their students’ confidence in reading (n=10), reading comprehension (n=10), fluency (n=10), vocabulary development (n=12), and enjoyment of reading (n=12). Of those teachers who did not report an increase, they all reported no changes (n=1-3). Table 6 provides a closer look at teachers’ ratings of student outcomes.

Table 6 – Teacher ratings of student growth.

RatingNo ChangeIncreasedSignificantly Increased
Confidence23.1% (n=3)69.2% (n=9)7.7% (n=1)
Comprehension23.1% (n=3)69.2% (n=9)7.7% (n=1)
Fluency23.1% (n=3)69.2% (n=9)7.7% (n=1)
Vocabulary7.7% (n=1)84.6% (n=11)7.7% (n=1)
Enjoyment7.7% (n=1)84.6% (n=11)7.7% (n=1)

Teachers were also asked about how easy or difficult they found it to include Shoelace in their classroom and how well it aligned to their curriculum and their daily teaching. Table 7 shows that the vast majority of teachers were satisfied or very satisfied with all aspects of Shoelace usability (n=11-12).

Table 7 – Teacher perceptions of Shoelace usability

Rating Dissatisfied Neutral Satisfied Very Satisfied 
Curriculum 7.7% (n=1)  7.7% (n=1)  61.5% (n=8) 23.1% (n=3) 
Daily Teaching 7.7% (n=1)  7.7% (n=1)  61.5% (n=8) 23.1% (n=3) 
Training Support 0.0% (n=0)  0.0% (n=0)  61.5% (n=8) 38.5% (n=5) 
Teaching Tool 0.0% (n=0)  15.4% (n=2) 61.5% (n=8) 23.1% (n=3) 
Technical 0.0% (n=0)  7.7% (n=1)  69.2% (n=9) 23.1% (n=3) 

Discussion

Class Participation and Engagement

Class-level engagement patterns appear consistent with teachers’ self-reported data about the usability and impact of Dreamseeker Drift and Dreamscape. Survey data collected from teachers indicated a positive experience overall, with 11 of the 13 teachers satisfied or very satisfied with integrating Shoelace into the curriculum, their daily teaching, and its use as a teaching tool. There was, however, considerable variation in engagement levels across classes and fluctuations in both the percentage of active students and average LMD over the eight-week pilot period. While only five of the classes were identified as having reached the benchmark for class-level fidelity, and while a few of the classes had very few students who hit the individual student benchmark for fidelity, overall, 58.7% of students played with fidelity. This percentage of player fidelity can be deemed as highly positive, as research by Stanhope and Rectanus (2015) has shown that for most products, on average only 5.2% of student licenses reach full dosage amounts.

The LMD values across the pilot at both the individual student and class level revealed substantial variability, which also likely points to both between- and within-class differences in-game use. From the survey results and the roundtable discussions with the teachers, this may also be a result of the learning curve the teachers identified in implementing the games in their classrooms. Given this, along with the autonomy they were given over their individual implementations, these results are not surprising. Similarly, the differences in weekly LMD averages and accuracy are also likely impacted by factors such as class time, other commitments, how teachers chose to implement Shoelace, ease of implementation, student perceptions, or students’ reading abilities. 

The overall activation rate (students who played at least once) of 99.7% significantly exceeded the industry standard of an average student license activation rate of 63.4% across school and district sites (Stanhope & Rectanus, 2015). This exceptionally high activation rate may reflect positive teacher perceptions of the platform’s curricular alignment and/or teacher commitment to implementation fidelity, given that they were taking part in a paid research pilot period. While outside motivators may have influenced activation and the continual engagement with the platform, the volume of students who answered large numbers of LMD (71.3% attempted ≥200 LMD) show that the games did engage the vast majority of the classes’ and students’ interest over the entire pilot period.

Impact on Literacy Skills

Shoelace assigns students their first Reading Comprehension Level (RCL) at the end of the initial placement test. After that, their RCL value is updated each time a student completes a passage and question bundle. How it changes (increasing, decreasing, or staying the same) depends on how the student performs (accuracy over the entire bundle), the difficulty of the passage, and the students’ current RCL. RCL values range from 1.0 to 8.9. For this study, changes to students’ RCL over the eight-week period were used as a proxy for the improvement of their literacy skills. 

While the classes in the study ranged from grades 2 through 5, the students’ initial placement test results revealed that the students’ abilities covered a much larger range from a 1.0 (equivalent to the start of grade 1) to 6.5 (equivalent to about halfway through grade 6) with an average value of 2.0. This distribution of grades reflects the challenge of trying to teach students with a wide variation of skill level, which teachers frequently encounter in their classrooms.

By the end of the eight weeks, the range had widened slightly (1.0 to 6.6) but more interesting was the increase to the average RCL to 2.1. While a 0.1 increase may sound minor, each 0.1 step on the RCL scale can be viewed as roughly equivalent to a month’s progress (given a 10-month school year). When the data was dug into further, of the students who played with fidelity, 55.4% had at least a 0.1 increase and 42% had an increase of at least 0.3, or three or more months’ worth of growth in 8 weeks. Of the students who didn’t reach fidelity with their play, 33.3% had an increase of at least 0.1 and 16.7% of them showed 3 or more months of growth. These values demonstrated the relationship between consistent gameplay and positive change in RCL. 

In addition to these results, the survey and conversations with teachers showed that the teachers were also seeing positive literacy growth. From the survey results, a minimum of 76.9% of the teachers (or 10 of the 13) saw increases in students’ confidence, comprehension, fluency, vocabulary and enjoyment.

Limitations of the Study

Classrooms were expected to play either of the Shoelace games for a minimum of 30 minutes a week for each of the eight weeks of the pilot. At the class level, a class was considered to have met this goal and to have played with fidelity if a minimum of 80% of their students played with fidelity (reduced to provide flexibility). Even with this reduced benchmark, only 38.5% (or 5 of the 13) classes that participated met the fidelity threshold. At the individual level, students did slightly better, with 58.7% of them meeting their fidelity definition of a minimum of seven weeks played and answering a minimum average of 25 LMD per week. 

There are many reasons for students and classes not to have hit the benchmark for fidelity, including (but not limited to) time limitations, unknown barriers to implementation (i.e. confusion around gameplay), and student (or teacher) absences. Overall, while the number of classes that achieved fidelity was low, as the data above has shown, there is a good reason to set the bar for fidelity high and to push for this level of usage. 

The irregular usage may also imply barriers to consistent implementation (e.g., time) or perceptions about Shoelace’s usability or impact that were not captured by survey data. Indeed, whole-class conversations revealed that many teachers perceived Shoelace as having somewhat of a learning curve and that it required them to be “hands-on” while their students were engaged with the games. Students had overall positive perceptions of the games, with many talking enthusiastically about them. However, they also noted that more detailed in-game help would be beneficial. While a training session was provided to teachers at the start of the pilot, and additional training material made available, conversations from the teachers who received in-person visits by the Shoelace team revealed that additional face-to-face (vs asynchronous) training was extremely useful, especially once their class had started to play and the teachers had a clearer understanding of where they wanted and needed further information and training. The time and effort required to implement Dreamseeker Drift and Dreamscape may have impacted some teachers’ classroom use.

Conclusion

Students with low literacy skills struggle throughout their entire K-12 educational journey, but the impact does not end there. It will follow them throughout their adult life, affecting everything from the jobs they have to their financial health. Addressing this problem is the second goal of Michigan’s Top Ten Strategic Education Plan. Michigan Virtual and Shoelace Learning partnered together to run an eight-week pilot to examine whether using Shoelace games in Michigan classrooms would improve students’ confidence and literacy skills. 

The results of this study strongly indicate that Shoelace games result in gains in student literacy and confidence, with four important notes:

  1. Fidelity of usage is important. The students who played most consistently across the eight weeks showed the greatest gains, with 42% of them seeing the equivalent of at least 3 months of reading growth. 
  2. There is a learning curve for teachers, and they need training and support to overcome it. An initial overview and introduction to the platform is not enough, teachers need follow up once their students have started playing to address any questions and concerns that come up in the classroom. 
  3. Engagement was high. While the educators who chose to participate did receive a stipend and two-year Shoelace licenses for doing so, that doesn’t discount just how high participation levels remained throughout the pilot and beyond. More than half the students answered over 300 LMD over the eight-week pilots, and more than 80% were still playing more than two months after the conclusion of the pilot.
  4. Teachers’ positive perceptions of Shoelace and its impact on students’ skills were high. The vast majority of the teachers indicated that their students improved their confidence, comprehension, fluency, vocabulary, and overall enjoyment. A similar majority found that it met their curriculum and daily teaching needs. 

The combination of the pilot data and the teachers’ positive perceptions points to the overall utility of Shoelace as a tool for teachers looking to improve their students’ reading comprehension.

References

Hakkarainen, A. M., Holopainen, L. K., & Savolainen, H. K. (2016). The impact of learning difficulties and
socioemotional and behavioural problems on transition to postsecondary education or work life in Finland: a five-year follow-up study. European Journal of Special Needs Education, 31(2), 171, 186.https://eric.ed.gov/?id=EJ1095058

Hall, M. S., & Burns, M. K. (2018). Meta-analysis of targeted small-group reading interventions. Journal of School Psychology, 66, 54-66. https://doi.org/10.1016/j.jsp.2017.11.002

Hernandez, D. J. (2011). Double jeopardy: How third-grade reading skills and poverty influence high school graduation. Annie E. Casey Foundation. https://www.aecf.org/resources/double-jeopardy

Huang, R., Ritzhaupt, A. D., Sommer, M., & et al. (2020). The impact of gamification in educational settings on student learning outcomes: A meta-analysis. Educational Technology Research and Development, 68, 1875–1901. https://doi.org/10.1007/s11423-020-09807-z

McBreen, M., & Savage, R. (2021). The impact of motivational reading instruction on the reading achievement and motivation of students: A systematic review and meta-analysis. Educational Psychology Review, 33(3), 1125-1163. https://doi.org/10.1007/s10648-020-09584-4

Michigan Department of Education. (2024, September 26). Child Literacy Would Improve Under Bills Passed by Michigan Lawmakers. https://www.michigan.gov/mde/news-and-information/press-releases/2024/09/26/child-literacy-would-improve-under-bills-passed-by-michigan-lawmakers

Michigan Department of Education. (n.d.). Michigan’s Top 10 Strategic Education Plan.
https://www.michigan.gov/mde/-/media/Project/Websites/mde/top10/top_10_mi_strategic_ed_plan_promising_practices_1_pager.pdf?rev=13e8d60cd2be4ab4aa6f3cd86bbcf532&hash=5002C7DD79CA084D872FC3C1053603C2

Moon, A.L., Wold, C.M. & Francom, G.M. Enhancing Reading Comprehension with Student-Centered iPad Applications. TechTrends 61, 187–194 (2017). https://doi.org/10.1007/s11528-016-0153-1

National School Boards Association: Center for Public Education. (2014, October). Beyond fiction: The importance of reading for information. https://cdn-files.nsba.org/s3fs-public/Beyond-Fiction-Full-Report-PDF.pdf

Prados Sánchez, G., Cózar-Gutiérrez, R., del Olmo-Muñoz, J., & González-Calero, J. A. (2021). Impact of a gamified platform in the promotion of reading comprehension and attitudes towards reading in primary education. Computer Assisted Language Learning, 36(4), 669–693.https://doi.org/10.1080/09588221.2021.1939388

Sailer, M., & Homner, L. (2020). The gamification of learning: A meta-analysis. Educational Psychology
Review, 32, 77–112. https://doi.org/10.1007/s10648-019-09498-w

Stanhope, D. & Rectanus, K. (2015). Current realities of EdTech use: Research brief. Lea(R)n, Inc.
https://www.ikzadvisors.com/wp-content/uploads/CurrentRealitiesOfEdTechUse_Infographic_ResearchBrief.pdf

The Children’s Reading Foundation. What’s The Impact? (n.d.). https://readingfoundation.org/the-impact

The Nation’s Report Card. (n.d.). National achievement-level results.
https://www.nationsreportcard.gov/reading/nation/achievement/?grade=8

Unrau, N. J., Rueda, R., Son, E., Polanin, J. R., Lundeen, R. J., & Muraszewski, A. K. (2018). Can reading self-efficacy be modified? A meta-analysis of the impact of interventions on reading self-efficacy. Review of Educational Research, 88(2), 167-204.

Wanzek, J., Stevens, E. A., Williams, K. J., Scammacca, N., Vaughn, S., & Sargent, K. (2018). Current Evidence on the Effects of Intensive Early Reading Interventions. Journal of Learning Disabilities, 51(6), 612–624. https://doi.org/10.1177/0022219418775110

]]>
https://michiganvirtual.org/wp-content/uploads/2023/08/942.jpghttps://michiganvirtual.org/wp-content/uploads/2023/08/942-150x150.jpg
Project-Based Learning and Competency-Based Education Work Together at FlexTech: An Interview with an Educator https://michiganvirtual.org/blog/pbl-and-cbe-work-together-at-flextech/ Tue, 15 Oct 2024 20:40:07 +0000 https://michiganvirtual.site.strattic.io/?p=89605

By blending project-based learning with competency-based education, FlexTech aims to provide a personalized path to graduation, meeting both academic standards and students’ personal growth goals.

]]>

The K-12 education landscape is evolving, offering more options for students and their families seeking alternatives to the traditional model. Many desire flexible learning environments that better align with their needs and interests. At the same time, educators and school leaders are rethinking how to make learning more engaging and centered around essential skills or competencies that students must master.

Two instructional models driving this shift are project-based learning (PBL)—where students gain knowledge by tackling real-world and personally meaningful projects—and competency-based education (CBE)—where students advance by demonstrating mastery of competencies or key skills rather than completing a set number of classroom hours.

At FlexTech High School, these two approaches come together in a flexible model that enables students to incorporate their interests as they progress at their own pace toward graduation. To explore how FlexTech blends PBL and CBE, Michigan Virtual Learning Research Institute (MVLRI) researchers spoke with Dr. Sarah Pazur, director of school leadership at CS Partners, who oversees the FlexTech High School network in Michigan. The transcript of our conversation was edited for clarity and brevity. 

Can you tell us a little bit about FlexTech and why it was important for you to implement competency-based education?

FlexTech is a project-based, competency-based, blended-learning school offering face-to-face instruction Monday through Thursday with optional in-person student support offered on Fridays. All of our courses are accessible online through Google Classroom, allowing students to choose a learning model that works best for them—whether fully in-person, hybrid, or entirely online. Our advisory program, which is really focused on helping students find their purpose and make a plan for post-secondary that suits their passions and strengths, is the cornerstone of FlexTech. Beginning in the ninth grade, every student is paired with an advisor who follows them through their senior year and serves as an advocate for that student. 

There are three FlexTech campuses, all located in Michigan: Brighton, Oakland, and Shepherd. Each campus is small by design, with around 175 students at both Brighton and Oakland and 80 at Shepherd. Our small size enables us to offer students many options, flexibility, and one-on-one instructional support. FlexTech was actually designed as a competency-based school from the start. It was born out of a desire to provide options for students to finish high school—whether they weren’t going to finish because they needed to work during the day, life circumstances made school difficult, or they left their previous school because they weren’t successful, we wanted to provide these students with an option. Our flexibility and personalized support are designed to serve a wide range of students, including those who may not have been successful in traditional models.

What challenges did FlexTech face in implementing competency-based education?

One of our biggest challenges has been finding an LMS (learning management system) and a gradebook that work well in a competency-based/project-based learning environment to account for all of the competency-based nuances (e.g., indicating the “rigor rating” of an assignment, coordinating with a student’s transcript). Many learning management systems are designed around the student’s schedule; however, we need one that focuses, instead, on the project. This allows us to more accurately track competencies and allow for flexibility in terms of mastery. 

Another challenge has been tracking student progress. Because we’re competency-based, students can progress at their own pace, so we need detailed tracking systems to monitor which competencies each student has mastered and which still need work. Because of seat time (instructional time) requirements and expectations that students finish a course in a semester or in a year, we’ve had to create many of our own internal systems using spreadsheets and Google Docs, which allow teachers to make notes from semester to semester and year to year.

One area where we’ve experienced some pushback is around our grading scale. In our system, a 4.0 indicates a student can apply a skill in a new and novel way. However, the ceiling for an assignment that only requires mastery of a lower-level skill might be a 2.0. We’ve had to work hard to help parents understand that these numbers don’t directly translate to traditional letter grades.

Tell me a little bit about what competency-based education looks like in practice at FlexTech.

At FlexTech, competencies are primarily subject-based. While some competencies are multidisciplinary, most are tied to specific courses to ensure students meet credit requirements. To develop our competencies, we worked with curriculum designers and a consultant to create a crosswalk that aligns our competencies with state standards (e.g., MMC, NextGen, CCSS). This alignment process is crucial because it ensures that our competencies truly reflect the big ideas we want students to take away from each course. We continue to diligently revisit and revise these competencies to ensure they are still reflective of the right big ideas and standards. 

PBL and CBE complement each other at FlexTech. Most courses are designed to be project-based; however, some are more authentically project-based than others, as we do offer some off-shoot traditional courses for students who need them (e.g., a basic math class). Typically, students either a) start a project based on a personal interest or current event, and then, with help, competencies are wrapped into the project; or b) teachers create the “frame,” outlining specific competencies for the project, and then students create the “focus” based on their interests. Projects run for 3-10 weeks and typically focus on 1-3 specific competencies. Milestones, formative feedback, and self-reflection are key components of every project. Our project-based approach helps avoid treating CBE as a checklist, keeping the focus on authentic learning experiences. While there is a general class pace, FlexTech offers flexibility through asynchronous learning for students who move faster or slower. Daily project support time and one-on-one appointments on Fridays provide additional individualized assistance to ensure students stay on track.

Grades at FlexTech focus purely on competency mastery, unlike traditional systems where behavior or extra credit might influence grades. Teachers provide ongoing formative feedback to help students gauge their progress. Each campus functions a little differently when it comes to its transcript. When I was the principal at the Oakland campus, we used a conversion scale because parents wanted a more traditional transcript. However, the Brighton and Shepherd campuses offer a fully competency-based transcript, which is accompanied by a one-page explanatory document for universities (and parents).

How do you measure success with regard to competency-based education? 

We measure success in multiple ways. Because we’re project-based, students have the flexibility to demonstrate mastery of academic competencies in creative ways, which isn’t restricted by a rigid curriculum. Our smaller size allows us to track student progress very closely. Beyond academic competencies, we focus on students’ personal growth through the competencies outlined in our Portrait of a Graduate (e.g., passionate, problem-solver, growth mindset). Our Senior Chronicle serves as a culminating portfolio where students reflect on how they’ve grown throughout their time with us, including their ability to meet our Portrait of a Graduate competencies. 

We also measure success by hearing stories of FlexTech students who are relieved of the traditional school “pressures” (e.g., lack of flexibility, social anxiety, inability to catch up) and are now finding their way. A fundamental part of FlexTech is taking students who are not doing well and helping them, whether that means helping them for five or maybe even six years. However, maintaining our identity and inclusivity within an education system with a strict four-year finish policy is a challenge.

What advice would you give to other school leaders interested in implementing competency-based education?

Start by staying true to your values and beliefs about teaching and learning. Implementing CBE is not just about helping students finish faster—it’s about embracing a philosophy that requires a complete system-wide shift. This means rethinking everything from your instructional framework to your scheduling, course offerings, and even your LMS. Developing a shared vision is crucial, and so is being prepared to make some concessions along the way. 

It’s also essential to get everyone on board from the beginning, especially with the “why” behind CBE. We have a very personalized intake process, where we interview every family and explain how FlexTech functions. We also offer sessions during curriculum nights to educate parents and students about competency-based education and what it means for their learning journey.

Lastly, consider piloting the program on a small scale first to see how it fits within your school or district before going full-scale.

Final thoughts

FlexTech’s project-based and competency-based model is designed to give students more autonomy in how they meet academic standards and develop essential skills. While this approach provides opportunities for personalized support, it also presents challenges, such as navigating external expectations for a four-year graduation timeline and finding internal systems that can effectively track student progress in a non-traditional format. FlexTech continues to adapt its model in response to these challenges while maintaining a focus on individualized student growth.

]]>
https://michiganvirtual.org/wp-content/uploads/2024/10/PBL-CBE.jpghttps://michiganvirtual.org/wp-content/uploads/2024/10/PBL-CBE-150x150.jpg
Understanding What Motivates High School Students to Pursue Computer Science https://michiganvirtual.org/blog/understanding-what-motivates-high-school-students-to-pursue-computer-science/ Fri, 27 Sep 2024 12:36:41 +0000 https://michiganvirtual.site.strattic.io/?p=89468

As computer science (CS) continues to grow in importance in K-12 education, understanding what motivates students to pursue this field is becoming increasingly vital. In a study, Dr. Aman Yadav from Michigan State University and Dr. Kristen DeBruler from Michigan Virtual studied how students’ motivation – beliefs about their abilities (self-efficacy), the perceived challenges of...

]]>

As computer science (CS) continues to grow in importance in K-12 education, understanding what motivates students to pursue this field is becoming increasingly vital. In a study, Dr. Aman Yadav from Michigan State University and Dr. Kristen DeBruler from Michigan Virtual studied how students’ motivation – beliefs about their abilities (self-efficacy), the perceived challenges of learning CS (cost), and the perceived value of the subject (value) –  shape their intentions to continue studying CS. 

Connecting Motivation to Computer Science

When applying these concepts to computer science, it becomes clear why motivation is crucial. CS is often seen as challenging, requiring complex problem-solving skills and a significant time investment. This perception can either motivate students who see the effort as worthwhile or discourage those who find the challenge too daunting. Moreover, understanding how CS is applied in real-world careers, like data analysis, can enhance students’ appreciation for its utility and relevance.

The Study: High School Students’ Experiences in Online CS Courses

The researchers focused on 44 high school students enrolled in online AP Computer Science courses, examining how self-efficacy, cost, and utility influenced their intention to continue studying CS. Here’s what they found:

Self-efficacy initially appeared to be a significant factor in predicting students’ intent to pursue CS. This means those who felt more capable in their CS courses were more inclined to continue. However, when other factors (cost and utility) were included in the analysis, self-efficacy’s impact diminished.

Perceived cost had a surprising effect. Students who believed that studying CS would require significant effort were actually more likely to want to continue! This finding challenges the assumption that high perceived cost always discourages engagement. It suggests that students might associate CS with a meaningful challenge worth their time and effort.

Utility value showed an unexpected negative relationship with intent to pursue CS. Students who saw a higher utility in studying CS were less likely to want to continue. One possible explanation is that students may feel the subject’s relevance but find the commitment to learning it too demanding, especially in an online setting where support and guidance might be limited.

What Does This Mean for Teaching Computer Science?

The findings highlight the complex ways in which students’ perceptions influence their motivation to study computer science. The idea that high perceived cost can increase motivation suggests that students who view CS as a challenge are willing to tackle it if they see the effort as rewarding. However, the negative relationship between utility value and intent to pursue suggests that even if students understand the importance of CS, they might need more support to overcome perceived difficulties.

For educators, these insights are essential. As more high schools introduce CS courses, especially online options, it’s crucial to:

  • Provide support and resources to help students overcome the challenges of studying CS, ensuring they feel capable and confident.
  • Highlight the real-world applications of CS, clarifying the subject’s utility and emphasizing how students can succeed despite the challenges.

To learn more and explore related research, you can read the following papers:

Lishinki, A. & Yadav, A. (2021). Self-evaluation interventions: Impact on self-efficacy and performance in introductory programming. ACM Transactions on Computing Education. DOI: 10.1145/3447378  

Lishinski, A., Yadav, A., Good, J., & Enbody, R. (2016). Learning to program: Gender differences and interactive effects of students’ motivation, goals, and self-efficacy on performance. In Proceedings of International Computing Educational Research (pp. 211-220). Melbourne, Australia: Association for Computing Machinery. DOI: 10.1145/2960310.2960328.

]]>
https://michiganvirtual.org/wp-content/uploads/2018/04/Focused-young-African-student-sitting-on-stairs-using-a-laptop-iStock-840243874.jpghttps://michiganvirtual.org/wp-content/uploads/2018/04/Focused-young-African-student-sitting-on-stairs-using-a-laptop-iStock-840243874-150x150.jpg
From Playtime to Purpose: How Hobbies Can Set Your Child on the Path to Success https://michiganvirtual.org/blog/from-playtime-to-purpose-how-hobbies-can-set-your-child-on-the-path-to-success/ Wed, 21 Aug 2024 12:51:40 +0000 https://michiganvirtual.site.strattic.io/?p=88832

Hobbies aren’t just fun—they help kids grow. From building confidence to developing real-world skills, hobbies play a key role in helping children discover who they are and what they care about. In this post, we explore how parents and educators can support hobby-based learning and create space for curiosity, creativity, and long-term success.

]]>

Do you remember the first time you picked up a hobby just for the fun of it? Maybe it was painting, coding, or playing the guitar. For me, it was writing stories in the margins of my school notebooks. That quiet joy of creating something, just because I loved it, stayed with me.

That’s the thing about hobbies. They’re more than just something to pass the time. They spark curiosity, build skills, and help shape who we become. And when we give kids the chance to explore their interests—especially outside of the classroom—we’re helping them discover parts of themselves they might not find otherwise.

Why Hobbies Matter

We’ve all seen the moment when a child lights up after trying something new and realizing they love it. Whether it’s cooking, sketching, building robots, or playing a team sport, these experiences help kids connect with their creativity and build confidence.

In the rush of school, homework, and screen time, hobbies can get pushed to the side. But carving out space for them is more important than ever. Hobbies offer kids something school can’t always provide: time to explore freely, take healthy risks, and express themselves without a grade attached.

They also build real-world skills. Through hobbies, kids learn how to problem-solve, stick with something challenging, manage their time, and work toward a goal. These are the kinds of skills that carry them through school and into adulthood.

In fact, research from the National Institutes of Health shows that leisure activities can reduce stress, boost brain function, and improve creativity. So yes, letting kids play, build, and explore can actually help them succeed in school, too.

Helping Kids Find a Hobby They Love

Not every child knows right away what excites them, and that’s okay. Exploration is part of the process. Here are a few ways to help them find something that sticks:

1. Offer a range of experiences

Let them try different things. Community centers and schools often offer low-cost classes or clubs that introduce kids to everything from music to robotics.

2. Notice what catches their attention

Do they draw in every notebook? Are they always building with blocks or tinkering with tech? Follow those clues and offer opportunities to go deeper.

3. Encourage group activities

Social hobbies like sports, theater, or coding clubs help kids connect with peers, build teamwork skills, and stay motivated.

4. Be flexible

It’s normal for kids to bounce between interests. Support the journey, not just the outcome. Over time, they’ll land on something that clicks.

5. Join in when you can

Some hobbies are more fun when they’re shared. Cook together. Build something as a family. In classrooms, invite students to showcase their interests and share their skills with others.

When Hobbies Become Pathways

Here’s something else to consider: hobbies can sometimes lead to future career paths. While it might be hard to imagine a child’s fascination with video games turning into something more, the reality is that many hobbies can translate into valuable career skills. For example, a love of gaming could lead to a career in game design or esports. Similarly, a passion for photography could evolve into a professional photography career.

With the rise of the gig economy and digital platforms, what starts as a hobby can often become a side hustle or even a full-time job. By supporting a child’s hobbies, we’re not just helping them develop as individuals—we’re also opening up future possibilities they might not have considered otherwise.

Resources to Explore

If you’re looking for resources to help your child discover new hobbies, there are plenty of places to start. Here are a few recommendations:

  • Common Sense Media: Offers reviews and recommendations for educational tools, apps, and games to help kids discover new interests.
  • Local Community Centers and Libraries: Many offer free or low-cost classes and events, especially after school or during school breaks.
  • Youtube & Online Platforms: From guitar tutorials to animation guides, there are thousands of kid-friendly resources available. Educators can help curate safe, high-quality options for independent learning.
  • Enrichment Programs: Courses like Wiz Kid Learning’s Roblox Game Design or other structured micro-courses offer students a chance to dive into their interests in a more guided way.

Supporting Hobbies Through Flexible Learning

Today’s learning environment gives us more flexibility than ever. With options like online enrichment courses, after-school clubs, and independent projects, students can explore their hobbies in ways that work with their schedule and learning style.

These flexible experiences help students build confidence and take ownership of their learning. And when a child sees their passion taken seriously, they’re more likely to stick with it and grow.

Final Thoughts

At the heart of it, hobbies are about more than just having fun. They’re tools for growth, connection, and self-discovery. Whether your child is learning an instrument, building games, or exploring nature, those experiences can shape who they are—and who they’ll become.

So keep encouraging their curiosity. Let them try new things. Celebrate the process, even when it’s messy or short-lived. Because those moments might be the start of something truly meaningful.

Want to help your child take the next step?

Explore enrichment courses designed to support student interests, build skills, and make learning feel like play.
Register Now

]]>
https://michiganvirtual.org/wp-content/uploads/2024/08/iStock-1620882724.jpghttps://michiganvirtual.org/wp-content/uploads/2024/08/iStock-1620882724-150x150.jpg
Key Strategies for Supporting Disengaged and Struggling Students: An Interview With A Researcher https://michiganvirtual.org/blog/key-strategies-for-supporting-disengaged-and-struggling-students-an-interview-with-a-researcher/ Fri, 21 Jun 2024 13:51:40 +0000 https://michiganvirtual.site.strattic.io/?p=87635

In an era where virtual learning is becoming increasingly prevalent, understanding the best practices for engaging students online is crucial. Researchers at the Michigan Virtual Learning Research Institute (MVLRI) have conducted a comprehensive study to uncover effective strategies used by virtual educators, particularly those that help disengaged and struggling students succeed.  The following interview, part...

]]>

In an era where virtual learning is becoming increasingly prevalent, understanding the best practices for engaging students online is crucial. Researchers at the Michigan Virtual Learning Research Institute (MVLRI) have conducted a comprehensive study to uncover effective strategies used by virtual educators, particularly those that help disengaged and struggling students succeed. 

The following interview, part of our “Interview with a Researcher” blog series, shares some highlights from this research.

Why was it important to examine effective practices in virtual learning environments, especially for struggling students?

The shift to emergency remote instruction during the COVID-19 pandemic highlighted a significant disparity in the success of virtual learning implementations. Schools with pre-existing, well-established virtual teaching practices fared much better. We wanted to identify what made these programs successful and how new virtual teachers and administrators could adopt these practices to better engage all students, particularly those who are disengaged or struggling.

What were some of the key strategies identified for engaging disengaged or struggling students in virtual environments?

One of the most frequently used and effective strategies was providing frequent and specific feedback, which was reported by nearly 79% of educators. This type of feedback not only supports academic progress but also helps in building strong teacher-student relationships. Additionally, involving other adults, such as onsite mentors and parents, was idenfied as being crucial. Around 69% of educators communicated with the student’s onsite mentor, and 61% encouraged parental involvement. These strategies help bridge the gap created by the lack of physical presence in virtual learning.

Communication seems to be a recurring theme. Can you elaborate on the importance of communication in virtual learning environments?

Absolutely. Communication is the backbone of virtual education. Effective communication strategies include maintaining regular contact through various channels like LMS messaging, phone calls, and web conferencing tools. Many educators also emphasized the importance of being available for students through scheduled office hours or drop-in times. Establishing clear communication channels helps ensure that students, parents, and educators are on the same page, which is vital for student engagement and success.

The study also looked at professional development for virtual educators. What sources of professional development were found to be most effective?

Our findings showed that optional opportunities provided by the virtual school or program were considered the most effective, with over 50% of educators endorsing them. Conferences and informal peer mentoring were also highly valued. These professional development sources are preferred because they are immediately applicable and foster a sense of community among educators, which is essential for sharing best practices and support.

What challenges did educators face in virtual teaching, particularly in connecting with disengaged students?

One of the biggest challenges is the lack of face-to-face interaction, which makes it difficult to read body language and establish personal connections. This physical separation also complicates identifying the specific reasons behind a student’s disengagement. Additionally, educators mentioned difficulties in effectively communicating with parents and guardians, who are crucial allies in supporting student engagement and progress.

Based on your research, what recommendations would you give to new virtual teachers working with disengaged or struggling students?

Focus on building strong relationships with your students from the beginning. Use frequent, specific feedback to show students that you care about their progress. Keep open channels of communication and be flexible with your teaching methods to accommodate diverse learning needs. Also, involve parents and onsite mentors whenever possible to create a supportive network around the student. Flexibility, patience, and a personalized approach are key.

The insights from this study underscore the importance of tailored strategies, consistent communication, and community support in virtual learning environments. By focusing on relationship-building, providing specific feedback, and involving parents and mentors, educators can significantly improve engagement and success for all students, especially those who struggle. As virtual learning continues to evolve, these findings offer a valuable roadmap for educators seeking to enhance their practices and better support their students in a digital age.

]]>
https://michiganvirtual.org/wp-content/uploads/2020/09/iStock-1215017775.jpghttps://michiganvirtual.org/wp-content/uploads/2020/09/iStock-1215017775-150x150.jpg
Solving the Pacing Puzzle: Course Design and Technical Considerations for Pacing in K-12 Online Learning https://michiganvirtual.org/blog/solving-the-pacing-puzzle-course-design-and-technical-considerations-for-pacing-in-k-12-online-learning/ Mon, 03 Jun 2024 17:32:48 +0000 https://michiganvirtual.site.strattic.io/?p=87043

Pacing is critical to student success in online learning, and supporting effective pacing is a team effort. This blog explores how Michigan Virtual staff leverages technology and course design principles to uplift student learning through proper pacing.

]]>

Introduction

Helping students succeed in their online courses is a team effort involving leveraging course design principles and technology to facilitate learning. Instructional designers use technology to design courses that optimize students’ and teachers’ experiences. Technology operations staff help implement and adapt systems to meet users’ needs. Pacing, or how students move through a course, is important to student success. Cramming and submitting assignments out of their intended order are associated with poor course performance (DeBruler, 2021; Cuccolo & DeBruler, 2024; Michigan Virtual Learning Research Institute, 2019). Because online courses are dynamic environments facilitating learning “anytime, anyplace,” it is crucial to leverage technology and course design principles to support students’ pacing to optimize their experience. In this blog, we’ll explore the pivotal roles of instructional designers and technology operations in students’ experiences with pacing in online courses.

Expert Interviews

Michigan Virtual Learning Research Institute researchers talked with Kim Garvison and Megan Riggers from Michigan Virtual’s Instructional Product Development (iPD) team, and Kristen Crain Senior Director of Technology Operations about the interplay between technology and course design principles in addressing pacing. This blog highlights central themes that cut across our conversations. The transcript has been edited for clarity and brevity.

Functionality: Supporting Students, Teachers, and Guardians through LMS Features

Centering Students and Teachers

How is course pacing important within the context of course design?

We try to ensure the workload is evenly distributed. We consider when students might be doing these assignments and how we’re balancing intensity in terms of cognitive load for the student and the teacher. Another way we consider pacing is through how we distribute auto and teacher-graded assignments. Even though we have a short turnaround time for teachers to grade assignments, balancing the type of assignment does help move students through their course without having roadblocks.

What design choices have you found helpful for supporting students struggling with pacing in their online courses?

As designers, we have to think about our end users and how much is realistic for them to handle. What grade are they in? What’s their age? What’s their reading level? What can they handle? What experience might they have in an online setting? We try to create courses with predictable structures: approximately the same number of lessons and each unit follows the same general structure. That way, both teachers and students know what to expect. Predictability also helps mentors and parents support students more effectively during their courses.

We also ensure our content is accessible for students with different learning needs. For example, our LMS helps ensure we have appropriate alt text (descriptive text concisely conveying the meaning of an image). Having hurdles for students with different learning needs is a big deal for their pacing, and therefore is a big deal to us when designing a course.

Organizational Tools Aid in User Experience

Organizational tools provided by the LMS are critical for helping students navigate the course efficiently. For instructors, organization allows them to serve students more effectively by creating a centralized location where they can assess students’ progress.

How can technology or the LMS be leveraged to address pacing?

We are very intentional with setting up course navigation. We don’t want students thinking “What do I do next? How do I find the quiz? Where do I click?” That’s the background instructional design part that helps students work through their content. Having a good LMS helps us have students get friction from content and learning, and not from navigating their course. I think that the technology is what allows us to not think about the technology. The more none of us have to think about the technology, that’s a good sign that it’s working well for us.

Our LMS is heavily invested in simplifying the instructor and teacher experience. Brightspace centralizes everything so an instructor can see who is accessing content, who submitted content, and student grades in one view. LMSs also do a great job creating buckets of content, nesting units together, and keeping students moving through all the content linearly.

Reminders

Reminder tools within the LMS were viewed positively as a way to provide guide rails for students as they move through content at their own pace.

Is there anything that helps students adhere more closely to the pacing guides?

Instead of thinking, “Okay, on this date, we have to blast out this reminder,” for our courses, location is more important than time because we don’t know where the students will be within a course at a specific date.

In addition to strategically placed reminders within the LMS, data from the LMS can be used to create reminders that are pushed outward to students, and guardians.

LMSs can compile extensive amounts of data in a consumable way for a school, an instructor, and even parents. They can leverage the data into tools, widgets, calendars, emails, and other things that make the students’ experience more streamlined. For example, certain LMSs let parents opt into features such as automated daily emails telling them what’s due for their child.

Leveraging LMS to Promote Sequential Course Progression

How do you encourage students to follow the logical progression of the course content? Do you use any specific tools?

We use LMS features that should help students get back on track. For example, we have a checklist of assignments at the beginning of each unit. There are also LMS features at the end of lessons and units pointing out incomplete assignments and reminding students to go back. Our LMS also provides the option to put a password on an assignment. We do this for all final exams as another way to say, “Hey, this is important.”

We make sure everything is scaffolded so it’s clear to students that they have to go through certain assignments before they can do the next project. If students skip content, the teacher can point out that they don’t meet the rubric requirements because we develop rubrics that emphasize the lesson’s content.

We also have an internally designed pacing guide application that’s accessible by the student and the mentor so they can see a week-by-week breakdown of what students should accomplish including graded and non-graded activities.

Addressing Assignment Cherry-Picking

We’ve conducted research showing students tend to favor auto-graded and higher-point assignments. What are some potential workarounds for addressing this cherry-picking?

We try to anticipate it and consider the point spread and the ratio of auto-graded to teacher-graded assignments. We don’t want a student to be able to only take the auto-graded quizzes and pass the course. Usually, we go for approximately 40% auto-graded to 60% teacher-graded.

One way to address this is to incorporate conditional releases which can function based on completion and unlock specific content, units, or modules. We don’t use this tool often, but it can create different learning methods for students. One student may come into a course and say, “I want to learn today via video.” Then they go into the content, choose the method they want, and it unlocks content that’s all video-based instruction.

We leave everything open though, and that’s a transparency piece so the student can see the whole scope of the class and every graded object upfront. Sometimes waiting for a teacher to grade an assignment before being able to move on frustrates students. In addition, only allowing students to see one unit at a time (rather than the whole course) can add another layer of frustration. From an instruction design standpoint, it seems easy – just lock it until they pass. However, from the user side, that’s not always how it goes.

Thinking Ahead

What would an LMS that perfectly addresses students’ pacing needs look like?

One that automatically sets a student’s pace as they start the course based on its duration, and is flexible enough to empower teachers or administrators to override, re-pace, or modify students’ pace in bulk since they’re dealing with large course loads of students.

It would be neat if courses had their own AI bot that could provide students with reminders like, “Hey, you should be working on this assignment,” or “Hey, it looks like you are behind here. Your new due date is.”

Course Pacing Blog Series

In our Course Pacing blog series, we discuss pacing and how it impacts student success with input from several different subject matter experts. Our hope with this series is to bring to light how different organizations and experts approach course pacing, share their insights and struggles, provide relevant research and resources, and determine areas for future research. Stay up to date on future blogs in this series by signing up for email notifications!

]]>
https://michiganvirtual.org/wp-content/uploads/2024/05/towfiqu-barbhuiya-jOeh3Lv88xA-unsplash-scaled.jpghttps://michiganvirtual.org/wp-content/uploads/2024/05/towfiqu-barbhuiya-jOeh3Lv88xA-unsplash-150x150.jpg
Solving the Pacing Puzzle: Supporting Student Progress in K-12 Online Programs https://michiganvirtual.org/blog/supporting-student-progress-in-k-12-online-programs/ Fri, 17 May 2024 18:44:09 +0000 https://michiganvirtual.site.strattic.io/?p=86994

In online learning, effective course pacing is crucial for student success. This blog explores how Michigan Virtual addresses course pacing challenges and develops effective pacing guides to support student learning.

]]>

One of the benefits of online learning is that students can work at their own pace. However, not all students have developed the time management skills to work through a course consistently. Perhaps unsurprisingly, research has shown that consistent course pacing results in higher student achievement (DeBruler, 2021). 

To explore course pacing through the lens of an online program administrator, researchers from the Michigan Virtual Learning Research Institute (MVLRI) interviewed Andrea McKay, Director of Instruction for Michigan Virtual in April 2024. The transcript from our interview has been edited for clarity and brevity. During our conversation, we explored how through intentional decisions made by online program administrators and support from both online teachers and on-site mentors, students are guided and supported to stay on pace in their online courses. 

Understanding Course Pacing and Developing Pacing Guides

How is course pacing currently addressed in Michigan Virtual courses? For example, how do you ensure students keep up without feeling rushed?

We provide pacing guides within our courses that show students a week-by-week breakdown of assignments to complete. Our teachers can access and adjust the pacing guide if students get behind and need a new plan to help them catch up. Once students have enrolled, their start date remains the same, and the length of the term remains the same; however, we can adjust the pacing guide so students see what assignments they need to complete within the shorter working time they have as a result of falling behind. 

How do you develop pacing guides for your courses? 

After a course is designed, the pacing guide is developed by splitting up the content and assignments over the number of weeks in the length of the term. Subject matter experts who understand the content and assignment expectations evaluate the pacing guide to ensure the pace is realistic. In addition, our courses are regularly updated which includes re-evaluating the pacing guide and making necessary adjustments. 

Tools for Pacing, Progress Tracking, and Data Utilization

Are there any tools or platforms that Michigan Virtual uses to help students keep pace and track their progress?

Teachers send monthly progress reports to the student, mentor, and parent so everyone understands the student’s course progress. Progress reports are personalized to each student by pulling data from our LMS (Brightspace) and student information system (SIS) such as the number of complete and incomplete assignments, current grade, and total assignments in the course. Some teachers add information such as personalized comments or a reminder of the course end date. For example, “Are you on track to complete the course by [course end date]? With a few adjustments on the back end, this tool within our LMS allows teachers to send these personalized progress reports to their entire course roster. In addition, teachers use our SIS to sort students in a course by start date and, based on that, send timely communication to help keep students on pace. Being able to sort students within a course by start date is very important in terms of accurately tracking student progress as we offer many different course start dates within each semester to better meet school districts’ needs for course start dates to align with their school calendar.

Online Program Course Pacing: Challenges and Solutions

What major challenges do you run into regarding course pacing in an online setting?

One major challenge is grade reporting. A common request from schools is to know a student’s current grade to determine whether that student is eligible to play sports. Unfortunately, the way our systems report students’ scores does not give them that information. In our courses, students start with zero points and build up their total course points with every assessment submitted. We provide an accurate display of their overall score at all times, but it’s not the same as a weekly grade that schools may be more used to. However, adjusting our grade reporting process would result in far fewer start and end date options and reduced flexibility in submission deadlines.

What do you think are the biggest pacing hurdles for students learning online? 

While the flexibility to work at your own pace is a common reason why many students take courses with us, some students need help to learn how to manage their time effectively. If a student gets behind, assignments add up quickly, resulting in a poor learning experience and a mountain of assignments to submit. 

What challenges do teachers face with pacing in their online courses? 

If instructors are overloaded with a flurry of assignment submissions at the end of a course, they cannot provide the same feedback quality while also meeting grading turnaround expectations. The resulting assignment feedback isn’t as effective because those students procrastinating and turning in numerous assignments during the last few weeks are probably not as concerned about whether they didn’t quite meet a learning target, what content they might need to revisit, or how to use and grow from instructor feedback—they’re just trying to get through it. An additional problem when students leave so many assignments for the end of the course is that they might be tempted to take some shortcuts and plagiarize, which turns into another huge headache at the end of a term. We have found the number of plagiarism incidents increases drastically in the last few weeks of a course. As a result, teachers communicate progress regularly and try to keep students on pace. 

Given our recent finding that the extent to which students submit assignments out of order is associated with lower grades, what’s your take on getting students to stick to the order and pace you set out?

Teachers understand that students may choose to complete assignments based on which ones will earn them a higher score more quickly, otherwise known as cherry-picking. Unfortunately, this means that rather than working sequentially through courses designed to scaffold and build skills as students progress, students are sometimes more focused on points than learning. Our teachers recognize this and will reach out to students to get them back on track. 

Pacing: Mentor and Parent Roles

Do mentors and parents play a part in monitoring students’ pacing? What does that collaboration look like?

There is a triangle of student support between the teacher, mentor, and (ideally) the parent as they are a critical component in overall student success (Borup & Stimson, 2017; Borup et al., 2017). Both mentors and parents are copied on progress reports to remain aware of student pace. In addition, specific mentor reports and monitoring tools are available through our LMS, Brightspace, to help mentors support students most effectively. Teachers understand that parents and mentors may know students in ways that they do not, so we all value the roles and support they provide and work hard to establish and maintain open lines of communication. 

Looking Down the Road

Are there any big changes you would like to see or pacing challenges you are preparing for in K-12 online learning?

In addition to continuing to meet districts’ needs for different course start/end dates and providing the flexibility of an adjustable pacing guide, we would love to offer a different view of students’ scores based on current progress and performance as school districts so frequently request it. This is a current limitation of our flexible scheduling options and how they interact with the LMS; however, we are working to find solutions. Despite students having access to a pacing guide and inevitably tallying up points, my other hope is to get to a point where students think more about their learning rather than the points. 

Course Pacing Blog Series

In this Course Pacing Blog Series, we discuss pacing and how it impacts student success with input from several different subject matter experts. Our hope with this series is to bring to light how different organizations and experts approach course pacing, share their insights and struggles, provide relevant research and resources, and determine areas for future research. Stay up to date on future blogs in this series by signing up for email notifications!

References

Borup, J. & Stimson, R. (2017). Helping students be successful: Mentor responsibilities. Michigan Virtual University. https://michiganvirtual.org/research/publications/helping-online-students-be-successful-mentor-responsibilities/  

Borup, J., Chambers, C. B., Stimson, R. (2017). Helping online students be successful: Parental engagement. Michigan Virtual University. https://michiganvirtual.org/research/publications/helping-online-students-be-successful-parental-engagement/ 

DeBruler, K. (2021). Research On K-12 Online Best Practices. Michigan Virtual. https://michiganvirtual.org/blog/research-on-k-12-online-best-practices

Acknowledgments

The author would like to thank Andrea McKay and Dr. Shannon Smith from Michigan Virtual’s Learning Services team as well as Dr. Kelly Cuccolo and Dr. Kristen DeBruler from the Michigan Virtual Learning Research Institute for their contributions and advice in developing this blog post.

]]>
https://michiganvirtual.org/wp-content/uploads/2024/05/online-learning-2.jpghttps://michiganvirtual.org/wp-content/uploads/2024/05/online-learning-2-150x150.jpg