Summary

The first quarter of IXL usage saw the completion of Goal 1: universal screener. Goal 2 and 3 were not completed. Upon reviewing usage data at the school, teacher, and student levels, it was apparent that a few students drove the school use averages and that many students had little interaction with IXL during quarter 1. By its nature, this means that diagnostic data, and 30 questions per subject per student per week was unattainable. Continuing with IXL requires progress monitoring, leveraging grade books and discipline, and committing to common usage schedules. The following identifies lessons learned and an action plan for the successful implementation of IXL as our MTSS software platform.


Lessons Learned

School level data and teacher level data obscure actual student engagement

School Level Data Questions Per Student Per Week

image.png

Teacher Level Data Questions Per Student Per Week

image.png

Student Level Data: Actual Usage

image.png

A few students clicking through are inauthentically informing questions per week.

ILSs are unrealistic environments to engage with IXL

Teachers and students use ILSs for work completion in an environment that is less suitable for monitoring whole-group activities and more suitable for redirecting/assisting individual students with competing needs. For example, student A and student A’s teacher will prioritize graded homework over engaging with IXL meaningfully, which is befitting for the ILS environment. This creates spontaneous and difficult to monitor situations when having students engage with the program, especially when compounding the differences in ILS expectations.