Our Top 5 Lessons Learned About Sustainable Scale through ASSISTments’ EIR Grant

Dotted blue decorative line - ASSISTments Images
ASSISTments operates in part from the U.S. Department of Education’s Education Innovation and Research (EIR) Program grant. The first two years of this grant have been focused on what we have named as three key “scale-up” mechanisms, areas where we can further develop and improve to achieve greater reach and impact. Thus far, we’ve engaged hundreds of our teacher users, and learned valuable lessons about scaling sustainability. As we enter the third year of the grant, we wanted to share these lessons to support other nonprofits in the early and pivotal stages of growth.

ASSISTments was first developed  in 2003 with a primary focus on enabling research in the learning sciences that shed light on what improves math teaching and learning. In 2019, we launched a non-profit, the ASSISTments Foundation, to scale our research and support for schools across the US. Over the past two years, the non-profit has grown to 19 full time employees that serve 21,000 teachers and 550,000 learners. A key accelerator for this scale has been an $8M grant from the U.S. Department of Education’s Education Innovation and Research (EIR) Program. Given ASSISTments’ demonstrated impact on student learning, the Department of Education was invested in supporting our next stage of growth, ensuring our product reaches even more teachers.

The grant was awarded to Worcester Polytechnic Institute (WPI), where the tool was first developed, and where we still maintain a team of innovators overseen by Dr. Neil Heffernan. The work of the grant is conducted in partnership with The ASSISTments Foundation, as well as the Center for Math Achievement at Lesley University and WestEd, our evaluation partner.

The first two years of this grant have been focused on what we have named as three key “scale-up” mechanisms, areas where we can further develop and improve to achieve greater reach and impact

  • Improve our product usability, making sure ASSISTments is intuitive for teachers and meets their needs. 
  • Create high quality Virtual Professional Learning Communities, as a low-cost and effective alternative to in-person coaching and teacher support. 
  • Develop an innovative new product feature, called Instructional Recommendations (IRs), providing teachers automated recommendations, written by expert teachers, for how to take action based on ASSISTments data.

In doing the above work, we’ve engaged hundreds of our teacher users, and learned valuable lessons about scaling sustainably. As we enter the third year of the grant, we wanted to share these lessons to support other nonprofits in the early and pivotal stages of growth.

Top 5 lessons learned:

1. The importance of being obsessed with user wants and needs

When we established ourselves as a non-profit, we also established a strong team of product engineers and designers who could dedicate themselves to understanding our user needs and making changes accordingly. Over the past two years, we have engaged in multiple focus groups and usability studies to inform our overall product roadmap that lead us to adding functionality and features that align to teacher wants and needs. For example, over the past two years, we launched an integration with Canvas, which expanded the market of teachers who can use ASSISTments and significantly updated our current UX/UI to allow teachers to search our content library by standards and customize assignments. We have also added functionality to our student interface as well and have improved the ease with which students can submit work to open response type problems. These improvements directly align with what we learned from our teacher users during our focus groups and usability studies. We will continue to release improvements to the teacher interface of ASSISTments as well as currently creating designs of a product for instructional leaders at the school and district level. 

2. Piloting is invaluable, especially when paired with a strong formative assessment approach 

One of the key advantages of the way this grant program is structured is it allowed for a period of testing, learning and iterating with real users, which is essential for developing something useful.

As a key example, we were able to study and iterate our virtual professional learning community design over 4 semesters, running different variations of cohorts for over 50 7th grade math teachers from all over the country. We varied the number of teachers in each cohort as well as their experience with ASSISTments. 

Iterations of vPLC - Assistments - Formative Assessment Solutions - Image

Through our various iterations, we measured success by looking at teacher usage of ASSISTments, teacher engagement in conversations, and teacher satisfaction of their experience. 

Survey Data - Assistments - Formative Assessment Solutions - Image

Through this piloting we have now finalized our scope, sequence, and structure of eight sessions of vPLC and between-session activities as well as identified elements of a successful vPLC to share with others. 

3. The potential for virtual PLCs to turn teachers from tired to inspired

Through our various iterations, we were able to find success with our vPLCs through building a foundation around four characteristics. 

  1. Establish a common connection to ground the vPLC

The connection that we established was our vPLC centered around 7th grade math teachers who were new to using ASSISTments. This allowed each session to focus on a new component of using ASSISTments successfully in the classroom and allowed teachers to build upon common experiences.

  1. Provide opportunities for authentic sharing

During each session, we shared authentic stories from teachers utilizing ASSISTments in their classroom and provided a space where participants could share their own stories with each other. This authentic sharing allowed our vPLC to be built upon real stories from real teachers and centered around our participants’ experiences. 

  1. Encourage equity of voice 

We intentionally encouraged our participants to share their ideas  through training our facilitators to call in participants, provide various ways participants could share in a way they felt comfortable, and allowing for multiple small group shares before discussing in a whole group setting.

  1. Provide time to apply

Since each session of the vPLC was focused on a component of using ASSISTments, we wanted to provide participants time to connect what was discussed in the vPLC to their own practice. Providing this time for participants to apply the knowledge gained ensured that the learning could be implemented. 

Using these four characteristics for the foundation of our vPLC, allowed us to create a low-cost and effective alternative to in-person coaching and teacher support and a space where teachers went from tired to inspired.

4. The promising potential for automated teacher-supports within digital platforms

One of the most powerful parts of ASSISTments is the data it provides teachers on student progress. However, having access to data is only the first step. Using the data to modify instruction in ways that best support their students’ progress by receiving information about student academic performance. However, it can be challenging for teachers to take the time to interpret the data, and use it to plan instruction. We wanted to develop an innovative feature that provides teachers with high quality instructional guidance on supporting students who struggle with a concept, based on their students' data. We want to make it as easy as possible for teachers to take action.

To develop this feature, we engaged 22 expert math teachers, with strong familiarity with the specific curriculum that we offer within ASSISTments (Illustrative Math or Eureka Math). They examined data from thousands of students on individual problems within the curriculum, and the common wrong answers and points of confusion students have historically experienced. They then crafted brief instructional recommendations for teachers to consider when looking at their own assignment report data.

To give a simple example, let’s say a teacher assigns the below math problem to their students, and the majority of students incorrectly select “Yes”, which is the incorrect answer.

Example Test Question - Assistments - Formative Assessment Solutions - Image

they might get a message that says: If students enter yes, review the image in Grade 7 unit 1 lesson 3 activity 3 to remind students that if they use additive thinking, scaled figures won't come out right.

We currently have over 1000 high-quality Instructional Recommendations embedded in ASSISTments that are delivered to teachers when available. In the coming year, we plan to continue to add additional high-quality instructional recommendations as well as leverage analytics from our platform and serves to understand if they are being leveraged and supporting increased use of our data to drive instruction.

Preliminary user research showed strong positive reactions from teachers at having this kind of resource at their fingertips, pointing to the potential for digital platforms to support effective use through automated instructional features.

5. The challenges of study recruitment and the need for system-level support

A key aspect of our grant is to conduct a large scale evaluation of ASSISTments. We began recruitment for our first study cohort between February and September 2021. The team developed recruitment materials and reached out via multiple channels. This outreach resulted in interest from 206 teachers and 12 other district personnel (representing 143 districts in 35 states). We found this to be an incredibly positive response, and felt hopeful about reaching our goal of an initial 40 school cohort. However, after follow up conversations with schools and school districts, 

The end result was that four schools in four districts agreed to participate in the study. Even with teacher and principal interest and support, many districts were unwilling to commit to research activities during the uncertain times related to the COVID-19 pandemic. Districts cited concerns about staff changes due to COVID, worries about how to address COVID-related learning loss, and an increased focus on streamlining systems and programs across the district as reasons they were hesitant to enroll one of their schools in a research study. Additionally, a few interested schools were not allowed to participate because their districts did not want to share student data. 

Based on the above challenges from this year of recruitment, the project team has revised the recruitment approach to include states and districts in the initial outreach for the 2022-2023 study. We plan to recruit top-down instead of bottom-up in order to get state and district leaders on board before recruiting teacher interest. We believe this process shift will allow us to focus our early efforts on high-level leaders in order to recruit in districts where we know schools will be allowed to participate in the study. This will also allow us to recruit multiple schools in each interested district, which will result in fewer MOUs that cover more schools. Overall, we believe the top-down recruitment approach will allow for more successful and targeted outreach and will ultimately lead to more district commitment to the study.

We know now that the pandemic hit math education the hardest, leaving students on average 5 months behind and exacerbated the existing opportunity gap. Now is the time to implement resources that engage students and empower teachers. ASSISTments has been able to rise to the occasion to do just that through this grant and we wanted to share our lessons learned and best practices to support other nonprofits as we all grow together to improve math education.

Share on socials!

Facebook logoLinkedIn LogoPinterest logo
View All Blogs

Similar Posts