District Dashboard for Social-Emotional Learning Programs
Empowering district and school leaders to implement with data
The Situation
District leaders comprise over 80% of Second Step’s purchasers. Their role is often to support staff in implementing the Second Step program in each school. To help teachers and improve learner outcomes, leaders rely on implementation fidelity data to answer key questions.

However, the product was not set up with a district or school leader interface, and the implementation data was found in different areas of the Second Step platform.
User Context
Leaders want to see if teachers completed the initial setup steps.
Monitoring program fidelity is essential to supporting program implementation across schools.
District SEL Leaders had to download school reports individually, creating a barrier to easily compare progress across school campuses.
Business Context
The district dashboard was the top-requested feature we heard from our sales team and in the SWOT analysis.
The product team wanted to position the Second Step program in the market as a district-wide solution. And yet, no district experiences were present in the product.
The organization wanted to deliver this feature in about 6 months, before the new school year.
Technical Context
The platform was not built to include a district model.
Teachers need to create classes before they can start teaching.
The dashboard would need to accommodate a school and a district leader role.
We lacked essential context data, such as "Where should teachers be?" and "How many classes should they have set up?"
Step 1
Assess the Research
Earlier in the Fall, we conducted interviews and research that gave us a lot of information of the most important tools and tasks that SEL leaders need and expect from the Second Step Platform.

Step 2
Design Sprint
With such a tight timeline, I decided to lead a design sprint to learn about what would be most valuable to district and school leaders and try some early data visualization concepts on conceptual leader dashboard designs.
Below you'll see the 6 iterations of designs that I made and received feedback on from district and school leaders, all in under 10 days.




Step 3
Behavior-Change Mapping
With lots of inspiration from the design sprint, we were ready to have some strong insights for what data will be most useful to access for leaders in different roles. We also identified the primary persona, an SEL coordinator. This role is the one that is most likely to use data tools strategically and regularly.
The next step was to align the cross-functional team (Instructional Design, Research Scientist, Product Management, Engineering, Design) and integrate the learnings from the design sprint into the desired business and impact outcomes. We did this by understanding our collective assumptions and then by identifying a change in product-use behavior that we believe would lead to higher outcomes in program implementation fidelity.



Step 4
Design Iterations & Validation Testing
The team identified the key behavior change as "take action, within the expected 6 weeks of program start date," and from this, mapped out common barriers that could prevent action from the data. These barriers included: a high level of implementation rate by school, unit, and grade, being difficult to access, and not being able to drill down to see more details on specific classes that need support.
I tried different data designs to present district and school level data, and in the end, users reported that the designs that just had a table of school data were easier to read than an overly aggregated at-a-glance district snapshot.

The Result
After a data design exploration and usability testing, we arrived at a flexible leader dashboard that accommodated the needed functions and features. To meet the behavior change goal, we created a scannable table to analyze the schools that need support with an at-a-glance unit progress indicator.


2 of the 3 product goals were met
MET High customer satisfaction with usability
MET Data is being used to take action and to support program impact
UNMET Leader login rates stayed consistent and did not increase
High customer satisfaction, 250 users were surveyed:
85% agreed that it is "easy to understand the lesson progress data"
89% agreed that they can "use the lesson progress information to support staff"
4.5 average rating that it was "easy to collect information for end-of-year implementation reports"
93% agreed that they used the new lesson completion reports to determine whether they "met their lesson completion goals"
73% of participants said they use that data to take action (this was the key behavior goal)
Internal team satisfaction:
The sales and support teams are happy too! The sales team is closing district purchases much easier with the demo of this suite of implementation monitoring tools.
The customer success team reported that the data export and the unit breakdowns made supporting district leaders with their data impactful and strategic.
In reflection
Given the challenge of displaying data without the context of where the schools should be, we did meet the behavioral goals of the project. This was the most challenging part of the data design.
When we have school settings and pacing data, our reporting capabilities will have the chance to be more dynamic.
Due to the school year schedule and scope, we had to do a rolling release of features. This made it hard to see the whole design experience, and some features went live mid-year, which may have lost leaders in adopting the feature.
Because we had to focus on energy building out the interface in-platform, we have yet to take advantage of off-platform intervention opportunities, such as bi-weekly implementation data emails.
Huge thank you to the amazing team members, Sara (PDM), Anne (UI), Matthew (Dev Lead), and Karen (SME)!