Centering student perspectives in state assessments: progress and next steps

November 20, 2024 • Lars Esdal

A few weeks ago, Education Evolving released its latest report, A Beacon, a Barometer, and a Bridge. Drawing on conversations with students, educators, and community members across Minnesota, it presents nine recommendations for making statewide assessments and standards more equitable and student-centered.

Cover for 'A Beacon, a Barometer, and a Bridge' paper.

Now our work shifts to action. Over the coming months and years, we’ll be working with our community (and in partnership with Minnesota’s state administrators and policymakers) to advance the recommendations they called for.

Fortunately, many with authority to act have already shown interest in being responsive to our community and in moving recommendations forward. In particular, staff at the Minnesota Department of Education (MDE) have been doing great work listening to students.

We highlight that work below and share how it intersects with our report’s recommendations. We also suggest some opportunities for even more forward progress.

Piloting student feedback on assessment material

A theme we heard from students in interviews and focus groups for our report was that some test content felt out of touch with their lived experience—drawing on examples and metaphors that could feel like a quiz on context rather than mastery of material. One of our recommendations was to “involve youth in test development and prioritize cultural relevance.”

Fortunately, MDE is taking steps towards this end.

MDE has been working with the state’s “Technical Assistance Committee” or TAC (a group of assessment experts who advise the state on its assessment design) to build a pilot for getting student feedback on test questions.

“Involving youth in designing and reviewing test content ensures tests are valid measures of learning.”

They will be piloting that work this school year, visiting schools and holding conversations facilitated by a trusted adult the students know. MDE hopes to hear student feedback on specific test content, but also looks forward to discovering the unknown. Will clear trends emerge? Will different content areas or question types yield different types of feedback?

Involving youth in designing and reviewing test content ensures tests are valid measures of learning, without tripping up students on cultural or linguistic context they may not have. This requires intentionality; adults staffing this work should see themselves as coaches and guides, not just facilitators of a technical process.

Youth should be supported in applying the lens of cultural relevance so tests invoke examples and language reflective of their own backgrounds, perspectives, and lived experiences.

Student feedback on the information and reports they receive

Beyond the test material itself, MDE has been asking students their thoughts on how they would like information presented, via surveys and focus groups over the last year.

Currently, there is a real mix of how students learn their results—mirroring what we heard from students in our own research. Some say they see their results in an online portal and/or have their teachers share them. Still others say they never saw their results. MDE has been making changes based on what they heard—and plans to hold more student focus groups this fall.

A related theme we heard from students was that fall is a confusing—and too late—time to get last year’s MCA results. Students often take start-of-year tests (such as FastBridge or NWEA MAP) in early fall. When they get last year’s official MCA results at the same time, it can be confusing which test they are learning about.

“Students and families need results as quickly as possible.”

Our report recommended assessment results be automatically imported into school/district data systems (MDE has been providing many software vendors with results during the administration and during the summer for several years), or at least released in a format that made importing data and sharing official results with families as easy as possible.

Students and families need results as quickly as possible. While MDE provides student level results within approximately one hour of the completion of the assessment, that information is not being consistently shared with students, families, and educators. MDE is making changes to those reports in an effort to improve distribution and communication to students, families, and educators.

Engagement beyond students—the role of teachers and community

While this blog post has focused on students, MDE also emphasizes the important role that teachers and community members currently play in assessment development.

Before a student ever sees content, groups of educators and community members—experienced in content areas and representing many forms of diversity—review all test content. They consider factors like bias stemming from context or lived experience, alignment with statewide academic standards, and grade-level appropriateness.

MDE encourages teachers and community members who are interested in participating to apply. Review committees form on an ongoing basis, typically for two-to-four consecutive days during spring or summer. Participants are paid an honorarium and reimbursed for expenses; educators earn CEUs. 

Moving forward

MDE is taking steps to listen to students and involve them directly in developing assessments. We applaud this wonderful progress.

We look forward to more steps forward—and to closer partnership as we work to act on what Minnesota’s students, families, and educators have called for. Our students deserve it.