I am beginning to see and feel the light at the end of the tunnel. It may just be the start of Spring, but the sense of focus is returning and I, as well as CRESST staff, welcome the renewal and optimism for what is to come. As the COVID-19 vaccine is being distributed, a new administration is populating the White House (including the March 1, 2021, confirmation of Miguel A. Cardona, as US Secretary of Education), and public schools in our state are once again opening their doors, I am looking forward to continuing our pursuit of assessing and evaluating learning across the country and around the world. I have greatly missed one of the most simple and stimulating joys of my profession — the impromptu in-person conversations and interactions with my colleagues and peers — without a screen or speaker between us.
In some ways it’s as if the world has been “on hold.” I often hear, “When things get back to normal” and “when this is all over,” but in the field of education research, our work continues and grows -- in most cases in very different ways and with different goals and outcomes than perhaps we planned for -- but we continue to explore the complex question of, “What is the best way(s) for this student to show what they are learning and know?” I will use this blog post to discuss CRESST’s work with English learners, our progress, our findings, and our questions.
CRESST is home to two state-led projects focused on English learners (ELs) — ELPA21 and CAAELP — and is complementing our EL portfolio with this IES-funded research project with WestEd, the University of Oregon, and Oregon State University. Specifically we are analyzing student-level; statewide data from four states to identify secondary ELs’ access to courses; how that access is related to their long-term educational outcomes; how malleable factors at the school level can increase secondary ELs’ course access and achievement, and how we may examine our English language proficiency assessments such ELPA21 under a new telescope that we call “feature analysis” to better articulate the ties between instruction, learning, and assessment. Here are two recent studies by CRESST on feature analysis: Using Feature Analysis to Examine Career Readiness in High School Assessments and Exploring Career-Readiness Features in High School Test Items Through Cognitive Laboratory Interviews. Our goal is to learn and share how to offer adolescent ELs quality opportunities to learn.
ELPA21, housed at CRESST for almost 5 years, offers a unique understanding of English language development. The ELP Standards describe language proficiency as interactive in nature and embedded in grade-appropriate rigorous content knowledge. The online assessment system reflects and informs enhanced instruction, and the assessments incorporate technological advances to measure, with precision, how students use language within academic contexts. The multifaceted nature of language learning and development is reflected in the multidimensional measurement and statistical models that undergird ELPA21. Our work is guided by the belief that a high-quality, comprehensive assessment system meets the needs of its users, maximizing student engagement and providing actionable results for classroom educators and school decision-makers. Recent developments by ELPA21 include a suite of professional learning opportunities for educators.
The Collaborative for the Alternate Assessment of English Language Proficiency (CAAELP) received USED funding in 2019 to develop the Alternate English Language Proficiency Assessment (Alt ELPA). As is ELPA21, CAAELP is a state-led collaborative housed at CRESST designing and developing assessments for English learners with the most significant cognitive disabilities in Kindergarten through Grade 12 — a very diverse and historically underserved subgroup of students. Respecting each student’s unique needs, the standards-based Alt ELPA will measure English proficiency, enrich instruction, and inform professional learning.
Milestones of CAAELP reached in 2020 include finalizing assessment blueprints, completing foundational documentation reflecting a principled approach to assessment design, deciding to develop both an end-of-year and through-year assessment model, and reporting student scores based on modality with domain sub-scores and an overall score. Soon to come is the finalization of individual state communities of practice to support the assessments at the school, district, and community levels; the altelpa.org website; pilot and field test; and a learning management system. And we have, since proposal development, kept post-grant sustainability of the project at the forefront.
I look forward to once again seeing my colleagues in person and collaborating on our united goal of improving assessment and learning for all students. I welcome any and all inquiries and discussions on the aforementioned projects and new opportunities to work together and problem-solve. Here’s to 2021 and a new beginning.