Educational interventions in a variety of contexts have shown that students can learn the strategies professional fact checkers use to evaluate the credibility of online sources. Researchers conducting these interventions have developed new kinds of assessments—instruments that measure participants’ knowledge, behaviors, or cognitive processes—to test the effects of their interventions.
These new kinds of assessments are necessary because assessments commonly used to measure outcomes in misinformation research offer limited insights into participants’ reasoning. Extant measures do not reveal whether students deploy effective evaluation strategies and do not tap whether students engage in common evaluative mistakes like judging surface-level features (e.g., a source’s top-level domain or appearance).
In this study, we investigated what new assessments revealed about how students evaluated online sources. Rather than replicate the findings of prior intervention studies, this study focused on understanding what these assessments revealed about students’ reasoning as they evaluated online information.
The findings showed that the assessments were effective in revealing patterns in students’ reasoning as they evaluated websites. Responses pointed to common challenges students encountered when evaluating online content and showed evidence of students’ effective evaluation strategies.
This study highlights possibilities for types of assessments that can be both readily implemented and provide insight into students’ thinking. Policymakers could use similar tasks to assess program effectiveness; researchers could utilize them as outcome measures in studies; and teachers could employ them for formative assessment of student learning.
LPI Research Shows How College Admissions That Utilize Authentic Student Work Can Advance Equity and Diversity With the new school year on the horizon and in light of the recent Supreme Court decision on affirmative action, the topic of equity in college admissions has never been more relevant. LPI’s research shares examples of admissions processes that use student portfolios and performance assessments to inform effective and equitable admission, placement, and advising decisions.Performance assessments are an approach to educational assessment that enables students to demonstrate what they know and are able to do through open-ended tasks. This demonstration can include, among other things, writing an analytical essay, conducting a science investigation, creating a curated portfolio of work, or presenting the results of an original research paper. The following research sheds light on the use of performance assessments in K-12 settings and in higher education admissions.
The Promise of Performance Assessments describes innovations in high school and higher education assessments, including the value of performance assessments in providing K-12 schools, colleges, and universities with insights about what students know and can do. The brief explores state and local policies that support the use of these assessments, along with emerging higher education efforts to incorporate them in college admission, placement, and advising. Importantly, performance assessments also have promise for better reflecting the achievements of historically underserved students, which in turn may help institutions identify promising candidates who might have been overlooked by traditional measures.
Assessing College Readiness Through Authentic Student Work describes the history, context, implementation, and early results of a unique college admissions pilot in which 25 colleges in the City University of New York (CUNY) system and high schools in the New York Performance Standards Consortium—which use performance-based assessments to assess student progress—have collaborated to add authentic evidence of student learning to the college admissions process. Early evidence showed that students in Consortium schools who began high school more educationally and economically disadvantaged than their peers were more likely to graduate from high school and attend college. Students admitted to CUNY through the Consortium–CUNY pilot on average achieved higher first-semester college GPAs and persisted in college at higher rates than peers from other New York City schools, even though the latter had higher SAT scores. These results suggest that a more holistic review of admission applications that include evidence of student work can help identify students with strong potential to succeed in college.
Authentic Student Work in College Admissions looks at the Ross School of Business at the University of Michigan and describes how it requests, collects, and reviews portfolios of student work along with traditional application materials as a part of the undergraduate admissions process. The case illuminates the use of student-generated portfolios as one possible model for other higher education systems seeking to evolve their holistic admission processes.
A platform that both tracks the delivery of SEL interventions and brings them together with other data into one whole-child view can add a measure of simplicity to educators’ lives
Although structured social-emotional learning (SEL) has been around since the mid-90s, schools’ focus on SEL has skyrocketed following the impact the COVID-19 pandemic had on education. As remote learning exacerbated feelings of isolation and uncertainty, and behavioral and mental health issues emerged, many educators shifted away from attainment goals to helping students cope and connect in an environment that suddenly lacked regular social interactions, academic expectations and daily structure. SEL then became a foundational piece of the return to in-person learning and, by many accounts, remains an integral part of student needs a year into post-shut down recovery.
According to a report from Tyton Partners and the Collaborative for Academic, Social, and Emotional Learning (CASEL), district spending on SEL programming between the 2019–20 and 2020–21 academic years grew from $530 million to $765 million. SEL also received a $160 million funding boost in the FY2022 Consolidated Appropriations Act earlier this year. Educators are investing in SEL on an individual level, too. Based on data from DonorsChoose, reports indicate that donation requests for supplies that help students develop SEL skills and improve mental health have almost doubled since 2020.
While SEL and mental health initiatives are different, when delivered as part of a multi-tiered system of supports (MTSS), SEL can play a significant role in promoting responsive relationships, emotionally safe environments and skills development that improve or mitigate mental health issues. In fact, the American Academy of Child and Adolescent Psychiatry states that SEL screening instruments can be used to both help standardize the identification of anxiety concerns and help facilitate early intervention.
As more districts integrate SEL into their curricula and expand SEL practices into their secondary schools, the collection and management of such data play an essential role in measuring student progress and program efficacy. That’s especially important because, as the Tyton Partners/CASEL report notes, quality in the SEL marketplace may not keep pace with demand. And overarchingly, an easily navigable student data insights platform gives educators more time to focus on how they’re incorporating SEL in their classrooms.
Visualizing data improves SEL strategies
SEL should not exist in a vacuum. It serves as a component of MTSS. Your data platform should allow educators to not only track and record SEL elements alongside academic performance, attendance, and behavior, but to also to visualize them side-by-side within a single report. Each of these factors individually and collectively will influence students’ social and emotional well-being. A complete portrait of a student, instead of a corner of the picture, gives educators the context they need to assign or adjust learning and supports across all areas of a student’s life.
Next, your platform should allow educators to drill into the details, preferably in one place, given the vast array of available SEL tools. Some tools may be free, while others come at a cost or as part of a larger assessment suite. A data insights platform flexible enough to gather information from all your SEL tools via integration, file upload or manual score entry broadens context while saving educators valuable time.
Finally, consider whether a platform can track and assign interventions in ways that fit your specific MTSS and SEL needs. Features like a centralized intervention bank that can be tailor-made for every school, the option to individually or mass assign interventions, and the ability to split interventions and progress monitoring assessments by tier collectively build efficiency through a common language of practice. Together, these measures of student growth can provide other insights including the time students may be spending in different interventions, and the growth each intervention may yield provides leaders an easy overview of program effectiveness and return on investment.
SEL data can elevate student and family perspectives
At its core, SEL is intended to help students develop and practice empathy, perspective, self-reflection and active listening to build connections with others. By teaching those competencies, SEL affirms students’ identities, strengths, values, lived experiences, and culture. Thoughtful programming and meaningful assessments, combined with a flexible data platform such as Proliftic, allows educators to monitor and compare the development of these so-called “soft skills.” Visual representations of student progress also give educators an engaging way to start conversations with students, parents and the community to make SEL programming more impactful.
When educators and students examine student SEL data together, it helps to strengthen educator-student relationships while continuing to build SEL skills including collaboration, self-efficacy and goal setting. School leaders, teachers and the SEL materials and data themselves may have “blind spots.” Therefore, including students, their families, as well as community-based organization leaders in selection of SEL programming, may help diminish the blind spots and lead to improved strategies that more fully reflect the lived experiences of students and families while furthering the educators’ and leaders’ personal SEL journeys.
Giving educators back their time
Remote learning brought both educational system shortcomings and strengths into the forefront including the long list of responsibilities that educators regularly take on. While surveys show that most educators believe there’s a great need for SEL in the classroom, and many have always provided it without an official framework, building out a formal SEL program, screening and progress monitoring require training and support. A platform that can both track the delivery of SEL interventions and bring them together with other data into one whole-child view can add a measure of simplicity to educators’ lives, especially as educators report higher rates of stress and burnout.
Before investing in a data insights vendor, organization leaders should ask themselves the following questions:
Is the system flexible enough to handle a variety of data sources?
Are we going to be treated like a transaction or partner in the relationship?
Are the people who work for the vendor experts in their field who can guide us through the process?
Will the implementation include remote or in-person training and materials?
How much of our own human resource capital will we need to support the system once it’s up and running?
As SEL makes its way into more schools, data is crucial to refine the programming and interventions that work best for students. A data insights platform should be part of any SEL framework to guide decision-making, report on impact and give educators more time to model the SEL they’re teaching in their classrooms.
SEL data should be available in visual representations, side by side with academic performance data. — Read on eschoolmedia.com/esn/
Removing summative assessments from instructional practice can make class more interesting, engaging, and stress free for students.
— Read on www.edutopia.org/article/year-no-tests
Learning Loss Data: Stanford University’s CREDO (Center for Research on Education Outcomes) has released a study based on NWEA (MAP assessment) data on 20 states.
In their report they conclude with four Implications: First, the learning losses chilling. Second, new approaches will be needed. Third, diagnostic assessment and frequent progress checks are needed more than ever. Fourth, the assessments needed exceed local capabilities and should be provided nationally for all.
Educators and families across the country are observing and predicting learning loss in academics and social emotional learning losses as well. But what is our best course– is it to spend some of the prized limited school days on testing? During the pandemic, voices have been raised by educators, administrators, and families asking to postpone testing given the added pressure it can put on already stressed students and schools.
“At the end of the day, there’s going to be an asterisk around any 2020-21 [test] results if they’re given,” Stephen Pruitt of the Southern Regional Education Board told our colleague Sarah D. Sparks in July. Even more than an asterisk, the pandemic should underscore that the traditional standardized tests mandated by federal law simply haven’t worked as desiged, and push the Biden administration and others to rethink the entire system, said Joshua Starr, the CEO of PDK International, a professional association of educators. “This is the time to actually challenge the assumption that the state testing regimes will give us what we want,” said Starr. “I have no confidence that state standardized tests this year will do that. I don’t know that they’ve ever done that, and they certainly won’t do it this year.” While Starr said that formative assessments, for example, could be useful to students and educators. But he said that in general, given the pandemic’s clear and disproportionate impact on underserved students and communities, education leaders should move straight into directing more resources and support to students and families in need, without depending on tests to do so. The ability of tests to discern trend lines in a typical fashion has also been disrupted beyond the point of being useful, including for accountability, said Daniel Koretz, a research professor at the Harvard Graduate School of Education who focuses on assessments. And more broadly, he said, potential disruptions for students at home and other factors unique to the pandemic present an environment that tests simply can’t control for.
American students are still navigating the most difficult year of learning in modern history. Between losing loved ones to COVID-19, being forced out of classrooms, adapting to distance learning, and missing out on a year of regular social interactions – they have had their worlds turned upside down. The last thing they need is to take a stressful, ineffective standardized test.
The AFT has written resolution to address various aspects of the pandemic and school needs going forward. Regarding Assessment, here is their stance:
RESOLVED, that the AFT will advocate for districts and states to develop systems of assessments that support teaching and learning by: Seeking waivers on state summative assessments and the high-stakes consequences attached to them as the upcoming year is a bridge period following prolonged coronavirus closures and re-established instruction in schools;
Conducting comprehensive reviews of all assessment programs to limit the loss of learning time to excessive testing;
Prioritizing assessments that support and help target teaching and learning, including reliable, nonintrusive and teacher-friendly diagnostics—both in-person and virtual; and
Supporting teacher use of authentic assessments wherein students are asked to perform real-world tasks that demonstrate meaningful application of what they have learned.
Educators and families across the country are observing and predicting learning loss in academics and social emotional learning losses as well. But what is our best course– is it to spend some of the prized limited school days on testing?
During the pandemic, voices have been raised by educators, administrators, and families asking to postpone testing given the added pressure it can put on already stressed students and schools. Nov 2020 Edweek article “States Push to Ditch or Downplay Standardized Tests During Virus Surge 2020” reflects those concerns.
Others feel that the pandemic raises the import of testing, so that we can ascertain educational needs. However the federal ESSA mandated standardized tests are not deemed to be accurate about “learning loss”.
Our new Education Director, Michael Cordona, is standing firm in requiring states to conduct standardized tests.
There will be flexibility, but summative tests will be required.
“It is urgent to understand the impact of COVID-19 on learning,” Ian Rosenblum, acting assistant secretary in the office of elementary and secondary education, wrote to states. “We know, however, that some schools and school districts may face circumstances in which they are not able to safely administer statewide summative assessments this spring using their standard practices.”
Rosenblum said states would still have to publicly report data by student subgroups, as required. He also specifically encouraged states to extend the testing window for English-language proficiency tests.
Rosenblum did not give a deadline for when states would have to seek flexibility from accountability or other requirements. However he also said the department recognized that “individual states may need additional assessment flexibility based on the specific circumstances.” He added that in such cases, the department “will work with states to address their individual needs and conditions while ensuring the maximum available statewide data to inform the targeting of resources and supports.”