Learning Policy Institute Research on Using Authentic Student Work for College Admissions

LPI Research Shows How College Admissions That Utilize Authentic Student Work Can Advance Equity and Diversity
With the new school year on the horizon and in light of the recent Supreme Court decision on affirmative action, the topic of equity in college admissions has never been more relevant. LPI’s research shares examples of admissions processes that use student portfolios and performance assessments to inform effective and equitable admission, placement, and advising decisions.Performance assessments are an approach to educational assessment that enables students to demonstrate what they know and are able to do through open-ended tasks. This demonstration can include, among other things, writing an analytical essay, conducting a science investigation, creating a curated portfolio of work, or presenting the results of an original research paper. The following research sheds light on the use of performance assessments in K-12 settings and in higher education admissions. 

The Promise of Performance Assessments describes innovations in high school and higher education assessments, including the value of performance assessments in providing K-12 schools, colleges, and universities with insights about what students know and can do. The brief explores state and local policies that support the use of these assessments, along with emerging higher education efforts to incorporate them in college admission, placement, and advising. Importantly, performance assessments also have promise for better reflecting the achievements of historically underserved students, which in turn may help institutions identify promising candidates who might have been overlooked by traditional measures.

Assessing College Readiness Through Authentic Student Work describes the history, context, implementation, and early results of a unique college admissions pilot in which 25 colleges in the City University of New York (CUNY) system and high schools in the New York Performance Standards Consortium—which use performance-based assessments to assess student progress—have collaborated to add authentic evidence of student learning to the college admissions process. Early evidence showed that students in Consortium schools who began high school more educationally and economically disadvantaged than their peers were more likely to graduate from high school and attend college. Students admitted to CUNY through the Consortium–CUNY pilot on average achieved higher first-semester college GPAs and persisted in college at higher rates than peers from other New York City schools, even though the latter had higher SAT scores. These results suggest that a more holistic review of admission applications that include evidence of student work can help identify students with strong potential to succeed in college.

Authentic Student Work in College Admissions looks at the Ross School of Business at the University of Michigan and describes how it requests, collects, and reviews portfolios of student work along with traditional application materials as a part of the undergraduate admissions process. The case illuminates the use of student-generated portfolios as one possible model for other higher education systems seeking to evolve their holistic admission processes.




The Role of AI in Assisting Teachers and in Formative Assessments of Students — THE Journal

How can AI be developed to help advance teaching? For beginners, it needs to put teachers front and center, according to a new report from the United States Department of Education.
— Read on

Using data insight platforms to improve SEL (Social Emotional Learning) strategies –  DR. DELONNA DARSOW of SourceWell for eSchool News

A platform that both tracks the delivery of SEL interventions and brings them together with other data into one whole-child view can add a measure of simplicity to educators’ lives

Although structured social-emotional learning (SEL) has been around since the mid-90s, schools’ focus on SEL has skyrocketed following the impact the COVID-19 pandemic had on education. As remote learning exacerbated feelings of isolation and uncertainty, and behavioral and mental health issues emerged, many educators shifted away from attainment goals to helping students cope and connect in an environment that suddenly lacked regular social interactions, academic expectations and daily structure. SEL then became a foundational piece of the return to in-person learning and, by many accounts, remains an integral part of student needs a year into post-shut down recovery.

According to a report from Tyton Partners and the Collaborative for Academic, Social, and Emotional Learning (CASEL), district spending on SEL programming between the 2019–20 and 2020–21 academic years grew from $530 million to $765 million. SEL also received a $160 million funding boost in the FY2022 Consolidated Appropriations Act earlier this year. Educators are investing in SEL on an individual level, too. Based on data from DonorsChoose, reports indicate that donation requests for supplies that help students develop SEL skills and improve mental health have almost doubled since 2020.

While SEL and mental health initiatives are different, when delivered as part of a multi-tiered system of supports (MTSS), SEL can play a significant role in promoting responsive relationships, emotionally safe environments and skills development that improve or mitigate mental health issues. In fact, the American Academy of Child and Adolescent Psychiatry states that SEL screening instruments can be used to both help standardize the identification of anxiety concerns and help facilitate early intervention.

As more districts integrate SEL into their curricula and expand SEL practices into their secondary schools, the collection and management of such data play an essential role in measuring student progress and program efficacy. That’s especially important because, as the Tyton Partners/CASEL report notes, quality in the SEL marketplace may not keep pace with demand. And overarchingly, an easily navigable student data insights platform gives educators more time to focus on how they’re incorporating SEL in their classrooms.

Visualizing data improves SEL strategies

SEL should not exist in a vacuum. It serves as a component of MTSS. Your data platform should allow educators to not only track and record SEL elements alongside academic performance, attendance, and behavior, but to also to visualize them side-by-side within a single report. Each of these factors individually and collectively will influence students’ social and emotional well-being. A complete portrait of a student, instead of a corner of the picture, gives educators the context they need to assign or adjust learning and supports across all areas of a student’s life.

Next, your platform should allow educators to drill into the details, preferably in one place, given the vast array of available SEL tools. Some tools may be free, while others come at a cost or as part of a larger assessment suite. A data insights platform flexible enough to gather information from all your SEL tools via integration, file upload or manual score entry broadens context while saving educators valuable time.

4 engaging strategies that promote student SEL
SEL is critical–but teachers rarely have time to address it

Finally, consider whether a platform can track and assign interventions in ways that fit your specific MTSS and SEL needs. Features like a centralized intervention bank that can be tailor-made for every school, the option to individually or mass assign interventions, and the ability to split interventions and progress monitoring assessments by tier collectively build efficiency through a common language of practice. Together, these measures of student growth can provide other insights including the time students may be spending in different interventions, and the growth each intervention may yield provides leaders an easy overview of program effectiveness and return on investment.

SEL data can elevate student and family perspectives

At its core, SEL is intended to help students develop and practice empathy, perspective, self-reflection and active listening to build connections with others.  By teaching those competencies, SEL affirms students’ identities, strengths, values, lived experiences, and culture. Thoughtful programming and meaningful assessments, combined with a flexible data platform such as Proliftic, allows educators to monitor and compare the development of these so-called “soft skills.” Visual representations of student progress also give educators an engaging way to start conversations with students, parents and the community to make SEL programming more impactful.

When educators and students examine student SEL data together, it helps to strengthen educator-student relationships while continuing to build SEL skills including collaboration, self-efficacy and goal setting.  School leaders, teachers and the SEL materials and data themselves may have “blind spots.”  Therefore, including students, their families, as well as community-based organization leaders in selection of SEL programming, may help diminish the blind spots and lead to improved strategies that more fully reflect the lived experiences of students and families while furthering the educators’ and leaders’ personal SEL journeys.

Giving educators back their time

Remote learning brought both educational system shortcomings and strengths into the forefront including the long list of responsibilities that educators regularly take on. While surveys show that most educators believe there’s a great need for SEL in the classroom, and many have always provided it without an official framework, building out a formal SEL program, screening and progress monitoring require training and support. A platform that can both track the delivery of SEL interventions and bring them together with other data into one whole-child view can add a measure of simplicity to educators’ lives, especially as educators report higher rates of stress and burnout.

Before investing in a data insights vendor, organization leaders should ask themselves the following questions:

  1. Is the system flexible enough to handle a variety of data sources?
  2. Are we going to be treated like a transaction or partner in the relationship?
  3. Are the people who work for the vendor experts in their field who can guide us through the process?
  4. Will the implementation include remote or in-person training and materials?
  5. How much of our own human resource capital will we need to support the system once it’s up and running?

As SEL makes its way into more schools, data is crucial to refine the programming and interventions that work best for students. A data insights platform should be part of any SEL framework to guide decision-making, report on impact and give educators more time to model the SEL they’re teaching in their classrooms.

SEL data should be available in visual representations, side by side with academic performance data.
— Read on

Learning Loss: How to Assess It? How to Address It?

Learning Loss Data: Stanford University’s CREDO (Center for Research on Education Outcomes) has released a study based on NWEA (MAP assessment) data on 20 states.

In their report they conclude with four Implications: First, the learning losses chilling.
Second, new approaches will be needed.
Third, diagnostic assessment and frequent progress checks are needed more than ever.
Fourth, the assessments needed exceed local capabilities and should be provided nationally for all.

Report: “Estimates of Learning Loss in 2019-2000 School Year” CREDO VIDEO. CBS VIDEO

Educators and families across the country are observing and predicting learning loss in academics and social emotional learning losses as well. But what is our best course– is it to spend some of the prized limited school days on testing? During the pandemic, voices have been raised by educators, administrators, and families asking to postpone testing given the added pressure it can put on already stressed students and schools.

Nov 2020 Edweek article “States Push to Ditch or Downplay Standardized Tests During Virus Surge 2020” reflects those concerns.

“At the end of the day, there’s going to be an asterisk around any 2020-21 [test] results if they’re given,” Stephen Pruitt of the Southern Regional Education Board told our colleague Sarah D. Sparks in July.
Even more than an asterisk, the pandemic should underscore that the traditional standardized tests mandated by federal law simply haven’t worked as desiged, and push the Biden administration and others to rethink the entire system, said Joshua Starr, the CEO of PDK International, a professional association of educators.
“This is the time to actually challenge the assumption that the state testing regimes will give us what we want,” said Starr. “I have no confidence that state standardized tests this year will do that. I don’t know that they’ve ever done that, and they certainly won’t do it this year.”
While Starr said that formative assessments, for example, could be useful to students and educators. But he said that in general, given the pandemic’s clear and disproportionate impact on underserved students and communities, education leaders should move straight into directing more resources and support to students and families in need, without depending on tests to do so.
The ability of tests to discern trend lines in a typical fashion has also been disrupted beyond the point of being useful, including for accountability, said Daniel Koretz, a research professor at the Harvard Graduate School of Education who focuses on assessments. And more broadly, he said, potential disruptions for students at home and other factors unique to the pandemic present an environment that tests simply can’t control for.


An NEA article urges that we “Cancel Standardized Tests During COVID-19”
They write and ask us to sign a letter to the U.S. E.D.

American students are still navigating the most difficult year of learning in modern history. Between losing loved ones to COVID-19, being forced out of classrooms, adapting to distance learning, and missing out on a year of regular social interactions – they have had their worlds turned upside down. 
The last thing they need is to take a stressful, ineffective standardized test. 

The AFT has written resolution to address various aspects of the pandemic and school needs going forward. Regarding Assessment, here is their stance:

RESOLVED, that the AFT will advocate for districts and states to develop systems of assessments that support teaching and learning by:
Seeking waivers on state summative assessments and the high-stakes consequences attached to them as the upcoming year is a bridge period following prolonged coronavirus closures and re-established instruction in schools;

Conducting comprehensive reviews of all assessment programs to limit the loss of learning time to excessive testing;

Prioritizing assessments that support and help target teaching and learning, including reliable, nonintrusive and teacher-friendly diagnostics—both in-person and virtual; and

Supporting teacher use of authentic assessments wherein students are asked to perform real-world tasks that demonstrate meaningful application of what they have learned.

Educators and families across the country are observing and predicting learning loss in academics and social emotional learning losses as well. But what is our best course– is it to spend some of the prized limited school days on testing?

During the pandemic, voices have been raised by educators, administrators, and families asking to postpone testing given the added pressure it can put on already stressed students and schools. Nov 2020 Edweek article “States Push to Ditch or Downplay Standardized Tests During Virus Surge 2020” reflects those concerns.

Others feel that the pandemic raises the import of testing, so that we can ascertain educational needs. However the federal ESSA mandated standardized tests are not deemed to be accurate about “learning loss”.

Our new Education Director, Michael Cordona, is standing firm in requiring states to conduct standardized tests.

There will be flexibility, but summative tests will be required.

“It is urgent to understand the impact of COVID-19 on learning,” Ian Rosenblum, acting assistant secretary in the office of elementary and secondary education, wrote to states. “We know, however, that some schools and school districts may face circumstances in which they are not able to safely administer statewide summative assessments this spring using their standard practices.”

Rosenblum said states would still have to publicly report data by student subgroups, as required. He also specifically encouraged states to extend the testing window for English-language proficiency tests.

Rosenblum did not give a deadline for when states would have to seek flexibility from accountability or other requirements. However he also said the department recognized that “individual states may need additional assessment flexibility based on the specific circumstances.” He added that in such cases, the department “will work with states to address their individual needs and conditions while ensuring the maximum available statewide data to inform the targeting of resources and supports.”

— Ian Rosenblum, Acting Asst. Sec. of Education

Assessing our Assessing!

Are we wasting student’s learning time, enthusiasm, and interest with multiple tests that bore them and often cause anxiety and discouragement?

As citizens we want to ensure effective public education for all. To determine if it is effective at the national, state, district, school, classroom, and student levels– we test.

Our students are tested in their classwork in daily and periodic tests, in district tests, in state mandated tests. Student learning time is devoted to testing in order to give us information we need to achieve effectiveness and equity. Delivering multiple varieties of assessment often interferes with quality curriculum and instruction by taking time and also by redirecting instruction to “cover” the test. In spite extraordinary expense of student and teacher time, test results often do not give teachers, students, or families usable information to improve learning.
Grade for our current assessment practices = D

Positive Assessment

CAST (the Center for Applied Special Technology) the developers of Universal Design for Learning framework use Assessment not from a “deficit perspective” but as an integral part of the learning experience. We have been and still are in such an urgent time, when students have lost opportunities. We must make use of the time to closely examine what each. student knows. Teachers are expert at thins.

CAST Big Takeaways on Assessment Now

Coherent Balanced Assessment

In 2001, an important report “Knowing What Students Know” was published, calling for a coherent, balanced system of assessments that meet the specific needs for learners, families, teachers, schools, districts, states and our nation. We have not succeeded in following the guidelines for coherence between tests. Since 2001 much has been developed in technology, gaming, cognitive psychology, understanding of the needs of today’s and tomorrow’s careers and lives. Yet, for the most part, we still have not integrated the 2001 basics.

Executive Summary, 17 pages.
Full Report, 383 pages.

CAST Series on Assessment: Assess for Learning —
CAST is a multifaceted organization with a singular ambition: Bust the barriers to learning that millions of people experience every day. Barriers are in the environment, not in the student. We need to shift the definition of Assessment from a negative reflection on what has not been accomplished to an integral part of learing.

What are “Balanced Assessment Systems and why do we need standardized annual assessments in addition to formative assessment and classroom assessments?

From the National Center for the Improvement of Educational Assessment (aka Center for Assessment)

The Center for Assessment, a non profit in Dover, NH, focuses on support
ing educators understanding the variety and importance of balanced assessment. The Center has been working since 1998 on supporting coherent systems of assessment.

Click here to access Assessment Learning Modules that describe types and usages of various kinds of assessment.

Educators and families across the country are observing and predicting learning loss in academics and social emotional learning losses as well. But what is our best course– is it to spend some of the prized limited school days on testing?

During the pandemic, voices have been raised by educators, administrators, and families asking to postpone testing given the added pressure it can put on already stressed students and schools. Nov 2020 Edweek article “States Push to Ditch or Downplay Standardized Tests During Virus Surge 2020” reflects those concerns.

Others feel that the pandemic raises the import of testing, so that we can ascertain educational needs. However the federal ESSA mandated standardized tests are not deemed to be accurate about “learning loss”.

Authentic Assessment

It may be time to design learning objectives that map to real work and to design authentic assessments (tasks) that enable students to show what they know. There is great interest in “Authentic Assessment” also known as “Performance Assessment”, especially with the rise of PBL (Project Based Learning.) Instead of taking time away from learning to test students, curriculum can be designed to spur activities- projects, writing, research, experimentation, service- which generate results. The outcomes or learning products (the bridge designed in a marsh, the play written and acted about history, etc.) are evaluated against the learning objectives or targets. The US ED has stated it will consider applications to use Authentic Assessment in place of standardized summative tests. The state of New Hampshire applied and was granted the right to proceed with the PACE assessment they had been using for five years.

Performance Assessment of Competency Education – NH Assessment Model

New Hampshire’s PACE is a first-in-the-nation accountability strategy that offers a reduced level of standardized testing together with locally developed common performance assessments.
Click here for a chart of the NH assessments used by subject at various grade levels.

Flyer summarizing the qualities of formative and summative assessment.

Should We Test During the Pandemic? How Should Test Results be Presented?

How Two Years of Pandemic Disruption Could Shake Up the Debate Over Standardized Testing -Edweek

The week the U.S. Department of Education told states it wouldn’t issue blanket waivers from mandated annual assessments, the creators of a national guide instructing parents on how to opt their children out of the standardized tests reported a spike in web traffic to the site.

“Parents are hopping mad,” said Bob Schaeffer, the interim executive director of FairTest, an organization that promotes testing opt outs and created the guide. “If schools don’t cancel the tests, parents will.”

Some of the questions educators face on standardized testing during the pandemic:

Should students be tested at this time?

What are we testing?

Advocates for testing — including civil rights organizations, a vocal group of lawmakers, and some educational leaders concerned about equity —say such suggestions may be overblown. But, after states cancelled tests entirely in 2020, some of those same advocates fear that two consecutive years of disruption in state testing—be it through opt outs, modifications, or complete cancellations — could amount to the country taking its foot off the gas in its commitment to broad assessments.

How Much Real Learning Time Are Students Losing During the Pandemic?3 min rea

“What I fear is we just don’t know enough right now,” said Cheryl Oldham, vice president of education policy at the U.S. Chamber of Commerce, which has advocated for federal testing mandates. “We just won’t know the implications of this year for a while.”

Although President Joe Biden criticized high-stakes testing as a candidate, one of his Education Department’s first acts was to leave the tests in place, even as many testing opponents argued that an unprecedented year of disruption caused by the COVID-19 pandemic gave a good reason to cancel them.

In addition to concerns about the reliability of scores for tests given during the pandemic, schools will face a host of logistical challenges, like deciding how to test remote learners, finding off-site space that allows for greater social distancing, and finding adults to supervise testing at a time when staffing is already a challenge.

Testing supporters were heartened when the Education Department said in its Feb. 22 guidance that states must conduct tests this year. But the guidance provides a lot of wiggle room, allowing states to bypass requirements that they use the scores to rate schools, to delay when they are administered, and to forgo the requirement that 95 percent of students participate, a change that could open the door for more opt outs.


An alliance in support of assessment

Congress reaffirmed its commitment to test-based accountability when it passed the Every Student Succeeds Act in 2015 after years of debate. The federal education law maintained many of the testing mandates that were the hallmarks of its predecessor, the No Child Left Behind Act.

Under ESSA, states have to test students in reading and math in grades 3-8 and once in high school. They also have to break down the resulting test scores to track results for targeted populations, like English-language learners, various racial groups, and students from low-income families.

Critics of NCLB had argued that it led schools to place too much emphasis on tests and to use their results in punitive ways. Under ESSA, lawmakers aimed to address those concerns by giving states more flexibility in how scores are used and by allowing them to limit the amount of time schools spend testing. They also created a program to pilot new innovative assessments.

But an alliance that included civil rights groups, business organizations, and prominent congressional Democrats successfully pushed for Congress to reject more-dramatic changes to testing mandates when it passed ESSA.

Broad annual data are necessary to ensure schools are serving all students adequately, they insisted. And standardized tests offer consistency that other forms of feedback, like teacher observations and classroom assignments, may lack, they said.

I just think we have a moral responsibility to understand how all of our students are doing, where we are falling short…

Senator Patty Murray

If- as Murray says, the testing is about Education Equity and making sure we are getting education to each- then we are taking the students’ learning time to evaluate and thus remediate our performance in education implementation snd delivery.

Perhaps we should evaluate our efforts rather than test the students. We should find ways to let them explore, learn, and “show what they know.” Then we should find ways to evaluate and improve our performance.

A Plan for Standardized Test Scores During the Pandemic Has Gotten States’ Attention

There are three main elements of Ho’s proposal.

1. The first part is to report the percentage of students from this year’s state testing that have comparable previous test scores—indeed, he stressed the importance of this and not scores being the first issue that states focus on when reporting testing data. This would mean looking at which students took the tests two years ago, and seeing whether they took the tests for this academic year. Remember: Last year states canceled their tests en masse, so there aren’t test scores from that time.

So, for example, states would report the percentage of students who, two years ago, took the tests in the 3rd grade and also took the tests in the 5th grade this year. This would require state data systems to track individual students.

Ho says that this amounts to conducting an educational census that would help states sort students into two important groups: one for which the state has comparable test score data, and the other for which there isn’t test score data.

“It instantly divides your attention into two deserving groups,” he said, calling this piece of his idea the “match rate.”

2. The second part would focus on the students for whom there is comparable test score data from two school years ago.

Ho proposes that for those students, states find their previous “academic peers.” In other words, states would identify students from 2017 and 2019 who performed at similar levels on the exams. Then they would study how the 2017 group performed on 2019 tests, and how the 2019 group performed on 2021 tests. (This is why Ho’s plan relies on states that have longitudinal data systems dating and “stable” testing systems dating back to the 2016-17 school year.)

This method would help people determine the extent to which the pandemic has affected students’ academic progress, compared to similar students from before COVID-19. Ho calls this a “fair trend” approach.

3. The third part focuses on students who don’t take the tests this year and who the system has lost track of, or what Ho calls the “equity check.”

For those students, Ho also proposes looking at the scores from those students in 2019, and looking at their academic peers from 2017, and then looking at the test scores from that peer group in 2019.

Ho admitted that this third piece of his plan “requires the most guesswork,” but could still tell a meaningful, descriptive story. Yet he also said this “equity check” would probably paint a best-case picture of where these missing students stand. Why? Because it “assumes academic learning rates for those who went missing from 2019 to 2021 are the same as those in 2017 to 2019,” Ho wrote.

Andrew Ho, Edweek