Philly Schools May Not Be 'Following The Research' On School Safety

Story Stream
recent articles

School District of Philadelphia superintendent Tony Watlington has made it his mission to turn city schools around by following “what the research tells us.” 

This commitment is bolstered by the recent Accelerate Philly five-year strategic plan, approved by the Board of Education in June 2023. It cites hundreds of academic papers and policy briefs to support current and future district initiatives, from overhauling curricula to replacing security equipment. 

But not all research is created equal.

As academic Steven Hoffman and journalist Julia Belluz write, “‘studies’ are not equally reliable, they all have different limitations, and they should not be acted on in the same manner — or even acted on at all.” 

In a district where more than 70% of students come from low-income families, effective teaching and administrative practices are key to fulfilling one of the main goals of public education: to give students a chance to improve their quality of life. As district officials evaluate progress on their current practices and deliberate over budgets for the next school year, it’s important to ask: Is the SDP’s current use of research aligned with the best standards of evidence? 

What Constitutes “Good Evidence,” Anyway?

A general consensus exists across many academic fields that meta-analyses and systematic reviews provide the strongest evidence for scientific claims, with randomized controlled trials (RCTs) considered the gold standard for individual studies. Meta-analyses and systematic reviews synthesize results from individual statistical studies, allowing for greater confidence in research findings. Randomized controlled trials allow researchers to infer causal effects of treatments by randomly assigning research participants into groups that receive and do not receive treatment. Establishing causality is key for research that policymakers use because it ensures that the effects seen in the study correspond to the policy in question and not other factors. When implementing policies costs significant amounts of money, it’s critical to know exactly what does and doesn’t work.

A Bird’s Eye View of the District’s Evidence

With 199 SDP students shot (33 fatally) in the past school year, it’s no surprise that Superintendent Watlington has made school safety and student well-being a major priority in the strategic plan. Strategies include expanding a community monitoring program to deter crime near schools, replacing security cameras, and evaluating various behavioral intervention programs. According to district spokeswoman Marissa Orbanek, trained researchers from the Office of Evaluation, Research, and Accountability compile evidence from existing research to make decisions about which programs and strategies to adopt. She noted that researchers “consider the quality of the evidence from the existing research” when making these assessments.

But a close analysis of the 65 unique sources cited by the SDP in the school safety portion of Accelerate Philly — the first of five priority areas in the plan — reveals that the district often relies on weak evidence. 

Out of the 65 cited sources, 44 are research papers, only four of which are either meta-analyses or RCTs. The type of study most commonly cited compares schools and/or students at a single point in time, otherwise known as cross-sectional studies. The units in cross-sectional studies are not randomly assigned to particular treatments, which makes evidence from such methods strictly correlational, not causal. 

Bottom line: this means that the district could be spending significant amounts of time and money on safety strategies that, at best, are unproven and, at worst, do not work. 

The amount and strength of evidence that the SDP provides across these initiatives varies widely. For instance, one strategic action that the SDP plans to take to improve safety and wellbeing in Philadelphia schools is to audit its own social-climate programs. Evidence for this strategy included three meta analyses and systematic reviews, the strongest research methods relative to other initiatives within school safety. The papers that presented higher-quality evidence focused on specific types of interventions, like social and emotional learning (SEL). 

It’s clear that SDP officials have the capability to find research that uses high-quality methods. The district plans to spend $25,000 on the social-climate audit for the current school year.

But elsewhere in its action plan, the SDP fails to employ the same degree of rigor — even when millions of dollars are at stake. For instance, the SDP expects to spend nearly $14 million on security cameras and metal detectors to improve school security. To justify this expenditure, which constitutes more than half of the SDP’s overall budget for the school safety priority area, the district cited only three sources: a sponsored article by metal-detector manufacturer CEIA USA, a product description of CEIA’s detectors, and a single cross-sectional study

Even setting aside the weaknesses of the cited study, it’s also not representative of the research on the effectiveness of school security technology. Another paper not cited by the district, which used stronger, quasi-experimental research methods, found the exact opposite of what the strategic plan cited.

“There’s little to no evidence that ‘target hardening’ approaches to school safety will reduce crime and violence,” said Ben Fisher, University of Wisconsin–Madison Associate Professor of Civil Society and Community Studies, and one of the authors of the uncited study. “But at the same time there’s a massive market that capitalizes on fear to get schools to spend millions and millions of dollars on equipment and strategies that have zero evidence of their effectiveness.”

Students are also skeptical of the effectiveness of these expensive security measures. Manal Hssain, a recent graduate of William L. Sayre High School in West Philadelphia, said that school officials focused too much on security measures like metal detectors instead of managing student behavior through disciplinary measures like suspensions. She described the school environment as chaotic, with lots of fights and lockdowns.  

The SDP has moved away from harsh suspension policies in recent years, confident that doing so  is supported strongly by research. 

“The District favors restorative alternatives to punitive discipline for behavior management as decades of research show us the harms of suspending and expelling students for minor offenses,” Orbanek said. 

But here again, it seems that the SDP has failed to keep up with the research. A 2017 paper found absences and serious misconduct increased while test scores decreased as a result of the SDP’s move away from harsh suspension policies in the early 2010s. This finding is corroborated by a recent paper on suspension reductions in Los Angeles, where the authors also noted that teacher attrition increased. They also find that suspension reductions largely benefit the minority of students who are suspended, while harming the remainder of students. 

As an alternative to zero-tolerance disciplinary policies, the SDP has integrated so-called restorative practices in administrations and classrooms. Restorative practices emphasize discussion and mediation in response to behavioral infractions rather than punishment. The strategic plan justifies the district’s continuation of its restorative justice program by citing only a single cross-sectional study. When asked about this plan, Orbanek did not mention whether there was more compelling evidence in support of these restorative alternatives. The SDP also did not answer, when asked, whether the district considers the benefits and costs of its disciplinary policies on both instigators and victims of student misbehavior. 

What Should be Done?

An important first step the district could take to improve the quality of evidence for its initiatives would be to rely more on the expertise of academics skilled in evaluating research methods. The district says that it put together multiple teams, consisting of hundreds of members of the Philadelphia community and several K-12 experts, to review research for Accelerate Philly. Yet out of the three professors on those teams, not one focuses on evaluating research methods. 

This lack of expertise is surprising considering that one of the nation’s top schools of education, the Graduate School of Education (GSE) at the University of Pennsylvania, is within the district’s boundaries. Robert Boruch and Michael Gottfried, two GSE professors who specialize in evaluating the outcomes of educational interventions, said that they were neither consulted for Accelerate Philly nor for other interventions previously considered by the SDP. 

Another key step is to prioritize the use of cost-benefit analyses when evaluating educational interventions. This means that even if an intervention achieves its desired outcome in a study, the benefits to students must be weighed against the costs of the intervention. This is particularly relevant for expensive initiatives, such as the SDP’s $14 million investment in school security equipment.

“Schools should seriously consider working with researchers to understand if investments they choose to make are actually [a] net benefit to kids — willingness to engage in this type of evaluation is a foundation of evidence-based crime control,” Emily Owens, a Deans’ Professor of Criminology and Economics at the University of California–Irvine, said. 

Conducting this type of research is challenging and requires school districts to allow researchers access to the necessary information to answer complex questions. 

“I’ve found it helpful when schools and researchers approach the work as a partnership,” Fisher said. “There needs to be an open back-and-forth to identify research questions that are meaningful for the schools/districts and sic [can also be] studied in a way that will yield valid and trustworthy results.”

A district educator, who did not want to be named for fear of reprisal, questioned whether the district is sufficiently open to such research, particularly on school safety. The educator recalled from conversations between professors at a Philadelphia university and the district several years ago that research on student discipline was considered controversial because it “reflected illy on organizational precedents.” The educator is doubtful that attitudes have since changed, even with a research-focused superintendent.

Violence has plagued Philadelphia schools for years. It particularly affects low-income students, who attend schools that experience disproportionately frequent violence. The negative effects of school violence on children are clear — from depression to hindered economic mobility. Persistent violence leads to higher dropout rates, student absenteeism, and poor academic performance, all of which limit students’ abilities to obtain the skills needed to improve their financial situations. Given the powerful long-term negative effects of school violence on academic outcomes, school districts should use the highest-quality evidence to ensure that all students are able to learn in a safe environment. 

Show comments Hide Comments