The last week in August is also the first week of school for many K-12 students, but as more school districts use algorithms to place students in schools and enhance school safety, some experts worry these formulas may be causing more harm than good.

The Center for Democracy and Technology (CDT), a prominent D.C. tech think tank, released a report just in time for the start of the school year detailing ways school districts can take responsibility for the algorithms they use.

Three economists led by Alvin Roth, an economist at Stanford University, developed the school placement algorithm using game theory and won the Nobel Prize for their work in 2012. Several major cities, including Chicago, Denver, Indianapolis, New Orleans, New York, San Francisco and Washington, D.C., use this algorithm to choose the schools students attend. Parents rank which schools they want their children to attend, and the algorithm attempts to place them at their top choice.

The problem, CDT says, is these algorithms are black boxes, and for some cities — like New York — they may not work the way policymakers intend.

“Although they involve math, algorithms are not neutral decision-makers,” Quay-de la Vallee and Duarte wrote. “Subjective human judgments dictate the purpose, design, and function of an algorithm and influence its outcomes.”

The report addresses two key areas where schools are using algorithms: school placement and monitoring social media to predict which students “may be at risk of committing a violent act,” such as a mass shooting.

In response to recent shootings, some schools are hoping algorithms can detect problems before they start. But the CDT report’s authors, senior technologist Hannah Quay-de la Vallee and policy analyst Natasha Duarte, worry some school districts aren’t properly evaluating privacy risks, accountability questions and the risks and limitations of algorithms before using these algorithms and databases.

“Social media information may be part of a larger threat assessment database, as in the case in Florida, where the Marjory Stoneman Douglas High School Public Safety Act mandates that social media information from students be combined with data from other sources such as law enforcement and social services,” Quay-de la Vallee and Duarte wrote. “The database went live in August 2019, despite bureaucratic issues and unresolved privacy and governance concerns that delayed its implementation by eight months.”

The CDT also warned that algorithms monitoring social media are notoriously bad at predicting mass shootings or other violent acts because “these events are statistically rare” and  “their small sample size and the complexity of factors surrounding them, they cannot be reliably predicted with algorithms.” Thus, these algorithms tend to “generate large numbers of false positives.”

The privacy ramifications of these false positives, the CDT notes, could be devastating for students.

“Even if a flagged student is ultimately cleared, she could still face negative repercussions if her records indicate that she was flagged as a safety concern,” Quay-de la Vallee and Duarte wrote. “Surveilling students’ speech risks dissuading them from expressing their views, engaging in political organizing, or discussing sensitive issues such as mental health.”

The CDT also worries about the parameters of such social media monitoring. If schools are monitoring students’ social media accounts outside of school hours, for example, then this is “overboard surveillance” and could violate students’ First Amendment rights.

Because of these risks and limitations, the CDT strongly warned schools against using algorithms to monitor students’ social media accounts for safety threats. They also advise schools using these algorithms to “conduct regular audits, keep humans in the loop, ensure legal compliance and ongoing communication with stakeholders.”

In addition to social media monitoring, many school districts use school placement algorithms to streamline placing children in schools, but again, the CDT and other experts worry these algorithms harm students instead of helping them.

Working with MIT, Senior Consultant Meg Benner and Senior Fellow Ulrich Boser at the think tank American Progress evaluated school placement algorithms to see how effective and fair they are. While these algorithms did streamline the school placement process, the researchers found that they also perpetuated racial and socioeconomic biases.

The researchers also found that due to flaws in the algorithms used by school districts, most students weren’t even getting placed in their top choice for school.

“Though it may seem like a technical detail, the design of the algorithm that matches students to schools significantly affects students’ chances of being placed at their preferred school,” they said in their 2018 report.

Affluent parents with more time on their hands also try to “game” the system to get their children into the top schools by ranking their second choice school as their top choice if they know that their true first choice is popular and likely to receive many applicants. By not being honest about which school they truly want, they ensure their child gets into at least the second-best school, while many of those who applied to the top, popular school don’t even get into the second-best school simply because there aren’t enough spots.

“It can be especially difficult for economically disadvantaged or disconnected families who may lack the time and information to play the game,” the researchers explained. “Furthermore, experience and social networks increase understanding of the application process, putting newcomers at a disadvantage.”

In addition to being unfair, these systems are inefficient, leaving many students without a match to any of their chosen schools.

The CDT recommends school districts develop a list of the district’s values to guide algorithm decision-making.

Examples include, “How far should students have to travel to get to school? Should districts optimize for school diversity, and what should diversity look like? How should priority be assigned to families’ choices? Should a student’s financial situation or other risk factors ever be considered in their school assignment, and if so, how? These are questions of policy, not math, and their answers might depend in part on the specific context and characteristics of the district.”

The Brookings Institution published a report on school algorithms in February, also stressing the need for school districts to clearly identify what problems they want their school placement algorithms to solve.

School placement algorithms “embody and implement the goals that policymakers have for student assignment,” Assistant Research Director Matt Kasman and Fellow John Valant write in the Brookings report. “Policymakers should be forthcoming about those goals—e.g., whether to integrate schools or give all families a neighborhood option—and make policy decisions that align with them. Specifically, we suggest creating a statement of values for student assignment. Some of these values might exist in tension with one another, but community members should know how policymakers approach these tensions.”

Follow Kate on Twitter