You wait in line with nervous excitement — you’ve been home for months and finally you’re out and about with your friends once again.

As you get to the front, the uniformed guard takes your temperature and asks to see your phone. You show the guard that you have downloaded a contact tracing app and have had it running for a few weeks. The app flashes a green QR code indicating that your Bluetooth tracker has not logged contact with a contagious person.

The guard scans the code, confirms your identity using biometric authentication and eventually permits entry. Once you are inside, it feels almost normal again.

As the United States reopens the economy, something like the scenario above might be part of the new normal.

The use of mobile phone apps and other technologically enhanced surveillance might be necessary to participate in social life, including taking part in everyday activities such as going to the office, seeing concerts, eating at restaurants, visiting amusement parks, or even getting on airplanes or public transportation.

Governments in dozens of countries have hastened to deploy mobile phone surveillance programs as part of their response to the pandemic.

They have used mobile apps designed to track COVID-19 symptoms, map population movement, trace contacts and disease exposure, enforce quarantine orders, and validate a person’s health status.

This mobile surveillance promises to augment public health interventions to manage COVID-19, and the new capabilities might become technological fixtures that assist countries worldwide to be better prepared for the next virus outbreak.

But these programs collect sensitive health and behavior data — such as your symptoms, location and contacts — that raise significant risks to personal privacy and civil liberties.

This data might enable governmental abuse, discrimination by employers and health insurers, social and reputational harms, and criminal fraud. Even with narrowly tailored mobile apps that seek to exclusively serve public health objectives, they likely contain software vulnerabilities that mean they can be exploited by hackers.

Concerns about data privacy might also diminish trust in the programs and thereby discourage the use of apps such as digital contact tracing that requires widespread adoption to be effective. If these programs are rushed into everyday use, they may be accompanied by serious questions about the robustness of their personal privacy protections.

What’s worse, the people most affected by the contagion and lethality of COVID-19, including marginalized and vulnerable communities — especially Black and brown communities of color — are also the most at risk of harm from privacy intrusions. Many of these communities are already facing extensive government surveillance and have the least recourse to mitigate the damages.

Unfortunately, as anyone who has paused to actually read a mobile app privacy policy knows, understanding these policies is exceptionally hard. They are written in dense jargon and written for lawyers rather than users. And each app is likely to have a different lengthy privacy policy that precludes them from easy comparison.

What could be needed now more than ever is a concise, transparent and standardized approach to understanding how the proliferating public health surveillance programs affect personal privacy.

Taking inspiration from the well-known Consumer Reports, we have developed a simple privacy scorecard that summarizes the most important privacy criteria associated with surveillance. The criteria we identify include transparency about the program, narrow purpose focused on public health, the possibility of informed and revocable consent, time limitations on the program and the data it collects, and data management considerations to guard against cybersecurity threats.

To prototype and demonstrate this scorecard approach, we scored more than 40 COVID-19 mobile programs worldwide.

We believe these scores can be used by everyone to assess programs, but also used by government officials that are administering and implementing them. Even if not all privacy criteria are met, there could be an opportunity to explain how and why they are not met by explicit justification based in public health.

There is also a need to ensure that these technological interventions actually serve the public health effort. At this stage, there is limited evidence that these programs are actually useful. State and local governments could strategically leverage public health expertise to identify where these technologies might be of value.

They also could consult with key stakeholders, including the vulnerable and marginalized communities that have already been hit the hardest by COVID-19, to ensure that these approaches are sufficiently privacy preserving.

As the world becomes ever more mobile app dependent, understanding the implications on privacy could be critical.

A standardized, concise, and transparent privacy scorecard is one such way.