Since the onset of the pandemic, people have looked to ways technology can help lead us out of this crisis — only to be derailed by privacy and trust issues. Recently, a well-intentioned use of algorithms to help create test scores in the U.K. exposed biases affecting the placement of hundreds of thousands of students. Controversy surrounds certain countries’ use of geo-location services for contact tracing. COVID-19 shows that issues of data privacy, safety, and civil rights are more than legal concepts — they are connected and have an impact on people in tangible and deeply personal ways.

With the proper approach, is it possible to embrace technology to fight the pandemic without sacrificing trust or our safety? To answer that question, Salesforce President and Chief Legal Officer Amy Weaver hosted a recent Leading Through Change discussion with two of the leading minds on the responsible and ethical use of technology — Future of Privacy Forum CEO Jules Polonetsky and BSA | The Software Alliance President and CEO Victoria Espinel. Together, they explore how to build and maintain trust while using new technologies to rebuild our communities and regain some sense of normalcy.

Following are highlights from the conversation with Polonetsky and Espinel, in their words. They have been lightly edited for clarity.

 

The connection between privacy and trust

Polonetsky (JP): Privacy in data is a human right. This is really about your dignity. Data is power. All of us who have data — whether we're managing our website or providing major services — need to recognize you want to have a trusting relationship. Having the humility to say, “How do I show that I'm truly listening, and understanding how what I do [with the data] affects people? And having your buy-in because I listen and then shape what I do based on a goal of helping the entire community.” You're not going to be trusted unless you go that extra length to actually have buy-in from your stakeholders.

If we share information in a nuanced, balanced way, respecting limits and boundaries in a way that doesn't leave a person hungry or discriminated against, we can get it right. It just takes careful thinking and appreciation for the particular context.

Espinel (VE): There's commonality between privacy and how you build trust. It often comes down to three things: being straightforward and transparent about how and why decisions are made, listening to people's concerns and really trying to understand them, and sticking to what you say. The guiding principle for all of this has to be “no surprises.”

 

Lessons on the role of technology during the pandemic

JP: Some countries plunged into contact tracing very quickly, thinking tech would solve the problem and Silicon Valley was going to invent the solutions. It wasn't surprising to see a little bit of a backlash, but there are lessons from some of the early snafus.

This isn't about technology, this is about communicating with the public and understanding what the public health authorities and epidemiologists need. If data is being collected, it should be the minimum [needed]. And most important, it's not going to be used in an adverse way. We want to understand that we are in control, and that we, the community, the school, has bought into the way the data is used.

VE: This moment demonstrates how critical software and technology are in making sure students can go to school or workers can do their job remotely. The software industry can play a huge role in partnering with governments to make sure digital skills and literacy are being upgraded.

Also, legacy IT and outdated systems have been a real barrier to governments’ ability to have first responders communicate effectively, or to process the loans companies need desperately in order to keep going. Talking to governments about how they can use technology so they can deliver the needed and desired citizen services is a big priority.

 

Making the case for a national privacy law

JP: [The United States is] one of the only democracies in the world — and may soon be the only major economy in the world — that doesn't have a comprehensive privacy law. More than ever, we need data moving around the world. If we're going to have clinical trials and we want to make sure vaccines work, we need to understand it works for different kinds of populations. We're more dependent than ever [on each other] because of the acceleration needed to respond to COVID, yet we end up having this gap — some laws covering some data, and other laws covering other data.

I'm hoping Congress prioritizes putting in place a comprehensive privacy law so people know if they share data, companies, governments, and organizations are bound not only by good policy and practices, but there's a law backing it up. It'll actually accelerate trust, if we can promise people that they're protected by law.

VE: The pandemic has made people realize how necessary data is in order to solve major societal challenges, not only trying to find a cure or vaccine for COVID-19, but also trying to ameliorate all of its negative economic impacts. It makes no sense to have a system in the United States where your level of data protection is completely dependent upon where in the country you happen to be located.

 

Takeaway

This is a golden opportunity to discuss privacy in a way people can understand because of the immediate and significant impact it has on our ability to recover safely from this pandemic. While there are valid concerns about privacy and civil liberties biases raised by technologies that have been or will be developed, this is also an unprecedented time for organizations to earn the trust of their stakeholders by keeping human rights and privacy top of mind.

To learn more, watch the full interview with Jules Polonetsky and Victoria Espinel at the link below.

This conversation is part of our Leading Through Change series, providing thought leadership, tips, and resources to help business leaders manage through crisis. Prior video interviews include: