You wouldn’t mail sensitive information using a postcard. Instead, you’d send it in a securely sealed envelope. But when you send the same sensitive information digitally, would you be sure to encrypt your message? Chances are the answer is no.
That observation prompted Laura Brandimarte, assistant professor of management information systems in the Eller College of Management at the University of Arizona, and her co-authors to write their latest analysis, “How Privacy’s Past May Shape Its Future.”
“While privacy decisions seem to be quite easy to make in the physical world, they appear to be more difficult once we switch to similar situations in the online world,” says Brandimarte. “So we started to wonder whether there is maybe an evolutionary story behind these differences in offline versus online behaviors.”
The analysis focuses on the idea of a privacy mismatch, which Brandimarte defines as the idea “that people are not (yet) equipped to make informed decisions in the online world when it comes to privacy and security.”
Humans evolved to have senses that help us detect possible dangers in the physical world. Using certain cues, we’re able to evaluate risks and stay away from threats. In the online world, we don’t receive the same types of cues and thus cannot as clearly detect potential risks. For instance, “We do not see Google leaning over our shoulders to track our sensitive searches; we do not hear the US National Security Agency stepping closer to listen to our videoconferences,” the analysis notes.
In fact, we can be tricked online into thinking we’re safe, which may lead us to make poor decisions about our privacy, like revealing sensitive information.
One predominant approach to privacy management in the United States is notice and consent. However, Brandimarte argues that this tactic is inadequate, as it places the responsibility of privacy on people who aren’t always trained to properly protect their information.
Rather, the authors advocate for policy changes. As the automobile industry developed and driving became more dangerous, regulators enforced safety measures like airbags and anti-lock braking systems. In the same way, laws and regulations should be enacted to ensure individuals have a level of privacy. These measures could include privacy-enhancing technologies like privacy-preserving data analytics, Brandimarte notes.
Effecting change takes time, but for now, what can we learn from this analysis? “Don’t trust your senses too much when you are online,” Brandimarte says. “The next dangerous deepfake or deceiving dark pattern is just around the corner!”
This research is co-authored by Alessandro Acquisti, Heinz College, Carnegie Mellon University and Jeff Hancock, Stanford University and is published in Science.