Redefining Privacy Invasion to Involve a Human Element on Both Ends

On October 24th, Yafit Lev-Aretz presented on “Privacy & The Human Element” in the weekly Privacy Research Group (PRG) meeting. There is no doubt that interest in privacy violations has skyrocketed ever since the Cambridge Analytica scandal, which caused a huge uproar regarding Facebook’s privacy measures. However, despite this incident, most people continue to use Facebook without paying much heed to their privacy.

Lev-Aretz explained this phenomena as the “privacy paradox.” The paradox is that people care about privacy in theory, but act as if they don’t really value it. For example, it is a common notion that some people think privacy is dead (or at least not as valuable as innovation, saving lives, etc.), and that privacy doesn’t affect them personally because of the thought that “I’ve got nothing to hide.”

Lev-Aretz then suggested that the privacy paradox is perhaps caused by a mismatch in the definition of what people mean by “privacy violation,” and proposes that a privacy violation should be defined as including a human element on both ends (i.e. a human(s) being violated by another human(s)).

She proposes that the thought that “nobody is watching me,” and that even if something is, it’s an algorithm, paired with the thought that whatever is watching me doesn’t know that it is “me,” but rather views me as a data point, is what makes people act in ways that suggest that they do not really care about privacy. For example, Lev-Aretz poses the question: “would you put a human instead of an Alexa in your bedroom?” The answer is probably no. This demonstrates that “computation and algorithms have replaced the human element in the loop.”

Importantly, Lev-Aretz proposes that when

  • No human is in the loop, there is no privacy violation
    • Example: an algorithm collects data points on multiple individuals
  • A human is involved in one of the ends, there is no privacy Violation
    • Example: A person puts an Amazon Alexa in her bedroom and it collects and analyzes the person
  • Humans are involved in both ends, there is a potential privacy violation depending on the context
    • Example: A Facebook employee uses Facebook data to stalk women online

Lev-Aretz explains that redefining privacy to include a human element on both ends is significant for a number of reasons. First, it is important for conceptual accuracy. In other words, this would eliminate the privacy paradox because when people say that they care about privacy they are thinking about “privacy” as involving a human element on both ends, while when people act as if privacy is not as important, they are often thinking about privacy in terms of no human element or only one human element in the loop. Second, redefining privacy is important for a number of strategic reasons. First, it would ameliorate privacy exhaustion by communicating more efficiently; second, a narrower privacy definition would not distract from other pressing informational concerns.

While including the human element in both ends undoubtedly explains the privacy paradox, another possibility is that people are unaware of the harm that is involved when only one human is involved in the loop. The irksome feeling one gets when a human is observing them in a space where she has an expectation of privacy is enough for them to know or feel the harm. However, the harm isn’t all that obvious when it is a machine or algorithm observing them. In other words, people may think there is no harm when it is an algorithm collecting their data or if their information goes to a company that they trust such as Facebook or Google. However, there are still harms that occur in such situations. For example, without proper privacy measures this information could get into the hands of the wrong people or company, and this could lead to immense harm as was evident in the Ashley Madison Scandal. Perhaps if a human element on no end or on only one end can be characterized as the first step to there being a human element on both ends, this could also make people behave in ways that show that they care about privacy invasion, eliminating the privacy paradox.

Kyung Song is a J.D. candidate, 2020, at NYU School of Law.