Privacy Papers, Part 14 – What About Trust?

The PrivacyPapers was released over a two week period of emails by Michael Kasdan, who has generously given us permission to post it in its entirety over several posts.

You can search Twitter: #PrivacyPapers, for the content and to share comments.

14. Privacy Papers – What About Trust?

From: Sanjay
Sent: Sunday, August 25, 2013 1:24 AM
To: privacy-papers@googlegroups.com
Subject: Privacy Papers: what about trust?

Privacy is always a fascinating topic 🙂  Thanks for bringing us together Mike, not sure I’m qualified to hang with this group.

My background: I’ve been dealing with data and analytics since the dot com boom.  I’ve built data-driven solutions for a whole bunch of companies ranging from healthcare to government to retail to oil & gas to software.  I just launched a new venture in big data in April with my co-founder John Akred who is a data scientist by training.

My view on privacy (at least in the consumer sphere) has always been driven by one question:  what is the economic value of trust?  I’m probably on the cynical side of pandora’s box on this one, privacy as we knew it is pretty much dead.  With the access to information, the massive amounts of “digital exhaust” that each of us puts off knowingly or unknowingly it’s pretty much impossible to function in the modern world and protect all of your data.  This doesn’t mean we should accept it lying down, but rather take it as a given and figure out what to do next.  Looking at most of the responses so far it feels like most of you have similar views.

Going back to trust, the question that I think most people consider (sub-consciously most of the time): do I trust the vendor/service/company I’m dealing with to manage my data in a positive way?  Realistically we have no idea (how many of us click through the terms and conditions without reading them?)  Rather we use trust (or lack thereof) as a proxy to decide whether we’ll share our personal data.

Back in 2003 timeframe, we did a bunch of consumer studies to figure out how much people value their data.  Unsurprisingly the answer is not much.  It’s had to understand how much to value your own personal data, the direct marketing applications are easy, but all else is hard to track back to value.  So the consumer isn’t equipped to function in a marketplace where value is assigned or exchanged based on their data.  As most of us do now, we give away our personal data (email, location, financial transactions) for not much value back in exchange.

I’ll take a shot at explaining my thoughts further soon.  But if any of you are EXTREMELY bored and want to read an opinion piece I and some colleagues wrote back in 2003, I’ve attached to this email.  Warning: written in my consulting years, consulting speak overload in the document.

From: John
Sent: Sunday, August 25, 2013 9:29 AM
To: privacy-papers@googlegroups.com
Subject: Re: Privacy Papers: what about trust?

Sanjay, thank you for your thoughts and your paper was anything but boring.  I’ve also been fascinated by the economics of trust for years.  I wrote a piece for Mashable a while back called Why Social Accountability Will be the New Currency of the Web because I wanted to find ways beyond influence that people would begin judging one another in digital networks.  I see the economics of trust evolving rapidly in this regard.  For instance, I just interviewed a woman at Allstate in charge of their Drivewise app that lets customers get discounts on their premiums after they’ve driven safely for a certain amount of time.  This feels pretty basic to me:

Drive safe = demonstration of trust/accountability with driving = economic benefits.

After enough of these apps/methodologies come into vogue (most of them are here already, we just need to aggregate them), a portrait of consumers and their trustworthiness will come into play.  In this sense, I think the CSR (corporate social responsibility) that companies have to deal with will come into play for consumers.  Pretty soon our actions will have a score and other people will ask us, “why do you keep buying clothes made in China sweatshops?”

For the Drivewise app, what I’ve been telling people in regards to privacy is also about cultural ramifications along with economic.  I’m a Dad.  Right now, a big part of my life (thanks to my amazing wife, Stacy) is arranging carpools.  Today, we judge whether or not someone should drive our kids by their personality and general trustworthiness.  Trusting that they mean well for our kids, I never put them through a driving test of any kind.  If the Drivewise mentality becomes pervasive, I firmly believe not only will a person’s driving record become public, we’ll start using other apps to check people’s overall driving score.  So when the “want to get a carpool going?” conversations happen, first I’ll check my “do you drive like a jerk app” before putting my kids in your car. Wrote about that here fyi.

One thing about Sanjay’s paper – I’m amazed that you wrote it in 2003!  Most of it read like it could be (and should be) written and implemented today.  In that sense, however, I was saddened to think that these issues still need addressing a decade later.  It’s very hard for me not to feel reactionary and think there’s some form of conspiracy going on in regard to having people remain cavalier about their data.  But that is how I feel.  Terms and conditions language has not gotten shorter or easier to read for non-lawyers over the past decade.  Why is that?  This is not anti-lawyer, mind you – my point is that a non-readable agreement has apparently been one of the primary ways to get people to just click and ignore what happens to their data.

That said, and I’m a bit rambly this morning (need more coffee), it’s time for consumers to be culpable for their/our actions.  Like the CSR for average people that’s coming into play, I think the call to action needs to be for simpler to read/comprehend T&C agreements along with data vaults, etc.

A book recommendation: “Who owns the future?” by Jaron Lanier.  Great description of how the economic side of data/privacy has grown over the past decade or so written by an insider (works at Microsoft, basically created virtual reality).  It’s not an angry book (a place where I tend to go too quickly) but does speak to the urgency of dealing with this stuff.

Thanks for all the great thoughts.  This group has been very helpful for me already.

Cheers,
JCH

From: privacy-papers@googlegroups.com [mailto:privacy-papers@googlegroups.com] On Behalf Of Michael Kasdan
Sent: Sunday, August 25, 2013
To: privacy-papers@googlegroups.com
Subject: RE: Privacy Papers: what about trust?

1.  Um. you used the term “digital exhaust” and sent us a copy of a ten page article you authored called “The Economic Value of Trust.”  You’re qualified to hang with the group, but thanks for your modesty.

2.  Good stuff all.  I’m glad we’ve moved from the original questions to just having a conversation.

3.  I will do my best to synthesize the past week and lead us in further discussion to the extent this group needs that.  Just keep talking…

d.  Of course your unwillingness to rank privacy on a scale of 1 to Spinal Tap will make my computation of the meaningless average across the group quite difficult….

MK

From: privacy-papers@googlegroups.com [mailto:privacy-papers@googlegroups.com] On Behalf Of Michael Kasdan
Sent: Sunday, August 25, 2013
To: privacy-papers@googlegroups.com
Subject: RE: Privacy Papers: what about trust?

One response to the point made by John that “Terms and conditions language has not gotten shorter or easier to read for non-lawyers over the past decade.”  This was a small point within your very interesting larger post about the economics of trust.

But the point about Terms of Service, I don’t think that’s quite right.  From what I’ve seen, there is a trend towards plainer English Terms of Service – check out FourSquare’s Privacy Policy for example, or Twitters, which has plain english annotations to the more formal legal sections of their Privacy Policy.

I talked a little bit about this on pg. 34 of this article (Section on “Plain English vs. Legalese”)

Like I said, a very small point in your larger post, so apologies for the lawyerly response…

MK

From: John
Sent: Sunday, August 25, 2013 4:35 PM
To: privacy-papers@googlegroups.com
Subject: Re: Privacy Papers: what about trust?

That’s encouraging. Thanks.

From: Kevin
Sent: Sunday, August 25, 2013 4:54 PM
To: privacy-papers@googlegroups.com
Subject: Re: Privacy Papers: what about trust?

I was going to make a similar point.  Yeah – I’m an attorney…I geek out over these things.  My experience is that generally good companies focus on minimizing legal text as much as possible.

There are a few of us who read every terms of use and privacy policy we’re presented with, but I think part of the problem is that for most people it’s an annoyance, something keeping them from the app or feature they want to use and when presented with an option they dismiss/accept without reading thoroughly.  They have the opportunity, they “agree,” but ask them 60 seconds later what they agreed to and they don’t know.  I think this is the case no matter how “plain English” you make the terms or privacy policy.  Until people care more about the terms than just getting that app started, I don’t know how you even get to the trust/value question.

CSR is an interesting way to look at things…if companies got enough pressure would there be industry standards and company ratings that meant enough that a company would feel that its business was at risk if it didn’t raise its privacy ranking?  It’ll be interesting to see how that sort of public shaming works.

more »