Transparency is More Than a Privacy Policy
Privacy laws and regulations are behind current technology’s ability to capture and process personal information, and they’ll never catch up. Technology is moving too fast and the political process is too slow. At the same time the public is becoming more aware that when they divulge their personal information it often benefits the company gathering that information significantly more then the benefits they themselves receive. The recognition of this inequity is creating a public that is increasingly hesitant to share their personal data.
The inherent tension between these two trends is difficult to tackle, but if a company is open about what they do with the information they collect (a.k.a. transparent), the tension can be ameliorated. Transparency is often required to comply with laws and regulations, but, if executed properly, it can also build trust with users. And that bond of trust encourages users to more freely share their personal information.
While that sounds simple enough. the problem is that transparency is difficult to execute in a manner that makes sense to users. Users aren’t willing to read the long and detailed explanations that are required to explain what a modern organization does with personal information. And transparency via a long winded legal document might gain a company compliance with the law, but it is not meeting the intent of the regulations if the user doesn’t comprehend the results of their actions and choices.
To address this problem we can look at aspects of our human nature that can be leveraged to aid in educating users in a reasonable and effective manner. I’ve written about this in the past, but I’ve run across another way to think about the matter that dovetails nicely with what I’ve previously covered.
Decision Systems
I’m currently reading Thinking, Fast and Slow by Daniel Kahneman and it’s an interesting study on how and why we make decisions. The part that is germane to the difficulty of transparency is the revelation that we use our intuition to make decisions preferentially over having to give it some effort to our decisions.
Or maybe that isn’t a revelation, but the book describes this in great detail. In short, psychologists describe two distinct systems we use for thinking—cleverly named System 1 and System 2. System 1 is our intuitive or “automatic” thinking where system 2 requires more thought and effort. What is 2+2? The answer is almost reflective—that is System 1. What is 17x24? Your System 2 kicked in to try to solve that. The import aspect of this dichotomy is to recognize that we like using System 1. We are lazy. We prefer not to use System 2 as it involves too much work.
Therefore when educating users, strive to make it as intuitive as possible. Remember people won’t bother to read a privacy policy despite their concern over what may happen to their personal data. That is our natural aversion to System 2 at work.
Consider the standard privacy policy. Do you think understanding that long and complex text requires System 1 or System 2? Right, System 2. So in a typical situation we aren’t making it easy for someone to understand what we are doing, or going to do, with their personal data.
Solution
What if we broke that policy apart and informed the user what we we’re doing in small chunks. For example, when we ask for someone’s birthdate we could say, right next to the field asking for that information, “In compliance with COPPA we are required by law to verify that you 13 years of age or older. Please provide your date of birth:” That’s certainly easier to understand then if we buried it in a policy somewhere, but we can do better. How about saying, “You must be 13 years of age or older. What is your birthdate?”
Now we’re speaking in a language System 1 understands. Think about this for other important points we need to communicate. To follow are some other examples:
Don’t say: “We don’t sell or rent your personal information to third parties for their marketing purposes without your explicit consent.”
Say: “We don’t share your personal information without your permission.”
Don’t say: “We store and process your personal information on our computers in the US and elsewhere in the world where our facilities or service providers are located, and we protect it by maintaining physical, electronic and procedural safeguards in compliance with applicable US federal and state regulations. ”
Say: “We store your data securely.”
Don’t say: “We may share your personal information with law enforcement, government officials, or other third parties when: we are compelled to do so by a subpoena, court order or similar legal procedure; we need to do so to comply with law or credit card rules; we believe in good faith that the disclosure of personal information is necessary to prevent physical harm or financial loss, to report suspected illegal activity or to investigate violations of our User Agreement.”
Say: “We may be legally obligated to share your personal information with law enforcement.”
“Don’t say” examples above are from PayPal’s privacy policy, which is a well-written and comprehensive document but is utterly incomprehensible to most consumers.
All of this should be self-evident, but somehow we seem to fail in this task over and over again. Understand it still may be necessary to spell things out in detail for those that want that level of information so you should maintain a comprehensive privacy policy, however those folks are outliers. Consider a typical users’ desire to understand their privacy choices in combination with their preference to utilize intuition (System 1).
To sum up: Break information into small and easily understood chunks. If you can execute transparency in a meaningful way, you will allay users’ fears and earn their trust.