Presentation at InformaticaWorld
Note: To follow is a transcript of the presentation I gave at InformaticaWorld 2014. The intent was to give data professionals a reason to consider privacy as part of their roles and responsibilities. I will be following this up in a couple of days with my thoughts on the InformaticaWorld conference.
I want to spend a couple of minutes and take you on a quick trip. I’m going to start with what you already know about privacy and hopefully end up connecting it right into your everyday work responsibilities. I’m going to cover a lot of ground really quickly, so buckle up. Let’s start: For consumers privacy is a top-of-mind issue.
If you need evidence of this, then look no further than an ongoing online poll from the Web We Want Project. It asks one simple question: What kind of web do you want? The viewer chooses from six answers. I want a web that:
- Safeguards privacy
- Is available to all
- Inspires learning
- Creates opportunity
- Puts me in control
- Promotes freedom
When the user selects an answer they are taken to the results page. This page shows a map of the world, broken down into regions, with each region showing the color corresponding to the key color of the most popular answer. As you can see the most popular answer, in every corner of the globe, is privacy.
We are seeing and explosion of new products and services from startups and established companies alike that are born from the public’s desire for privacy protections. And organizations are touting privacy sensitivity or privacy protection as part of their brand identity or their main value proposition.
Take Facebook for example. They have made a number of moves recently to appear more privacy sensitive, including a big announcement two weeks ago at their developer’s conference. They made a big splash with the announcement of the availability of a login function for third party sites, similar to what they’ve had for a long time, but this function allows users to use their Facebook credentials but do so anonymously.
Or look at scroogled.com. This is a site owned and operated by Microsoft for the sole purpose, at launch, of hammering Google on its lack of sensitivity to privacy issues, and of course, positioning Microsoft as a better alternative.
It’s not just the tech giants either. There are a growing number of startups, some very well funded, with privacy as their main value proposition. For example Blackphone. These guys are creating an Andriod-based smartphone, with encryption turned on by default and the promise never to sell data to third parties.
Now you might be thinking, this is all consumer facing stuff, what does this have to do with me? I’m getting there, I promise, but first recognize that these efforts are a response to consumer demand. And coincidentally they are also in line with the intent of most privacy laws and regulations around the world. These, at their heart, demand corporate transparency and they encourage users’ to retain control over their data.
But the concepts of transparency and user control, commonly referred to as notice and consent, again a fundamental underpinning of most privacy laws and regulations and the foundation of most of our compliance work, is fundamentally flawed – they don’t work.
And, there are lots of reasons for this, but if you want evidence of the pervasiveness of this belief, look no further than last week’s Big Data and Privacy report from the President’s Council of Advisors on Science and Technology. That is a formal report presented at the direct request of the President. In this report, of 80 or so pages, they all but flat-out say notice and consent is dead.
“Notice and consent is the practice of requiring individuals to give positive consent to the personal data collection practices of each individual app, program, or web service. Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent.”
… And the regulators are not blind to this fact. And we are seeing clear evidence of this in their increasingly aggressive enforcement actions.
Edith Ramirez, head of the U.S. Federal Trade Commission, in testimony at a recent US Senate hearing expressed her concerns about how personal information has been collected and vowed to sue companies that collect large amounts of data and misuse it. Ramirez likened herself to a “lifeguard at the beach.”
And the FTC, along with State AGs have done just that. And it is important to note that they are not just bringing action against companies who are deceptive (you promise to do one thing, but you do another) they are going after companies that are failing to meet consumer’s expectations for security and privacy.
And it is not just here in the US. Globally, Europe is struggling to rewrite it’s aging Data Protection Directive. Singapore has a new privacy law, Australia and Canada just strengthen their already mature regulations, Hong Kong, Japan and Argentina all have laws in place as well. The trend is clear. Privacy protections are getting more robust and this a trend that will continue for the foreseeable future.
So we have regulators getting more aggressive about enforcement and extracting multi-million dollar settlements, and asking for, and getting, more restrictive regulations and, here in the US, they are not solely relying on notice and consent, they are gauging whether a company violates social norms. What does all this mean? It means that the enterprise is now much more accountable to their actions then they have been in the recent past. And to properly mitigate privacy risks the enterprise needs to do much more than just complying with applicable laws and regulations in the way they secure sensitive data.
And who is best positioned in the enterprise to know what sensitive data is being collected, where is it being stored, how is it being stored, who has access to it, and for what purposes? The kinds of things regulators are scrutinizing? It’s you. Privacy is migrating out of the silos of legal and compliance and into the domains of technologist at all levels. Particularly those who manage our increasing large caches of data.
Privacy is a bit of a wild west right now: there is little clear indication of exactly what constitutes acceptable uses or reasonable security. Yet the risks of, and the penalties for, committing a privacy violation are real. Just securing it from unauthorized use and complying with policies is not enough. The enterprise is being held accountable, and ultimately that means you are being held accountable for keeping an eye on how your data is moving around, into and out of your organization.
In this digital economy, the game has changed. If you are in charge of managing data security, you need to play a key role in mitigating privacy risks, and to do that you need to understand the privacy landscape and you need to know your data. In the event of a breach you will be held accountable.