When Policies and Practices Don’t Match

The Center for Digital Democracy (CDD) recently filed a compliant with the FCC alleging that 30 U.S. companies are failing to comply with the US Safe Harbor Agreement. The companies are all data brokers of some sort – either as their primary business, such as Axciom, or a by-product of what they do such as with Salesforce. The filing is 100+ pages of background, opinion and evidence, but the whole thing can be boiled down to one simple allegation: These companies say they do one thing in their policies but in practice they do something else.

This filing is not unique, as the FTC earlier this year settled similar allegations with twelve companies and, over the years (and with increasing frequency), they have brought a number of similar cases against individual companies. All of these cases almost always boil down to someone doing something with personal information that their policies didn’t either permit or explain clearly.

Unfortunately for most of these companies there rarely appeared to be some nefarious intent, rather it was just a case of laziness, sloppiness or ignorance. Yet the cost in terms of loss to the bottom line, damage to brand reputation as well as the monumental distraction from core business needs is non-trivial. This begs the question, how could this situation be avoided? The good news is that the answer is simple: Employ someone who understands laws, regulations and policy as well as technology.

The bad news is that kind of expertise is rare. There are plenty of smart capable lawyers who deeply understand privacy laws and regulations and there are plenty of technologists who understand data, and tools and techniques used for analytics, distribution and protection, but very few who know both. Yet, as is apparent in the CDD filing, companies that are collecting, storing or processing personal information need a role that can audit, issue spot and direct the policies, procedures and development efforts within the company to mitigate the risk of violating privacy.

The IAPP recently published a sample Chief Privacy Officer job description. In that lengthy description are the requirements to understand law, policy, technology and trends, but the devil is in the details. I know some of the privacy officers for companies listed in the CDD filing, and these are bright and capable individuals, but the vast complexity of the technology they are required to understand, I fear is beyond the scope of their training (most come from a legal background). And the department these folks run are generally under-funded and under-staffed leaving them without adequate internal technical help (to keep track of projects and interpret the technology for these officers). They are facing a near impossible task and, as we see today, the risks of failure are real.

I’m betting over the next 2-3 years we are going to see a huge investment in individuals who come from a technology background that can also interpret laws and regulations specifically to mitigate privacy risks. As is clearly evident in the CDD filing, the market is demanding it.


Now read this

IoT is Going to Result in Privacy Troubles

Someone asked me the other day, what I meant when I said the Internet of Things was going to shake up privacy as an issue. I thought I would share my answer. The short answer is, “Dude, there are going to be billions of these things... Continue →