Skip to content

The Privacy Paradox

We increasingly hear about digital privacy. The amount of commentary taking place across the Internet keeps increasing, and nearly every media outlet has discussed it at some point in the recent past. We’ve seen public concern rise, and governments across take action in passing regulation regarding data security and use. Despite all of this, it seems that companies struggle with data privacy issues. For a topic that gets so much visibility, with the very real risk of heavy fines, why is it that companies struggle so much? In this blog post, I hope to explore this a bit, with the five common reasons I’ve seen for this disconnect between what should happen, and what does happen.

Issue 1: “Not My Problem”

In companies which lack a compliance function that includes privacy concerns the primary issue is often a result of imperfect information. Nothing gets done, because every relevant group assumes another group is handling it. I should clarify, I don’t think this is intentional, but the mechanics at work result in the same outcome. Specific groups rarely have the capability or authority to handle all relevant work when it comes to digital privacy, even if certain teams, such as marketing or analytics face an outsized impact for the company getting it wrong. Further. due possible conflict with other internal groups – one group may not understand or surface changes with enough time for the company to properly react.

In the realm of marketing – privacy concerns tend to come in via one of three avenues.

Marketers may get notices regarding privacy changes in their advertising platforms, such as those that were communicated via Google and Facebook in light of Apple’s App Tracking Transparency enforcement. Unless the Marketer is technically inclined, they may reasonably assume that the engineering team responsible for mobile app development is handling things and they have no or limited action to take. They may not fully understand the impact of such changes and fail to communicate them correctly to the appropriate parties.

Engineers are rarely dedicated to review beta builds and release notices for breaking changes, and even should such a process exist in engineering – the engineer doing the evaluation of said notices may not recognize the impact of specific changes being made in the name of privacy and how those may cascade to changes throughout a organizations’ processes. They may fail to notify relevant groups of technical changes unless some external force forces a discussion. This is often the case with browser privacy changes, which can cause dramatic fluctuations in reporting, but rarely cause a specific metric to flatline. The impact may be noted by Analytics or Marketing teams in reporting, but debugging a drop in data may take a backseat to other efforts that engineering is prioritizing. This lack of communication can cause outsized efforts to understand what is going on in the reporting in those other groups that could have otherwise been avoided had engineering communicated.

Finally, a companies legal team may become aware of regulation changes which need to be addressed. Depending on the specific legal team’s risk tolerance, they can either force discussions with other groups such as engineering, or delay in acting on anything until case law starts resolving ambiguity for the law in question.

Legal is the group most likely to reach out to other groups such as leadership, engineering and marketing, but only in organizations that are actively monitoring privacy changes and have dedicated legal resources. For brands that employ ad-hoc legal counsel, I would expect the likelihood of notices to decrease dramatically and it would be on the company to kick off discussions.

Essentially there is a very high chance for privacy concerns to fall through the cracks in absence of a dedicated effort to address concerns. This requires leadership buy in, and making one or more people responsible for compliance with-in their respective domains.

Issue 2: Languishing Lexicons

The next issue that often happens is one of communication. Even if all the various groups share a common core language, such as English, it’s unlikely their specific lexicons will be understood by the other groups.

Rarely, do specific groups receive cross training in adjacent groups. For example, a lawyer may not understand engineering to the required degree to properly gauge risk for specific activities. Likewise, an engineer may not have the required training to properly translate law or regulation into actionable steps required by engineering to comply.

This tends to result in delaying progress on determining the scope of work, and the required steps to mitigate or address privacy concerns as the various groups seek to develop a common vocabulary and understanding of the topic in question. Further, this increases the risk of misunderstanding between the groups which could lead to costly results later on down the line (see process below).

Issue 3: Competency Conundrums

While the first two sections discussed problems in the interaction (or lack thereof) between groups this one talks about problems that can occur with-in a specific group.

Generally being asked to work on privacy issues without some sort of training is equivalent to being tossed into the deep end of a pool and being told to figure out how to swim before you drown. Why is this the case? I see three reasons.

Training

Digital Privacy concerns have really only risen to the forefront for most companies in the past few years. When I attended Trustweek a few months ago, in a room of 100 privacy professionals during a session on international data transfers, The speaker asked how long the audience had worked on privacy compliance issues. Less than ten had been in the field longer than 2 years and only two longer than 5 years. Those two were in healthcare and had to deal with the USA’s Health Insurance Portability and Accountability Act (HIPPA).

While this was a personal anecdote it tracks with my general experience. I believe we have effectively an emergent industry where privacy expertise is more likely to be the exception, rather than the rule for most verticals. But surely critical roles such as Engineering and Legal get training in their schooling? Not so much.

When we look at engineering, very few programs exist which are dedicated to Privacy Engineering. Examples of courses that do exist are the Master’s in Privacy Engineering from Carnegie Mellon and the Certified Information Privacy Technologist from the International Association of Privacy Professionals. It is far more likely for a given engineer to have no privacy training, then even a passing understanding of the complexity of it. It’s only in the past few years that many degrees have started including a privacy module and so even fresh graduates may only have a broad overview of what is a rapidly changing field. This is to say nothing of the vast array of self taught engineers many of whom never get exposed to data privacy in their self-selected training.

What about law? Law is also a rapidly changing field when it comes to digital privacy and often a lawyer would have to specialize in data privacy training in order to develop a in-depth understanding of this complex multi-domain space. It wasn’t until June of 2022 that the State of New York became the first state to issue rules requiring attorneys to complete at least one credit of cybersecurity, privacy and data protection training as part of their continuing legal education requirements beginning in July of 2023. Thus again, it’s more likely that a given lawyer will have no, or limited training when it comes to data privacy.

As a result of the training chasm that exists between what is needed in regards to education and what is provided, teams assigned to privacy concerns are often having to effectively start from scratch when new privacy activities take place which delays compliance activities and increases the risks of mistakes in reaching compliance goals. This is doubly true if the staff assigned to these projects rotates between projects depending on staff availability.

Rate of Change

The rate of change is another factor that hurts developing expertise in the digital privacy space. Platforms, Browsers, and Regulations change at a frantic rate. The competing concerns coming in during one calendar year could be enough to give most people whiplash. This results in confusion in the industry as various court cases get litigated prompting a reevaluation of the current processes in light of any new legal guidance in effort to further reduce legal risk.

This never ending cycle can be very hard for organizations who are trying to stand up a compliance program to check all the boxes and handle often competing information accurately because the privacy demands seem to never stop. This can easily lead to burn out or the feeling that your re-treading previous efforts as you employ yet more solutions for whatever the privacy goal is for this week / month / quarter.

Personal Issues

It should be noted that often privacy discussions require senior staff to determine the best path forward. A privacy engineer may have to review the proposed plans submitted by an architect and when issues arise offer recommended solutions. This effectively means the privacy engineer understands the technical system in depth, and can apply privacy techniques / constraints to it and have the solution remain viable, or be able to explain to non-technical staff why a direction change may be required. Commonly this is described in job descriptions alongside a 5 to 7 year experience requirement. It may be found that more junior people have a skill set misalignment to the needs of the role (see training above).

Lastly, it needs to be understood that not every engineer / marketer / lawyer has interest in privacy matters. It has been my experience that even amongst senior people you’ll get better performance / engagement with those whom have a interest in the field of privacy. Since privacy work never ends it’s a good idea to ensure that the staff you have developing these hard to acquire skills, actually want to have these skills long term. Otherwise you may find yourself starting over in training new personal in a specific cross-domain mix such as privacy engineer as the people you’ve previously trained switch jobs to better align to something they are actually interested in.

Issue 4: Process Problems

This next section relates to issues on how companies may act to resolve privacy concerns. When related to process, the issues come across in three ways.

In the first scenario: Privacy efforts may be handled via a one-off project, instead of a dedicated team. In this scenario, not only does everything I talk about in the above sections likely come into play, but so does the difficulty being slated into the existing workstreams of multiple groups so that the deliverables complete around the same time. If the effort lacks a Leadership sponsor or that Leader fails to see the importance of cross-functional team efforts there may be a conflict in terms of timing or securing key personal. The odds of this increase in my experience for privacy specific issues, as I have seen very few companies plan for significant privacy investment outside of landmark laws such as Europe’s General Data Protection Regulation.

In the second scenario, Once slated to a release timeline – without a strong internal support advocate privacy issues may only get a review period once a feature is fully built. This in my opinion is too late, and more expensive than including the privacy reviews at key checkpoints during the feature design cycle where if a change was requested it would cost less and be trivial to adjust. It becomes significantly more expensive to attempt to ‘bolt on’ privacy practices to an existing product in which none of those concerns were factored in. Further, once deployed any privacy breach may result in external investigations and fines, not to mention brand reputation damage (see below).

New to many companies is the need to document key privacy related decisions in a framework such as a Data Protection Impact Assessment. Given that the goal of these documents is to identify privacy risks, and develop methods to mitigate them – running this process at the end of a development effort doesn’t make a whole lot of sense. You want to have these discussions as earlier in the development process as possible, so that things can be addressed while is it still inexpensive to do so.

In the final scenario there is likely a need to modify existing processes in light of data privacy work which may be overlooked. For example the common request to add a marketing pixel to a site, may now be required to go through a data governance review, a privacy review, and generate a Data Protection Impact Assessment before deployment. This elongates the workflow, but must be done to avoid potential legal issues with supervisory bodies. It is this process for ensuring alignment between what a company says it does and what it actually does that drives the recent warnings from the Federal Trade Commission.

Issue 5: Operational Opportunities

Lastly I want to talk a bit on the tendency of companies to handle privacy concerns as capital investments, which once a given amount of capital is expended, the problem is ‘solved’. The dangerous aspect to this belief centers around the rate of change in the field identified above. The world is changing in regards to data privacy and we’re seeing the need to stop considering privacy work as capital expenditures in most scenarios because it doesn’t actually end with a project completion.

It is my belief that more companies would be better served by considering privacy a operational expense. Standing up a compliance or privacy program, with the required staff, training, processes and technology is a non-trivial investment. However due to the rate of change, the fact is that the need to sustain some level of investment for the foreseeable future remains. This also enables the brand to keep the project team together, and considering the skills gap often present in this work, that can result in significant time savings as privacy work continues.

Lastly, Leadership would be well served consider privacy concerns as critical to brand safety. Certainly, this was found out the hard way when last month when several hospitals and Meta were sued alleging violations of HIPPA due to marketing pixel placement. The fact that an blogger or investigative reporter may cause damage to a brand (such as the recent tracking announcements regarding TikTok and Facebook) due to perceived privacy failings should be a cautionary warning to other brands whom may not have their own house in order.

Conclusion

Privacy is a very complex topic, and there is often a mismatch of skills, priorities, and understanding of importance that can occur at all levels of a brands hierarchy. This is why brands will continue to stumble as they move towards compliance goals and why it is the brands that solve these challenges (often via a dedicated compliance program) will likely be better shielded from legal pressure in the near future, and enjoy the benefits of increased consumer trust and the advantages that brings to the bottom line in the long term.

Published inPrivacy