Privacy vs. Transparency

One can easily imagine that there might be tension between the interests of transparency and personal privacy.
  • Where do the two overlap or interfere?
  • Are these interests really in competition with each other and if so, is it a winner-take-all prospect or are there solutions that maximize the interests of each?
  • Is there a definable boundary between the two?

What Privacy Wants


Privacy wants to be free. That is, the interests of privacy seem to me to be:
- no transparency about what someone is doing or exactly when
- no transparency about the condition of someone’s body or what they put into it
- no transparency about someone’s thoughts or feelings, except what they choose to share
- no transparency about someone’s location
- no transparency about someone’s beliefs about anything that cannot be known as fact
- no transparency about someone’s personal preferences in products or services or media or information sources
- no transparency about sexual orientation
- no transparency about one’s “triggers” or “hot buttons” or “pet peeves” or other vulnerabilities
- no transparency about one’s values
- no transparency about one’s spiritual nature

Sufficient privacy would certainly allow someone to commit a crime unnoticed if that was their goal. That should not be construed as a reason to oppose privacy, any more than the fact that a weapon can be used to harm someone should be construed as a reason to oppose the personal ownership of weapons. When you consider how many things could be used as a weapon, Reductio ad absurdum suggests the opportunity to commit a crime in private is not a problem with privacy, it is a problem with people’s choices.

Privacy wants to change its mind freely.

Privacy wants to give itself only up on its own terms. It cannot be taken alive. Privacy wants to feel safe and when it doesn’t, we hurt inside. We might not even understand why, but even small children understand privacy and when it has been violated, even if they don’t understand when they violate the privacy of others.

What Transparency Wants


Transparency wants to know everything that can be known. That is, the interests of transparency seem to me to be:
- sufficient information to understand what was done, when and by whom
- sufficient information to understand the basis of decisions and the values behind them to the extent practical or possible or allowed
- sufficient information to assess whether assertions made are true
- sufficient information to understand whether changes made actually happened
- sufficient information to assess whether the agreed-upon process was followed
- sufficient information to understand whether a change yielded the desired results
- sufficient information to understand whether a change also had undesired results, and what they were
- sufficient information to permit a root-cause analysis to begin

Sufficient transparency would make it practically impossible to commit a crime unnoticed or anonymously. That should not be construed as the reason for transparency any more than creating an opportunity to see the public naked is the reason for airport body scanners.

Transparency just wants to know. Transparency doesn’t judge, but people do sometimes. In fact, transparency lays bare the details of our actions and choices and makes it easy to judge us for them. If we are to be judged, let it at least be for what we actually are and not for people’s misconceptions of us.

Transparency doesn’t understand boundaries. It believes all information wants to be free.

The Boundary between Privacy and Transparency

First, transparency is nothing without an observer. That is, if the information can never or will never be seen or considered, it isn’t really transparency. It is unobserved information and it has no meaning until it is observed.

Without an observer, privacy reigns supreme.

So, the observer must exist and like all individuals, they have their own interests and responsibilities and vulnerabilities, etc. There must be limits to transparency because without them there can be no privacy. And, yet, the needs of the observer can only be satisfied by sufficient transparency. So, what is sufficient and what is an unnecessary intrusion on privacy?

I believe there is a boundary between privacy and transparency. I would describe the boundary like this:
- transparency ends where the information it would provide isn’t material to actions or choices being observed.
- privacy begins with a person’s preferences, feelings and thoughts as long as they are not asserted by that someone as a cause or reason for actions or choices
- privacy always reigns in matters of belief and spirituality. It is as silly to measure the unknowable by the rules of what can be known as it is to measure what is known by the rules of the unknowable.
- privacy always reigns in matters personal sexuality so long as nobody is directly harmed and sexuality is consensual
- privacy always reigns in matters personal choices about one’s body, so long as those choices do interfere with other’s ability to use shared areas or facilities

It comes down to the goals of transparency in any given instance. If the goals require people to give up personal privacy, the goals are questionable. If the goals do not require it, then there should be little conflict between transparency and privacy.

Transparency is a mechanism: a tool for understanding the actions of people and systems.

People’s goals might be subjective, irrelevant or important to success. They might be appropriate or not. They might be anything, but using transparency as a mechanism doesn’t say anything at all about the virtue of the goals underlying that transparency. Once can use masterful techniques for transparency for all the wrong reasons and cause terrible harm.

So, to break the conflict between privacy and transparency, one must investigate the goals behind transparency, using transparency. Poetic, no?

A thorough exploration of the goals of transparency go a long way toward defining its limits and establishing the boundary beyond which lies privacy.

Why do you Need to Know That?

If transparency could be characterized as a list of questions, the interests of privacy are met by challenging each item with the question:

Why do you need to know that?


There either is or isn’t a good reason to need to know that, or the reason’s goodness or badness is not a shared assessment between the people discussing it. Of course, power was never a reason to think someone is right. Their ability to enforce a decision has nothing whatsoever to do with the merits of that decision. Too often power corrupts, but it is also true that there is no fate and people can make good choices. We are not consigned always to be self-interested.

If someone doesn’t have a good reason to need information, we must err on the side of privacy. To do otherwise would be to allow the willful ignorance of one party to entirely strip away the privacy of the other. People have an intrinsic right to privacy unless there is an adequate reason they should not. Where this is not true, privacy becomes a privilege.

Nothing strikes at the ambiguity or conflict between transparency and privacy so much as understanding the motivations for transparency and challenging any that aren’t valid. But, that is only the beginning.

Just as transparency doesn’t bless the goals behind it, blessed goals do not intrinsically bless the methods by which the information is collected. Imagine giving a blood sample with every code checkin. There must be a less egregious way to authenticate the person checking in changes than a drop of their blood.

It is not sufficient that a given goal is deemed valid. The information collected for that goal must be material to the goal and the manner of collecting the information should make it easy to provide. Compliance should not be an issue or it works against everyone’s interests, consumes time unnecessarily and reduces the quality of the information collected. It is in everyone’s interest to make the mechanism of transparency as simple as possible. It is sometimes possible for it to be completely automatic, such as when using a wiki or SCM system. This is the ideal: compliance requires no effort.

When compliance requires effort you can be fairly sure that the observer does’t bear this extra effort.

Sometimes the resistance to transparency is more to the mechanisms themselves than the information collected.

If the goal is right and the mechanism isn’t a burden, privacy and transparency do not compete. They need not even overlap, and if they do by accident it is abundantly clear in which direction the transgression has been made.


More than Coexistence

Transparency and privacy need to do more than coexist separated by a demilitarized zone. They must be good neighbors and must each help maintain the fences between them. Encroachment by one should not be considered a victory, because any variance from a healthy balance weakens both, despite any sense of short-term gains. The reason is that when privacy and transparency are in balance people are least affected as they do their work or live their lives, and the goals of transparency are met. When privacy or transparency is unsatisfied, some of the energy of workers and observers is consumed in something other than working or observing and using what was observed. That time might feel justified when it is spent; however, we cannot overlook the fact that it is time spent doing something other than working on the product or service being delivered. Arguing about how to make widgets doesn’t itself make any widgets.

Transparency and privacy are the check and balance that keep each other healthy. Any boundary issue can be considered in terms of their needs to help break a complicated situation into smaller considerations that are easier to understand and compare. A change in one precipitates a change in the other often enough that they should be considered together. The relationship between the two can be healthy or not. In an unhealthy relationship, fear or unshared interests distorts the needs on one side at the expense of the other.

It is easier to damage privacy than to defend it. Therefore, the burden of proof should lay with anyone seeking to damage the privacy of others. Their reasons must be fair and reasonable. That is how privacy keeps transparency in check.

Healthy transparency keeps privacy in check by allowing the assessments of the merits of submitted work to be made on fairly collected, objective data. In other words, it creates a social contract wherein someone agrees to the manner and means by which their performance will be judged, and there is a reasonable belief that only the agreed upon manner and means shall be used in that assessment. If someone is instead judged on other grounds that aren’t material, they have every right to feel misjudged. The criteria for evaluation should be clear and consistently understood by all parties.

Without some means to assess performance then the law of the jungle fills the void that would otherwise be filled by rational evaluation of the performance of individuals. Without some means to assess performance, the peter principle is sure to play out painfully. Survival is a race to the bottom for such a company.

If we can agree that some means to assess performance is appropriate, and if we can agree on the terms, then healthy transparency sufficient to gain the data needed to assess those performance goals cannot really be perceived as an invasion of privacy. If they are, then there cannot have been agreement on the terms of personal evaluation and they should be renegotiated.

If we agree that the transparency is not perceived as an invasion of privacy, I think we have reached a quiet place that should be savored.

Objective Performance Evaluaton


The goal is that the transparency sufficient to gain the data needed to assess performance against agreed-upon measures is not perceived as an invasion of privacy


In that rarified place data is willingly provided by participants. It is more accurate and its use is valued by both the providers and observers of the information.

The only way to reach that place is to negotiate the terms of evaluation openly and fairly and to collect only as much information of relevant kinds as is necessary to assess performance.

The social contract around personal evaluation is the core boundary layer between transparency and privacy and crossing it is a serious matter. Nowhere can more damage be done so swiftly as in the speedy misjudgment of another. Transparency reduces the chances of misjudgment, but can’t eliminate it. The fewer means of evaluating performance, the more likely that subjective evaluations will affect the outcome. The more independent means of evaluating performance, the less like that subjective evaluations will affect the outcome. Of course, there are diminishing returns. Evaluating someone by rating them on one hundred different performance measures indicates mainly that nobody wants to take responsibility for making decisions or evaluating performance. On the other hand, having only three performance goals leaves a lot to be inferred about how the rest of the person’s time was spent and whether it was well-spent or well-appreciated.

The social contract for performance assessment is a manifestation of the boundary between privacy and transparency. If it is an unhealthy social contract, you can expect either privacy or transparency to suffer. If there is no such social contract then privacy may have no champion and transparency may be a predator.

In the end, mutual success should be the goal of all the members of a company. If they don’t share that goal, there is clearly something wrong. The goal requires mutual trust, and an invasion of privacy undermines trust quickly. In other words, collecting more data than an organization needs can lead people to feel disenfranchised and not be as invested the success. It doesn’t matter that the goal of collecting the information was to increase the success of the company. If it makes people care less about success it seems more dangerous to collect the data than not to.

The data may not be the problem. It may just be the manner in which it is collected. Disambiguating the two helps. Ask how people “would like to provide the data?” to determine if they object to the means or the measures, If it is the measures, reconsider the goals. If it is the means, consider adopting different means if you value the data being collected.

If you value the data being collected, don’t you also care whether people care enough to enter accurate data? It is impossible to legislate willingness to participate constructively and there are countless ways data can be inaccurate. Willing participation in the data’s collection is an important factor to the accuracy of the data, unless its collection is completely automated.

Therefore, I believe that honest, open negotiations about performance measures and the transparency in place to collect the data upon which to make those assessments is the definition of a healthy balance between transparency and privacy.

Performance reviews are, of course, just a form of judgement. We hope the judgement is fair and good and based on objective criteria.

On the face of it transparency, privacy and judgement seem unrelated. But, their independence and relationship guarantee that they behave as a system. And therefore, to master the system, one must understand its parts. If you could afford not to understand them, you wouldn’t be surprised or suffer when they don’t behave as expected. Therefore, to understand transparency or privacy or judgment you must also understand the others, because they exist in relation to each other.

Neither transparency, privacy nor judgmement should be allowed to be a tyrant.