Privacy isn't about hiding things, it's about control and participation in information processing, interpretation, and usage.
Privacy is a plurality of related issues.
Unrestrained collection of data without relevant reference points can lead to errors of aggregation. Let's take a relatively harmless example: you purchase a book on cancer and several hours later complete the purchase of a wig. The assumption would be that you have cancer and will undergo treatment for it. If this is true (and there are other, equally valid interpretations), perhaps you don't mind sharing the information - but you'd certainly want to control how and when and to whom you reveal this information. And if it's not true, why should you have to deal with the misinterpretation of the data and the consequences of that misinterpretation? Perhaps you are studying the role of a cancer patient, perhaps you are purchasing the items for a friend. Perhaps you merely have an interest in the disease, and the wig is for a party. Perhaps they are each gifts for different people. Or any of a number of other equally probable scenarios. Now, consider some actions that are equally as innocent but with far more sinister misinterpretations.
Yeah. You see. This isn't about hiding things, it's about controlling what is shared and when and how and with whom so that ambiguities and misinterpretations are less likely.
When the government collects random data, lots of random data, with no context and no framing for the data, it's even easier to misinterpret it according to the paranoia of the interpreters. This is compounded when the government does so with its citizens without allowing the citizenry to access and correct errors, without the citizenry even being aware the data is being collected. This is a structural problem that leads to an imbalance of power, where the power should be in the control of the republic, it is concentrated in the hands of those who may have a bested interested in twisting that data as negatively as possible, if only to secure their jobs. Truth becomes irrelevant in the overwhelming face of misinterpreted and misrepresented data.
This leads to data exploitation and distortion.
Without limits or controls on data collection, to what secondary uses will this data be put? How long will it be stored? Who gets access to it? Who makes sure the data remains uncorrupted? Who makes sure the data is correct and free of errors? This is a huge issue in identity theft.
And in the case of errors and distortion - data can never reflect the entirety of a person. Because it must be stored in standardized formats, many details, relevant details, often get left out. Data collection is reductive in nature, and in the reduction, it becomes distorted. With distortion, errors creep in. It's worse than playing The Rumor Game. Let's use another harmless example. Suppose you buy a couple of books on the manufacture of methamphetamine. Without any further data, this can be interpreted in several ways - and government officials, fed on a diet of terrorists and conspiracies and paranoia for so long, will most likely choose the one that leads them to believe you are building a meth lab. The truth might actually be that you are a member of a neighborhood watch program and you wanted hard facts to bring to a watch meeting, or perhaps you were asked to teach neighbors what to look for. Or perhaps you are writing a novel or a poem. Or perhaps they are a gift for a friend in medical school. Or you're wanting to win a particularly tough trivia game. Or maybe, you just like knowing things.
In all of those scenarios, you don't have anything to hide. At least, you don't have anything to hide if our government truly had our best interests at heart. But the government's hopped up on paranoia and righteousness. Filled with the spirit of "Gov knows best" and unwilling to admit a mistake, and suddenly, your innocent purchase places you in a world of hurt.
Those who support the "nothing to hide" argument is a form of denial. It views privacy in a very narrow, troublingly particular, deeply partial way. It looks for a "dead bodies" type of harm, demanding that privacy is only invaded if something deeply embarrassing or discrediting is revealed, particularly if that information is taken out of context, as it so often is. If the standard way to recognize a privacy invasion is only through a blood and dead bodies way, then the true problems of privacy invasion won't be recognized, are indeed denied.
Privacy is rarely lost in one egregious swoop. It is threatened in bits and bytes, a slow accretion of a series of minor revelations, where a small error or a false assumption creeps in and skews everything that follows and even shadows earlier bits of a data, until suddenly, someone with "nothing to hide" is denied the right to fly, denied loans, is appearing on watch lists, is under deeper surveillance, their accounts frozen, and suddenly their lives, their "nothing to hide" lives are shattered. The trauma and expense of finding and correcting that little error, that small misinterpretation, can last years. And if the wrong sort of person accesses that huge database of information, you could be the victim of identity theft and all the problems that entails. We all know just how unsecure those data banks are, they are always getting hacked into or some disgruntled former employee is releasing the information or some employee is selling it to get rich. What if a stalker gains access to that data? Safety, sanity, health, and wealth are all compromised when we narrow our debate on privacy to just "nothing to hide".
It truly isn't about "nothing to hide" and all about personal safety. Perhaps our government doesn't want to hurt us. That doesn't mean we can't be hurt inadvertently, by carelessness or individual, rather than governmental, intent.
The "nothing to hide" argument, in the end, is a harmful, short-sighted, smug one that fails abysmally to address the full depth and breadth of privacy concerns.
Privacy is a plurality of related issues.
Unrestrained collection of data without relevant reference points can lead to errors of aggregation. Let's take a relatively harmless example: you purchase a book on cancer and several hours later complete the purchase of a wig. The assumption would be that you have cancer and will undergo treatment for it. If this is true (and there are other, equally valid interpretations), perhaps you don't mind sharing the information - but you'd certainly want to control how and when and to whom you reveal this information. And if it's not true, why should you have to deal with the misinterpretation of the data and the consequences of that misinterpretation? Perhaps you are studying the role of a cancer patient, perhaps you are purchasing the items for a friend. Perhaps you merely have an interest in the disease, and the wig is for a party. Perhaps they are each gifts for different people. Or any of a number of other equally probable scenarios. Now, consider some actions that are equally as innocent but with far more sinister misinterpretations.
Yeah. You see. This isn't about hiding things, it's about controlling what is shared and when and how and with whom so that ambiguities and misinterpretations are less likely.
When the government collects random data, lots of random data, with no context and no framing for the data, it's even easier to misinterpret it according to the paranoia of the interpreters. This is compounded when the government does so with its citizens without allowing the citizenry to access and correct errors, without the citizenry even being aware the data is being collected. This is a structural problem that leads to an imbalance of power, where the power should be in the control of the republic, it is concentrated in the hands of those who may have a bested interested in twisting that data as negatively as possible, if only to secure their jobs. Truth becomes irrelevant in the overwhelming face of misinterpreted and misrepresented data.
This leads to data exploitation and distortion.
Without limits or controls on data collection, to what secondary uses will this data be put? How long will it be stored? Who gets access to it? Who makes sure the data remains uncorrupted? Who makes sure the data is correct and free of errors? This is a huge issue in identity theft.
And in the case of errors and distortion - data can never reflect the entirety of a person. Because it must be stored in standardized formats, many details, relevant details, often get left out. Data collection is reductive in nature, and in the reduction, it becomes distorted. With distortion, errors creep in. It's worse than playing The Rumor Game. Let's use another harmless example. Suppose you buy a couple of books on the manufacture of methamphetamine. Without any further data, this can be interpreted in several ways - and government officials, fed on a diet of terrorists and conspiracies and paranoia for so long, will most likely choose the one that leads them to believe you are building a meth lab. The truth might actually be that you are a member of a neighborhood watch program and you wanted hard facts to bring to a watch meeting, or perhaps you were asked to teach neighbors what to look for. Or perhaps you are writing a novel or a poem. Or perhaps they are a gift for a friend in medical school. Or you're wanting to win a particularly tough trivia game. Or maybe, you just like knowing things.
In all of those scenarios, you don't have anything to hide. At least, you don't have anything to hide if our government truly had our best interests at heart. But the government's hopped up on paranoia and righteousness. Filled with the spirit of "Gov knows best" and unwilling to admit a mistake, and suddenly, your innocent purchase places you in a world of hurt.
Those who support the "nothing to hide" argument is a form of denial. It views privacy in a very narrow, troublingly particular, deeply partial way. It looks for a "dead bodies" type of harm, demanding that privacy is only invaded if something deeply embarrassing or discrediting is revealed, particularly if that information is taken out of context, as it so often is. If the standard way to recognize a privacy invasion is only through a blood and dead bodies way, then the true problems of privacy invasion won't be recognized, are indeed denied.
Privacy is rarely lost in one egregious swoop. It is threatened in bits and bytes, a slow accretion of a series of minor revelations, where a small error or a false assumption creeps in and skews everything that follows and even shadows earlier bits of a data, until suddenly, someone with "nothing to hide" is denied the right to fly, denied loans, is appearing on watch lists, is under deeper surveillance, their accounts frozen, and suddenly their lives, their "nothing to hide" lives are shattered. The trauma and expense of finding and correcting that little error, that small misinterpretation, can last years. And if the wrong sort of person accesses that huge database of information, you could be the victim of identity theft and all the problems that entails. We all know just how unsecure those data banks are, they are always getting hacked into or some disgruntled former employee is releasing the information or some employee is selling it to get rich. What if a stalker gains access to that data? Safety, sanity, health, and wealth are all compromised when we narrow our debate on privacy to just "nothing to hide".
It truly isn't about "nothing to hide" and all about personal safety. Perhaps our government doesn't want to hurt us. That doesn't mean we can't be hurt inadvertently, by carelessness or individual, rather than governmental, intent.
The "nothing to hide" argument, in the end, is a harmful, short-sighted, smug one that fails abysmally to address the full depth and breadth of privacy concerns.