top of page
Search

On Privacy & Big Data Part 1: Manipulation at Scale calls for Critical Ethical Revival

Updated: Dec 11, 2019

I recently saw "The Age of Surveillance Capitalism" - a documentary named after American author and scholar Shoshana Zuboff's book released in October last year. Its title is a provocative description for our era's paradox. Where data is the new gold, on the backs of people naively giving away rights to be monitored by commercial parties. I can say the documentary discussed things I did not even know about. For instance, Pokémon Go actually being a Google founded "start-up" (disguised as Niantic Labs) to lure people to places of commercial interest by making the sought after Pokémons appear there.


There are numerous examples in the book of how organizations are building systems designed to constantly monitor and predict our behaviour. Whether the goal is purely commercial, for public safety or personal health: the more they know about you, the better.


Image source

I have been thinking about this topic a lot lately and therefore wanted to dedicate two blogs to Privacy & Big Data. Because at the very least, I think it's prime time for us to stand up for our rights of freedom and privacy against corporate conglomerates that act on our weaknesses and vulnerabilities to eventually buy more. That is, to openly discuss about how we want to deal with privacy and the possibilities of big data. Also, to battle cynicism and think of constructive solutions.


It starts with a common understanding of what (y)our data is used for. How could it be a problem to let Google track what webpages you visit when and from where? How could it be a problem to let Facebook save what you talk about with friends? What are the risks if governments solely rely on data to ensure public safety? Then, what are potential solutions to manage this irreversible trend in an ethical, responsible way?


To structure my thoughts:

  • Part 1: "Manipulation at scale calls for critical ethical revival" will talk about the consequences of big data on a social and philosophical level.

  • Part 2: "How to deal with manipulation in the digital age" addresses what solutions people are already working on and additional solutions directions I think are needed to overcome the new era of digital manipulation.

If you are ready... Let's go!


Definitions... What is this fuss about?


Privacy is both a legal as well as a psychological term. Let's take a look at some different definitions of privacy first.


Duhaime's law dictionary says:

"A person's right to control access to his or her personal information."

Whereas the Cambridge English dictionary says:

"[1] someone's right to keep their personal matters and relationships secret; [2] the state of being alone"

Both definitions talk about the ability to control your personal information as a fundamental right, which makes the term rather a legal one in the first place. Interestingly, the Cambridge dictionary adds "personal relationships" to this as well. To make this very practically explicit right away, you have the right to disable social media like WhatsApp, Facebook and Google from gaining insight into your telephone contacts. However, Facebook ultimately gets to know your personal relationships based on which other people on Facebook you interact most with, whoever you explicitly list as best friends or family members, people you share "moments" with. If other people you are Facebook friends with do share more information about themselves than you in their profiles, Facebook still has a good proxy for what kind of person you are, too.


Now what turns out to be essential is that what we may classify as personal information or personal matters is not necessarily the most relevant part for predictive models that profile our digital identity. The question then becomes: what do we define as "personal"?


Via what is termed "rest data" (Prof. Zuboff continues referring to this as "behavioural surplus"), such as browsing history, time stamps, interaction patterns on web pages, click rates, organizations can still profile you based on your online behaviour. For instance, by applying natural language processing (NLP) techniques, it is possible to know what topics you are interested in when. Logs about what you search for on Google also indicate what you are concerned about or want to know more of. When you are online possibly even unveils whether you are more of a morning or a night person (corrected for your geographic location). The possibilities for profiling you as a potential user or customer are simply endless.


Unless you install workarounds to make it difficult for organizations like Google or Facebook to know when and where you are doing what via their medium, you are pretty much never really "alone". Basically, when you are online, Google is the big brother constantly watching you and using you to make money.

Secondly, the Cambridge dictionary defines a more psychological take on what privacy means: "the state of being alone". It is being by yourself, not bothered by anyone or anything else. For instance, when you use the toilet or want to take a shower, you probably usually want to make that a private occasion.


Now what happens when in the future we have smart toilets, smart homes or even smart cities based on sensor technology for the Internet of (Humans and) Things (Io(H)T)? Whichever organization with access to data about your movement may know how often you go to the toilet, when it's a number one or number two based on the duration of the visit, when and where you have sex, what people you meet where, what times you work, when you feel tired, and so on. There will simply be no privacy anymore. Even if your data would only be processed by the government to serve public safety, would you be willing to sacrifice your privacy for the public good? What other factors would it depend on?


Why businesses want Big Personal Data: To know whom to target when and how


The rise of big data, platform business models and with that personalized services at scale may be great from a technological business standpoint. However, the insights and actions derived from big data usually do not enrich us as individuals in society. They stay with Google, Facebook and whatever third party your data is being sold to. Data from potential customers (you and me) are being sold B2B (business to business) with profit margins that we as individual data sources do not get reciprocated.


Advanced analytics on big data about people (hence, Big Personal Data) allows for personal profiling. Thorough personal profiling means that whoever can see your data has access to exactly what you like, what you dislike, your habits and insecurities, your strengths and weaknesses. It means being able to predict when you will be most vulnerable to buy certain products or most susceptible to read certain messages.


You would be surprised how much information about you can be derived from a few days of your online browsing behaviour. This puts power to marketing and sales of typically large companies whose products and services are not even what you may really need. But when you see their ads with such compelling imagery you almost cannot resist...


In defense of marketeers and salespeople: when you have a (small) business and want to gain people's attention, it is extremely easy to use social media platforms like YouTube and Facebook to grow your business quickly. This works, because these platforms exist from the data to know exactly who and where your potential customers are. Will you think about how they know this? Products and services to "help you provide better service to customers" is just the same as "if you pay us a small fee for leveraging the data we collected from people over the past decades, we will help you market whatever you want to promote". Be it your business, political motives or anything else you seek attention for. It comes down to the same principle, but with very different objectives.


Not all doom and gloom: Other applications of personal profiling


This capacity to do personal profiling for anyone who is connected to the internet and even for people physically surrounding that person, is a true technical accomplishment. As a psychologist, I can say this is interesting purely from a scientific point of view. For instance, can we use online behaviour as proxies to better understand people's personalities? (I reckon the answer is most likely "yes".) Personal profiling is also needed for improvements in personalized healthcare. As well as to develop personalized education, as it adapts to precisely fit your learning curve and interests. It is then rather ironic how it has taken at least a decade before electronic health records were fully implemented in Dutch and British hospitals. The level of discretion around patient data is a ginormous contrast to the ease with which people agree to let commercial organizations process their data.


Money blurs our moral compass


I will go on a (seemingly) little tangent here to make my point.


In recent years, we have seen various (inter)national collaborations to regulate privacy and use of data, as well as separate initiatives to define principles for developing the related topic of ethical AI. This certainly shows higher-level concern for these issues from important policymakers such as Marietje Schaake, a Dutch politician and Member of the European Parliament. Which is absolutely great, though it also shows how we as societies and organizations are still figuring out how we want to deal with the new technological possibilities from using personal data.


For instance, the European General Data Protection Regulation (GDPR) seems to contain a few loopholes that need revision to further regulate transparency of data chains and prevent data leakage. China is notorious for its governmental processing of citizen's data at a national scale, but limits it for businesses. Moreover, the social credit system to punish those exhibiting unwanted behaviour is now also being implemented for businesses.


We can debate as non-Chinese citizens whether all these data are then right to become "state-owned". Also, we can certainly get inspiration from China's approach to strictly differentiate data processing rights, favourable to governmental organizations - i.e. not businesses. Bear in mind that all this happened only in recent years, while large-scale exploitation of big (personal) data for commerce in the west has been going on for at least ten years without proper regulations limiting them.


Bottom line is, as long as corporate data practices remain invisible and non-transparent, it is extremely difficult for anyone to decide on what is right to do. Meanwhile, we keep agreeing to revised terms and conditions to use free online services from Google and social media, in return for free monitoring of our personal lives through our data. Quid pro quo, anyone?



Evidently, the problems around big (personal) data and privacy are not primarily of technological nature. They are rather symptoms of our postmodern society's fixation on money and a diluting moral compass. It is the capitalist system today that seems to promote perverted instant gratification, mass consumerism and superficial materialism to fill up a spiritual emptiness that a nihilist would refer to as human nature.


In working toward reaching any objective - be it monetary, power, love - it may sometimes be easier to do an unethical thing to gain something in the short-term, than to take long-term responsibility by deciding not to. Organizations still have the power to choose to exploit personal data from loyalty programs, A/B-testing, cookies or other loopholes, or not to. The question here is: what moral values do they base their decision on?


Don't get me wrong, it is fine to want to make money. I am just principally opposed to any irresponsible entity that gets blinded by profit margins and tolerates obscuring factual knowledge about what is right and not right for sustaining people and planet. Yet, the power of large corporations goes further than just influencing our spending. Take the case of Cambridge Analytica, corporate funding for US presidential rallies and Russian fake news accounts on social media, to name a few.


Eventually, the essential issue of Privacy & Big Data is about manipulation at scale both for commercial as well as political power incentives.


Capitalizing on big personal data means manipulation at unprecedented scale


Manipulation has always existed and will continue to exist. The question is, how to defend our independent thought when manipulation is so invasive as well as invisible at the same time.


When we look around us in the "developed world", advertisements and brands are everywhere. On any device connected to the internet, on material items that we use, on the busses that pass us on our way to work, on our best friend's shopping bag. Companies are now capable of presenting products to you that they know of you will like, at times that you are most vulnerable to their hawking. They know exactly when you are most likely to check your phone. When you talked about that nice blue suit you saw on a colleague the other day. How likely you will be to experience a major life event soon, like getting married, being pregnant, buying a house. What your most likely political preferences are. How is it fair that they know from your data and you do not? They know, so they can capitalize on these "customer insights" and make you buy their products?


Wars have been fought for territorial power, political showdown, religious beliefs, trade, and soon we may need one to fight for which moral values we want to align with. It seems we are again arriving at a historic crossroad that divides humans by philosophical truth.


Where throughout history humans always worshiped a supernatural entities - be it the pharaohs in Egypt, the Greek gods, Hindu gods, local forest gods - these days our main structures of power seem to be corporations and not governments. As we may think at least, not in modern democratic nations. We as humans have a spiritual instinct that is not satisfied in a secular postmodern world dominated by global capitalism. Instead, we are becoming slaves to commerce with money as our sole incentive. Yet, even when we have it, it still is not enough for billionaires like Zara owner Amancio Ortega and still current POTUS Donald Trump to be clear about their fair share of tax paying. That is why, like secularization and the separation of church and state, I believe we need to oppose non-ethical commercial organizations and mass consumerism by finding a sustainable alternative. To put it in an Agile Scrum way, would this be another epic of the saga of "homo sapiens"?



In conclusion so far...


One may ask after reading this blog "why am I, or is a majority of society not demonstrating for improved privacy rights?". Because people do not currently suffer any perceptible consequences (e.g. pain) from exposing their data. Why? Because of a lack of knowledge. Because of other priorities. It is an invisible, psychological game that paralyzes our sense of morality and spirituality of which the long-term effect cannot easily be predicted.


On the bright side: I am glad to observe people are slowly waking to this important issue of our times. With the knowledge that it is extremely difficult to change anything in an existing system; what I can do is be vocal about my concerns, initiate relevant discussions with you and share my thoughts on what else we can do to battle (and conquer) manipulation. This will be central to part 2 of my Privacy & Big Data post.


Until then!

87 views0 comments
bottom of page