In part 1, I wrote about the various concerns we as society should have about the non-transparent use of big personal data. The main issue at hand is the invisible but increasingly invasive type of manipulation through personal profiling based on a constant monitoring of our online behaviour. How do we make sure that we as individuals are not becoming slaves of marketing and stay critical about what we perceive in a concrete world?
One thing is certain. In a world where many different interests compete for attention in limited time, (attempted) manipulation will remain as long as we exist. It is inherent to human nature to want to affect others' behaviour, be it for affection, adoration, religion, business, political beliefs, cultural values, and so on. Likewise, we will not be able to stop exploitation of big (personal) data to sell stories we want others to believe in too. But, as with anything in life, actions can be undertaken in an ethical, responsible and environmentally sustainable way. In this post I will therefore elaborate on ideas for dealing with manipulation in the digital age. With the knowledge that avoiding the digital world altogether is not a realistic solution in a developed world. Unless you retreat to a WiFi-free area and become a self-sufficing farmer, which is perfectly fine, too.
Manipulation: To skilfully control or utilize someone or something to serve your goal
Consulting the Cambridge English and Merriam-Webster English dictionaries for a definition of manipulation, I found each of them to contain the following aspects:
control;
a person or object to control;
advantage to oneself;
often, but not necessarily, disadvantage to the person or object of control;
skilfulness (deliberate "art").
To be able to achieve power in this world, you need to optimize your art of manipulation. Be it to convince others of your ideas, your (product's) capabilities, your (product's) added value. Literary instructions have been written by grand masters of philosophy like Aristotle, Sun Tzu and Machiavelli about how to achieve various aspects of this. In other words, manipulation is as old as civilization and can be considered a cognitive technology of which the valence depends on what we use it for.
Although there may be a negative connotation to the term, it is an important skill that any great leader expresses at least to some extent. Without it, I believe we would not have built the great inventions we know today.
For the topic of Privacy & Big Data, we focus on the manipulation of people through skilful use of people's data to serve the goal of the client. In GDPR terminology, the client could be either the Data Processor or the Data Controller. In everyday terms, this could be anyone like you and me with access to (big) personal data to for instance develop new machine learning models to predict sales for a product that your company sells. Here, the goal would be to maximize the company's cost-revenue balance.
A blunt opinion on privacy and the use of big data would be to allow all uses of your personal data as long as they contribute to your own vitality. That is, using any of your data to more accurately predict your health status. Or any of your relevant data to make you learn faster or any of your relevant data to warn you about financial difficulties in the near future - to name a few examples. These ideas beg the larger question of whether or not we need organizations to monitor what products or services we need to acquire — and how soon we need to acquire them — to maintain our vitality based on our (personal) data.
This highly individualistic approach relies on the premise that each individual has sufficient knowledge and judgmental power to decide what use of their data they allow. Therefore, what is typical about GDPR is the need to consent individually to cookies and privacy policies of each organization that we encounter online. This could only work if every individual has the cognitive ability and strength to really understand the impact of the policies at hand. One may argue whether this is fair to assume for the majority of people.
Whether these decisions concern individual retirement savings plans, voting to leave or stay in the EU, to accept cookies to enable "personalized advertisements" or not, what stocks and bonds to invest in, allowing personal data to be shared with third parties - and so on, and so forth. We are actually accumulating insights from behavioural sciences that show that people are not purely rational decision-makers, certainly not all the time. Hence, we should not keep shifting more responsibilities to individuals when it comes to such impactful decisions. It should be the responsibility of law- and policymakers to either protect individuals from making decisions they are not apt for, or make sure that all required information to take the decision is as clear and understandable as can be to a layperson.
Therefore, what I suggest are three major solution dimensions for dealing with issues of privacy and manipulation around the use of big (personal) data:
Educate & empower individuals with the right digital and cognitive tools
Legislative measures to hamper corporate manipulative practices
Data-driven systemic change to incentivize ethical and responsible behaviour more
1. Educate & empower individuals with the right digital and cognitive tools
Like all behavioural changes, it starts and ends with the individual. The most important answer to digital manipulation by mass exploitation of big personal data therefore lies in equipping people with the right knowledge and tools to take informed decisions on the matter. Thus, here are three ways to empower yourself and others to manage your online privacy and to get your "return on data".
1. Educate: Make cognitive and computer science topics part of the standard primary & secondary school curricula
As we are becoming more and more reliant on computers in our daily lives, everyone should have a basic understanding of what a computer is, does and what it could do in the future. Particularly the new young generations need to be prepared for the manipulative potential of everything digital. Children already have access to a wealth of information through the internet by themselves. Thus, we urgently need a basic framework for everyone to understand what being online means and adapt our primary and secondary school curricula accordingly.
To do so, we need to draw more on insights from philosophy, psychology and informatics to understand the basis and consequences of our own and other's (online) behaviour, as well as the actual possibilities and risks of digital space. This does not mean we all need to become full stack developers. It means that we, including children, need a basic affinity with statistics, data analysis and programming. Next, this knowledge can be applied to individual cybersecurity measures, for instance on your browser settings to manage cookies and third-party data storage.
A resourceful starting point from the European Interactive Digital Advertising Alliance (EEAA) is their satellite website called "Your Online Choices". Here you can find more in-depth information about privacy, online ads and easy to understand explanations about what cookies and different types of online advertisements are. It also contains a list of known companies that use online behavioural tracking data for targeted marketing efforts which you can disable to a certain extent via their website, as well as five easy tips to "help you manage your online privacy".
2. Enforce all personal data processors to return insights gained from your personal data & provide tools to manage your digital behaviour
For the past few years, Spotify provides annual insights for all its registered users about how much time they listened to music and which artist and genres were listened to most during which time of year and so on. This is such a beautiful example of returning insights from user data to the actual users. I loved seeing my own result over the course of several years by now as a loyal Spotify subscriber, presented in a visually appealing and user-friendly way.
This type of personalized data visualization is a simple thing to do for any organization that processes data that you generated. Think of annual trends in your healthcare spending provided by your health insurer or insights into your diet through your grocery expenses. These types of information may now only be shared between organizations' data science teams and marketing or sales departments, whereas they could be at least as valuable to you and me as end users to see where we can, for instance, cut costs or leverage other aspects of life. Moreover, insights gained from personal data should not be treated as intellectual property belonging just to the organization, since the data were provided by us as individuals in the first place.
Furthermore, individuals should be much more enabled to monitor their own online choices. Especially since the default web browser feature to disallow any webpage from tracking your online behaviour is gone. Instead, we now need to give explicit consent to each unique web domain for what type of cookies we allow the web host to store.
I could not find official research statistics or estimations on how many unique websites an average person visits on a given day. A rough estimate based on online forum answers though, would say an average person in the connected world visits about 30 unique web domains every month. The total amount of consents you need to decide on every month can easily grow too cumbersome to manage. Thus, we need tools to track and visualize our own online behaviour and manage our digital decisions, as well as to track content changes in organizations' online policies. There are quite a few cookie tracker extensions available already. For example, if you are a Mozilla Firefox user check out Cookie Quick Manager and Forget Me Not - Forget cookies & other data. However, they are not as extensive yet and moreover, not built-in by default.
3. Empower: Increase your own customer intelligence
Given the fact that marketing and propaganda will not cease to exist, the best generic solution is to stay informed, critical and self-conscious about what and why you see or buy things. Whenever you see an interesting product in a targeted ad or in the physical world, ask yourself a very simple question. Do I really need this? If the answer is yes, ask yourself if this product's quality is really worth my money. If yes, are you 100% sure that is the case? If no, let it go. The same applies to political slogans or other belief statements. Ask yourself why am I seeing this? Then, check the source of the message and be critical about what these statements were based on.
Especially in times of mass commercialization, these simple questions are extremely important behavioural tricks to help you take better care of your wallet and to become more conscious of your own values when it comes to the social, economic and environmental impact from what we create. Here, governmental and non-governmental bodies should aid as much as possible in reducing the manipulative powers of marketing and propaganda.
When it comes to ethical use of (personal) data, we can make a parallel to the recent decline of fast fashion brands. You and I as consumers have the collective power to demand responsibly, ethically and environmentally friendly produced goods. For instance, by not clicking on ads for products that we know do not align with our values. By stopping with buying at fast fashion stores; or by refusing policies that enable third parties to steal our data. Through our collective efforts in doing so, we send a clear signal to the world that we will not tolerate the mass exploitation of our data for things that do not align with our values.
2. Legislative measures to hamper corporate manipulative practices
Behaviour is a constant interaction between our internal processes and the environment. We may not want to fully control where people move, but we can create environments that facilitate ethical and responsible behaviour as much as possible. Now we have seen some solution directions at the individual level, this part is where regulatory authorities come into play to facilitate more ethical use of big (personal) data and reduce the invasiveness of digital manipulation from a legal/policy perspective.
1. Introduce "fair data use" label that reinforces ethical use of data
Akin to ISO certifications, energy labels and the Fairtrade quality mark, we need standardization on ethical data practices. Particularly now data are getting regarded more and more as commodities. This entails, amongst others, fairness checks in data representations, source validation of datasets used for instance for machine learning models and compliance with regulations such as GDPR. We could even say this "fair data use" label should align with basic human rights of freedom and sustainable development goals (SDGs). Hence, I look to a governing body such as ISO or even accountancies to audit organizations' data practices and put their ethical approval stamp on it.
2. Use blockchain technology to validate authentic data quality and disable oblique data chains
To prevent further large-scale data trade without people's consent - similar to trade in complex financial products or oblique supply chains in fashion - we can apply blockchain technology to obligate transparency of data chains, prevent data leaks and validate changes in organizations' data policies. A paper published two years ago worth reading already discusses a similar idea to track ownerships and adjustments of data (termed provenance tracking). It shows the technical possibility of using blockchain to this end (albeit there are still some more fundamental issues with blockchain that need to be solved).
3. Use default ratios of 25% "personalized" ads : 75% randomized items and 50% local organizations : 50% corporate ads
Simple regulatory measures for when you do not fill in any online preferences could be to set default ratios for how many of the presented advertisements on any web page are allowed to be personalized and how many of them can be from large organizations versus local small to medium sized organizations. This way, you prevent people from seeing the very same recommended content over and over again while also stimulating the local economy to grow.
3. Data-driven systemic change to incentivize responsible & ethical behaviour more
The above six shields against digital manipulation from an individual and governance point of view address relatively easy solutions that can be implemented in the near future. However, we can also think more fundamentally about manipulating the existing system to prevent unethical data usage and large-scale manipulation for pure commercial objectives in the first place.
1. Use big data to compute actual need for products & services
A basic question in economics is how to calculate the demand function for a product or service. Given the natural accumulation of real world data on our economies, we can presumably predict much more accurately now what the actual demand functions are for each type of product or service. That is, through the combination of knowledge about human psychology, human body and the ability to combine many more different data sources about the physical world at an unprecedented scale, we should be able to know precisely how much any individual would marginally need to live a comfortable life and adapt supply to that. Based on this, we could theoretically cap how much can be produced of each type of product or service to better balance actual physical resources and redistribute these to cater to what people realistically require to stay vital. The target result would be less overall waste across all industries and healthier people.
2. Adapt social and public policies to what we know from behavioural sciences
Fundamental to the current capitalist system is the intrinsic drive to capitalize on wealth and property. It is a system run on the presumption of a free market and rational economic market players. To what extent these reflect actual human behaviour is heavily questioned by growing evidence from scientific fields like behavioural economics that people are not always rational, economic decision-makers. Rather, our brains are biased and use heuristics to let us function in the world. Also, I recommend reading this reposted blog from Kerry-Ann Mendoza on why true free markets do not exist.
When I worked as a researcher in the pension strategy and marketing department of a large Dutch insurance company, I advised on various evidence-based solution directions to make people start planning their retirement savings in time. The major insight was that people's biases regarding retirement savings withheld them from actually doing something about their retirement savings plans. Therefore, what got people to actually take action was to communicate information in a completely different, more easily comprehensible and appealing way, by taking into account people's myopic bias and behavioural attitudes.
With these new insights into people's flaws and vulnerabilities, we can systematically check assumptions on human (economic) behaviour and adapt policies to protect us from our own cognitive imperfections. For instance, by encouraging environmentally friendly behaviour by providing every household with a complete toolkit to correctly separate waste. Equally effective would be the implementation of a national "verified" web browser that comes with default privacy settings that protect people from unwanted data usage and will only open web pages without warning that have the "fair data use" label as described in. 2.1.
3. Adapt or design an alternative to capitalism
In an effort to create a system that reinforces organizations to behave ethically and responsibly for people and planet, we need to critically appraise the issues we find in today's capitalist systems. With this we need to reinvent system dynamics that reduce irresponsible and unethical manipulative powers from typically large, wealthy organizations. One way is to impede organizations' marketing exposure (except for negative exposure) in case it behaves inappropriately.
Currently, our manner of controlling organization's behaviour is to punish unwanted actions through fines or other economic sanctions. What if we further encourage ethical and responsible decisions through positive reinforcement in the short-term? In a way, the Dutch national Climate Agreement provides a framework to do so, for instance through subsidies or governmental compensations for companies that invest in green energy or reduce their CO2-emissions. The same principle could be applied to reinforce ethical use of data. However, instead of direct monetary incentives, we could decide to restrict marketing of unethical or irresponsible organizations via any channel. Or legally block these organizations' use of any online behavioural data or digital platform
Final remarks
I started writing this blog with the main goal to provoke your thoughts about the issues and opportunities around Privacy & Big Data. Rather than only sketching the paradoxes I see today around the ease of giving uninformed consents for instant gratification versus the long-term implications of doing so, I also wanted to come up with possible resolutions in this second part. Why? Because I feel like these days people's concerns about the topic are not yet met by satisfactory practical solutions.
Although I know some of you may still truly question why people like me make such a big deal out of privacy and use of big data, what we all need to keep in mind is the sophistication with which we can be manipulated en masse, given the recent developments in data processing technology (including AI). With the solution directions presented herein, it should be clear that manipulative practices are for all times and cannot be eradicated. However, we as individuals, caregivers, legislators, societies, businesses and organizations can take a clear stance by implementing measures to protect independent thought.
Â
To close off, here is a summary of the solution dimensions and ideas in this blog. I would love to hear your opinions in the comments or feel free to leave me a personal message. Until next time!
Comments