Insights

Data Responsibility: The Difficulties of Achieving Transparency

Author: Joanna van der MerwePrivacy and Protection Lead

It is no secret that in the realm of data responsibility, there is much debate about transparency and how to achieve it. What does true transparency look like? How do you communicate about data to ensure data subjects understand what is happening? And in the end, how does this relate to the legal obligations around informed consent, transparency and the ability of users to protect their privacy?

At the Centre for Innovation, we have been exploring this subject specifically in relation to privacy policies. Whether you are using the “legitimate interest” or “informed consent” as the basis for processing personal data on websites, mailing lists, projects etc., a privacy policy is a foundational document that provides transparency for the user/data subject.

Privacy policies provide insight and clarity and what data is being collected, for what purpose, under what legal justification and who can be contacted regarding this data. The only problem is that these privacy policies are usually written using language only accessible to highly skilled experts in this area and/or lawyers and regulators. Thus, bringing into question whether they help achieve transparency.

In 2018, the BBC explored the difficulties of simply reading the privacy policies of major social media sites. They found that to understand the text of the policies, you had to have at least a university-level education, not to mention the length of time it would take to read them, even with the required education level. This is not unique to social media; education technology companies face similar challenges. In a study conducted by Common Sense Media, the majority of these edtech companies’ privacy policies and terms of services documents scored above the average adult reading level on various readability tests.

That is not to say that companies are not exploring ways to address this challenge. Twitter has gamified their privacy policy, whilst Facebook has provided a simplified version that includes short explainer videos. However, as Geoffrey A. Fowler highlights, this still takes time, and understandability is not the main challenge we now face regarding privacy policies. It does not matter how understandable privacy policies are when, if you wanted to truly understand everything about your data as a user, you would be reading hundreds of documents (or playing games or watching videos) for every app and website you use (not to mention all their sub-processers). If you would like to check how much time you would need to dedicate to this, Fowler conveniently provides a tool for this calculation.

At the Centre for Innovation, we provide a “condensed” version of our privacy policy focussed on ensuring accessible language and an “extended” version written in “legalese”. This has helped us achieve a higher level of transparency and accessibility in terms of language and the time needed to read the documents. However, the main foundation of our ability to protect data subjects’ rights is that we try to achieve the highest level of data minimisation, meaning we do not have to include lengthy policies because we are constantly aiming to collect the bare minimum of data we need to continue our work. For social media companies and others providing apps, user-friendly interfaces allowing the user to easily find and control data collection is a solution that needs to be added on top of simplified policies.

However, to truly address the issues around users controlling how much and what data is collected about them, the fundamental business models need to be challenged. Twitter, Meta (which owns Facebook, WhatsApp and Instagram), Google etc., make their money off user data whilst the services are free. How these companies make money is their ability to provide targeted advertising, which is based on their ability to, at a granular level, build profiles of their users, which are then matched with the audience wishes of companies paying these data behemoths for advertisement placements. The more data they have, the better the user profiles and the better the advertising service they can provide. Furthermore, much of the focus is on these Big Tech companies who collect the data, but there is also the need to shine a light on the selling and purchasing of data by data brokers in the United States.

To build on this, the market dominance of these companies makes it difficult as a small organisation or company to avoid using them or establish an alternative business model. Big Tech are often key to success for these small businesses, whether you are looking to sell items on Facebook Marketplace or simply understand the visitor to your website (Google Analytics).

Without challenging these business models, and exploring other options, transparent privacy policies and terms of service do little to achieve the true aim, which is to give users the ability to decide what level of privacy they wish to have in the digital world. And as digital technologies become more entrenched and new technologies emerge that can collect new forms of data, it is becoming increasingly urgent that a solution is found.