Nov 02

Why adtech and data protection don’t get along

The adtech industry has turned its focus from billboard and newspaper/TV commercials to the internet. What started off as innocent ads shown the same way to every visitor of a page has grown into a behemoth of highly targeted ads that are auctioned off between hundreds if not thousands of companies within split seconds.

The process of selling and serving ads has become so complicated that it is a black box for most people. It is known as real-time bidding (RTB) and has become a hot topic in the tech industry in the past decade. It went from an exciting new frontier of marketing to a privacy conundrum that we haven’t figured out yet how to solve.

It is fantastic how organisations can reach an exact audience with their ads, but the process itself has become so intransparent that it presents numerous privacy obstacles that need to be tackled to be legally compliant (not to mention the ethical concerns…).

How the Real-Time Bidding process works

This is how the real-time bidding process works in very simplistic terms: While the website we are visiting is loading, the website publisher puts the ad space on that site up for auction and that is bought by the highest bidding advertiser.

But there is a lot more that happens underneath the hood. The actual auction process occurs at external advertising exchanges that connect with supply-side platforms (SSP) and demand-side platforms (DSP) that help publishers and advertisers exchange information about the ad space, the desired audience and information about you, the visitor. Data management platforms (DMP) are used to analyse and combine data about us from various sources.

Information points analysed, among many more, include our:

  • unique ad tracking ID (an ID that identifies you across the web to advertisers)
  • online searches
  • pages visited
  • browsing behaviour on those pages (clicks, scrolls, session times)
  • activity on social media platforms
  • location
  • language
  • details about the devices they use

These data points are used to categorise you as a person and then compared with the audience the publisher wants to reach. When the system has more information about us they can serve us with suitable ads. The price advertisers are willing to pay for the ad space we see will be higher if you fit their audience criteria better since they assume that the ad will be closer to our interest and therefore more successful.

Due to legislative pressures in recent years, an additional actor was added to the mix – consent management platforms (CMP). These platforms are used by publishers to ask for and manage user consent to the ad tracking and are often connected to big frameworks such as the IAB Europe’s Transparency and Consent Framework which include thousands of companies.

Adtech + data protection = issues

Let’s look at why ads nowadays are problematic in the eyes of the law. In the EU we have two main pieces of legislation that are concerned with this topic: The ePrivacy Directive (soon to be replaced by the ePrivacy Regulation) and the GDPR. The former deals with the confidentiality of communications and the rules regarding tracking and monitoring, while the latter is concerned with the processing of personal data by organisations generally.

Cookies and the issue of consent

The ad industry relies on the usage of cookies and similar technologies. The ePrivacy Directive tells us that consent is required for all non-essential cookies – i.e. cookies that are not strictly necessary for providing an online service, or to carry out or facilitate the transmission of communications over a network. Cookies used for ads are considered non-essential. This means that obtaining consent from the website visitor through a cookie banner is required. Since the introduction of the GDPR, the standard for obtaining valid consent has been raised.

Regardless of whether cookies process anonymous data or personal data, it is always necessary to obtain valid consent to satisfy the ePrivacy Directive requirements.

In addition to the consent for the ePrivacy Directive, the GDPR requires organisations to determine a suitable legal basis if personal data is processed. The definition of personal data is far-reaching and also applies if non-personal data types can be combined with other data making it possible to identify an individual. European supervisory authorities have determined that consent is the valid GDPR legal basis for the processing of personal data for marketing cookies. So, even if a provider offers your organisation to rely on legitimate interest to process the personal data obtained through ad tracking technologies, you should opt for consent instead. If you feel like your organisation has good reasons to rely on legitimate interest you should conduct a thorough legitimate interest assessment.

Transparency and information to end-users

One of the main issues of the current ad serving process is the lack of transparency. The process behind serving ads has become so complex that it is hard to find the right balance between conveying a good level of understanding to affected users and getting lost in technical details. Oversimplifying the process that involves so many different parties and their systems doesn’t do justice to all the processing that occurs behind the scenes. This might mislead the individuals into adopting an idea that doesn’t have much overlap with what happens with their data in actuality.

For most companies in the ecosystem it is already unclear how the whole process works, making it incredibly hard for individuals to get a clear picture of what of their data is collected, combined with other data that other companies along the process might have about them and auctioned off to thousands of advertisers.

Many voices (including the ICO) see it as a tough target to reach and push for the ecosystem to be changed and simplified. The consequences of the data processing for ads is largely unforeseeable for individuals since it involves complex technology and a vast amount of third parties.

Processing of special categories of personal data

Another major issue with the current ad system is that it often includes the processing of what Art. 9 GDPR defines special categories of personal data (e.g. information about health, race, religion or sexual orientation). It is often combined with ‘regular’ personal data to segment the individual further.

Since processing special categories data implies more severe risks to individuals’ rights and freedoms it enjoys special protection and has stricter rules on when and how they can be processed. In addition to determining the legal basis under Art. 6 GDPR, one of the special conditions in Art. 9 GDPR needs to be applied. The only condition that is seen as applicable for lifting the general prohibition of processing this data for the purpose of serving ads is explicit consent (an enhanced version of consent). Current consent management frameworks don’t fulfil this high standard and therefore cannot technically be relied on for processing special categories data.

Risk assessments

While we should always do some sort of risk assessment when we start processing some personal data, the EDPB has provided a list of criteria that can indicate if the processing activity at hand is likely to result in high risk to individuals’ rights and freedoms. If that is the case, then a controller is required to conduct a more extensive risk assessment, known as Data Protection Impact Assessments (DPIA). The introduction of real time bidding requires a DPIA since several criteria are present, such as the fact that it is a new technology that implies the large-scale processing of personal data, including special categories data used for profiling individuals.

Other issues that trigger the need for DPIA are that the processing doesn’t have safeguards in place to protect vulnerable individuals and children and that the processing mostly occurs in a hidden way since the transparency principle is/can often not be fulfilled adequately (as described above).

So before we start using RTB we should have a thorough look at the setup and consider the rights and freedoms of individuals and see what measures we can put in place to mitigate the concerns.

Data minimisation

It shouldn’t come as a surprise that when we collect data and combine it with a lot more data from various sources we end up with so much data that it becomes questionable whether the principle of data minimisation is complied with. This data protection principle requires us to process the least amount of personal data needed for the purpose of the processing.  Serving highly individualised ads to make them more successful inherently implies having a good picture of the person you are targeting. In this sense, the desired outcome is clearly at odds with a person’s privacy and their rights and freedoms granted by the spirit behind data protection laws.

What can be done?

There are a couple of initiatives that try to detangle this complex system and to make it more privacy-friendly. For one, most big browser players have introduced mechanisms to block the setting of third party cookies that are commonly used for ad tracking. The most commonly used browser, Chrome, announced to follow suit in 2022. This might sound like a great success however cookies are an ancient technology that companies have been moving away from for years. Instead, more advanced tracking methods have been introduced.

It is clear that publishers need to make a living and ads are a great way to make money for providing content that is often offered free of charge. Merely blocking one technology after another on a browser level cannot be the (only) answer to fix this broken system. More drastic changes are needed.

The most prominent example of a reform came from Google itself. Since it is the biggest player in the ad market and already received fines for their lack of GDPR compliance, it wants to develop the thing that will replace the current model before someone else does it. Their first proof of concept is called FLoC and is part of the Chrome Privacy Sandbox. FLoC is centered around the idea that the segmentation of users into audiences occurs on the local browser instead of the servers of companies. The audience segments need to include a minimum number of people to avoid possible identification. If a so-called ‘cohort’ drops below this threshold the cohort will be combined with similar cohorts. The idea to anonymise users before their information leaves the browser is a good one in principle. Nonetheless, the idea has received a lot of backlash due to certain flaws concerning further monopolisation of the ad market and the risk of companies combining this cohort information with further data (as is done today) leading to further identification of individuals. Following this backlash, Google announced the end of the trial and that it will go back to the drawing board.

Looking forward

Ads have become omnipresent on the web and their usage subconsciously influences the decisions of billions of people on a daily basis. But despite their importance in the digital ecosystem, we’re just beginning to see the reform required to keep up with people’s expectations around privacy. FLoC is only one example of possible solutions forward, but all of these initiatives are still in their infancy steps and will require a lot of time and effort to replace the status quo. Until then we can only but try to do our best with what we have today… or turn our backs to real time bidding and return to a more simple way of serving non-targeted ads. 

See more related posts »

Related blog posts

Learn together with +8000 privacy pros

Grow and improve with our best tips and tricks. No spam, ever.

  • Hidden