Global update: Ground-breaking new EU law to regulate ad targeting and advertising transparency
The World Federation of Advertisers (WFA) reported that the European Union has agreed on a new law – the Digital Services Act (DSA), which aims to tackle illegal content online as well as drive responsibility and accountability in how online platforms moderate content.
The DSA will ban online platforms from showing ads to minors that are targeted based on their personal data. There is still a lack of clarity with regards to how minors are defined and how platforms should proceed if they are unsure about whether a user is a minor.
The Council and European Parliament are expected to formally adopt the DSA in July 2022. This means that the designation of very large online platforms could be complete as early as October 2022, with requirements to be met by February 2023. Requirements applicable to other platforms might apply from January 2024.
The DSA, which will apply to intermediary service providers including online platforms, also introduces measures to promote online advertising transparency including restricting the use of sensitive personal data for advertising purposes.
The DSA will require online platforms to ensure that consumers can easily identify ads (including influencer marketing) and obtain meaningful information about why they have been shown an ad and what targeting criteria were used to do so.
The new rules will apply to intermediary service providers, including online platforms such as social media, online marketplaces and search engines. However, the toughest requirements have been kept for so-called very large online platforms (VLOPs); those which have at least 45 million monthly users in Europe. If the text is adopted and published in July, VLOPs could be designated as early as October 2022, with requirements to be met by them by February 2023. Requirements on other platforms might become applicable from January 2024.
The WFA provides a helpful summary of the main measures relevant to advertisers below:
Promoting online advertising transparency
- The DSA will require online platforms to ensure that consumers can easily identify ads (including influencer marketing) and obtain meaningful information about why they have been shown an ad and what targeting criteria were used to do so.
- The regulation will oblige very large online platforms to put in place public databases of all ads published on their websites over the last year. These databases will include information such as how many people were shown the ad, which groups of recipients the ad was intended to be displayed to and how many recipients were targeted in each EU member state. Although WFA had called for the repositories to be available only to public authorities and vetted researchers, latest compromise language has indicated that the ad repositories will remain publicly available and searchable.
Regulating targeted advertising based on personal data
- The DSA will ban online platforms from showing ads to minors which are targeted on the basis of their personal data. There is still a lack of clarity regarding how minors are defined and how platforms should proceed if they are unsure about whether a user is a minor.
- According to final reports, online platforms will also have to gather user consent before showing advertising which is targeted using sensitive personal data such as religion, health information, sexual orientation and trade union membership.
Prohibiting dark patterns
- Online platforms will be prohibited from using deceptive design techniques (so-called ‘dark patterns’) to manipulate users into making certain choices. The European Commission will provide further information as to which practices would be banned.
Ensuring privacy, safety and security by design for minors
- Online platforms will be required to put in place targeted measures to ensure a high level of privacy, safety and security by design for minors. The European Commission will likely be tasked with issuing guidance as to which measures are appropriate.
Tackling illegal content online
- Online platforms will be required to take steps to remove illegal content on their sites and be more transparent about their content moderation practices. Very large online platforms will also be required to implement annual risk assessments of their services, put in place measures to mitigate risks, and be subject to independent audits. These measures will support advertisers’ brand safety efforts and reduce the risk of advertisements shown to European consumers appearing next to or inadvertently funding illegal content.
- Online marketplaces will also be required to verify the identity of their traders before they can promote or sell via the platform. Although WFA called for this obligation to be extended so that all platforms are also required to verify the identity of all advertisers, latest compromise language reveals that this suggestion was not taken up during negotiations.
The DSA sets an important global precedent as one of the world’s first laws introducing sweeping rules aimed at tackling illegal content online and new restrictions on ad targeting which go beyond existing EU data protection regulation. Similar laws are under consideration across the world, including in the UK, where regulators are currently exploring new rules to tackle illegal and harmful content in order to protect user safety online.
Australian election and policy directions
The two major Australian political parties have announced some policies relevant to advertisers in terms of the online space. The Labor party wants to bring in tough new industry codes for banks, telecommunications providers, social media providers and Government agencies to clearly define responsibilities for protecting consumers and businesses online; review penalties for perpetrators and remedies for consumers currently in place for online fraud, misleading conduct and deceptive practices and ensure that technology platforms who profit from the sale of online advertising are made responsible for the prompt removal of scam advertising from their sites.
The Liberal Party has announced additional financial support for the eSafety Commissioner to further expand coordination with other regulatory and law enforcement agencies to make it easier to report online harms; further protection of children online by requiring tech companies to install enhanced parental controls on smartphone and tablet devices that are easier to find and activate (particularly when first setting up a device), and harder for kids to bypass – including website blockers and filters, app store permissions, screen time limits and parental access. According to the announcement government will regulate if industry doesn’t act within 12 months.
Meanwhile the review of the Australian Privacy Act, which includes the Exposure Draft of an Online Privacy Bill, is still ongoing. Read more about AANA’s response in a recent article. The AANA’s submissions stress the importance of general location data (i.e. area or postcode) in order to serve relevant advertising and the need to allow targeted advertising in a privacy-safe way.
The AANA has also argued that the proposed privacy changes may risk consent fatigue as has happened in the EU, and any new privacy restrictions should not hamper an advertiser’s ability to track and audit their digital advertising spend to ensure they get what they pay for.