Apr 10 2024
Kali Baldino and Zach Moore

Key Takeaways for Digital Advertisers From the IAB’s Public Policy & Legal Summit

Share:

In recent years, the digital advertising industry has come face to face with a barrage of new policies and regulations. With concerns mounting over data privacy, consumer protection, and AI, new laws have sprung up at a variety of levels—from state, to federal, to global. This regulatory frenzy underscores a complex balancing act between commercial interests and consumers’ needs.

Consumers accept ads’ presence in our digital ecosystem, with 95% saying they would prefer ads to paying higher costs for an ad-free online experience. A further 88% say they want ads that are personalized to their interests and needs—personalization that is largely dependent on the personal information that consumers do (or do not) share.

However, that being said, there is still a strong desire among consumers and regulators alike for increased transparency and consent around data collection and usage that’s built upon deep distrust of companies’ data practices, with 81% of Americans saying they are concerned about how companies use the data they collect about them and 67% admitting they have little to no understanding of what those companies are doing with that data once they collect it.

Advertisers, then, are faced with a daunting task: Prioritize consumer privacy and adapt to new and ever-evolving regulation, while simultaneously delivering personalized digital advertising experiences that resonate with audiences.

This challenge was at the forefront of conversations at the IAB’s recent Public Policy & Legal Summit, where industry leaders explored this juxtaposition and shared critical considerations for advertising teams. Whether in discussions around bias in AI, presentations on state-specific privacy legislation, or conversations on how to address kids’ safety online, navigating this complex landscape demands not only attention, but conscious action.

Consumer data regulation is picking up

While federal regulation has remained largely in limbo, there’s been a flurry of enacted legislation at the state level. In 2023, new consumer data privacy acts took effect in California, Connecticut, Colorado, Utah and Virginia. By early 2026, the number of state-level laws will grow to 14 states.

In the absence of a federal framework, advertising teams are left with a patchwork of regulation—and one that varies significantly from state to state. These laws range from relatively baseline (for instance, VCPDA), to enhanced (like the Colorado Privacy Act), all the way to business-friendly (such as the Utah Consumer Privacy Act).

With such significant variation, a one-size-fits-all approach will not suffice. Many advertisers have attempted to find the strictest regulation—namely, California’s regulation, the CCPA and CPRA—and simply adhere to that in the hopes it will cover all their bases. However, regulation is evolving so rapidly—and there are many different types of consumer data—that trying to find and adopt the “strictest” laws will likely hinder teams and create self-imposed limits where they are not necessary. Instead, those organizations that prioritize flexibility, bolster their legal and technical teams, and take the time to truly understand these different policies and regulations will be most well-positioned for success.

AI regulation is in the limelight as it continues to evolve

Generative AI was, unsurprisingly, another major topic of conversation. The technology has garnered significant attention since its public debut in late 2022, generating considerable buzz within the digital advertising ecosystem. However, its oversight and regulation pose challenges, given the rapid pace at which this technology is evolving and becoming accessible.

Though the US has yet to enact widespread laws governing its use, President Biden signed an executive order in late 2023 aimed at addressing the “safe, secure, and trustworthy development and use of Artificial Intelligence.” Additionally, the House introduced a bill that would create a commission to spearhead AI regulation.

Despite a lack of codified regulations, FTC leaders shared a few primary focus areas that should be top of mind for advertising leaders as they navigate AI. First, they encouraged teams to conduct AI-focused risk assessments and to ask their vendors to do the same, so that they can evaluate and mitigate any potential risks, such as privacy, security, or bias. They also flagged that advertisers need to be particularly attuned to the risk of bias in AI, since the data and content these models are trained by is generated by humans—and humans, inherently, have biases. Though these tools can prove useful across many aspects of digital advertising, it’s crucial that they be consistently and critically evaluated.

As the technology continues to develop, regulators are certain to prioritize its oversight to ensure that AI is employed in ways that safeguard human safety and prioritizes trust.

Data regulation goes beyond privacy

Regulation of the advertising industry appears to be focused on simultaneously protecting consumer privacy and ensuring their safety and security online. But regulators are also using these laws to address larger societal risks and issues that inevitably arise in an increasingly digital world. This convergence of privacy, trust, and safety was a major theme throughout the summit, and advertisers must recognize the significance of this overlap as they navigate today’s complex regulatory landscape.

The balance of power around user data in marketing appears to be swinging away from corporations and toward consumers, and companies will be well-served to take notice and act accordingly. Take, for example, the FTC’s recent action against Amazon: The agency’s complaint relates specifically to consumer data, but it goes beyond standard privacy protections and accuses the company of using data to manipulate people into unwittingly spending more than they intend, alleging that Amazon’s “manipulative, coercive, or deceptive user-interface designs known as "dark patterns” essentially “trick” customers into auto-renewing their Prime subscriptions.” The move indicates a new regulatory outlook where protecting consumers’ data isn’t enough, and companies must also ensure that data isn’t being used in a way that harms consumers or goes against their best interests.

This shift in power between consumers and corporations is also evident in regulators’ approach to children’s data—a particularly pressing issue, given that one in three internet users globally is a child under 18 years old. Marketing leaders must remember that kids’ data is, inherently, sensitive data, and that there are strict regulations aimed at safeguarding both this data and the kids themselves. This has been especially evident in social media, where regulators are concerned not only with the collection of children’s user data, but also the digital environment that data is subsequently used to create. Social media algorithms, in particular, have come under fire for their role in increasing mental health concerns, with some states passing legislation aimed specifically at restricting algorithms that target young users.

The FTC has also proposed changes to the Children’s Online Privacy Protection Rule (COPPA) that would shift the onus from parents to providers for ensuring that all digital spaces are safe and secure for children. Though only recommended changes at present, the rules can serve as useful guidelines as advertising teams consider how they’re collecting, storing, and using kids’ data to create experiences for these users online.

Intentionality and collaboration are key

Navigating ever-evolving regulation can be challenging, though there are certain strategies that can help.

First, advertising teams should constantly assess their relationships with vendors and third-party partners and review the processes they have in place to ensure they’re meeting all necessary laws and regulations. The IAB recently launched a new tool called the IAB Diligence Platform, which aims to guide these assessments by sharing a set of standardized privacy diligence questions for professionals across the digital advertising industry.

It also helps to have a strong legal team that can stay abreast of new regulatory developments and craft internal guidance and best practices. When regulations inevitably change, these legal teams can assess the implications for your organization, compare them with existing protocols to develop updated guidance, and share these changes in a way that is clear, consistent, and accessible.

Additionally, marketing and agency leaders should invest in internal training and education to ensure teams' compliance with any new rules or regulations. Leaders should ensure they establish clear communication channels to relay regulatory changes affecting their teams, conduct thorough briefings on updated legal guidance and clearly outline the implications of these changes for day-to-day operations.

Advertisers should also be deliberate when selecting vendors and partners to ensure they share similar values and priorities around data privacy and regulatory adherence, but organizations cannot simply rely on their vendors and assume that they’re checking all the boxes when it comes to adhering to regulations—all of this is a shared responsibility, and teams that acknowledge and embrace that can meet today’s regulatory demands proactively and effectively.

Looking forward

In today’s complex digital ecosystem, prioritizing consumer privacy and safety isn’t simply a best practice—it’s a necessity. Balancing consumer data privacy and online safety with the delivery of personalized advertising experiences poses a unique challenge. However, by staying informed on the latest regulation and legislation, maintaining flexibility, and committing to making decisions centered around consumer needs, advertising teams can cultivate trust while still delivering tailored messaging that resonates with their target audiences.