Sep 27 2024
Megan Reschke

How New Regulations Are Reshaping Social Media Advertising

Share:

Social media has come a long way since its early days. What once was a space primarily for connecting with friends has morphed into a powerful cultural and economic force, driven by personalized algorithms that shape everything from content recommendations to targeted ads. Social media platforms have evolved into highly sophisticated ecosystems where those personalized algorithms, driven by user data and AI, work together to curate experiences, amplify (and sometimes stifle) voices, and even influence public opinion.

With billions of users spending significant amounts of their time on these platforms, it’s no surprise that social media has become an essential channel for advertisers. Yet as social media—and social media advertising—has grown, so too have concerns about its impact on key issues like users’ online privacy and mental health, especially children and teens. Amidst these concerns, recent regulatory efforts in the United States have intensified, with both lawmakers and platforms taking action to safeguard younger audiences. For advertising leaders, staying ahead of these changes is key to understanding the current state of the industry, as well as how it may evolve in coming years.

No Longer the Wild West

In the early days of social media, the landscape was akin to a “wild west” of digital communication: Platforms were growing and changing rapidly, and both users and advertisers operated with significant freedom and minimal oversight. But as social media’s influence has grown, so too has the need for regulation—particularly given the proliferation of mis- and disinformation on social media sites and the technological advancements that make these platforms both highly personalized and addictive for users.

Many of regulators’ concerns are focused on children and teens, because while social media is used across generations, it is especially popular among young people: US teenagers spend an average of 4.8 hours per day on social media platforms, and US children averaged two hours per day on TikTok and over an hour per day on Instagram in 2023.

Social media’s impacts on youth mental health make the need for regulation especially critical: Adolescents who spend more than three hours per day on social media are twice as likely to encounter mental health problems like anxiety and depression, and half of 13-to-17-year-olds say that social media makes them feel lonely or isolated. Earlier this year, the US Surgeon General released an advisory exploring the negative impacts of social media on youth mental health, and published a piece in the New York Times calling for a warning label on social media platforms (akin to the labels found on tobacco products).

Given both social media’s popularity among young users and its potential to create negative health impacts, regulatory bodies are increasingly focusing on protective measures for younger audiences. These regulations impact how data is collected and used, restrict certain types of ad targeting and content, and attempt to safeguard young social media users’ mental health and wellbeing. For advertisers, this shift means adapting to a more structured environment, but also offers an opportunity to engage with audiences more responsibly.

An Uptick in Regulations Focused on Safeguarding Children and Teenagers

The FTC’s Landmark Decision on Social Media

In July 2024, the Federal Trade Commission (FTC) took action against the social media platform NGL: ask me anything, banning its parent company and founders from offering their app to anyone under the age of 18. The FTC, along with the Los Angeles District Attorney’s Office, says the app unfairly and actively targeted children and teens to encourage sign ups, misrepresented the safeguards in place for filtering out harmful content, and used deceptive practices to lure young users into paying for subscriptions,

This represents the first time the FTC has ever banned minors from such a platform, but isn’t the first time the agency has tried to take action against a social media giant: In 2023, they proposed banning Meta from monetizing children and teens’ data. Though it is yet to be seen whether their groundbreaking decision on NGL: ask me anything will set a precedent for similar cases, it marks a significant shift in regulatory actions.

Recent Developments in Federal Legislation

In late July 2024, the US Senate took decisive action to protect children’s safety online, overwhelmingly passing the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA). COPPA 2.0 prevents companies from collecting personal data from children under the age of 17 without consent, bans targeted advertising to children and teens, and makes it easy for parents to eliminate their children’s online personal information. KOSA, on the other hand, focuses primarily on social media platforms, requiring that they protect minors’ information, “disable addictive product features,” and make it easier for minors to opt out of personalized algorithms.

Despite garnering bipartisan support in the Senate, this legislation has yet to pass the House. However, the strong momentum behind these bills indicates a growing consensus on the need for enhanced protections for young internet users.

State-Level Regulations on the Rise

Beyond this federal-level action, many individual states are taking social media regulation into their own hands.

In 2023, several states passed legislation requiring age verification and parental consent for minors to access social media platforms. And in September 2024, a Texas law came into effect that creates requirements for social media platforms and other digital services providers aimed at protecting minors. Under this law, social media providers must clearly disclose how they use algorithms when sharing information and content with minors, create parental tools that allow them to supervise minors’ use of the platform, limit the collection and use of minors’ personal information, and more.

Additionally, New York recently passed two measures, The Stop Addictive Feeds Exploitation Act (SAFE Act) and the New York Child Data Protection Act, which significantly limit how social media companies can interact with minors. The SAFE Act prevents social media platforms from using algorithmic (and highly addictive) feeds on young users’ accounts, and the New York Child Data Protection Act restricts websites from collecting, using, sharing, or selling minors’ personal data, unless informed consent is given or it is crucial for the operation of the site.

Social Media Platforms Take Action

Though most of these protective measures have come directly from leaders and regulators, some social media platforms are taking proactive steps to protect young audiences. TikTok, in the face of an impending ban in the US, recently announced they are restricting ad targeting for users under the age of 18. Other platforms like Facebook and Instagram already restrict certain types of targeting for teenage users, such as targeting by gender, locations smaller than cities, or interests, behaviors, and demographics.

The Impact on Advertisers

This burst of regulatory action around social media isn’t happening in a vacuum: It’s part of a larger trend of regulators cracking down on the digital advertising industry. In recent years, the industry has faced heightened regulatory scrutiny, with significant developments in data privacy laws, as well as antitrust lawsuits against tech giants like Google and other types of new legislation aimed at protecting consumers.

Of course, advertisers must work with their legal teams to ensure compliance with regulations that apply to them, such as COPPA 2.0. But it’s also important for advertising leaders to keep track of new regulations focused specifically on social media platforms and providers. In understanding the sentiment and goals that guide these regulations, leaders will gain a valuable outlook on how social media advertising may develop in the coming years, as well as how the industry is evolving more generally when it comes to consumer safety, choice, and privacy.

Wrapping Up

The social media advertising landscape is changing, driven by a surge in regulation aimed at enhancing the safety and privacy of young internet users. Both federal and state-level actions underscore the increasing demand that social media platforms protect the minors who use their applications. This shift is not only transforming how such platforms operate, but also reshaping social media advertising more broadly.

For advertisers, understanding the changing regulatory landscape is essential to navigating the current social media advertising environment. As the social media regulatory framework continues to develop, advertising leaders who stay informed and agile will be better equipped to maintain compliance and understand the trajectory of how the digital advertising industry will develop and change in the coming years.