Advertisement

Social media platforms engaged in ‘vast surveillance’ and failed to protect young people, FTC finds

Messenger mobile app
Social media apps and video streaming services are still facing scrutiny from U.S. regulators over youth safety and privacy concerns.
(Photo illustration by Jonathan Raa/NurPhoto via Getty Images)
Share via

Major social media platforms and video streaming services that gather a “staggering” amount of data from their users have failed to protect young people and safeguard online privacy, the Federal Trade Commission said Thursday.

“These surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking,” FTC Chair Lina Khan said in a statement. “Several firms’ failure to adequately protect kids and teens online is especially troubling.”

The agency, which is focused on protecting consumers and enforcing antitrust law, released a 129-page report that analyzes how some of the world’s largest social media platforms including Instagram, TikTok and YouTube collect and use vast troves of data they gather from users. The findings highlight the mounting scrutiny online platforms face from regulators and lawmakers seeking to combat technology’s potential harms as they become more deeply intertwined with people’s daily lives.

Advertisement

Politicians and consumer advocates have long been critical of how companies such as Facebook compile information on users that is utilized to target ads at people based on their interests, location, gender and other information. There also has been alarm over how teens are grappling with the potential downsides of social media, including the sale of illegal drugs and comparing themselves with their peers.

The report stems from information the FTC ordered the largest social media and video streaming platforms to turn over in 2020. Those companies include Snap; Facebook, now Meta; Google-owned YouTube; Twitter, now X; ByteDance, which owns TikTok; Discord; Reddit; and Meta-owned WhatsApp.

The responses showed how companies collected information on unwitting consumers about household income, activities elsewhere on the internet, their location and more. Tech platforms gather this information from ad tracking technology, data brokers and from users who engage with posts online, giving companies a glimpse into their interests. Some companies failed to delete data on people who had requested that they do so, the report said.

Advertisement

Even though most social media platforms require teens to be at least 13 to create accounts, people can easily lie about their age and the platforms collect data from teens in the same way they do for adults, according to the report.

In attempts to fend off the ongoing criticism, social media companies rolled out features aimed at giving parents more control over their chidren’s online experience. This week, Meta said it would make accounts for teens younger than 18 private by default, stop sending notifications to minors during certain times and provide more parental controls. Snap, which held its annual conference Tuesday, said it was partnering with Common Sense Media to develop a program so families know more about potential online harms.

Lawmakers, including in California, have been trying to address data privacy and youth safety concerns by passing new laws. But they’ve also faced legal hurdles because of a section of federal law that shields online platforms from being held legally liable for user-generated content.

Advertisement

Meta and Snap declined to comment on the report. Meta is a member of the Interactive Advertising Bureau, which said in a blog post that the group was “disappointed” by the FTC’s characterization of the digital ad industry as one that engages in mass surveillance.

Discord, which lets users to communicate through text, video and voice calls, tried to distinguish itself from other social media platforms, noting that it doesn’t encourage people to scroll endlessly by providing a running feed of comments.

“The FTC report’s intent and focus on consumers is an important step. However, the report lumps very different models into one bucket and paints a broad brush, which might confuse consumers and portray some platforms, like Discord, inaccurately,” Kate Sheerin, the head of U.S. and Canada public policy at Discord, said in an email.

The platform, which is popular among people who play games, has shied away fromads but started to run them this year.

Google, which owns YouTube, said in a statement that it had the “strictest privacy policies” and outlined several measures it takes to protect children, including not allowing personalized ads for users younger than 18.

In a statement, a spokesperson for X said the company has made “tremendous strides in protecting users’ safety” since the FTC requested information from 2020. The company said only about 1% of X’s US users are between the ages of 13 to 17.

Advertisement

“X takes user data privacy seriously and ensures users are aware of the data they are sharing with the platform and how it is being used, while providing them with the option of limiting the data that is collected from their accounts,” the statement said.

Other companies listed in the report didn’t immediately respond to a request for comment.

The FTC noted that its findings have limitations because technology and a company’s practices can change. The companies’ responses to the FTC reflected their practices from 2019 to 2020, according to the report.

The agency included recommendations for companies and urged Congress to enact a law that would protect user privacy and grant consumer data rights. Companies should take steps to minimize potential risks, the FTC said, such as only collecting data that are necessary, being more transparent about their practices and having better default protections for teens and young people.

“As policymakers consider different approaches to protecting the public, focusing on the root causes of many harms — and not just the symptoms — is key,” the report said.

Advertisement