Major websites routinely engage in "mass surveillance" of their users, especially children and teens, to squeeze profits out of their personal information, a new report from the Federal Trade Commission indicates.
“The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,” said FTC Chair Lina M. Khan. “While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking."
The report found that the companies collected and could indefinitely retain troves of data, including information from data brokers, and about both users and non-users of their platforms.
"Woefully inadequate"
The report said the companies' data collection, minimization and retention practices were “woefully inadequate.” In addition, it found that some companies did not delete all user data when users asked them to do so.
The report also found that many companies engaged in broad data sharing that raises serious concerns regarding the adequacy of the companies’ data handling controls and oversight.
The companies' business models encouraged mass collection of user data to monetize, especially through targeted advertising, which accounts for most of their revenue, with profit incentives often poses risks to user privacy. Some companies deployed privacy-invasive tracking technologies, such as pixels, to facilitate advertising to users based on preferences and interests, the FTC report found.
The report found that users and non-users had little or no way to opt out of how their data was used by the sites' automated systems, and that there were "differing, inconsistent, and inadequate" approaches to monitoring and testing the use of automated systems.
Children unprotected
The staff report concluded that the social media and video streaming services didn’t adequately protect children and teens on their sites. The report cited research that found social media and digital technology contributed to negative mental health impacts on young users.
Based on the data collected, the staff report said many companies assert that there are no children on their platforms because their services were not directed to children or did not allow children to create accounts.
The report said that this was an apparent attempt to avoid liability under the Children’s Online Privacy Protection Act Rule, and said that the social media and video streaming services often treated teens the same as adult users, with most companies allowing teens on their platforms with no account restrictions.
Sites studied
The staff reportis based on responses to what are called 6(b) orders issued in December 2020 to nine companies including some of the largest social media and video streaming services:
- Amazon.com, Inc., which owns the gaming platform Twitch;
- Facebook, Inc. (now Meta Platforms, Inc.);
- YouTube LLC;
- Twitter, Inc. (now X Corp.);
- Snap Inc.;
- ByteDance Ltd., which owns the video-sharing platform TikTok;
- Discord Inc.;
- Reddit, Inc.; and
- WhatsApp Inc.
The orders asked for information about how the companies collect, track and use personal and demographic information, how they determine which ads and other content are shown to consumers, whether and how they apply algorithms or data analytics to personal and demographic information, and how their practices impact children and teens.
Congress urged to act
The staff report makes recommendations to policymakers and companies based on staff’s observations, findings, and analysis, including:
- Congress should pass comprehensive federal privacy legislation to limit surveillance, address baseline protections, and grant consumers data rights;
- Companies should limit data collection, implement concrete and enforceable data minimization and retention policies, limit data sharing with third parties and affiliates, delete consumer data when it is no longer needed, and adopt consumer-friendly privacy policies that are clear, simple, and easily understood;
- Companies should not collect sensitive information through privacy-invasive ad tracking technologies;
- Companies should carefully examine their policies and practices regarding ad targeting based on sensitive categories;
- Companies should address the lack of user control over how their data is used by systems as well as the lack of transparency regarding how such systems are used, and also should implement more stringent testing and monitoring standards for such systems; Companies should not ignore the reality that there are child users on their platforms and should treat COPPA as representing the minimum requirements and provide additional safety measures for children;
- The Companies should recognize teens are not adults and provide them greater privacy protections; and
- Congress should pass federal privacy legislation to fill the gap in privacy protections provided by COPPA for teens over the age of 13.
The Commission voted 5-0 to issue the staff report.