Franchise News

Social Media Platforms Engaged in 'Extensive Surveillance,' Failing to Protect Minors, FTC Finds

| By

A photo taken on October 21, 2020 shows the logo of the multinational American Internet technology and services company, Google (center), the American online social media and social networking service, clockwise: Facebook (Top), Twitter, Tik Tok, Snapchat, and Instragram (top L) on a computer screen in Lille. AFP via Getty Images/DENIS CHARLET

A recent report from the US Federal Trade Commission (FTC) has raised serious concerns about how social media companies handle user data, especially for children and teenagers.

The report emphasizes that these companies gather, share, and manage a vast amount of information from users, yet they offer little transparency and control over how this data is used, particularly regarding artificial intelligence systems.

FTC Raises Concerns on Social Media Platform's Privacy Protection

The FTC's analysis focused on major companies like Meta Platforms (which owns Facebook and Instagram), TikTok, and Amazon's gaming platform Twitch. The findings revealed that many of these companies have "woefully inadequate" policies for managing and retaining user data.

The report also included information from platforms such as YouTube, X (formerly Twitter), Snap, Discord, and Reddit, though the specifics about individual companies were kept anonymous.

In response to the report, Discord stated that the FTC grouped different business models together, which may not accurately represent its platform, especially since it did not use advertising when the study was conducted.

The FTC report outlines how social media firms collect data through various methods. These include using tracking technologies in online advertising, purchasing data from brokers, and other techniques.

FTC Chair Lina Khan emphasized the particular concern regarding the failure of many companies to adequately protect young people online, the LA Times reported.

Currently, the US House of Representatives is considering legislation that has already been approved by the Senate. This proposed law aims to address the impact of social media on younger audiences.

In a proactive step, Meta has launched new accounts for teenagers that include improved parental controls.

The FTC reported that many companies also collect data about users' ages, genders, and even make educated guesses about their income and family status.

Interestingly, the report revealed that some companies collected information about people who didn't even use their services, and many failed to fully disclose how they gather and use data.

This lack of clarity has drawn criticism from various industry groups, who argue that consumers benefit from services supported by advertising. David Cohen, CEO of the Interactive Advertising Bureau, expressed disappointment in how the FTC has portrayed the digital advertising industry, according to Inc.

Also Read: MrBeast, Amazon Face Lawsuit Over 'Beast Games' Show, Citing Pay Denials and Unsafe Work Environment

How Companies Respond to Ongoing Concerns

Despite age restrictions on social media platforms, which generally require users to be at least 13 years old, many users can easily misrepresent their ages. This means that platforms are often gathering data from younger users just like they do from adults.

In response to ongoing criticism, several social media companies have introduced features aimed at enhancing parental control. Meta recently announced plans to make accounts for users under 18 private by default.

Additionally, the company will stop sending notifications to minors during certain hours and will introduce more parental control features. Snap also announced a partnership with Common Sense Media to educate families about online dangers.

Legislators, particularly in California, are working hard to address issues of data privacy and the safety of young users through new laws. However, they face challenges due to a federal law that protects online platforms from being held accountable for content created by users.

Google, which owns YouTube, highlighted its commitment to maintaining high privacy standards and has implemented initiatives to protect children, such as banning personalized advertisements for users under 18.

Meanwhile, X claims it has made significant strides in user safety since the FTC requested information in 2020, noting that only about 1% of its US users are between the ages of 13 and 17.

Related Article: Instagram Rolls Out New Privacy Settings With Default Private Accounts for Teens to Enhance Child Safety

© 2024 Franchise Herald. All rights reserved.

Franchise News

Real Time Analytics