States Could Let Parents Sue Big Tech for Addicting Kids. Here’s What That Really Means. – TIME

Posted: March 27, 2022 at 10:12 pm

Lawmakers across the country are increasingly looking for ways to crack down on the likes of Instagram, TikTok, YouTube and others, with claims that the platforms employ addictive social media algorithms and exploit children. Last week, legislators in California and Minnesota made strides on proposed legislation that would hold companies accountable for the toll their platforms take on young peoples mental health. The bills coincide with calls in Washington to implement meaningful oversight of Big Tech to help keep kids safe.

The California bill would let parents sue companies that dont take steps to avoid addicting children. The proposal is the first of its kind in the U.S. and the most aggressive state-level effort to rein in Big Tech over its use of algorithmic tools that draw on childrens personal data to generate recommendations and other techniques intended to increase their engagement. It would hold social platforms legally accountable for features that are designed to be addictive to children, such as like buttons and endless scroll. Violators could face civil penalties of up to $25,000 per child or damages that could include $1,000 or more per child in a class-action suit, according to the University of San Diego School of Law Childrens Advocacy Institute, a co-sponsor of the bill.

Still, if passed, this type of liability law likely wouldnt be very successful at reigning in Big Tech, says Abbey Stemler, an associate professor of business law and ethics at Indiana University who specializes in internet law, regulatory theory, and Big Tech data. This law isnt really saying anything, she tells TIME. Its too vague to actually be actionable.

Dubbed the Social Media Platform Duty to Children Act, the proposal was advanced in the California Assembly on March 15 by a bipartisan pair of lawmakers, Republican Jordan Cunningham of Paso Robles and Democrat Buffy Wicks of Oakland, with support from the Childrens Advocacy Institute. Cunningham told the Los Angeles Times that some of these companies intentionally design their apps to keep children coming back for more. He asks: Who should pay the social cost of this? Should it be borne by the schools and the parents and the kids, or should it be borne in part by the companies that profited from creating these products?

Californias bill came on the same day that another state, Minnesota, made strides on another measure aimed at protecting young people from social media. A state committee voted to advance a proposed law that would prohibit social media platforms from using algorithms to recommend content to anyone under the age of 18. The states House Judiciary Finance and Civil Law Committee will now vote on the measure on March 22. If passed, companies would be liable for damages and a civil penalty of $1,000 for each violation of the law. The bill would require anyone operating a social media platform with more than one million users to require that algorithm functions be turned off for accounts owned by anyone under the age of 18, the bill summary reads.

While these types of proposals are intended to force social platforms to bear some responsibility for the damages inflicted by their algorithms, Stemler says a more effective strategy would be to enact measures that address companies ability to access the data that fuels those algorithms in the first place.

The reason why algorithms work is because they suck in as much data as possible about what these young people are doing, she says. And once they have that data, they can use it. So instead of saying, Hey, dont create addictive systems, we really should be focused on [preventing platforms from] learning that data. The easiest way to do that is just to limit access to the data itself.

Another bill put forth by Cunningham and Wick in February, the California Age-Appropriate Design Code Act, takes a similar angle. The proposal would restrict social platforms collection of childrens personal and location data.

Congress has also moved forward with federal legislation designed to help reduce the dangers that kids face online. In February, Senators Richard Blumenthal and Marsha Blackburn introduced the Kids Online Safety Act, a bipartisan measure that provides kids and their parents with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations (platforms would be required to enable the strongest settings by default).

The push for these regulatory efforts is driven by continued fallout over leaked company documents from Facebook whistleblower Frances Haugen. Those documents showed that Meta, the parent company of Facebook and Instagram, downplayed its own research on the harmful effects of its platforms on young peopleissues that included eating disorders, depression, suicidal thoughts, and more. This led to a series of Congressional hearings and growing calls for social medias biggest players to face accountability for how they keep young users scrolling through content for as long as possible.

Features that encourage endless scrolling are among the most harmful to young people, according to the companys own research. Aspects of Instagram exacerbate each other to create a perfect storm, one report leaked by Haugen read.

The algorithmic recommendation systems used by popular video platforms like TikTok and YouTube have also drawn criticism. The New York Times reported in December that the inner workings of TikToks algorithm were leaked by a source who was disturbed by the apps push toward sad content that could induce self-harm.

As state and federal efforts grow, Stemler says its crucial that lawmakers get it rightand fast.

My concern for this generations mental health is serious, she says. There are deep problems coming from the pandemic and isolation tech has become the way that young people interact with the world.

More Must-Read Stories From TIME

Write to Megan McCluskey at megan.mccluskey@time.com.

View post:

States Could Let Parents Sue Big Tech for Addicting Kids. Here's What That Really Means. - TIME

Related Posts