Instagram LLC head Adam Mosseri's Dec. 8 Senate testimony may spur lawmakers to expand existing child protection laws or adopt broader legislation against social media business models, experts say.
Mosseri will testify before the Senate Commerce Committee's Subcommittee on Consumer Protection, Product Safety and Data Security. The hearing comes after leaked documents from Facebook whistleblower Frances Haugen highlighted how Instagram-parent Meta Platforms Inc. knew the platform worsened the mental and physical health of some of its users, notably teenage girls.
Instagram head Adam Mosseri will face questions from senators on Dec. 8. Source: Meta |
In the wake of Haugen's revelations, "Parents are deeply concerned about the product designs and powerful algorithms that push content to kids and create addiction-like behaviors," the hearing notice states. Senators aim to address what Instagram knows about its impacts on young users, its commitments to reform and potential legislative solutions.
One potential solution could be to reform or expand the Children's Online Privacy Protection Act, or COPPA, according to Daniel Lyons, senior fellow at the American Enterprise Institute.
"COPPA has done a pretty good job over the years in policing the use of information gathered about children online," Lyons said.
But the law only applies to children under 13 years of age. Lyons noted lawmakers could raise that age cap higher.
If the age cap is increased, it means that Meta will have to innovate to capture younger users, an issue that the company has been struggling with on its legacy Facebook platform for years.
"Facebook, in particular, has shown an interest in going after younger users because there is a perception that its brands are for older people," Lyons said.
Instagram had planned to launch an "Instagram Kids" designed for children under the age of 13, but the company paused that work in September amid Haugen's revelations.
On Dec. 7, just a day before the hearing, the platform rolled out its Take a Break feature, which alerts users when they have been scrolling for too long for a set amount of time.
Meta has said Haugen's leaked documents mischaracterized the company's research and argued its social platforms do more good than harm.
Notably, any changes to COPPA would not just affect Meta and Instagram but would also impact other platforms such as Snap Inc.'s Snapchat or ByteDance's TikTok.
A comprehensive legislative approach to tech policy matters will be the best legal outcome that Congress can carry out following the hearing, said James Steyer, founder and CEO of Common Sense Media, a nonprofit that advocates for safety in children's media and technology.
"We've known about these issues for a decade," Steyer said in an interview, citing an accumulation of external research about social media harms to young people. "You can't just look at that in a vacuum," he added.
Lawmakers in both the House and Senate have already introduced legislation targeting social platforms' recommender algorithms to reduce that saturation of harmful content shown to users while using a product. However, some experts told S&P Global Market Intelligence last month that such changes could be ineffective or may end up backfiring.
AEI's Lyons later said the Federal Trade Commission may become involved to regulate social platforms under its authority to protect against unfair or deceptive business practices.
The FTC declined to comment on specific enforcement matters but referred Market Intelligence to its published guidance about fairness measures in the use of artificial intelligence. AI and machine learning techniques can bolster product effectiveness, but the technologies can also lead to "troubling outcomes," the guidance states.
While specific legislative approaches remain to be seen, one thing is for certain: the concerns about Instagram are bipartisan and bicameral.
"Wednesday should be Instagram's day of reckoning," Common Sense's Steyer said. "It's about time that Adam and the other senior executives at Facebook are held to account."