Instagram CEO Adam Mosseri received a broadside of complaints before a Senate committee on Wednesday over what lawmakers see as the company’s lack of action to protect teens using its platform.
The Meta-owned site, along with other popular platforms like TikTok and Snapchat, have been at the center of lawmakers’ hearings in recent months as Congress considers legislation in areas such as consumer privacy, algorithm transparency and Section 230 reform. The latter has to do with the broad liability protection that tech companies have had for third party content posted on their site.
While Mosseri released advanced testimony that included calls to an industry body to determine best practices “for how to verify age, design age-appropriate experiences and how to build parental controls,”
“I believe the time for self policing and self regulation is over,” said Sen. Richard Blumenthal (C-CT), who was chairing the hearing before a Senate Commerce subcommittee on consumer protection, product safety and data security.
The top Republican on the subcommittee, Sen. Marsha Blackburn (R-TN), said that she was “just a little bit frustrated” that “nothing changes. Nothing.”
“This is the fourth time in past two years that we have spoken with someone from Meta … and I feel like the conversation repeats itself ad nauseam,” she said.
Although a slew of bill have been proposed over the past few years as Congress has vowed to rein in Big Tech, only a series of measures to bolster antitrust laws has passed, and that was in the House Judiciary Committee. Its future prospect are uncertain.
But congressional lawmakers have in recent months zeroed in on the need for legislation to address the use of social media by kids. A whistleblower, Frances Haugen, provided documents to The Wall Street Journal showing that Facebook researchers conducted studies on the potentially harmful impact the platform had on teen girls body image and other issues like anorexia and mental health.
Mosseri said that he also backed a proposal in which tech platforms would not get a certain level of liability protection unless they adhered to a set of best practices.
But Blumenthal was skeptical of the idea that an industry body could take an oversight role, and instead suggested an independent one, with work by outside researchers. He pressed Mosseri on whether the company would support a legal requirement that tech platforms provide access to otherwise private data and algorithms.
“I will be happy to have my office work with you on that,” Mosseri told him.
Mosseri also would not commit to ending any plans for an Instagram Kids, aimed at 10 to 12 year olds. The project was put on hold in September. Instagram requires users to be at least 13 years old before they can create an account. Otherwise, a parent must manage it.
In advance of the hearing, on Tuesday, Instagram unveiled a set of new features, like controls that would help parents limit their children’s use, and others to remind users to take breaks.
Blumenthal, however, suggested that the U.S. should impose measures similar to that in the UK that would restrict the use of addictive design.
“Respectfully, I don’t believe the research suggests that our products are addictive,” Mosseri said.
Representatives from TikTok, YouTube and Snapchat also appeared before the Senate Commerce Committee in a hearing in October.
Facebook faced the first immediate fallout of the Wall Street Journal series in late September, when its global head of safety appeared before the committee and defended Instagram. At Wednesday’s hearing, Blumenthal said that “what really stuns me is the lack of action” since then. He said that on Monday, his office again conducted an experiment by setting up a fake teen girl’s account to see what recommendations came to the user. He said that the results were still posts that he said promoted eating disorders.
More to come.