Instagram ‘warning screens’ for illegal content are target of Senate demands for details

newyhub
5 Min Read


Republican Sen. Ted Cruz of Texas, left, speaks with Chairman Dick Durbin, a Democrat from Illinois, as Judge Ketanji Brown Jackson speaks during the Senate Judiciary Committee confirmation hearing on her nomination to become an associate justice of the U.S. Supreme Court, on Capitol Hill, March 23, 2022.

Jim Watson | AFP | Getty Images

WASHINGTON — The Democratic chairman of the Senate Judiciary Committee and one of its most senior Republicans escalated their demands Monday for information from Meta CEO Mark Zuckerberg about Instagram’s now-shuttered “warning screens” for child sexual abuse material, according to a letter first obtained by CNBC.

Sen. Dick Durbin, of Illinois, the committee’s chairman, and Republican Sen. Ted Cruz, of Texas, gave Zuckerberg 11 days to produce a trove of records and responses related to a still-murky chapter in Instagram’s history, and instructed the Facebook founder to preserve any records related to how the Meta platforms’ algorithms handled child sexual abuse material.

The letter from Durbin and Cruz follows a contentious and emotional Judiciary Committee hearing in January about social media and child sexual abuse material. There, Zuckerberg was repeatedly pushed into the spotlight by Republicans and Democrats alike.

Cruz used part of his allotted questioning time to grill the Meta CEO on a formerly available feature on Instagram described as a “warning screen” that users needed to either heed or bypass before the social media platform would grant them access to view search results for terms likely to produce images of child sex abuse.

The warning screen option was removed in June of last year, but only after The Wall Street Journal reported on it and pressed the company for details about why it permitted the abusive content on the platform in the first place.

At the time, Instagram refused to tell the Journal when the warning screen option was first created, or why, or by whom.

Read more CNBC politics coverage

The black screen notified viewers that forthcoming search results “may contain images of child sexual abuse” and then noted that viewing such images is a crime. Nonetheless, at the bottom of the warning widget there was another option: “See results anyway.”

A Meta spokesperson replied to CNBC’s request for comment on the senators’ letter by noting that the warning screen button isn’t shown anymore.

At the Senate hearing, Cruz pressed Zuckerberg to reveal how many times the warning screen had been displayed, as well as how many times users saw the warning screen and clicked on the “See results anyway” option. The Instagram boss said he did not know the details and promised to “personally look into” it and answer their questions.

Now, almost two weeks later, Cruz is following up with a formal congressional request for records.

The letter Monday also asked Zuckerberg to detail whether Meta ever conducted further investigations into the users who clicked “See results anyway,” and how many minors’ profiles had been viewed behind the warning screen.

A detailed explanation of Meta’s decision to remove the warning screen was also on the list of demands, as well as all documents related to Meta’s development of the screen and the decision to display it.

The letter comes as both Republicans and Democrats have vowed to pass legislation to hold social media companies more accountable for child sexual abuse content that appears on their platforms.

While there is strong bipartisan support for several bills that would do this, a packed legislative calendar and looming presidential and congressional elections makes the odds of any action on the issue this year 50/50 at best.

Meanwhile, Meta and other social media platforms are waging a ferocious lobbying battle with the app store giants Google and Apple over where and how age verification should take place online.

Platform providers such as Meta and ByteDance, which owns TikTok, want any age verification online to take place at the app store level, with parental approval required for users under 16 who want to download apps.

Apple and Google, by contrast, want the social media apps themselves to be individually responsible for verifying the ages of their users, and for obtaining parental consent for minors when appropriate.

//
Share This Article
Leave a comment