Instagram CEO faces senators amid anger over potential harm
Instagram CEO says protecting children from harm online ‘is not just about one company’ as he faces questions from angry senators over allegations it put profit before safety
- Adam Mosseri appeared before senators to answer questions about whether Instagram was doing enough to protect young users
- It follows whistleblower revelations that Facebook, which owns the photo sharing platform, put profits before safety
- ‘Keeping young people safe online is not just about one company,’ he said
The CEO of Facebook’s Instagram faced lawmakers on Wednesday to face tough questioning over revelations that the popular photo-sharing platform can harm young users amid demands that the company commit to making changes.
Adam Mosseri gave evidence at a Senate hearing as Facebook, whose parent now is named Meta Platforms, has been roiled by public and political outrage over the disclosures by a former Facebook employee.
Sen. Richard Blumenthal, who chairs the Senate’s consumer protection subcommittee, opened proceedings by accusing him of creating addictive products.
He said his office set up a fake account on Monday for a teenager, and followed other users promoting content about eating disorders.
‘Within an hour all of our recommendations promoted pro-anorexia and eating disorder content,’ he said.
The answer, said Blumethal, was an independent regulator.
‘Some of the big tech companies have said “Trust us.” That seems to be what Instagram is saying in your testimony,’ Blumenthal said.
‘But self-policing depends on trust. The trust is gone.’
But Mosseri hit back at the blame leveled at his company and would not back an independent body to oversee big tech.
‘The reality is that keeping young people safe online is not just about one company,’ he said.
Instead he proposed the creation of an industry body that would set out best practise, such as how to verify a user’s age.
‘The standards need to be high and the protections universal. And I believe that companies like ours should have to earn some of the Section 230 protections by adhering to those standards,’ he said, citing legal protections that some lawmakers want to abolish or overhaul.
But he faced a tough time. New protections rolled out a day earlier – including prompts to suggest users take breaks, more controls for parents, and an option bulk delete photos – were given short shrift.
Instagram CEO Adam Mosseri said protecting children online was about ‘not just one company’ as he appeared before a subcommittee of the Senate Commerce, Science and Transportation Committee
Former Facebook staffer Frances Haugen has made a series of explosive claims that the company amplifies online hate and that it elevates profits over the safety of users
The company faces scrutiny over the potential detrimental impact its social media platform has on young people
It comes after former Facebook staffer, Frances Haugen, made the case before lawmakers in the U.S., Britain and Europe that Facebook’s systems amplify online hate and extremism and that the company elevates profits over the safety of users.
Haugen, a data scientist who had worked in Facebook’s civic integrity unit, buttressed her assertions with a massive trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
A Senate Commerce Committee panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts.
For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.
The revelations in a report by The Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.
At a subcommittee hearing in September, senators of both parties were united in condemnation of the social network giant and Instagram, the photo-sharing juggernaut valued at some $100 billion that Facebook acquired for $1 billion in 2012.
The lawmakers accused Facebook of concealing the negative findings on Instagram. The panel grilled Antigone Davis, Facebook´s head of global safety, who defended Instagram´s efforts to protect young people using its platform. She disputed the way the Wall Street Journal outlined the research.
Sen. Richard Blumenthal, D-Conn., the subcommittee´s chairman, had called for Meta CEO Mark Zuckerberg to appear before the panel to testify on the Instagram situation. For now, it will be Mosseri fielding those questions.
‘After bombshell reports about Instagram´s toxic impacts, we want to hear straight from the company´s leadership why it uses powerful algorithms that push poisonous content to children driving them down rabbit holes to dark places, and what it will do to make its platform safer,’ Blumenthal said in a prepared statement Wednesday.
Facebook´s public response in September to the outcry over Instagram was to put on hold its work on a kids´ version of the platform, which the company says is meant mainly for children aged 10 to 12.
On Tuesday, Instagram introduced a previously-announced feature which urges teenagers to take breaks from the platform. The company also announced other tools that it says are aimed at protecting young users from harmful content.
Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children – its Messenger Kids app that launched in late 2017.
Beyond changes by the company, senators are pressing Mosseri to support legislative remedies for social media.
Among the legislative proposals put forward by Blumenthal and others, one bill proposes an ‘eraser button’ that would let parents instantly delete all personal information collected from their children or teens.
Another proposal bans specific features for kids under 16, such as video auto-play, push alerts, ‘like’ buttons and follower counts.
Also being floated is a prohibition against collecting personal data from anyone aged 13 to 15 without their consent. And a new digital ‘bill of rights’ for minors that would similarly limit gathering of personal data from teens.
Source: Read Full Article