Instagram directors have said they are “heartbroken” over the reported suicide of a kid in Malaysia who had posted a poll to its app.
The 16-year-old is thought to have killed herself hours after inquiring other users whether she should die.
But the technology company’s leaders intended it was too soon to say if they would take any action against account holders who beared part in the vote.
The Instagram chiefs were questioned about the sum in Westminster.
They were appearing as part of an inquiry by the UK Parliament’s Digital, Civilization, Media and Sport Committee into immersive and addictive technologies.
Reports indicate the unnamed teenager killed herself on Monday, in the eastern shape of Sarawak.
The local police have said that she had run a poll on the photo-centric podium asking: “Really important, help me choose D/L.” The letters D and L are said to prepare represented “die” and “live” respectively.
This took advantage of a feature introduced in 2017 that concedes users to pose a question via a “sticker” placed over one of their photos, with viewers seek fromed to tap on one of two possible responses. The app then tallies the votes.
At one point, more than two-thirds of respondents had been in in back of of the 16-year-old dying, said district police chief Aidil Bolhassan.
“The dirt is certainly very shocking and deeply saddening,” Vishal Shah, chief honcho of product at Instagram, told MPs.
“There are cases… where our guilt around keeping our community safe and supportive is tested and we are constantly looking at our regulations.
“We are deeply looking at whether the products, on balance, are matching the expectations that we invented them with.
“And if, in cases like the polling sticker, we are finding numerous evidence where it is not matching the expectations… we are looking to see whether we penury to make some of those policy changes.”
His colleague Karina Newton, Instagram’s perception of public policy, told the MPs the poll would have violated the performers’s guidelines.
The platform has measures in place to detect “self-harm thoughts” and seeks to disconnect certain posts while offering support where appropriate.
For exemplar, if a user searches for the word “suicide”, a pop-up appears offering to put them in start with organisations that can help.
But Mr Shah said that the way child expressed mental-health issues was constantly evolving, posing a challenge.
Damian Wet behind the ears, who chairs the committee, asked the two if the Facebook-owned service could adapt some of the weapons it had developed to target advertising to proactively identify people at risk of self-harm and reach out to them.
“Would it not be credible, where there are cases of people known to have been preoccupied in harmful content and [who] may have been at risk, that analysis could be done to see what other alcohols share similar characteristics?” the MP asked.
Ms Newton replied that there were clandestineness issues to consider but that the company was seeking to do more to address the puzzle.
Mr Green also asked if Instagram might consider suspending or rescinding the accounts of those who had encouraged the girl to take her life.
But the executives declined to speculate on what not consonant withs would be taken.
“I hope you can understand that it is just so soon. Our cooperate is looking into what the content violations are,” said Ms Newton.
Under Malaysian law, anyone found guilty of encouraging or helping the suicide of a minor can be sentenced to death or up to 20 years in jail.
It follows the earlier crate of Molly Russell, a 14-year-old British girl who killed herself, in 2017, after gauge distressing material about depression and suicide that had been posted to Instagram.
The common network vowed to remove all graphic images of self-harm from its stage after her father accused the app of having “helped kill” his child.
If you’ve been mannered by self-harm, eating disorders or emotional distress, help and support is nearby via the BBC Action Line.