Mohsin will commit suicide live, Facebook does not understand why

Mohsin will commit suicide live, Facebook does not understand why

Turkey’s Ayhan Uzun came on Facebook Live and said that his daughter is getting married. However, he did not consent to the marriage. He can’t accept the matter in any way. Shortly afterwards, the video showed 54-year-old Ayhan Uzun committing suicide with a pistol.

This is what happened in October 2016. In March of the same year, Facebook announced that they would use artificial intelligence to detect suicidal tendencies on Facebook. Then he will take action accordingly, so that it is possible to reduce the suicide. However, that did not happen in the case of Ayhan Uzun. Uzun was live for some time. He expressed his frustration. However, Facebook could not identify it or take immediate action.

At the time, in response to a question from the New York Post, a Facebook spokesperson said, “It is virtually impossible to watch all live videos on social media with 200 million users.” After five years, the situation does not seem to have improved much. The recent example is the suicide of Abu Mohsin Khan, a businessman from Dhaka. For more than 16 minutes, he shot himself in the head, talking about the frustrations of his personal life on Facebook Live.

Meanwhile, Chief Additional Inspector General (IGP) of the Criminal Investigation Department (CID) Mahbubur Rahman said on Wednesday afternoon that Facebook did not understand that Mohsin would commit suicide live. Because, at the beginning of the live, Mohsin’s words seemed normal to them. Facebook informed CID about this.

Read more
Mohsin will commit suicide live, Facebook did not understand at first: CID chief
Mohsin will commit suicide live, Facebook did not understand at first: CID chief
We have seen at least two catastrophes because Facebook does not understand this. First of all, the opportunity that could have been created to prevent Mohsin’s suicide did not materialize. Second, the spread of the video could not be prevented.

Although Facebook’s initiative to reduce suicides among users was initially praised, experts are increasingly concerned. The use of such algorithms has long been banned in Europe due to security concerns. Fear began to grow, especially after the introduction of artificial intelligence.

How the algorithm works
Before the use of artificial intelligence, if any user ‘reported’ a post, Facebook moderators (human staff) would check it. And with the introduction of artificial intelligence, it is the algorithm that automatically starts scanning all types of posts of all users. If there is a risk of suicide in a word or picture, it goes to the moderators.

Moderators then view selected excerpts from posts identified by other users or by artificial intelligence technology. They do not have to read or see the whole. If the moderators think it is alarming, they should inform the local authorities about the location or try to contact the people on the friend list. This is the process.

The main reason behind the initiative of Facebook in 2016 was that at that time a number of suicides were reported on Facebook Live. In the video, he explains the reason for his suicide before the suicide. In many cases, there is an opportunity to present the cause of suicide as acceptable. The videos went viral and began to spread among many more people.

In those days, researchers have figured out ways to identify whether users are at risk of suicide from words or emojis mentioned in social media posts. Facebook incorporates these strategies into machine learning algorithms.

In addition to the content of the post, comments from other users are also taken into account. Facebook’s algorithm identifies questions such as ‘is everything alright’, ‘does it need any help’, ‘where are you now’ or
What is the cause of concern of experts

First of all, suicide is a very sensitive issue, the question of human survival. Second, the algorithm scans almost all Facebook posts and collects user mental health information. In other words, the news of the mind is also being stored in the server of Facebook. And the example of information leaks that Facebook has shown in the past years, where is the opportunity to be confident by handing over the news of the mind to Facebook?

In response to a question from Business Insider in 2019, a Facebook spokesperson said that posts that were initially thought not to require human review were also stored for up to 30 days. Why, he answered the question but did not match.

In addition most of these moderators are contract workers. They do not have adequate training to deal with sensitive information. There is less chance of relying on them.

We come back to the words of Facebook that told CID. Facebook did not understand that Mohsin would commit suicide live. Five years after the technology was introduced, we asked Meta Platforms, the parent company of Facebook, why it is still failing to automatically identify and take immediate action.

Didn’t respond directly, but a Meter spokesperson told Prothom Alo, “We attach the highest importance to the safety of our apps users. Showing suicide or inciting others on Facebook and Instagram both violate our community standards. We have removed the aforementioned suicide video and taken the necessary steps to prevent the content from being uploaded again. We’re constantly improving our technology to ensure that we can quickly remove content that violates our community standards. In addition to this, we are also making regular changes in our policy in consultation with mental health experts from Bangladesh and other countries. Our various activities are going on all the time so that the users can easily communicate with various mental health care and counseling organizations of the world including Bangladesh.

Another thing can be mentioned here. The technology used by Facebook to automatically detect suicidal tendencies with the help of algorithms analyzes the content of the post. “One of our goals is for our employees to be able to respond globally to any Facebook-supported language,” said Guy Rosen, the company’s vice president of integrity at the launch of Facebook’s suicide detection technology in 2016.

However, in a document submitted to the US Securities and Exchange Commission in October last year, Frances Haugen, a former Facebook employee, noted that Facebook had failed to stop hatred and violence in India. Because, Facebook system cannot analyze Hindi and Bangla content properly. The Meta spokesperson did not answer the question whether there was any such reason behind the failure to identify the tendency to commit suicide.

Read more
Facebook spreads lies because it doesn’t understand Hindi and Bengali: Haugen
Frances Haugen, a former Facebook employee, attended the Senate hearing
The video is still on Facebook
The High Court has directed the Bangladesh Telecommunication Regulatory Commission (BTRC) to take immediate action to remove the Facebook video and still image of Abu Mohsin Khan’s suicide from social media. Meanwhile, Meter said, “We have removed the aforementioned suicide video and have taken necessary steps to prevent the content from being uploaded again.” However, a copy of the video was found on Facebook on Wednesday evening.

Not only in the case of Mohsin Khan, more news of suicide of Bangladeshis can be found by searching on Google by posting on Facebook. Tanjir Rashid and Sheikh Mohammad Shariful Islam published a case study on these incidents in March 2020 in the journal Global Mental Health, published by Cambridge University Press. Asked what could be the cause of people’s suicide on Facebook Live, psychiatrist Tanjir Rashid said, “It is difficult to say. At least many people think that it is an effective way to express their feelings and grief to their relatives. At the same time the last request for help. What you are doing is giving a chance to a lot of people to present it logically. Many times the fantasy of glorifying one’s own death and the desire that people will discuss him even after death. ‘

However, when suicide videos spread widely on social media, the negative effects are not small.

Technology