Delhi panel seeks list of content flagged on FB around Delhi riots

New Delhi: The Delhi Assembly's Peace and Harmony Committee on Thursday deposed the public policy director of Facebook India Shivnath Thukral and sought the user reports (complaints) against posts on Facebook and action taken on them in the one month prior to the deadly Delhi riots last year and two months after the riots concluded — continuing its hearings on the role of social media in possibly fuelling the riots.
Hearing a representation by Shivnath Thukral, the public policy director of Facebook India (Meta Platforms), in the Assembly, chairman of the Peace and Harmony Committee Raghav Chadha asked for the records and also sought details of the specific people and teams responsible for moderating content during the time period of the riots.
Thukral informed the Committee that they had appointed an India-based Grievance Officer only recently as per the IT Rules and had one during the riots when it was not mandated under the law. But when the committee proceeded to seek details of this grievance officer, Thukral invoked the Supreme Court order in the Ajit Mohan case to say that he had the right to decline an answer as it pertained to the Delhi riots - which was a law and order issue.
This was not the only time Facebook's Thukral exercised this right. During the two-hour hearing, every time, the committee asked the social media giant about specific teams, procedures, and people in charge of the platform's content regulation during the period around the riots, the public policy director cited the Supreme Court order, saying it was a matter of law and order as well as an ongoing investigation.
Thukral was aided in the hearing by Saanjh Purohit, Associate General Counsel, Facebook, who read out the relevant portion of the order that allowed the company to not respond to matters of law and order.
But Chadha quoted the same to say that that Supreme Court had gone on to say that the company cannot use this right to "stonewall and frustrate" the committee.
The Supreme Court, in the 'Ajit Mohan & Ors. V. Legislative Assembly, NCT of Delhi' case, had upheld the powers of the committee to summon both members and non-members, including representatives of Facebook, in its hearing.
The committee also quizzed the Facebook official regarding its organisational structure, the complaints redressal mechanism, community standards and hate speech definitions with respect to the social media giant. It also asked whether the company has ensured sufficient representation of religious and other minorities in India, and sought to know how diverse the content moderation and public policy teams in India are.
At one point in the hearing, the committee also delved into how their community standards are framed and whether these standards are based on any citable Indian law.
The committee also went ahead to ask whether Facebook ever takes the responsibility to report content showing a crime or violating a law to enforcement authorities. To this, Thukral said Facebook is not a law enforcement agency but it has a mechanism to co-operate with such agencies whenever required, adding, "When things happen in the real world, they reflect on our platform as well. We do not want hate on our platform. There are some bad actors that need to be worked on."
At this point, Chadha interjected with, "I am not sure whether hate hurts you because you are a business and virality of hate posts bring you revenue."
In his opening statement, Thukral said "I want to address the recent global debate that is going on around our company. We believe that we should be scrutinised in good faith, criticism helps us get better. But we cannot be subjected to unfair trials based on selectively leaked documents which paint a false picture of our company."
Thukral said Facebook has 40,000 people working on content management, including 15,000 dealing with content moderation. Any content found against the community standards followed by the platform is taken down immediately, he
added.
Thukral said the platform has 10 fact-checking partners in India who cover content in 11 languages out of 20. Besides, there are machine tools as well as independent social organisations which help tackle problematic content and take action against it.
A statement of the committee later stated, "During the hearing, the official on behalf of Facebook India acknowledged that their tools are not always perfect and there are scope of improvements."
The committee said it will further deliberate and decide on recalling Facebook officials for another round of proceeding and examination, Chadha said.