What FB got right and what it didnt in the 2019 elections – Hindustan Times

Posted: November 28, 2021 at 10:01 pm

Coming off a bruising scandal in the US involving political consultant Cambridge Analytica using its data to manipulate voter behaviour, Facebook pulled out all stops to get it right in Indias 2019 general elections. It managed to sail through the biggest elections in the world involving 900 million voters without a scratch, but information that has recently come to light shows that despite its eagerness to remain blameless, its efforts were sometimes lacking.

As the polls scheduled to begin in April 2019 drew close, Facebook (now Meta Platforms) added resources to monitor and manage information flow through its platform, putting together 40 cross-functional teams with 300 members based in Delhi, Singapore, Dublin, and at its headquarters in Menlo Park, California. It wanted to avoid another scandal at any cost. Although India was the big one, the teams were also looking at elections in Indonesia and to the European Parliament.

Over two years beginning January 2017, Facebook closely studied India and drew up a list of priorities for its Civic Integrity, Business Integrity, Misinformation, and Community Integrity teams. The efforts were not in vain. The company, according to internal documents reviewed by The Intersection and Hindustan Times, was thrilled that it stayed out of the headlines and even managed some good press. In a post-election internal review, one Facebook official wrote, In spite of this being coined a WhatsApp election, the teams proactive efforts over the course of a year paid off, leading to a surprisingly quiet, uneventful election period.

In reality, former Facebook officials told The Intersection and HT, Facebooks priority was to avoid flak should anything go wrong in the elections. Not known until now was also that Facebooks carefully erected systems could not capture many violations, as revealed by the Wall Street Journal and The Economic Times.

Nevertheless, Facebook did take down large volumes of bad content around election misinformation, and acted against attempts at voter suppression, internal documents show.

These excerpts are from disclosures made to the Securities and Exchange Commission (SEC) and provided to the US Congress in redacted form by whistleblower Frances Haugens counsel. The redacted versions received by Congress were reviewed by a consortium of news organisations, including The Intersection. The Intersection is publishing these stories in partnership with HT. This is the second in a series of stories.

What Facebook enforced

With the first day of polling 10 days out, Facebook made public what it called coordinated inauthentic behaviour (CIB) and civic spam on the platform. It shut down accounts and took down pages and groups run by the Pakistani spy agency Inter-Services Intelligence (ISI) targeting the Indian electorate. It shut down 687 pages, accounts that engaged in CIB and were allegedly linked to individuals associated with an IT Cell of the Indian National Congress and also removed 15 pages, groups and accounts that, it said, were linked to a technology firm, Silver Touch, which managed several pages supporting the ruling Bharatiya Janata Party.

Initial press coverage drew parallels between the INC and Pakistan, though later reports were more balanced, the Facebook official wrote assessing the impact of Facebook releasing the takedown data.

The platform viewed the CIB takedown as proactively shielding election integrity. A former Facebook official said on condition of anonymity that it had an element of playing to the gallery. There was an expectation that Facebook would do something about elections in general. By going public with the CIB, the company was showing that it was transparent.

It prepared for a second CIB in the midst of the elections. As we prepared for a second round of CIB in the midst of the elections, the focus was on protocols and what constituted action under CIB. Also the question over whether there was a need to distinguish between foreign and domestic interference in these cases, the Facebook official wrote in the memo titled India Elections: Case Study (Part 2).

At the time, the company also paused civic spam takedowns globally because it could not clearly define violations of civic spam rules. Civic spam in Facebook-speak is usage of fake accounts, multiple accounts with same names, impersonation, posting malware links and using a content deluge to drive traffic to affiliated websites to make money.

The second CIB takedown was never publicly disclosed or reported, lending more credence to the former Facebook officials observation that it was a show for the public. CIB round two was all exclusively domestic financially motivated (FMO) and politically motivated (PMO) and was blocked for India. This meant no enforcement on any domestic-only (no foreign nexus) CIB case. It was lifted a few weeks later.

Facebook proactively took down over 65,000 pieces of content since the start of polling that were aimed at voter suppression. As polls progressed, the company took down posts claiming that the indelible ink used to mark fingers was made out of pig blood and so Muslims should skip voting to avoid its use. It also took down posts that included incorrect polling dates and times and polling locations according to the Facebook officials memo.

A Meta spokesperson, in response to The Intersection and Hindustan Times questionnaire, said, Voter suppression policy prohibits election-related and voter fraud things that are objectively verifiable like misrepresentation of dates and methods for voting (e.g., text to vote). The content that requires additional review to determine if it violates our policy may be sent to our third-party fact-checkers for verification.

A constant theme throughout the election was misinformation regarding the failure of electronic voting machines (EVM), the official wrote in the memo. While there were legitimate EVM failures that required re-polling in a few constituencies, there was also misinformation in the form of out-of-context videos claiming vote rigging... In total, Market Ops removed over 10,000 pieces of EVM malfunctioning misinformation.

The mess that was verification

To strengthen the verification process, Facebook originally put in place a mechanism to mark political advertisers. This would typically include a mandatory disclosure for advertisers with a paid for or published by label. In February 2019, it also announced an offline verification process with boots on the ground and an OTP sent to the postal address. Facebook was to hire a third-party vendor for this. These were clearly not scalable solutions, even if the intent was right, said a Facebook official aware of the matter.

Facebook later relied on phone-based verification, a person familiar with the matter said. But it reduced oversight. Some advertisers would get verified using burner phones. There would be no follow-up verifications despite it being part of the companys transparency plans. Internally, questions were raised about the frequency to keep a check on these hacks, as once verified, the phones would get unanswered.

Multiple former Facebook officials confirmed that the verification process was a mess, while also highlighting the struggles Facebook has in executing things well globally. One of them said, People wanted ad transparency, but Facebook couldnt get it out in time for the election and have all the things worked out.

The BJP benefited from this loophole, according to a Wall Street Journal report of August 2020. Facebook declined to act after discovering that the BJP was circumventing its political ad transparency requirements, it said, quoting sources. In addition to buying Facebook ads in its own name, the BJP was also found to have spent hundreds of thousands of dollars through newly created organisations that didnt disclose the partys role. Facebook neither took down the ads nor the pages.

One of the officials The Intersection and HT spoke to said the company has since taken some steps, including mandatory verification using government-issued identification documents. The biggest problem in India is that there are no standardised address formats, the official said. According to another former official, the Election Commission of India should ideally be looking at a digitised database of who is allowed to run political ads that a platform like Facebook can use to verify people, and anyone not in the database, cant run the ads.

The Meta spokesperson added, In India, based on learnings from the US and other countries, we tightened the disclaimer options available to advertisers and require additional credentials to increase their accountability. E.g. in case of an escalation, if we discover that the phone, email or website are no longer active or valid, we will inform the advertiser to update them. If they do not, they will no longer be able to use that disclaimer to run ads...

To disable or not to disable: That is the question

To prevent India creating fresh legal obligations for social media companies, Facebook led the conversation around the need for a voluntary code of ethics during the silent period, the 48 hours before the polling date when canvassing is prohibited. This would have meant that Facebook would have had to disable all ads for two days in every phase.

Instead, it shifted the onus of reporting ads violating the code to the Election Commission of India (ECI), and did not proactively disable ads as it did in the US. It took down only those ads flagged to it by ECI. Others slipped through and remained live on the platform.

It on-boarded ECI on to the Government Casework channel for escalating content which violated election laws, noted the Facebook official in the memo. This channel, people familiar with the matter said, was primarily for flagging illegal content, although it did include some advertising. A Huffington Post investigation in May 2019, revealed that a total of 2,235 advertisements worth approximately 1.59 crore ran in violation of the silent period in the first four phases.

Product and other teams (presumably in charge of revenues) at Facebook clashed over whether to block ads during the silent period or not. Facebook erred on the side of free speech, and contended that ads were another way for people to express opinion. Parties too wanted them running, and Facebook believed it was only fair to smaller parties. Internally, the firm considers political ads as high risk, low reward, because they bring in little money (in comparison to other types of ads people run on its platforms).

Blocking would have required carving out the right geographical regions as per polling dates which were spread over a month and building digital fences around them to dynamically change the visibility of the ads. Facebook hates being told how to build products, said one of the former company officials The Intersection and Hindustan Times spoke to.

Nayantara Ranganathan, an independent researcher and co-founder of Persuasion Lab, a project interrogating new forms of propaganda told The Intersection and Hindustan Times, In choosing to serve an advertisement between two potential viewers, Facebook optimises for goals of the advertiser, engagement of users and growth of the platform. It is not such a stretch to expect Facebook to optimise for compliance with laws. She added, Ultimately, ads delivery is something that Facebook algorithms control, and it is very much possible to exclude by geolocation and dates.

Venkat Ananth is a co-founder at The Intersection published by The Signal, http://www.thesignal.co

Go here to see the original:
What FB got right and what it didnt in the 2019 elections - Hindustan Times

Related Posts