Home>> Entertainment>>Facebook and the Normalization of Deviance
Entertainment

Facebook and the Normalization of Deviance

When the sociologist Diane Vaughan came up with the term “the normalization of deviance,” she was referring to NASA administrators’ disregard of the flaw that caused the Challenger space shuttle to explode, in 1986. The idea was that people in an organization can become so accepting of a problem that they no longer consider it to be problematic. (In the case of the Challenger, NASA had been warned that the shuttle’s O-rings were likely to fail in cold temperatures.) Consider Facebook: for years, its leadership has known that the social network has abetted political polarization, social unrest, and even ethnic cleansing. More recently, it has been aware that its algorithms have promoted misinformation and disinformation campaigns about COVID-19 and vaccines. Over the past year, the company made piecemeal attempts to remove false information about the pandemic, issuing its most comprehensive ban in February. An analysis last month by the nonprofit group First Draft, however, found that at least thirty-two hundred posts making unfounded claims about COVID-19 vaccines had been posted after the February ban. Two weeks ago, the top post on Facebook about the vaccines was of Tucker Carlson, on Fox News, “explaining” that they don’t work.

Over the years, Mark Zuckerberg, Facebook’s C.E.O., has issued a cascade of apologies for the company’s privacy breaches, algorithmic biases, and promotion of hate speech, among other issues. Too often, the company seems to change course only after such issues become public; in many cases, it had been made aware of those failures long before, by Facebook employees, injured parties, or objective evidence. It took months for the firm to acknowledge that political ads on its platform were being used to manipulate voters, and to then create a way for users to find out who was paying for them. Last December, the company finally reconfigured its hate-speech algorithm, after years of criticism from Black groups that the algorithm disproportionately removed posts by Black users discussing racial discrimination. “I think it’s more useful to make things happen and then, like, apologize later,” Zuckerberg said early in his career. We’ve witnessed the consequences ever since.

Here’s what Facebook’s normalization of deviance has looked like in the first few months of 2021: In February, internal company e-mails obtained by ProPublica revealed that, in 2018, the Turkish government demanded that Facebook block posts, in Turkey, from a mainly Kurdish militia group that was using them to alert Syrian Kurdish civilians of impending Turkish attacks against them, and made clear, according to Facebook, “that failing to do so would have led to its services in the country being completely shut down.” Sheryl Sandberg, Facebook’s C.O.O., told her team, “I’m fine with this.” (Reuters reported that the Turkish government had detained almost six hundred people in Turkey “for social media posts and protests criticizing its military offensive in Syria.”)

On April 3rd, Alon Gal, the chief technology officer of the cybercrime-intelligence firm Hudson Rock, reported that, sometime prior to September, 2019, the personal information of more than half a billion Facebook users had been “scraped” and posted to a public Web site frequented by hackers, where it is still available. The stolen data included names, addresses, phone numbers, e-mail addresses, and other identifying information. But, according to Mike Clark, Facebook’s product-management director, scraping data is not the same as hacking data—a technicality that will be lost on most people—so, apparently, the company was not obligated to let users know that their personal information had been stolen. “I have yet to see Facebook acknowledging this absolute negligence,” Gal wrote. An internal memo about the breach was inadvertently shared with a Dutch journalist, who posted it online. It stated that “assuming press volume continues to decline, we’re not planning additional statements on this issue. Longer term, though, we expect more scraping incidents and think it’s important to . . . normalize the fact that this activity happens regularly.” On April 16th, it was announced that the group Digital Rights Ireland is planning to sue Facebook for the breach, in what it calls “a mass action”; and Ireland’s privacy regulator, the Data Protection Commission, has opened an investigation to determine if the company violated E.U. data rules. (Facebook’s European headquarters are in Dublin.)

On April 12th, the Guardian revealed new details about the experience of Sophie Zhang, a data scientist who posted an angry, cautionary farewell memo to her co-workers, before she left the company, last August. According to the newspaper, Zhang was fired for “spending too much time focused on uprooting civic fake engagement and not enough time on the priorities outlined by management.” “In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry,” Zhang wrote in the memo, which, the Guardian reports, Facebook tried to suppress. “We simply didn’t care enough to stop them.” A known loophole in one of Facebook’s products enabled corrupt governments to create fake followers and fake “likes,” which then triggered Facebook’s algorithms to boost their propaganda and legitimacy. According to the Guardian, when Zhang alerted higher-ups about how this was being used by the government of Honduras, an executive told her, “I don’t think Honduras is big on people’s minds here.” (A Facebook spokesperson told the newspaper, “We fundamentally disagree with Ms Zhang’s characterization of our priorities and efforts to root out abuse on our platform.”)

On April 13th, The Markup, a nonprofit, public-interest investigative Web site, reported that Facebook’s ad business was monetizing and reinforcing political polarization in the United States, by allowing companies to target users based on their political beliefs. ExxonMobil, for example, was serving liberals with ads about its clean-energy initiatives, while conservatives were told that “the oil and gas industry is THE engine that powers America’s economy. Help us make sure unnecessary regulations don’t slow energy growth.” How did ExxonMobil know whom, specifically, to target? According to the report, from Facebook’s persistent monitoring of users’ activities and behaviors on and off Facebook, and its delivering of these “custom audiences” to those willing to pay for ads on its platform.

On April 19th, Monika Bickert, Facebook’s vice-president of content policy, announced that, in anticipation of a verdict in the trial of Derek Chauvin, the company would remove hate speech, calls to violence, and misinformation relating to that trial. That accommodation was a tacit acknowledgement of the power that users of the platform have to incite violence and spread dangerous information, and it was reminiscent of the company’s decision, after the November election, to tweak its newsfeed algorithm in order to suppress partisan outlets, such as Breitbart. By mid-December, the original algorithm was restored, prompting several employees to tell the Times’ Kevin Roose that Facebook executives had reduced or vetoed past efforts to combat misinformation and hate speech on the platform, “either because they hurt Facebook’s usage numbers or because executives feared they would disproportionately harm right-wing publishers.” According to the Tech Transparency Project, right-wing extremists spent months on Facebook organizing their storming of the Capitol, on January 6th. Last week, an internal Facebook report obtained by Buzzfeed News confirmed the company’s failure to stop coördinated “Stop the Steal” efforts on the platform. Soon afterward, Facebook removed the report from its employee message board.

Facebook has nearly three billion users. It is common to compare the company’s “population” with the population of countries, and to marvel that it is bigger than the biggest of them—China’s and India’s—combined. Facebook’s policy decisions often have outsized geopolitical and social ramifications, even though no one has elected or appointed Zuckerberg and his staff to run the world. The Guardian article about Zhang’s experience, for instance, concludes that “some of Facebook’s policy staff act as a kind of legislative branch in Facebook’s approximation of a global government.”

It’s possible to see Facebook’s Oversight Board, a deliberative body composed of twenty esteemed international jurists and academics, which the company established, in 2018, to rule on contentious content decisions, as another branch of its self-appointed parallel government. Indeed, when Zuckerberg announced the creation of the board, he called it “almost like a Supreme Court.” Soon, the board will issue what is likely to be its most contentious ruling yet: whether to uphold the ban on Donald Trump, which Facebook instituted after the January 6th insurrection, on the ground that, as Zuckerberg put it at the time, “We believe the risks of allowing the President to continue to use our service during this period are simply too great.” That decision will not be a referendum on Trump’s disastrous Presidency, or on his promotion of Stop the Steal. Rather, it will answer a single, discrete question: Did Trump violate Facebook’s policies about what is allowed on its platform? This narrow brief is codified in the Oversight Board’s charter, which says that “the board will review content enforcement decisions and determine whether they were consistent with Facebook’s content policies and values.”

As events of the past few months have again demonstrated, Facebook’s policies and values have normalized the kind of deviance that enables a disregard for regions and populations who are not “big on people’s minds.” They are not democratic or humanistic but, rather, corporate. Whichever way the Trump decision—or any decision made by the Oversight Board—goes, this will still be true.

donate

Please disable Adblock!

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: