Technology has Social Consequences

Sociology Technology

Current Issue

This Article From Issue

July-August 2018

Volume 106, Number 4
Page 194

DOI: 10.1511/2018.106.4.194

In 2016, Facebook teamed up with Myanmar’s state-owned telecommunications operator, Myanma Posts and Telecommunications, in a move that allowed Myanmar residents to use Facebook without time spent on the site counting against their mobile phone data plans. Not surprisingly, the number of Facebook users in Myanmar skyrocketed, growing from 2 million in 2014 to 30 million in 2017.

However, something else skyrocketed—something the social media company apparently didn’t anticipate: The spread of violent hate speech and false information about the country’s persecuted Rohinga Muslim minority group. Many Myanmar residents now use Facebook as their primary source of news, and many have believed the anti-Rohinga propaganda that has gone viral on the site, some of which were also shared on official government and military pages. The fake information included unfounded rumors of attacks, and doctored photos, such as ones falsely showing Rohinga torching their own homes.

In October 2017, the New York Times reported that Facebook’s rapid adoption by first-time internet users, whose inexperience with online information makes them far less likely to be skeptical of what they see, has made the site a particularly potent instrument for spreading hoaxes. Although Facebook has made some efforts to educate consumers in Myanmar about misinformation, activists have protested the company’s lackluster commitment to these efforts, citing the fact that the company does not maintain a permanent office or staff in Myanmar. CEO Mark Zuckerberg did not publicly discuss the company’s plans to address the malicious messages until April 2018. In the meantime, conditions for Myanmar’s Rohinga population became more dire. Since August 2017, almost 700,000 Rohinga have fled their homes to escape mass killings, and the number of dead could be in the tens of thousands. Although the spread of propaganda was just one of several factors fueling the violence, a more immediate, rigorous intervention by social media companies could have reduced the kindling at a key time.

The abuse of technology for the purpose of terrorizing and controlling a population certainly precedes the existence of social media. Virginia Eubanks, a political scientist who studies the societal impacts of technology, points out that the serial numbers tattooed onto the forearms of inmates of the Nazi concentration camp at Auschwitz began as punch-card identification numbers. But as Eubanks explains in “High-Tech Homelessness,” automated algorithms have become ubiquitous in all of our lives, and we cede important decision-making to them whether we realize it or not. Such algorithms can have life or death consequences in this country as well. Eubanks describes how an automated system is now the required means for the poor and working class who need housing assistance to access services. They must disclose frighteningly personal data that is entered into a system demonstrated to lack adequate data security—a system from which their information also can be requested by law enforcement without probable cause.

In an era when data are a lucrative commodity, and our personal data define our identities and establish our ability to live normal lives, new technologies that aim to harness data need far greater scrutiny before they become entrenched. Designers of such systems, argues Eubanks, may have the best intentions, but they still must be held accountable for not reasonably anticipating, preventing, and counteracting abuses of their systems, or for not stemming breaches with sufficient alacrity. Facebook has been accused of being an “absentee landlord” in Myanmar, but the same criticism could be leveled at nearly every automated predictive or decision-making system currently in use, including those that Eubanks describes. As Eubanks points out, often the poor and working class unwittingly provide a proving ground for new technologies that are eventually applied to the wider public. And the only way to ensure that they are equitable is to purposely build them to be that way from the beginning.—Fenella Saunders (@FenellaSaunders)

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.