From Adrienne LaFrance's 12-15-20 ATLANTIC article entitled "Facebook Is a Doomsday Machine":
People tend to complain about Facebook as if something recently curdled. There’s a notion that the social web was once useful, or at least that it could have been good, if only we had pulled a few levers: some moderation and fact-checking here, a bit of regulation there, perhaps a federal antitrust lawsuit. But that’s far too sunny and shortsighted a view. Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture [...].
The social web is doing exactly what it was built for. Facebook does not exist to seek truth and report it, or to improve civic health, or to hold the powerful to account, or to represent the interests of its users, though these phenomena may be occasional by-products of its existence. The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning. The rise of QAnon, for example, is one of the social web’s logical conclusions. That’s because Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences. Facebook is an agent of government propaganda, targeted harassment, terrorist recruitment, emotional manipulation, and genocide—a world-historic weapon that lives not underground, but in a Disneyland-inspired campus in Menlo Park, California [...].
I recalled Clinton’s warning a few weeks ago, when Zuckerberg defended the decision not to suspend Steve Bannon from Facebook after he argued, in essence, for the beheading of two senior U.S. officials, the infectious-disease doctor Anthony Fauci and FBI Director Christopher Wray. The episode got me thinking about a question that’s unanswerable but that I keep asking people anyway: How much real-world violence would never have happened if Facebook didn’t exist? One of the people I’ve asked is Joshua Geltzer, a former White House counterterrorism official who is now teaching at Georgetown Law. In counterterrorism circles, he told me, people are fond of pointing out how good the United States has been at keeping terrorists out since 9/11. That’s wrong, he said. In fact, “terrorists are entering every single day, every single hour, every single minute” through Facebook.
The website that’s perhaps best known for encouraging mass violence is the image board 4chan—which was followed by 8chan, which then became 8kun. These boards are infamous for being the sites where multiple mass-shooting suspects have shared manifestos before homicide sprees. The few people who are willing to defend these sites unconditionally do so from a position of free-speech absolutism. That argument is worthy of consideration. But there’s something architectural about the site that merits attention, too: There are no algorithms on 8kun, only a community of users who post what they want. People use 8kun to publish abhorrent ideas, but at least the community isn’t pretending to be something it’s not. The biggest social platforms claim to be similarly neutral and pro–free speech when in fact no two people see the same feed. Algorithmically tweaked environments feed on user data and manipulate user experience, and not ultimately for the purpose of serving the user. Evidence of real-world violence can be easily traced back to both Facebook and 8kun. But 8kun doesn’t manipulate its users or the informational environment they’re in. Both sites are harmful. But Facebook might actually be worse for humanity.
“What a dreadful set of choices when you frame it that way,” Geltzer told me when I put this question to him in another conversation. “The idea of a free-for-all sounds really bad until you see what the purportedly moderated and curated set of platforms is yielding … It may not be blood onscreen, but it can really do a lot of damage.”
In previous eras, U.S. officials could at least study, say, Nazi propaganda during World War II, and fully grasp what the Nazis wanted people to believe. Today, “it’s not a filter bubble; it’s a filter shroud,” Geltzer said. “I don’t even know what others with personalized experiences are seeing.”
To read the rest of the article, click HERE.
No comments:
Post a Comment