The past year, especially since the pandemic, has been one giant demonstration of the consequences of inaction; the consequences of ignoring the many, many people who have been begging social media companies to take the meme-making extremists and conspiracy theorists that have thrived on their platforms seriously.
Facebook and Twitter acted to slow the rise of QAnon over the summer, but only after the pro-Trump conspiracy theory was able to grow relatively unrestricted there for three years. Account bans and algorithm tweaks have long been too little, too late to deal with racists, extremists and conspiracy theorists, and they have rarely addressed the fact that these powerful systems were working exactly as intended.
I spoke with a small handful of the people who could have told you this was coming about this for a story in October. Researchers, technologists, and activists told me that major social media companies have, for the entirety of their history, chosen to do nothing, or to act only after their platforms cause abuse and harm.
Ariel Waldman tried to get Twitter to meaningfully address abuse there in 2008. Researchers like Shafiqah Hudson, I’Nasah Crockett, and Shireen Mitchell have tracked exactly how harassment works and finds an audience on these platforms for years. Whitney Phillips talked about how she’s haunted by laughter–not just from other people, but also her own–back in the earliest days of her research into online culture and trolling, when overwhelmingly white researchers and personalities treated the extremists among them as edgy curiosities.
Ellen Pao, who briefly served as CEO of Reddit in 2014 and stepped down after introducing the platform’s first anti-harassment policy, was astonished that Reddit had only banned r/The_Donald in June 2020, after evidence had built for years to show that the popular pro-Trump message board served as an organizing space for extremiss and a channel for mob abuse. Of course, by the time it was banned, many of its users had already migrated away from Reddit to TheDonald.win, an independent forum created by the same people who ran the previous version. Its pages were filled with dozens of calls for violence ahead of Wednesday’s rally-turned-attempted-coup.
Facebook, Twitter, and YouTube didn’t create conspiracy thinking, or extremist ideologies, of course. Nor did they invent the idea of dangerous personality cults. But these platforms have–by design–handed those groups the mechanisms to reach much larger audiences much faster, and to recruit and radicalize new converts, even at the expense of the people and communities those ideologies target for abuse. And crucially, even when it was clear what was happening, they chose the minimal amount of change–or decided not to intervene at all.
There are many differences between national and international pallets and it’s critical for business owners…
While puppies are undeniably adorable, they also come with a fair share of responsibilities. Read…
The Gucci Bamboo bag is one of the most iconic and enduring creations in the…
In today's digital world, it's vital to boost your 1v1 video chat site's visibility for…
Inadequate hazmat storage can lead to catastrophic fires, toxic spills, and harmful gas releases. Learn…
Thinking of buying a camper van? Explore these key factors to ensure the perfect fit…