-
サマリー
あらすじ・解説
Elon Musk’s presence has loomed over Twitter since he announced plans to purchase the platform. And for these few weeks that he’s been in charge, many concerns have proven to be justified. Musk laid off 3,700 employees, and then 4,400 contractors. He is firing those who are critical of him. The verification process, perhaps one of Twitter’s most trusted features, has been unraveled. He’s offered severance to those who don’t want to be part of “extremely hardcore” Twitter. Following the results of a Twitter poll, he reinstated the account of Donald Trump, who was suspended from the platform for his role in inciting the January 6th attacks. So, what happens now? What of the many social movements that manifested on Twitter? While some movements and followings may see new manifestations on other platforms, not everything will be completely recreated. For example, as writer Jason Parham explains, “whatever the destination, Black Twitter will be increasingly difficult to recreate.” In this episode of Community Signal, Patrick speaks to three experts: Sarah T. Roberts, associate professor in the Department of Information Studies at UCLA, trust and safety consultant Ralph Spencer, and Omar Wasow, assistant professor in UC Berkeley’s Department of Political Science and co-founder of BlackPlanet, about the current state and future of Twitter. They dissect the realities facing the platform today including content moderation, loss of institutional knowledge, and uncertainty about Twitter’s infrastructure, but also emphasize the importance of Twitter as a social utility for news and more. This episode also touches on: The reality of moderating a platform like TwitterWhat platforms actually mean when they say they’re for “free speech”How Musk tanked the value of verification on Twitter Big Quotes On the future of content moderation at Twitter (8:28): “There’s no way possible with the cuts [Musk has] made that he’s going to be able to do any type of content moderation. … [He] isn’t going to have anybody who remotely begins to know to how to do that [legal compliance and related work].” –Ralph Spencer Sarah T. Roberts’ moderation challenge for Elon Musk (11:19): “I want Elon Musk to spend one day as a frontline production content moderator, and then get back to this [Community Signal] crew about how that went. Let us know what you saw. Share with us how easy it was to stomach that. Were you able to keep up with the expected pace at Twitter? Could you … make good decisions over 90% of the time, over 1,000, 2,000 times a day? Could you do that all the while seeing animals being harmed, kids being beat on, [and] child sexual exploitation material?” –@ubiquity75 Bumper sticker wisdom doesn’t make good policy (15:46): “Everything [Musk has said about free speech] has had the quality of good bumper stickers but is totally divorced from reality, and that doesn’t bode well, obviously.” –@owasow The responsibility in leading a social media platform (19:41): “One thing that we are seeing in real-time [at Twitter] is what a danger there is in having one individual – especially a very privileged individual who does not live in the same social milieu as almost anyone else in the world – one very privileged individual’s ability to be the arbiter of … these profoundly contested ideological notions of something like free speech which again is continually misapplied in this realm.” –@ubiquity75 Musk’s peddling of conspiracy theories (20:29): “[Musk is] running around tweeting that story about Nancy Pelosi’s husband, the false article about what happened between him and his attacker. What kind of example is that to set? … What it is to me is like this kid who has way too much money, and he found a new toy he wants to play with.” –Ralph Spencer Leading with humility (21:23): “[If you’re running a site like Twitter,] you have to have a ‘small d’ democratic personality, which is to say you really have to be comfortable with a thousand voices flourishing, a lot of them being critical of you, and that’s not something that you take personally.” –@owasow There are always limits on speech (23:50): “When you declare that your product, your site, your platform, your service is a free speech zone, there is always going to be a limit on that speech. … [CSAM] is the most extreme example that we can come up with, but that is content moderation. To remove that material, to disallow it, to enforce the law means that there is a limit on speech, and there ought to be in that case. If there’s a limit on speech, it is by definition not a free speech site. Then we have to ask, well, what are the limits, and who do they serve?” –@ubiquity75 “Free speech” platforms are not a thing (25:25): “When I hear people invoke free speech on a for-profit social media site, not only does that not exist today, it never ...