• Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindr

  • 2023/05/01
  • 再生時間: 27 分
  • ポッドキャスト

Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindr

  • サマリー

  • Bodies aren’t moderated equally on the internet. Content moderation efforts, especially those at large, mainstream platforms, can suffer from policy-based bias that results in moderation centering a cisgender gaze. This reinforcing of heteronormativity can leave some of your most vulnerable community members – and potential community members – feeling alienated, ostracized, and simply unwelcome. Last year, in her role as CX escalations supervisor at Grindr, Vanity Brown co-authored a whitepaper, Best Practices for Gender-Inclusive Content Moderation. Insightful, with a straight forward approach to making content moderation just a bit better, I found that it was also a validation of good, thoughtful moderation that has been going on for a long time. Vanity joins the show to talk about these efforts, which are tempered by a realistic acknowledgement of the limitations of this work, and how our need to be in other places (like app stores) can often slow down the progress we’d like to make. We also discuss: Why it’s not our job to guess the gender of our membersThe state of AI trust and safety toolsChatGPT, Midjourney, and how much to worry about them Big Quotes How bodies are moderated differently online (2:16): “We want folks to express themselves and their sexuality joyfully, without judgment. Of course, without any harm. But what does that look like? … There traditionally are [community] guidelines for females and guidelines for males, but the world is changing and folks are becoming more in tune with who they are, and we want to be able to treat them equally and let folks, especially I emphasize our trans users, who are uploading photos … and if they are showing the top, then they’re considered a woman if they have female-presenting breasts versus male. There are just a lot of nuances there that we saw as we were moderating content from a community who is very fluid with their gender expression.” -Vanity Brown When do kinks create a moderation issue? (6:38): “[Kinks vs. crimes get] sticky when the kink looks like a crime. … Everything is about sex and kinks at Grindr. With this mass of kinky stuff, which of these things are harmful? I often echo that, in my work, I’m always driven … to do no harm. At the end of the day, are we harming someone? … Do we have a responsibility to protect them and keep them safe? As we continue to build trust with the community, we have to realize that folks are adults, too.” -Vanity Brown Empathy sits at the core of good moderation (14:38): “If you can’t be empathetic for the things you are not … then you’re not really doing good thoughtful community moderation, trust and safety work. … Ultimately, if you want to be truly great at this work, you have to protect the people who aren’t you.” -Patrick O’Keefe What can community pros learn from dating apps? (24:23): “[Community, moderation, trust, and safety pros] can learn from dating apps on the level of how personal and sensitive dating apps are in the content you’re sending back and forth. Folks using dating apps, a lot of times their heartstrings are attached, and their heartstrings are attached on a dating app, but not necessarily Amazon or shopping at Macy’s. … It’s just important to look at folks with a microscope and treat them with kindness as those in dating apps hopefully are doing when they’re handling their customers.” -Vanity Brown About Vanity Brown Vanity Brown is the CX escalations supervisor for Grindr, where she has worked in trust and safety for over 2 years, following more than 7 years at eHarmony. Vanity manages an escalations team of specialists devoted to handling the most complex cases that come through Grindr’s support channels. Related Links Vanity on LinkedInGrindr, where Vanity is CX escalations supervisor Best Practices for Gender-Inclusive Content Moderation whitepaper, co-authored by Alice Hunsberger, Vanity, and Lily Galib, which I found via Juliet ShenGrindr’s community guidelines OpenAI’s efforts to identify AI-generated text, which were only able to identify “likely” AI-written text 26% of the time, a bit more than the approximately 10% I mentioned during the showLove Light Community, a youth choir founded by Vanity, dedicated to “enriching the lives of youth and families in underserved communities through the transforming power of music and the arts”Love Light Community on Instagram Transcript View transcript on our website Your Thoughts If you have any thoughts on this episode that you’d like to share, please leave me a comment or send me an email. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
    続きを読む 一部表示

あらすじ・解説

Bodies aren’t moderated equally on the internet. Content moderation efforts, especially those at large, mainstream platforms, can suffer from policy-based bias that results in moderation centering a cisgender gaze. This reinforcing of heteronormativity can leave some of your most vulnerable community members – and potential community members – feeling alienated, ostracized, and simply unwelcome. Last year, in her role as CX escalations supervisor at Grindr, Vanity Brown co-authored a whitepaper, Best Practices for Gender-Inclusive Content Moderation. Insightful, with a straight forward approach to making content moderation just a bit better, I found that it was also a validation of good, thoughtful moderation that has been going on for a long time. Vanity joins the show to talk about these efforts, which are tempered by a realistic acknowledgement of the limitations of this work, and how our need to be in other places (like app stores) can often slow down the progress we’d like to make. We also discuss: Why it’s not our job to guess the gender of our membersThe state of AI trust and safety toolsChatGPT, Midjourney, and how much to worry about them Big Quotes How bodies are moderated differently online (2:16): “We want folks to express themselves and their sexuality joyfully, without judgment. Of course, without any harm. But what does that look like? … There traditionally are [community] guidelines for females and guidelines for males, but the world is changing and folks are becoming more in tune with who they are, and we want to be able to treat them equally and let folks, especially I emphasize our trans users, who are uploading photos … and if they are showing the top, then they’re considered a woman if they have female-presenting breasts versus male. There are just a lot of nuances there that we saw as we were moderating content from a community who is very fluid with their gender expression.” -Vanity Brown When do kinks create a moderation issue? (6:38): “[Kinks vs. crimes get] sticky when the kink looks like a crime. … Everything is about sex and kinks at Grindr. With this mass of kinky stuff, which of these things are harmful? I often echo that, in my work, I’m always driven … to do no harm. At the end of the day, are we harming someone? … Do we have a responsibility to protect them and keep them safe? As we continue to build trust with the community, we have to realize that folks are adults, too.” -Vanity Brown Empathy sits at the core of good moderation (14:38): “If you can’t be empathetic for the things you are not … then you’re not really doing good thoughtful community moderation, trust and safety work. … Ultimately, if you want to be truly great at this work, you have to protect the people who aren’t you.” -Patrick O’Keefe What can community pros learn from dating apps? (24:23): “[Community, moderation, trust, and safety pros] can learn from dating apps on the level of how personal and sensitive dating apps are in the content you’re sending back and forth. Folks using dating apps, a lot of times their heartstrings are attached, and their heartstrings are attached on a dating app, but not necessarily Amazon or shopping at Macy’s. … It’s just important to look at folks with a microscope and treat them with kindness as those in dating apps hopefully are doing when they’re handling their customers.” -Vanity Brown About Vanity Brown Vanity Brown is the CX escalations supervisor for Grindr, where she has worked in trust and safety for over 2 years, following more than 7 years at eHarmony. Vanity manages an escalations team of specialists devoted to handling the most complex cases that come through Grindr’s support channels. Related Links Vanity on LinkedInGrindr, where Vanity is CX escalations supervisor Best Practices for Gender-Inclusive Content Moderation whitepaper, co-authored by Alice Hunsberger, Vanity, and Lily Galib, which I found via Juliet ShenGrindr’s community guidelines OpenAI’s efforts to identify AI-generated text, which were only able to identify “likely” AI-written text 26% of the time, a bit more than the approximately 10% I mentioned during the showLove Light Community, a youth choir founded by Vanity, dedicated to “enriching the lives of youth and families in underserved communities through the transforming power of music and the arts”Love Light Community on Instagram Transcript View transcript on our website Your Thoughts If you have any thoughts on this episode that you’d like to share, please leave me a comment or send me an email. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.

Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindrに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。