• she/her🏳️‍⚧️🏳️‍🌈

🎞️📷
i climb a lot.
learning doctor. i help people make computers do things.

but here it’s mostly manga

💖 @kaybee 💖



papers
@papers

Abstract

Content Warning: This paper contains examples of stereotypes and associations, misgendering, erasure, and other harms that could be offensive and triggering to trans and nonbinary individuals.

Gender is widely discussed in the context of language tasks and when examining the stereotypes propagated by language models. However, current discussions primarily treat gender as binary, which can perpetuate harms such as the cyclical erasure of non-binary gender identities. These harms are driven by model and dataset biases, which are consequences of the non-recognition and lack of understanding of non-binary genders in society. In this paper, we explain the complexity of gender and language around it, and survey non-binary persons to understand harms associated with the treatment of gender as binary in English language technologies. We also detail how current language representations (e.g., GloVe, BERT) capture and perpetuate these harms and related challenges that need to be acknowledged and addressed for representations to equitably encode gender information.

Links

Bibtex

@inproceedings{dev-etal-2021-harms,
    title = "Harms of Gender Exclusivity and Challenges in Non-Binary Representation in Language Technologies",
    author = "Dev, Sunipa  and
      Monajatipoor, Masoud  and
      Ovalle, Anaelia  and
      Subramonian, Arjun  and
      Phillips, Jeff  and
      Chang, Kai-Wei",
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2021",
    address = "Online and Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.emnlp-main.150",
    doi = "10.18653/v1/2021.emnlp-main.150",
    pages = "1968--1994",
    abstract = "Gender is widely discussed in the context of language tasks and when examining the stereotypes propagated by language models. However, current discussions primarily treat gender as binary, which can perpetuate harms such as the cyclical erasure of non-binary gender identities. These harms are driven by model and dataset biases, which are consequences of the non-recognition and lack of understanding of non-binary genders in society. In this paper, we explain the complexity of gender and language around it, and survey non-binary persons to understand harms associated with the treatment of gender as binary in English language technologies. We also detail how current language representations (e.g., GloVe, BERT) capture and perpetuate these harms and related challenges that need to be acknowledged and addressed for representations to equitably encode gender information.",
}

catball
@catball

so I made this mockup account @papers the other day to see what it could look like


You must log in to comment.

in reply to @catball's post: