Emily M. Bender

American linguist From Wikipedia, the free encyclopedia

Emily M. Bender

Emily Menon Bender (born 1973) is an American linguist who is a professor at the University of Washington. She specializes in computational linguistics and natural language processing. She is also the director of the University of Washington's Computational Linguistics Laboratory.[5][6] She has published several papers on the risks of large language models and on ethics in natural language processing.[7]

Quick Facts Born, Known for ...
Emily M. Bender
Thumb
Born1973 (age 5152)
Known forResearch on the risks of large language models and ethics of NLP; coining the term 'Stochastic parrot'; research on the use of Head-driven phrase structure grammar in computational linguistics
SpouseVijay Menon[1]
MotherSheila Bender[2]
Academic background
Alma materUC Berkeley and Stanford University[3][4]
ThesisSyntactic variation and linguistic competence: The case of AAVE copula absence (2000[3][4])
Doctoral advisorTom Wasow
Penelope Eckert[4]
Academic work
DisciplineLinguistics
Sub-discipline
InstitutionsUniversity of Washington
Close

Education

Bender earned an AB in Linguistics from UC Berkeley in 1995. She received her MA from Stanford University in 1997 and her PhD from Stanford in 2000 for her research on syntactic variation and linguistic competence in African American Vernacular English (AAVE).[8][3] She was supervised by Tom Wasow and Penelope Eckert.[4]

Career

Before working at University of Washington, Bender held positions at Stanford University, UC Berkeley and worked in industry at YY Technologies.[9] She holds several positions at the University of Washington, where she has been faculty since 2003, including professor in the Department of Linguistics, adjunct professor in the Department of Computer Science and Engineering, faculty director of the Master of Science in Computational Linguistics,[10] and director of the Computational Linguistics Laboratory.[11] Bender is the Howard and Frances Nostrand Endowed Professor.[12][13]

Bender was president of the Association for Computational Linguistics in 2024.[14][15][16] She was elected a Fellow of the American Association for the Advancement of Science in 2022.[17]

Contributions

Summarize
Perspective

Bender has published research papers on the linguistic structures of Japanese, Chintang, Mandarin, Wambaya, American Sign Language and English.[18]

Bender has constructed the LinGO Grammar Matrix, an open-source starter kit for the development of broad-coverage precision HPSG grammars.[19][20] In 2013, she published Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax, and in 2019, she published Linguistic Fundamentals for Natural Language Processing II: 100 Essentials from Semantics and Pragmatics with Alex Lascarides, which both explain basic linguistic principles in a way that makes them accessible to NLP practitioners.[citation needed]

In 2021, Bender presented a paper, "On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜" co-authored with Google researcher Timnit Gebru and others at the ACM Conference on Fairness, Accountability, and Transparency[21] that Google tried to block from publication, part of a sequence of events leading to Gebru departing from Google, the details of which are disputed.[22] The paper concerned ethical issues in building natural language processing systems using machine learning from large text corpora.[23] Since then, she has invested efforts to popularize AI ethics and has taken a stand against hype over large language models.[24][25]

The Bender Rule, which originated from the question Bender repeatedly asked at the research talks, is research advice for computational scholars to "always name the language you're working with".[1]

She draws a distinction between linguistic form versus linguistic meaning.[1] Form refers to the structure of language (e.g. syntax), whereas meaning refers to the ideas that language represents. In a 2020 paper, she argued that machine learning models for natural language processing which are trained only on form, without connection to meaning, cannot meaningfully understand language.[26] Therefore, she has argued that tools like ChatGPT have no way to meaningfully understand the text that they process, nor the text that they generate.[citation needed]

Selected publications

Books

  • Bender, Emily M. (2000). Syntactic Variation and Linguistic Competence: The Case of AAVE Copula Absence. Stanford University. ISBN 978-0493085425.
  • Sag, Ivan; Wasow, Thomas; Bender, Emily M. (2003). Syntactic theory: A formal introduction. Center for the Study of Language and Information. ISBN 978-1575864006.
  • Bender, Emily M. (2013). Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax. Synthesis Lectures on Human Language Technologies. Springer. ISBN 978-3031010224.
  • Bender, Emily M.; Lascarides, Alex (2019). Linguistic Fundamentals for Natural Language Processing II: 100 Essentials from Semantics and Pragmatics. Synthesis Lectures on Human Language Technologies. Springer. ISBN 978-3031010446.
  • Bender, Emily M.; Hanna, Alex (2025). The AI Con: How to Fight Big Tech's Hype and Create the Future We Want. Harper. ISBN 9780063418561.[27]

Articles

  • Bender, Emily (2000). "The Syntax of Mandarin Bă: Reconsidering the Verbal Analysis". Journal of East Asian Linguistics. 9 (2): 105–145. doi:10.1023/A:1008348224800. S2CID 115999663 via Academia.edu.
  • Bender, Emily M.; Flickinger, Dan; Oepen, Stephan (2002). The Grammar Matrix: An open-source starter-kit for the rapid development of cross-linguistically consistent broad-coverage precision grammars. Proceedings of the 2002 workshop on Grammar engineering and evaluation. Vol. 15.
  • Siegel, Melanie; Bender, Emily M. (2002). Efficient deep processing of Japanese. Proceedings of the 3rd workshop on Asian language resources and international standardization. Vol. 12.
  • Goodman, M. W.; Crowgey, J.; Xia, F; Bender, E. M. (2015). "Xigt: Extensible interlinear glossed text for natural language processing". Lang Resources & Evaluation. 49 (2): 455–485. doi:10.1007/s10579-014-9276-1. S2CID 254372685.
  • Xia, Fei; Lewis, William D.; Goodman, Michael Wayne; Slayden, Glenn; Georgi, Ryan; Crowgey, Joshua; Bender, Emily M. (2016). "Enriching A Massively Multilingual database of interlinear glossed text". Lang Resources & Evaluation. 50 (2): 321–349. doi:10.1007/s10579-015-9325-4. S2CID 254379828.
  • Bender, Emily M.; Gebru, Timnit; McMillan-Major and, Angelina; Shmitchell, Shmargaret (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜. FAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. doi:10.1145/3442188.3445922.

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.