Alexander von Humboldt Institute for Internet and Society (HIIG)

Detailed view

How to identify bias in Natural Language Processing

Why do translation programs or chatbots on our computers often contain discriminatory tendencies towards gender or race? Here is an easy guide to understand how bias in natural language processing works. We explain why sexist technologies like search engines are not just an unfortunate coincidence.



What is bias in translation programs?

Have you ever used machine translation for translating a sentence to Estonian? In some languages, like Estonian, pronouns, and nouns do not indicate gender. When translating to English, the software has to make a choice. Which word becomes male and which female? However, often it is a choice grounded in stereotypes. Is this just a coincidence?


Authors:

Freya Hewett Wissenschaftliche Mitarbeiterin: AI & Society Lab

Sami Nenno Studentischer Mitarbeiter: AI & Society Lab

Topic

Industry

Contact

Alexander von Humboldt Institute for Internet and Society (HIIG)

Address




Imprint link

» Imprint