Nature Podcast
Covert racism in AI chatbots, precise Stone Age engineering, and the science of paper cuts
- Autor: Vários
- Narrador: Vários
- Editora: Podcast
- Duração: 0:20:40
- Mais informações
Informações:
Sinopse
In this episode:00:31 Chatbots makes racist judgements on the basis of dialectResearch has shown that large language models, including those that power chatbots such as ChatGPT, make racist judgements on the basis of users’ dialect. If asked to describe a person, many AI systems responded with racist stereotypes when presented with text written in African American English — a dialect spoken by millions of people in the United States that is associated with the descendants of enslaved African Americans — compared with text written in Standardized American English. The findings show that such models harbour covert racism, even when they do not display overt racism, and that conventional fixes to try and address biases in these models had no effect on this issue.Research Article: Hoffman et al.News and Views: LLMs produce racist output when prompted in African American EnglishNature News: Chatbot AI makes racist judgements on the basis of dialect07:01 How ancient engineers built a megalithic structureThe 6,000-y