![]() ![]() “Stereotyping often results in the type of bias that negatively impacts all minority communities. ![]() Arabic, Basque, Brazilian Portuguese, Bulgarian, Catalan and 42 others. “We welcome the swift resolution of this issue and hope measures will be implemented to ensure that translation services do not produce such stereotypical results for any language,” said Nihad Awad, the national executive director of CAIR. Detailed reviews on Google Translate based on features, pricing, usability. Google removed the offensive phrase from the word’s entry shortly after receiving criticism from groups like CAIR. The present paper attempts to shed light on one of the issues of Arabic-English translation by the Google Translate service. “Unfortunately, some of those patterns can lead to unintentional autocomplete suggestions.” “Google Translate is an automatic translator, using patterns from millions of existing translations as well as user queries to help decide on the best translation and autocomplete suggestions for our users,” Google said in a statement apologizing for the most recent error. Experts on the technology have stated that developers must consider the consequences of replicating human bias and work toward reducing it as much as possible. A similar problem is present in large language models, which have been the subject of debate for their potential to augment hate speech. Tech giants Google, Microsoft and Facebook are all applying the lessons of machine learning to translation, but a small company called DeepL has outdone them. Such bias, while concerning, is not entirely the developers’ fault - because machine translation systems are trained on huge datasets of human language, they may replicate human biases. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |