This is an exciting step for Lingo24, putting us at the heart of the future of…
We all know automatic translation programmes aren’t perfect. But as with humans, the mistakes they make can be revealing. In some cases, they aren’t even mistakes – but just a choice of words that shows a particular bias.
For example, try using Google Translate to find the terms “physics teacher” or “maths teacher” in German. You’ll get Mathe-Lehrer and Physiklehrer, both male versions of the word. Unlike English, where “teacher“ is a gender-neutral term, in German it’s either der Lehrer (male) or die Lehrerin (female).
This type of gender bias isn‘t uncommon in language. For example, it’s still common to use “he“ in English to refer to “he or she“. But try the terms “French teacher“, “nursery teacher“ or “cooking teacher“ and Google Translate comes up with the feminine version!
That’s not the only time the programme shows a somewhat old fashioned attitude to career choices. Engineer, doctor, journalist, and president are all regarded as “jobs for the boys“. Nurse, flight attendant, receptionist, and childminder are “jobs for the girls“. An “author“ is male, but a “children’s author“ is female. Some of the same biases are evident in French, Spanish and Italian.
Of course this reflects a weakness in this type of statistical machine translation. Google Translate bases its results on huge volumes of documents. But it has no awareness of context, recognition of bias, (or knowledge of equal opportunities laws!) This means if its database shows “French teacher“ is usually translated as a female noun, then that’s what it comes up with. And if most doctors are male, it will produce the male version.
This problem is more evident in languages such as Hebrew, which have different verb forms depending on whether the subject is male or female. Usually, the male form is used if the sex is unknown. But some Hebrew users have complained that the machine translation tool conveys phrases such as “I don’t know how to drive a car“ “I wash the floor“ and “I go shopping“ using the female form!
Google Translate also suggests the most likely phrase in some cases:
Google staff have said that these results are due to the statistical nature of the programme, and sometimes the machine logic needs improvement. They say they are constantly working to improve the system, and encourage users to “contribute a better translation”.
But this type of unintentional bias is one reason why it’s best not to rely on web-based machine translation tools – especially for external documents. And it’s always best to check the results carefully – if only to avoid offending any male nurses or female engineers who might be reading your text!