How to Create and Deliver Intelligent Information

Machine translation towards purification?

There was a big stir in 2018 when, apparently out of nowhere, the quality of machine translation engines improved significantly. The term “neural machine translation” took not only the translation sector by storm, but also the general public. Buzzwords like Artificial Intelligence, Machine Learning, Deep Learning and Data Mining were everywhere. Suddenly, everyone was an expert – divided into two camps: “Machine translation is useless and always will be” and “Machine translation is the future and translators should start looking for a new job right now”.

What is true now? How has the topic developed over the last two years, and how can companies successfully implement machine translation (MT) in their companies?

Solution in sight

A classic statement in the MT industry is that the “translation problem” will be a thing of the past in the next five years. It’s been like this since 1950, but we’ve reached a point where even the most skeptical of MT skeptics will admit that the results are better – even useful! But what do we do with this information? How do we best implement machine translation in practice? That’s where the experts thin out.

One thing in advance: Don’t be blinded by statements like “three out of four translators prefer our engine” or “our BLEU score is much better than the competitors'”. In the vast majority of cases, this is just successful marketing. Microsoft’s research report on the supposed human parity for Chinese-English translations should also be treated with caution. For those who are interested in the topic, I recommend Tommi Nieminen’s assessment.

Generic or individualized machine translation?

Google Translate, DeepL, Microsoft Translator, Amazon Translate and others are generic MT systems. These systems are trained with large amounts of data from different departments. As a result, the translations read very fluently, but the terminology does not fit each specialist area or is translated incorrectly due to a lack of training data in a particular domain. This is why generic systems are more suitable for companies that do not use highly specialized terminology.

Individualized systems, on the other hand, are trained with customer-specific data to ensure that both the terminology and the corporate language are taken into account in the translations. The result is engines whose raw translations are of higher quality and require less post-editing.

Which of the two options is best and whether the use of machine translation is at all viable for the company depends on many factors: What priority is given to data security in the company? How large is the company’s translation memory? How much money should be invested in the new technology? The introduction of an MT system is a long process that does not guarantee a short-term return on investment. If a custom engine is to be used, it can easily take a year to set up, train and test it properly. A good provider or consultant will make you aware of this fact and will not promise you the moon.

Do I need to train the machine translation engine at all?

In the last two years, DeepL has caused quite a sensation. Now, many company decision-makers think that machine translation can be connected to the translation management system within a day and that translation costs can be halved immediately. Or even better: that the translation management system has become superfluous because DeepL and its competitors can translate entire documents anyway. Difficult terrain.

Yes, it can be done. Yes, the results are partly very good at first sight. The crucial point for high MT quality is the subject area of the text to be translated. Emails, customer reviews and social media articles are no longer challenges for MT.

However, if you try to translate a marketing brochure or an annual balance sheet with a generic engine, clear differences in quality become apparent. It should also be remembered that the output quality can vary greatly depending on the language pair. English-German or English-French translations are often very good, but with more “exotic” language pairs, quality quickly suffers. For some use cases, the output quality of generic engines in combination with post-editing may be sufficient, and that’s a good thing!

The best part of the development during the last two years is that it has resulted in an ever-increasing number of MT providers (now more than 100). If a company wants to integrate machine translation into the localization workflow, there are many choices.

To decide whether the output quality of a generic engine is sufficient for the company or whether an engine should be customized, it is best to do extensive testing with different engines and then make an informed decision.

The challenge of the training data

In computer science, the phrase “Garbage In, Garbage Out” is well-known. The same is true for machine translation. In order to individualize an MT engine, large topic-specific corpora are necessary. Globalese, a platform on which individualized engines can be created, specifies a minimum size of 100,000 segments (approx. 1 million words) per language pair and domain.

The question arises whether machine translation is only suitable for companies with large translation memories. The answer is: yes and no. The selected cooperation partner will show you various ways of obtaining additional training data. However, your own translation memories and terminology databases are essential for individualization.

In the past, the rule of thumb was that a machine translation system becomes better the more data is fed in. This approach is no longer entirely up to date in the development of neural systems. Although a lot of data is required, it must also be high-quality and domain-specific.

Translators are indispensable

The project can only be successful if there is close cooperation with the translators or post-editors at the end of the supply chain. Finally, when using machine translation, post-editing is necessary to ensure the quality of the texts.

Post-editing is not a skill that a translator automatically masters. In order to work productively, he or she must post-edit approximately 7,000 words per day, compared to the average of 2,000 words per day for “classic” technical translation. This requires practice. It therefore makes sense to offer training in post-editing to regular translators. After all, the company is investing a lot of money in a new type of technology, but without the right expertise at the end of the supply chain, the new opportunities cannot be fully exploited.

A further prerequisite for success is that freelance translators are appropriately remunerated. There is one important reason why translators are skeptical of accepting post-editing jobs: inadequate payment in relation to the quality of the raw translation. This is due to the fact that for the past two years, companies and language service providers have increasingly been producing translations with generic engines and then sending them unedited for post-editing. Due to the high error rate, in such cases the translations often have to be created from scratch – at about one third of the usual word price. Make sure you pay a reasonable fee and reduce the price per word or per line for post-editing tasks over several months. This gives your translators time to get used to the new way of working. Also, listen to their feedback regarding output quality and pay them accordingly.

Conclusion

Machine translation can be a good tool for saving translation costs in the medium to long term. From DeepL to large customized solutions, there is now a solution for almost every use case. However, it should be noted in advance that not every type of text is equally suitable for machine translation.

For implementation to be successful, a large project must be started for which additional resources and manpower must be made available. A prerequisite for success is good cooperation with the translators, because without this, post-editing can neither take place, nor can there be any feedback on the machine translations – which is essential for the further development and improvement of the engine. In addition, the MT engine should be connected to the internal translation management system via an interface in order to exploit the strengths of both systems.


Latest posts by Christine Wetzl (see all)

Like it? Share it! Spread the Word...

Sag uns jetzt deine Meinung per Kommentar!