
Meta's AI model, called NLLB-200, can translate 200 different languages. This is a significant expansion compared to previous capabilities. The model includes three times as many low-resource languages as high-resource languages and performs 44% better than pre-existing systems. This advancement can help people speaking rarely translated languages access the internet and other technologies, as well as improve education by providing access to more books and research articles for those speaking low-resource languages4.

The primary function of Meta's AI, as described in the article, is to translate languages using artificial intelligence. The AI model can translate 200 different languages, including many under-resourced languages that previously had limited access to machine translation1. This technology aims to help people speaking rarely translated languages access the internet and other technologies, as well as provide educational benefits by making more books and research articles accessible.

Marta Costa-jussà and the No Language Left Behind (NLLB) team employed a cross-language approach to enable translation of low-resource languages. This approach allows neural machine translation models to learn how to translate low-resource languages using their pre-existing ability to translate high-resource languages. By leveraging the knowledge from high-resource languages, the model can better understand and translate low-resource languages, even with limited available data.