
PALM-2 performs better in Indonesian, Korean, Swahili, and Telugu than it does in English!

Google hinted they trained PALM-2 on a lot more multilingual data than before. For certain translation tasks, PALM2 exceeds Google Translate performance.
How will you distinguish between an LLM and any other transformer model? Translate has used a transformer since 2020. https://ai.googleblog.com/2020/06/recent-advances-in-google-translate.html?m=1
@RyanMoulton thanks for pointing this out. I'll have to read this post and draw up a better resolution criteria
Edit - specifically LLMs for this market.
@firstuserhere What criteria would you use to differentiate an LLM from any other kind of pretrained transformer?
@NoaNabeshima Looked it up: "Google translate uses a hybrid model of transformers and RNNs to get good performance."