[Foundation Model][Large Language Model] mT5-xxl


[Foundation Model][Large Language Model] mT5-xxl

mT5-xxl https://huggingface.co/google/mt5-xxl google/mt5-xxl · Hugging Face Google's mT5 mT5 is pretrained on the mC4 corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, Eng huggingface.co An encoder-decoder model based on the T5 pretraine..


원문링크 : [Foundation Model][Large Language Model] mT5-xxl