[Foundation Model][Large Language Model] MPT-7B, MPT-7B-Instruct


[Foundation Model][Large Language Model] MPT-7B, MPT-7B-Instruct

MPT-7B https://huggingface.co/mosaicml/mpt-7b mosaicml/mpt-7b · Hugging Face MPT-7B MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code. This model was trained by MosaicML. MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer archi huggingface.co a decoder-style transformer pretrained from scrat..


원문링크 : [Foundation Model][Large Language Model] MPT-7B, MPT-7B-Instruct