[Foundation Model][Large Language Model] GPT-NeoX-20B


[Foundation Model][Large Language Model] GPT-NeoX-20B

GPT-NeoX-20B https://huggingface.co/EleutherAI/gpt-neox-20b EleutherAI/gpt-neox-20b · Hugging Face GPT-NeoX-20B is a 20 billion parameter autoregressive language model trained on the Pile using the GPT-NeoX library. Its architecture intentionally resembles that of GPT-3, and is almost identical to that of GPT-J- 6B. Its training dataset contains a multi huggingface.co Transformer(Decoder)-based ..


원문링크 : [Foundation Model][Large Language Model] GPT-NeoX-20B