Aditya

GPT-J - Open-source cousin of GPT-3, everyone can use it

by
GPT⁠⁠-⁠J-6B, a 6 billion parameter model trained on the Pile, is now available for use with our new codebase, Mesh Transformer JAX.

Add a comment

Replies

Best
Zoya Brant
Gamechanger. Can't wait to try it in comparison to gpt3.
Angsuman Chakraborty
Can we just use the pre-trained model on 1080Ti?