Home Content News OpenAI Set to Unveil Open Source GPT Models Soon

OpenAI Set to Unveil Open Source GPT Models Soon

0
403
OpenAI Set to Unveil Open Source GPT Models Soon

Big news alert! OpenAI is poised to drop a bombshell in the AI world, with hints of releasing its GPT models as open source.
Is OpenAI prepping to release its GPT models as open source. According to some interesting bits that Andrej Karpathy slipped in his comments, it might seem so. Karpathy has established authority in the field of deep learning. He has also contributed to the OpenAI team from the very beginning.
The conversation around the potential open sourcing of GPT models began on Twitter, with users questioning Karpathy about his work on Llama 2 instead of focusing on building Jarvis for OpenAI. In response, he intriguingly mentioned, “If/when OpenAI was to release models as weights (which I can neither confirm nor deny!) then most of the code here would be very relevant.”
Karpathy’s ongoing exploration of running large language models (LLMs) on a single computer, particularly with Baby Llama (llama.c), has further fueled speculations. His experiments were inspired by Meta’s release of Llama 2, and he has been actively engaged in loading and inferring Meta’s models using llama2.c.
The technical details shared by Karpathy shed light on the promising performance of this approach. He showcased how the smallest 7B model can be inferred at approximately 3 tokens per second on 96 OMP threads on a cloud Linux box. Moreover, he is optimistic that the speed will soon reach around 300 tokens per second.
One of the most remarkable aspects of Karpathy’s approach is the ability to achieve highly interactive rates, even with reasonably sized models containing just a few million parameters. He has successfully trained these models on a 15 million parameter model of the TinyStories dataset. This breakthrough opens up new possibilities and efficiencies, enabling smooth transitions from “scratch-trained micromodels” to “LoRA fine-tuned 7B base model,” all within the code of the minimal llama2.c repository. This development holds significant potential, even with less training data.
Open sourcing GPT models could very well mean that OpenAI is returning to its roots as a non-profit open source company. With Andrej Karpathy, one of the original founding members, actively contributing to the open source community, this development becomes even more considerable.
As of now, OpenAI has no official line in the media regarding this buzz. However, with Karpathy’s hints and the ongoing experiments with LLMs, the AI world eagerly awaits OpenAI’s next move.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here