Microsoft has introduced a new AI model called Orca, that can learn to imitate the reasoning process of LFMs with minimal human intervention.
Microsoft has recently released a new open-source AI model called Orca, which is designed to learn by emulating the reasoning of larger AI models like GPT-4. The model has 13 billion parameters and is smaller than large models like GPT-4 or GPT-3.5, but it is tailored for specific use cases. Microsoft Orca has the ability to learn from step-by-step instructions and can also learn by larger language models. It does this by imitating the logic and reasoning of larger models like GPT-4, which powers ChatGPT. Microsoft says that it is using diverse imitation data to train the model.
What to expect?
Microsoft Orca is intended to circumvent the limits of smaller models by emulating huge language models. As Orca is smaller in size, it does not require many computing resources to run and operate. The Orca AI tool is said to be at par with large foundation models (LFM) such as GPT-4. Mircrosoft Orca can be optimised for specific tasks and trained using large language models like GPT-4. The model has sparked a debate among users as to whether Orca would compete with OpenAI’s popular AI product ChatGPT.
Here are the main features of Orca:
- 13 billion parameters
- Smaller in size than large models like GPT-4 or GPT-3.5
- Tailored for specific use cases
- Ability to learn from step-by-step instructions and larger language models
- Can be optimized for specific tasks and trained using large language models like GPT-4
In a statement, Microsoft said, “Orca is designed to be a more efficient and effective way to train AI models, and we believe it will be a valuable tool for researchers and developers working in the field of AI.”