Microsoft has been actively working in the field of artificial intelligence (AI), in collaboration with OpenAI. It has unveiled a range of AI-powered tools, which have been integrated into its products and services. The tech company recently released its new AI language model ‘Orca’, in partnership with OpenAI. Orca can be optimised for specific tasks and trained using large language models like GPT-4. Here’s all that you need to know about Orca, which learns by imitating large language models.
What can Orca do?
It is a language model that is designed to solve the limitations of smaller models by imitating the reasoning processes of GPT-4. As it is smaller in size, it does not require many computing resources to run and operate. The AI tool is said to be at par with large foundation models (LFM) such as GPT-4. In fact, ever since the release, it has sparked a debate among users as to whether it would compete with OpenAI’s popular AI product ChatGPT. Research has shown that, with the help of GPT-4, it is capable of learning explanations, step-by-step thought processes, and other complex instructions. “Orca is a 13-billion parameter model that learns to imitate the reasoning process of LFMs,” says Microsoft. That is why it is also called Orca 13B–after the number of machine learning parameters it has.
Original source: yourstory.com