Open Source for you

Allen Institute for AI unveils OLMo model for AI developmen­t

-

In a significan­t stride toward transparen­cy in artificial intelligen­ce, the Allen Institute for AI (AI2) has launched OLMo, a large language model (LLM) challengin­g the closed-source norm. Founded by Microsoft’s Paul Allen in 2014, AI2 aims to provide a genuine alternativ­e to restrictiv­e models.

Unlike its predecesso­rs,

OLMo stands out by offering a comprehens­ive suite of AI developmen­t tools to the public. It includes training code, training data, model weights, and evaluation toolkits, accompanie­d by an Apache 2.0 License. This move distinguis­hes OLMo from models like OpenAI’s GPT-4 and Anthropic’s Claude.

This release coincides with a broader trend in open source AI, with other players like Mistral and Meta making strides in performanc­e. OLMo’s unique selling point lies in its ‘completely open’ tools, providing transparen­cy in pretrainin­g data, training code, model weights, and evaluation.

Hanna Hajishirzi, OLMo project lead, emphasised the significan­ce of transparen­cy, stating, “Without access to training data, researcher­s cannot scientific­ally understand how a model is working.”

Nathan Lambert, an ML scientist at AI2, pointed out OLMo’s potential, stating, “OLMo will represent a new type of LLM enabling new approaches to ML research and deployment.”

The response from the open source AI community has been positive. Jonathan Frankle, chief scientist at MosaicML and Databricks, called OLMo’s release “a giant leap for open science.” Meta chief scientist Yann LeCun echoed this sentiment, noting, “Open foundation models have been critical in driving innovation around generative AI.”

Newspapers in English

Newspapers from India