Alibaba and Baidu add Meta’s Llama 3 to cloud platforms
Chinese tech firms Alibaba Group Holding and Baidu have rushed to add support for Meta’s Llama 3 large language model (LLM) to their cloud computing platforms, after the technology used to train chatbots like ChatGPT was released last week.
E-commerce giant Alibaba’s cloud computing unit has added Meta’s Llama 3 LLM to its opensource artificial intelligence (AI) model community, ModelScope, which offers developers access to a range of open-source AI models.
Separately, Alibaba Cloud extended support for Meta’s LLMs on the Bailian platform, offering free training and inferencing services and deploying solutions for a limited time, the company said in a post published on its official WeChat account yesterday, without elaborating on the time frame. Alibaba owns the Post.
Bailian is an LLM service platform that provides a suite of tools and services to assist clients in building and training their own models and applications using Alibaba’s cloud computing services.
Alibaba’s move follows an announcement by search engine giant and AI pioneer Baidu to extend support for Llama 3, which came right after the Meta model’s release last week.
Last Friday, a day after Meta debuted the third iteration of its Llama series models, Beijingbased Baidu was the first among major Chinese technology firms to step up, offering training and inferencing services for Llama 3 on its Qianfan model-as-a-service platform.
Qianfan was launched to help corporate clients build, train and scale AI models catering to their needs.
It offers a wide selection of models, ranging from Baidu’s proprietary Ernie family to thirdparty open-source models from local and overseas companies, such as Meta’s Llama series.
Qianfan currently hosts a total of 79 AI models, according to the company, with the platform’s 85,000 clients having built more than 14,000 models and 190,000 applications.
With a development platform to consolidate tasks that included data management, model finetuning, model assessment and optimisation, and inferencing service deployment, Qianfan users would be able to build new models with the capability to surpass the foundation models at a much lower cost, Baidu said.