AI supercomputer company Cerebras Systems, AI company Petuum, and Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) launched LLM360, a framework for creating open-source large language models (LLMs). Developed in partnership with MBZUAI’s Institute of Foundation Models, LLM360 empowers developers by providing detailed insights and methodologies, promising to simplify, expedite, and reduce costs in the development of LLMs.
Two open-source large language models are released : Amber, a 7 billion parameter English-language model trained on 1.2 trillion tokens, and CrystalCoder, a 7 billion parameter model, trained on 1.4T tokens designed for English language and coding tasks. Both the models are released under the Apache 2.0 license. There is also another model Diamond with 65 billion parameters which is set to release soon. These models are trained on the Condor Galaxy 1 supercomputer, built by G42 and Cerebras systems.
Both the models are built on Meta’s LLaMA architecture and Amber is said to perform similarly to LLaMA-7B, OpenLLaMA-v2-7B and outperforms Pythia-6.7B.
Source: LLM360 Blog
CrystalCoder undergoes meticulous training, incorporating a thoughtful blend of text and code data to enhance its effectiveness in both domains. Notably, the introduction of code data occurs early in the pretraining stage, distinguishing it from Code Llama 2, which relies solely on code data during fine-tuning on Llama 2. Furthermore, CrystalCoder is specifically trained on Python and web programming language, strategically designed to elevate its capabilities as a programming assistant.
UAE Heading Towards AI Dominance
With the recent AI developments, UAE is working towards becoming an AI superpower. Following TII’s Falcon and demographic-specific Jais large language model, UAE has been also rallying for open-source models to promote research initiatives. With the recent AI company, A171 that was launched a few weeks ago, UAE looks to even take on AI giant OpenAI.