The Technology Innovation Institute (TII) has upped its generative AI credentials with the launch of “Falcon LLM,” a foundational large language model (LLM) with 40 billion parameters.
AI and Digital Science Research Centre’s (AIDRC) AI Cross-Centre Unit, the team behind building Noor the world’s largest Arabic language model, builds Falcon LLM, the 40B model which outperforms GPT3.
Falcon LLM 40B model is trained on one trillion tokens. The model uses only 75% of GPT-3’s training compute, 40% of Chinchilla’s, and 80% of PaLM-62B’s.
LLMs can be used in a wide range of applications, such as chatbots, virtual assistants, language translation, content generation, and sentiment analysis. They can help businesses to streamline their customer service operations by providing efficient and effective responses to customer inquiries. This deep tech is allowing companies and countries to garner huge savings – from improving efficiencies to cutting down labour costs and identifying new revenue streams – simply with its implementation in businesses or departments.
Dr Ray O Johnson, CEO of TII, said: “The year 2023 is turning out to be the year of AI. Falcon LLM is a landmark announcement for us, but this is just the beginning. By the end of the year, we will be sharing news on a huge increase in capabilities in this space.
“We understand that this is the start of a momentous journey,” Johnson continued, “we will press on to give this region its own AI success stories, well-aligned with the UAE’s National AI Strategy.”
Prof Mérouane Debbah, Chief Researcher, AI and Digital Science Research Centre, said: “We are thrilled to advance the world’s understanding of the power and benefits of these LLMs which will have an important impact in various fields such as education, healthcare, film production, and CGI. As the country continues to develop and diversify its economy, this is an important milestone in the field. Falcon LLM model is just the start of a new journey.”
Dr Ebtesam Almazroui, Director, AI Cross-Centre Unit at AIDRC, and project lead of LLMs who plays a pivotal role to build LLMs and step up the UAE’s capability in this space, said: “This achievement serves as a testament to the UAE's forward-thinking approach and the importance it places on innovation and technology. The development of this model is part of our journey towards realising the country's economic vision and strategic plan "We the UAE 2031".
“Our Falcon LLM 40B model will significantly outperform models like BLOOM and GPT-3, even at a fraction of the size. Thanks to state-of-the-art data pipeline, Falcon LLM also improves upon new-generation models. For instance, it matches the performance of Chinchilla (from DeepMind) and PaLM-62B (from Google), at considerably lower training costs in comparison. Falcon LLM is trained on scaling laws from Hoffmann, et al, and the model was kept relatively modest in size with unprecedented data quality.”-- TradeArabia News Service