Quantcast
Channel: Analytics India Magazine
Viewing all articles
Browse latest Browse all 3540

UAE’s TII Announces ‘Powerful’ Small AI Model Falcon 3 

$
0
0
TII Falcon Mamba 7B

Technology Innovation Institute (TII), a research institute from Abu Dhabi, UAE, has unveiled a new family of small language models titled Falcon 3. The models range from one billion parameters to 10 billion parameters in both base and instruct versions. Falcon is available as an open-source model under TII’s Falcon License 2.0. 

The institute also released benchmark results comparing some of the other leading models in its category. Both Falcon 3 7B and 10B outperformed models like Qwen 2.5 7B and Llama 3.1 8B in several benchmarks. 

TII is a global research institution based in Abu Dhabi and funded by the Abu Dhabi government. It was established in May 2020 and focuses on research in AI, quantum computing, robotics, and cryptography. 

Falcon 3 employed a shared parameter technique called Grouped Query Attention (GQA) that reduces the memory demands, thereby leading to low latency during inference. 

“The initial training was followed by multiple stages to improve reasoning and math performance with high-quality data and context extension with natively long context data,” read the announcement. 

The model was also trained in four languages, including English, Spanish, Portuguese, and French. 

All variants of the Falcon model are available for download on Hugging Face

In August, TII launched the Falcon Mamba 7B model. It outperformed Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B in benchmarks. In May, they launched Falcon 2, an 11B text and vision model. 

Small Models on the Rise

Are small language models finally delivering the promise? A few days ago, Microsoft announced the latest Phi-4 model. With just 14B parameters, the model outperformed much larger models like Llama 3.3 70B and GPT 4o on several benchmarks. 

There have also been discussions about the relevance of pre-training and a brute-force approach to improve the model by increasing its size. Ilya Sutskever, former OpenAI chief scientist, had his say on this debate in his presentation at NeurIPS 2024. 

“Pre-training as we know it will unquestionably end,” he said, referring to the lack of available data. “We have but one internet. You could even go as far as to say that data is the fossil fuel of AI. It was created somehow, and now we use it,” he added. 

He also speculated that the use of inference time computing and synthetic data for training is a key technique that may help researchers overcome the problem. 

That said, if small models leverage new and innovative techniques and deliver high performance on resource-constrained devices, the smartphone market in 2025 will be the one to watch out for. 

The post UAE’s TII Announces ‘Powerful’ Small AI Model Falcon 3  appeared first on Analytics India Magazine.


Viewing all articles
Browse latest Browse all 3540

Trending Articles