It is generally accepted that when it comes to large language models (LLM), the bigger the better. And this is the key reason why the likes of OpenAI, Google, and Anthropic have been pushing billions of dollars into their AI models. All of this changed with the arrival of Chinese AI startup DeepSeek, which claimed to build a superlative AI model at a fraction of the cost used by big tech. However, now, another entrant in the arena seems to have stunned the tech world. Samsung, with a small team at its AI lab in Montreal, has introduced the Tiny Recursive Model (TRM), which is contrary to the view that performance scales with more parameters.
Reportedly, Samsung ’s TRM is 10,000x smaller than most of the prominent AI models. The model has only seven million parameters and has shown reaso