Tag: small language models

Phi-4 AI Models: Revolutionizing Multimodal Capabilities

In a groundbreaking move, Microsoft has unveiled the Phi-4 family of AI models, heralding a new era of efficiency and capability in artificial intelligence.Designed to seamlessly process text, images, and speech using significantly less computing power than their predecessors, the Phi-4 models represent a significant leap forward in the development of small language models (SLMs).

Small Language Models Outperform Larger Language Models

In the evolving landscape of artificial intelligence, recent research from the Shanghai AI Laboratory reveals a surprising twist: very small language models (SLMs) may excel in reasoning tasks compared to their larger counterparts.With just 1 billion parameters, an SLM can outperform a colossal 405 billion parameter large language model (LLM) on complex math benchmarks, thanks to innovative test-time scaling (TTS) techniques.