IBM researchers, in collaboration with ETH Zürich, have unveiled Analog Foundation Models (AFMs), a new class of AI models designed to bridge large language models (LLMs) and Analog In-Memory Computing (AIMC) hardware. AIMC promises ultra-efficient AI by performing matrix-vector multiplications directly inside memory arrays, bypassing the traditional von Neumann bottleneck and enabling high-throughput, low-power inference on edge devices.
By Super Admin
|
September
24, 2025