CRYPTO
Smarter AI, Less Memory: Liquid's Groundbreaking Debut
Massachusetts Institute of Technology (MIT), Cambridge, USATue Oct 01 2024
Liquid AI, a startup co-founded by former researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), has made a significant breakthrough in the field of artificial intelligence. Instead of relying on the transformer architecture, Liquid has developed a new type of AI model called the Liquid Foundation Models (LFMs) that are designed to be more efficient and adaptable.
The first LFM models, which are available in three different sizes and variants, have already demonstrated superior performance to other transformer-based models of comparable size. The smallest model, LFM 1. 3B, has outperformed Meta's Llama 3. 2-1. 2B and Microsoft's Phi-1. 5 on many leading third-party benchmarks, including the Massive Multitask Language Understanding (MMLU) test.
What sets Liquid's models apart is their ability to process sequential data, including video, audio, text, time series, and signals, while using significantly less memory than traditional transformer-based models. This makes them ideal for deployment on edge devices, such as smartphones and smart home devices, where memory is limited.
Liquid's approach to training post-transformer AI models is built on a blend of "computational units deeply rooted in the theory of dynamical systems, signal processing, and numerical linear algebra. " This has allowed the company to develop models that are not only more efficient but also more adaptable, allowing them to adjust in real-time without the need for additional computational power.
continue reading...
questions
Is the aim of the Liquid AI to develop models that can be used across multiple data modalities?
Is the 'proudest release of my career' a sign of a midlife crisis?
Can the new models be used for deployment on edge devices?
actions
flag content