Mixture of Experts (MoE) Architecture: A Deep Dive and Comparison of Top Open-Source Offerings

By Bala Kalavala, Chief Architect & Technology Evangelist The Mixture-of-Experts (MoE) architecture is a groundbreaking innovation in deep learning that has significant implications for developing and deploying Large Language Models (LLMs). In esse…

The Case Against Centricity

We need a center to steer explanations, summarise values and focus attention. It comes as naturally as the urge to compare and is closely related to it. Comparing can be explicit — we prefer this over that, and we say so. It can be wishful — something that is not the case, but we want […]