Ollama Now Runs Faster on Macs Thanks to Apple's MLX Framework - MacRumors

Corroborated by 2 sources from 2 publishers

globaltech1d ago

TL;DR

rapid advancements do seem to be trickling into the LLMs much faster now Ollama, a runtime system for operating large language models on a local computer, has introduced support for Apple’s open source MLX framework for machine learning.

Sources