Ollama, the popular app for running AI models locally on a computer, has released an update that takes advantage of Apple's own machine learning framework, MLX. The result is a hefty speed boost on Macs with Apple silicon. According to Ollama, the new version processes prompts around 1.6 times faste…
Ollama Now Runs Faster on Macs Thanks to Apple's MLX Framework
Ollama, the popular app for running AI models locally on a computer, has released an update that takes advantage of Apple's own machine learning framework, MLX. The result is a hefty speed boost on Macs with Apple silicon. According to Ollama, the new version processes prompts around 1.6 times faste…
Read original