Really nice - ChatMLX 1.0.0 by Apple MLX https://t.co/AovZ1gbzaK
.@awnihannun Is there a *good* Apple Messages clone (UI/UX) which we could use to talk to our MLX based LLM? We want something that's an app (built using Swift etc) that just calls an API endpoint. Ideally the app would render markdown and code. Thanks!
ChatMLX 1.0.0 powered by Apple MLX is out and new UI interface is simply amazing! Great job @JohnMai_IT 🚀 https://t.co/MKEJcYhp1A
MLX has released several significant updates and new products. Prince Canuma announced a 23.24% overall speedup for Qwen2-VL on MLX, with specific improvements in prompt speed (17.97%) and generation speed (26.17%). The latest version of mlx-vlm, v0.0.14, includes a refactored KVCache and support for Qwen2-VL. ChatMLX 1.0.0, powered by Apple MLX, has been launched with a new user interface. Additionally, MLX Chat, a chat interface for on-device Language Model use on Apple Silicon built on FastMLX, has been introduced. Future updates will include Trainer (LoRA) and batch processing.