I read over at the other place that Apple has released a decoder-only transformer open language model that can run on device - OpenELM.
The premise behind the project is something that I find quite interesting and it gives us some insight as to the strategy that Apple wants to take with ML. Not least with respect to M4 its increasing evidence that Apple will likely invest any additional available transistor budget to bolstering ML (beyond all the existing marketing / hype /rumors we're hearing about iOS18 and smart(er) Siri).
https://arxiv.org/pdf/2404.14619
The premise behind the project is something that I find quite interesting and it gives us some insight as to the strategy that Apple wants to take with ML. Not least with respect to M4 its increasing evidence that Apple will likely invest any additional available transistor budget to bolstering ML (beyond all the existing marketing / hype /rumors we're hearing about iOS18 and smart(er) Siri).
https://arxiv.org/pdf/2404.14619
Apple releases OpenELM, a slightly more accurate LLM
It's not the fastest machine learning model, but you can't have everything
www.theregister.com
Last edited: