I don’t know if Apple is working on LLM stuff; they probably do, they probably do work on improving Siri, too, if such a thing is possible in the current incarnation of its fundamentals. But, judging by the rapidity of other companies introducing AI features mainly based on LLM models, I don’t expect it would be so hard for Apple to do the same with Siri. But only if Apple accepts to work with CharGPT-back end for a short-term solution. This could be a transitory path in my mind. Because Apple being Apple, they probably would want to put their twist on this: better privacy protection, for example. They like to control the whole stack. That’s perhaps why they are, apparently, investing massively in their one training infrastructure, which would be they accept the fact that on-device training is too limited, even with powerful Apple silicon. It could prove to be a long journey. I don’t expect too much for next year’s OSes. It will be interesting to see where Apple is headed with this AI thing next year.

Meanwhile, when I’m asking Siri queries today, I cannot help but feel the tech is antiquated compared to what we can do with Whisper and the like today.

Image: Dall-E.