Mike Rockwell, the Apple Vision Pro chief, has replaced John Giannandrea as the executive in charge of Siri, in an executive shakeup to try and rescue Apple's flailing AI efforts.
i suspect we are going to have a semantic disagreement on what “understanding” means here.
ChatGPT is absolutely trained on the concept of calendars, that iPhones indeed have calendars, and how they work. It doesn’t need to “understand” what a calendar is on a deep epistemological level to process requests about them. If you ask chatGPT a question about calendars, it’ll answer you. So in that shallower sense LLMs absolutely “understand” what you mean, and that’s enough for chatGPT to help siri.
No, they really don’t. They answer you with words that are commonly found in conjunction with calendars. There is code in Siri that understands calendars, but the LLM part ain’t it. I have, ahem, firsthand knowledge of how Siri does this.
There is code in Siri that understands calendars, but the LLM part ain’t it.
yes i know, you are getting hung up on my colloquial use of “understand”. the LLM doesn’t need to “understand” it on that level because siri does. the LLM is there to parse the language and hand off to Siri. that’s all i’m saying.
Fair. I think it’s important from an “Is this True AGI?” sense to distinguish these, but yes, in the colloquial sense I guess the system could be said to understand, even if it’s not strictly the actual LLM part that does it.
i suspect we are going to have a semantic disagreement on what “understanding” means here.
ChatGPT is absolutely trained on the concept of calendars, that iPhones indeed have calendars, and how they work. It doesn’t need to “understand” what a calendar is on a deep epistemological level to process requests about them. If you ask chatGPT a question about calendars, it’ll answer you. So in that shallower sense LLMs absolutely “understand” what you mean, and that’s enough for chatGPT to help siri.
No, they really don’t. They answer you with words that are commonly found in conjunction with calendars. There is code in Siri that understands calendars, but the LLM part ain’t it. I have, ahem, firsthand knowledge of how Siri does this.
yes i know, you are getting hung up on my colloquial use of “understand”. the LLM doesn’t need to “understand” it on that level because siri does. the LLM is there to parse the language and hand off to Siri. that’s all i’m saying.
Fair. I think it’s important from an “Is this True AGI?” sense to distinguish these, but yes, in the colloquial sense I guess the system could be said to understand, even if it’s not strictly the actual LLM part that does it.