• acosmichippo@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    i suspect we are going to have a semantic disagreement on what “understanding” means here.

    ChatGPT is absolutely trained on the concept of calendars, that iPhones indeed have calendars, and how they work. It doesn’t need to “understand” what a calendar is on a deep epistemological level to process requests about them. If you ask chatGPT a question about calendars, it’ll answer you. So in that shallower sense LLMs absolutely “understand” what you mean, and that’s enough for chatGPT to help siri.

    • djehuti@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      No, they really don’t. They answer you with words that are commonly found in conjunction with calendars. There is code in Siri that understands calendars, but the LLM part ain’t it. I have, ahem, firsthand knowledge of how Siri does this.

      • acosmichippo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        There is code in Siri that understands calendars, but the LLM part ain’t it.

        yes i know, you are getting hung up on my colloquial use of “understand”. the LLM doesn’t need to “understand” it on that level because siri does. the LLM is there to parse the language and hand off to Siri. that’s all i’m saying.

        • djehuti@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          21 hours ago

          Fair. I think it’s important from an “Is this True AGI?” sense to distinguish these, but yes, in the colloquial sense I guess the system could be said to understand, even if it’s not strictly the actual LLM part that does it.