• Silicon_Knight@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    X to doubt. Apple “usually” plans these changes in ahead so other devices can take advantage of it. Not saying that is the case here at all as there are times it did.

    • mamimapr@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      ChatGPT is just 1 year old. And only recently has it become feasible to run LLMs locally on-device. Apple couldn’t really have planned for it hardware-wise.

      A common requirement for these LLMs and stable diffusion models is large memory requirement, which Apple has always been stingy with. Only with the newer hardware do I foresee them increasing base memory, if they want to make AI more accessible.