• ctrl_alt_esc@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    18 days ago

    This means absolutely nothing. It scanned a large amount of text and found something. Great, that’s exactly what it’s supposed to do. Doesn’t mean it’s smart or getting smarter.

    • 柊 つかさ@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      18 days ago

      People often dismiss AI capabilities because “it’s not really smart”. Does that really matter? If it automates everything in the future and most people lose their jobs (just an example), who cares if it is “smart” or not? If it steals art and GPL code and turns a profit on it, who cares if it is not actually intelligent? It’s about the impact AI has on the world, not semantics on what can be considered intelligence.

      • nyan@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        18 days ago

        It matters, because it’s a tool. That means it can be used correctly or incorrectly . . . and most people who don’t understand a given tool end up using it incorrectly, and in doing so, damage themselves, the tool, and/or innocent bystanders.

        True AI (“general artificial intelligence”, if you prefer) would qualify as a person in its own right, rather than a tool, and therefore be able to take responsibility for its own actions. LLMs can’t do that, so the responsibility for anything done by these types of model lies with either the person using it (or requiring its use) or whoever advertised the LLM as fit for some purpose. And that’s VERY important, from a legal, cultural, and societal point of view.