• @[email protected]
    link
    fedilink
    English
    019 days ago

    That’s the thing. It’s a tool like any other. People who just give it a 5 word prompt and then use the raw output are doing it wrong.

    • @[email protected]
      link
      fedilink
      English
      019 days ago

      But you have the tech literacy to know that. Most non-tech people that use it do not, and just blindly trust it, because the world is not used to the concept that the computer is deceiving them.

      • @[email protected]
        link
        fedilink
        English
        018 days ago

        You mean like that women who drove into a lake because her satnav told her?

        Maybe we should ban satnavs then! Too dangerous

    • Tar_Alcaran
      link
      fedilink
      019 days ago

      It takes a lot of skill and knowledge to recognise a wrong answer that is phrased like a correct answer. Humans are absolutely terrible at this skill, it’s why con artists are so succesful.

      And that skill and knowledge is not formed by using LLMs

      • @[email protected]
        link
        fedilink
        019 days ago

        Absolutely.

        And you can’t learn to build a fence by looking at a hammer.

        My point all over really. Tools and skills develop together and need to be seen in context.

        People, whether for or against, who describe AI or other tool in isolation, who ignore detail and nuance, are not helpful or informative.