• Tar_Alcaran
    link
    fedilink
    019 days ago

    It takes a lot of skill and knowledge to recognise a wrong answer that is phrased like a correct answer. Humans are absolutely terrible at this skill, it’s why con artists are so succesful.

    And that skill and knowledge is not formed by using LLMs

    • @[email protected]
      link
      fedilink
      019 days ago

      Absolutely.

      And you can’t learn to build a fence by looking at a hammer.

      My point all over really. Tools and skills develop together and need to be seen in context.

      People, whether for or against, who describe AI or other tool in isolation, who ignore detail and nuance, are not helpful or informative.