• Leon@pawb.social
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      edit-2
      3 months ago

      That’s a misrepresentation of what LLMs do. You feed them a fuckton of data and they, to oversimplify it a bit, put these concepts in a multi-dimensional map. Then based on input, it can give you an estimation of an output by referencing said map. It doesn’t search for anything, it’s just mathematics.

      It’s particularly easy to demonstrate with image models, where you could take two separate concepts, like say “eskimo dog” and “daisy” and add them together.

      When you query ChatGPT for something and it “searches” for it, it’s either fitted enough that it can reproduce a link directly, or it calls a script that performs a web search (likely using Bing) and compiles the result for you.

      You could do the same, just using an actual search engine.

      Hell, you could build your own “AI search engine” with an open weights model and a little bit of time.