• @[email protected]
    link
    fedilink
    English
    41
    edit-2
    9 days ago

    AI AI AI AI

    Yawn

    Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.

    • @[email protected]
      link
      fedilink
      English
      529 days ago

      normal person’s server.

      I’m pretty sure I speak for the majority of normal people, but we don’t have servers.

      • Rose
        link
        fedilink
        English
        219 days ago

        Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.

        (Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)

        • @[email protected]
          link
          fedilink
          English
          119 days ago

          Hello fellow home labber! I have a home built xpenology box, proxmox server with a dozen vm’s, a hackentosh, and a workstation with 44 cores running linux. Oh, and a usb floppy drive. We are out here.

          I also like long walks in Oblivion.

          • MrPistachios
            link
            fedilink
            English
            59 days ago

            Man oblivion walks are the best until a crazy woman comes at you trying to steal your soul with a fancy sword

        • @[email protected]
          link
          fedilink
          English
          69 days ago

          lol yeah, the lemmy userbase is NOT an accurate sample of the technical aptitude of the general population 😂

        • @[email protected]
          link
          fedilink
          English
          39 days ago

          It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives.

          Equally disheartening is knowing that both of those have a shelf-life. Old USB flash drives are more durable than the TLC/QLC cells we use today, but 15 years sitting unpowered in a box doesn’t have very good prospects.

      • @[email protected]
        link
        fedilink
        English
        99 days ago

        You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?

        Appologies, I’m tired and that made more sense in my head.

      • fmstrat
        link
        fedilink
        English
        48 days ago

        “Normal person” is a modifier of server. It does not state any expectation of every normal person having a server. Instead, it sets expectation that they are talking about servers owned by normal people. I have a server. I am norm… crap.

    • @[email protected]
      link
      fedilink
      English
      49 days ago

      You can get a Coral TPU for 40 bucks or so.

      You can get an AMD APU with a NN-inference-optimized tile for under 200.

      Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

      What price point are you trying to hit?

      • @[email protected]
        link
        fedilink
        English
        89 days ago

        What price point are you trying to hit?

        With regards to AI?. None tbh.

        With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

        • @[email protected]
          link
          fedilink
          English
          09 days ago

          With regards to AI?. None tbh.

          TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.

        • @[email protected]
          link
          fedilink
          English
          -29 days ago

          You’re willing to pay $none to have hardware ML support for local training and inference?

          Well, I’ll just say that you’re gonna get what you pay for.

          • @[email protected]
            link
            fedilink
            English
            99 days ago

            No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.

              • caseyweederman
                link
                fedilink
                English
                18 days ago

                I have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.

                • @[email protected]
                  link
                  fedilink
                  English
                  38 days ago

                  I mean the image generators can be cool and LLMs are great for bouncing ideas off them at 4 AM when everyone else is sleeping. But I can’t imagine paying for AI, don’t want it integrated into most products, or put a lot of effort into hosting a low parameter model that performs way worse than ChatGPT without a paid plan. So you’re exactly right, it’s not being sold to me in a way that I would want to pay for it, or invest in hardware resources to host better models.

      • @[email protected]
        link
        fedilink
        English
        18 days ago

        I just use pre-made AI’s and write some detailed instructions for them, and then watch them churn out basic documents over hours…I need a better Laptop