• Someone@sopuli.xyz
    link
    fedilink
    arrow-up
    26
    arrow-down
    2
    ·
    6 days ago

    You see how Google is going insane and stupid? Idk wtf is driving big companies that way, are they brain-washed or sth? wtf is going on in this world?

  • turdas@suppo.fi
    link
    fedilink
    arrow-up
    24
    arrow-down
    16
    ·
    6 days ago

    The climate costs aren’t “insane”. One billion devices receiving the push (probably an overestimate) represents about 0.02% of global internet traffic.

    The guy kind of proves this in his own post. The annual emissions of 13 000 cars (which is what this would equate to on 1 billion devices) is fuck all on a global level. One city pushing for bike-friendly infrastructure will have 10x that effect.

    This is not to say this isn’t kind of a stupid update, but the only thing insane about the climate costs is how insanely contrived bringing them up here is.

    • Jhex@lemmy.world
      link
      fedilink
      arrow-up
      25
      arrow-down
      1
      ·
      6 days ago

      downloading? the clear issue is running a billion mini geminis nobody asked for

      • turdas@suppo.fi
        link
        fedilink
        arrow-up
        4
        arrow-down
        12
        ·
        6 days ago

        The energy spent running it is going to be even more negligible than the bandwidth.

        • BarqsHasBite@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          6 days ago

          Everyone is freaking out about energy use of AI data centers, not the energy use of ISPs. It’s the energy used to run AI that’s the issue.

          • turdas@suppo.fi
            link
            fedilink
            arrow-up
            7
            arrow-down
            3
            ·
            6 days ago

            It’s a local model. It uses a fraction of the power a cloud AI query uses, and cloud AI queries already use much less power than you obviously think they do (it is AI training – specifically training frontier models – that burns power like crazy).

            • BarqsHasBite@lemmy.world
              link
              fedilink
              arrow-up
              8
              arrow-down
              4
              ·
              6 days ago

              Whether cloud or local, it takes CPU/GPU use. That’s what takes power. It’s not magically less because it’s on a personal PC rather than a data center.

              • turdas@suppo.fi
                link
                fedilink
                arrow-up
                5
                arrow-down
                1
                ·
                6 days ago

                Yes it is. Small models like this are on the order of 100x more efficient than the big models backing ChatGPT or Gemini proper.

                • BarqsHasBite@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  6 days ago

                  Press X to doubt.

                  But in any case allow me to amend my statement:

                  Whether cloud or local, it takes CPU/GPU use. It’s not magically less free because it’s on a personal PC rather than a data center.

                  That’s still what takes power. This is AI use that’s not needed. And multiply by hundreds of millions of devices, it’s a shit ton of energy.

              • Zetta@mander.xyz
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                6 days ago

                Like the other guy said, it is magically more efficient because it’s magically significantly smaller. This model is likely a few billion parameters and frontier models are in the 1 - 3+ trillion parameter range.

                Yeah, people’s mobile phones that run this model might die slightly faster, but playing a mobile game or doing any type of hardware intense process will kill your battery faster. It’s no different.

        • 4am@lemmy.zip
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          6 days ago

          Do the math please. Go on, I’ll wait. I want you to see your own process any why you’re wrong.

          • turdas@suppo.fi
            link
            fedilink
            arrow-up
            4
            arrow-down
            2
            ·
            6 days ago

            If it is not immediately obvious to you how negligible the cost is going to be, you have no clue how little compute small models like this require. Apply a bit of common sense: this is a model designed to run locally on smartphones. If it used a lot of power, the phone would run out of battery.

            It’s hard (if not impossible) to find power usage figures for Gemini Nano, because they’re going to depend on the efficiency of the device it’s running on. If it’s on a phone (where most Chrome installs are), that phone likely has an NPU, in which case the power draw will be negligible. If it has to run on the CPU, it’ll be more.

            So let’s instead assume every user will be using a model comparable to ChatGPT, for which we do have reasonable estimates. According to this estimate, 500 output tokens would use about 0.3 Wh of energy. 500 output tokens is about 400 words, which is probably more than the average user will be using Gemini Nano for (it is intended for small tasks), but let’s assume that as the average daily use. 1 billion users times 0.3 Wh is 300 MWh. Fuck all on a global scale, about 0.0015% of the world’s energy production (20 TWh per day).

            Keep in mind that figure is for the full ChatGPT, which runs on 1500-watt GPUs. Gemini Nano runs on chips that draw more like 1.5 W, and on devices that physically cannot draw more than 15 W. It’s thus reasonable to estimate that it is on the order of 100x more efficient.

            • qqq@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              6 days ago

              Their estimate of energy uses was only based on FLOPs, but I’d assume for real world energy usage the KV cache would be very impactful if not eventually dominant. It’s probably also a bit unfair of them to ignore the Internet traffic and likely all the extra network traffic behind the load balancer.

              Not a fan of their analysis, but I wonder if it’s potentially close to accurate to this deployment? I can’t imagine they’re having large contexts and ballooning caches on a model meant for a phone.

              • turdas@suppo.fi
                link
                fedilink
                arrow-up
                1
                ·
                6 days ago

                but I’d assume for real world energy usage the KV cache would be very impactful if not eventually dominant.

                They talk about this in the appendix where they go over the (estimated) effects of large amounts of input tokens (up to 100k). This isn’t really relevant for Gemini Nano because it only has a max 32k context window, and the deployment in Chrome probably caps it at far less than that.

                I’m inclined to believe the main analysis is reasonably accurate. The numbers are similar to what I get on my local machine with local models. Granted, I tested with smaller models (7b parameter Mistral in this case) on weaker hardware (AMD 6700XT), but on a quick test I get about 50 tok/s locally at 180 W power use, which is about 0.5 Wh for 500 tokens. AMD GPUs suck for AI, so I think it’s plausible that dedicated compute hardware would get basically the same energy efficiency on a frontier model.

                Gemini Nano on a phone NPU is obviously going to be far more efficient – by all accounts it gets the same or better tok/s I am getting at like 1/50th the TDP.

    • Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      21
      ·
      6 days ago

      The real meat in the article is the total abuse of every machine installed to. Illegal abuse. No consent. Hiding the evidence, reinstalling if it gets removed. And he barely touches the other concern, how once there AI will be used for anything you do on Chrome. The download impact is the least of the crimes here, but no one seemed to read the rest of the article.

      • turdas@suppo.fi
        link
        fedilink
        arrow-up
        7
        arrow-down
        12
        ·
        6 days ago

        That’s ridiculous. How is it “illegal abuse” for an application to install new features on your computer? If you don’t like the feature then uninstall the application. This is how it works for all software.

        It’s a local model so it doesn’t even have the privacy concerns a cloud model would have. Not that that really matters because Chrome is a privacy concern in and of itself already.

        • Rhaedas@fedia.io
          link
          fedilink
          arrow-up
          14
          arrow-down
          1
          ·
          6 days ago

          I mean, you could read about it. It’s all in the linked article, no reason for me to repeat it all.

          • turdas@suppo.fi
            link
            fedilink
            arrow-up
            3
            arrow-down
            7
            ·
            6 days ago

            I did read the article. You clearly read more into it than I did, so perhaps you should explain what your interpretation of it was.

            Like I said, the argument presented in the article is ridiculous. Obviously the law does not forbid applications from installing new features in updates. I can’t believe I have to explain this to an adult.

            • Rhaedas@fedia.io
              link
              fedilink
              arrow-up
              6
              ·
              6 days ago

              Well, you read it, so you read the Directive as well, which is specific about what is and isn’t okay. I guess you’re fine with Google and whomever else just using your computer for whatever purpose they need. The law is written to keep such activity narrow and for the application’s intended purpose, period. Now perhaps the EULA (that no one reads) is written to allow a lot of flex, and that’s where they think they can keep it legal.

              It reinstalls itself if you try to remove it normally. It’s named so as to not be easy to find (especially since there wasn’t any prompt letting you know it was being installed, or asking permission). It is more than a local AI, it’s connecting outside and being used beyond a controlled action.

              As an adult (since we had to go there and not just discuss it like adults), that all sounds not very trustworthy or legal to me. And while I don’t use Chrome because of its other problems, a lot of people do, people that aren’t going to be the wiser because of this “normal” update.

              • turdas@suppo.fi
                link
                fedilink
                arrow-up
                1
                arrow-down
                4
                ·
                6 days ago

                Yeah, and like I said, claiming that the directive forbids software from installing its features on your computer is patently ridiculous. The directive is trying to forbid tracking cookies and doesn’t cover anything the user explicitly requests to install. When the user installs Google Chrome, they’re explicitly installing Google Chrome and its features, including the AI features, and so the directive does not apply. If the user doesn’t like what the software does, they can choose to uninstall it.

                This interpretation of the directive would also make all automatic updates illegal, be it Chrome extensions, Chrome itself, Windows Update, Steam’s game updates, etc. Which, you know, is obviously not the case. So I must belabour the point: the argument is absolutely ridiculous.

      • turdas@suppo.fi
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        6 days ago

        Yeah, bloating the install size is the main problem with this. Users running out of storage space is inconvenient, but has no real bearing on climate or privacy.

  • lovingisliving@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    27
    ·
    6 days ago

    Breaking news, people use the internet for things, it emits carbon, carbon is bad. AI is not a factor in this equation.

    • Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      14
      ·
      6 days ago

      The article is far more than just “Chrome downloaded something”. And they want you to think exactly what you thought, this isn’t AI related at all, nothing to see here.