I use LLM regularly as a coding aid. And it works fine. Yesterday I had to put a math formula on code. My math knowledge is somehow rusty. So I just pasted the formula on the LLM, asked for an explanation and an example on how to put it in code. It worked perfectly, it was just right. I understood the formula and could proceed with the code.
The whole process took seconds. If I had to go down the rabbit hole of searching until I figured out the math formula by myself it could have maybe a couple of hours.
It’s just a tool. Properly used it’s useful.
And don’t try to bit me with the AI bad for environment. Because I stopped traveling abroad by plane more than a decade ago to reduce my carbon emissions. If regular people want to reduce their carbon footprint the first step is giving up vacations on far away places. I have run LLMs locally and the energy consumption is similar to gaming, so there’s not a case to be made there, imho.
Current LLM bad is very true. The method used to create is immoral, and are arguably illegal. In fact, some of the ai companies push to make what they did clearly illegal. How convenient…
And I hope you understand that using the LLM locally consuming the same amount as gaming is completely missing the point, right? The training and the required on-going training is what makes it so wasteful. That is like saying eating bananas in the winter in Sweden is not generating that much CO2 because the distance to the supermarket is not that far.
I don’t believe in Intelectual Property. I’m actually very against it.
But if you believe in it for some reason there are models exclusively trained with open data. Spanish government recently released a model called ALIA, it was 100% done with open data, none of the data used for it was proprietary.
Training energy consumption is not a problem because it’s made so sparsely. It’s like complaining about animation movies because rendering takes months using a lot of power. It’s an irrational argument. I don’t buy it.
I am not necessarily got intellectual property but as long as they want to have IPs on their shit, they should respect everyone else’s. That is what is immoral.
How is it made sparsely? The training time for e.g. chatgtp 4 was 4 months. Chatgtp 3.5 was released in November 2023, chatgtp 4 was released in March 2024. How many months are between that? Oh look at that… They train their ai 24/7. For chatgtp 4 training, they consumed 7200MWh. The average American household consumes a little less than 11000kWh per year. They consumed in 1/3 of the time, 654 times the energy of the average American household. So in a year, they consume around 2000 times the electricity of an average American household. That is just training. And that is just electricity. We don’t even talk about the water. We are also ignoring that they are scaling up. So if they would which they didn’t, use the same resources to train their next models.
Edit: sidenote, in 2024, chatgtp was projected to use 226.8 GWh.
2000 times, given your approximations as correct, the usage of a household for something that’s used by millions, or potentially billions, of people it’s not bad at all.
Probably comparable with 3d movies or many other industrial computer uses, like search indexers.
I just edited my comment, just no wonder you missed it.
In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are “gaming” 24/7, it is quite wasteful.
Edit: just in case, it isn’t obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.
Do you?
Also do you what are the actual issues on fresh water?
Do you actually think cooling of some data center it’s actually relevant? Because I really, data on hand, think it’s not. It’s just part of the dogma.
Stop trying to eat vegetables that need watering out of areas without a lot of rain, much better approach if you care about that. Eat what people on your area ate a few centuries ago if you want to be water sustainable.
That’s nothing compared with intensive irrigation.
Having a diet proper to your region has a massively bigger impact on water than some cooling.
Also not every place on earth have fresh water issues. Some places have it some are pretty ok. Not using water in a place where it’s plenty does nothing for people in a place where there is scarcity of fresh water.
I shall know as my country is pretty dry. Supercomputers, as the one used for our national AI, had had not visible impact on water supply.
I’m already familiarized on industrial and computer usage of water. As I said, very little impact.
Not all food is needed to survive. Any vegan would probably give a better argument on this than me. But choice of food it’s important. And choosing one food over another it’s not a matter of survival but a matter of joy, a tertiary necessity.
Not to sound as a boomer, but if this is such a big worry for you better action may be stop eating avocados in a place where avocados don’t naturally grow.
As I said, I live in a pretty dry place, where water cuts because of scarcity are common. Our very few super computers have not an impact on it. And supercomputers on china certainly are 100% irrelevant to our water scarcity issue.
I don’t know why you keep bringing up luxuries like avocados. I don’t remember the last time I ate an avocado, or a banana, or any exotic fruit for that matter.
I somehow feel like you won’t listen to LLM criticism from me either, so it’s a disingenuous argument.
The main issue is that the business folks are pushing it to be used way more than demand, as they see dollar signs if they can pull off a grift. If this or anything else pops the bubble, then the excessive footprint will subside, even as the technology persists at a more reasonable level.
For example, according to some report I saw OpenAI spent over a billion on ultimately failed attempts to train GPT5 that had to be scrapped. Essentially trying to brute force their way to better results when we have may have hit the limits of their approach. Investors tossed more billions their way to keep trying, but if it pops, that money is not available and they can’t waste resources on this.
Similarly, with the pressure off Google might stop throwing every search at AI. For every person asking for help translating a formula to code, there’s hundreds of people accidentally running a model sure to Google search.
So the folks for whom it’s sincerely useful might get their benefit with a more reasonable impact as the overuse subsides.
Same im not going back to not using it, im not good at this stuff but ai can fill in so many blanks, when installing stuff with github it can read instructions and follow them guiding me through the steps for more complex stuff, helping me launch and do stuff I woild never have thought of. Its opened me up to a lot of hobbies that id find too hard otherwise.
We just don’t follow the dogma “AI bad”.
I use LLM regularly as a coding aid. And it works fine. Yesterday I had to put a math formula on code. My math knowledge is somehow rusty. So I just pasted the formula on the LLM, asked for an explanation and an example on how to put it in code. It worked perfectly, it was just right. I understood the formula and could proceed with the code.
The whole process took seconds. If I had to go down the rabbit hole of searching until I figured out the math formula by myself it could have maybe a couple of hours.
It’s just a tool. Properly used it’s useful.
And don’t try to bit me with the AI bad for environment. Because I stopped traveling abroad by plane more than a decade ago to reduce my carbon emissions. If regular people want to reduce their carbon footprint the first step is giving up vacations on far away places. I have run LLMs locally and the energy consumption is similar to gaming, so there’s not a case to be made there, imho.
“ai bad” is obviously stupid.
Current LLM bad is very true. The method used to create is immoral, and are arguably illegal. In fact, some of the ai companies push to make what they did clearly illegal. How convenient…
And I hope you understand that using the LLM locally consuming the same amount as gaming is completely missing the point, right? The training and the required on-going training is what makes it so wasteful. That is like saying eating bananas in the winter in Sweden is not generating that much CO2 because the distance to the supermarket is not that far.
I don’t believe in Intelectual Property. I’m actually very against it.
But if you believe in it for some reason there are models exclusively trained with open data. Spanish government recently released a model called ALIA, it was 100% done with open data, none of the data used for it was proprietary.
Training energy consumption is not a problem because it’s made so sparsely. It’s like complaining about animation movies because rendering takes months using a lot of power. It’s an irrational argument. I don’t buy it.
I am not necessarily got intellectual property but as long as they want to have IPs on their shit, they should respect everyone else’s. That is what is immoral.
How is it made sparsely? The training time for e.g. chatgtp 4 was 4 months. Chatgtp 3.5 was released in November 2023, chatgtp 4 was released in March 2024. How many months are between that? Oh look at that… They train their ai 24/7. For chatgtp 4 training, they consumed 7200MWh. The average American household consumes a little less than 11000kWh per year. They consumed in 1/3 of the time, 654 times the energy of the average American household. So in a year, they consume around 2000 times the electricity of an average American household. That is just training. And that is just electricity. We don’t even talk about the water. We are also ignoring that they are scaling up. So if they would which they didn’t, use the same resources to train their next models.
Edit: sidenote, in 2024, chatgtp was projected to use 226.8 GWh.
2000 times, given your approximations as correct, the usage of a household for something that’s used by millions, or potentially billions, of people it’s not bad at all.
Probably comparable with 3d movies or many other industrial computer uses, like search indexers.
Yeah, but then they start “gaming”…
I just edited my comment, just no wonder you missed it.
In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are “gaming” 24/7, it is quite wasteful.
Edit: just in case, it isn’t obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.
So many tedious tasks that I can do but dont want to, now I just say a paragraph and make minor correxitons
What are you doing to reduce your fresh water usage? You do know how much fresh water they waste, right?
Do you? Also do you what are the actual issues on fresh water? Do you actually think cooling of some data center it’s actually relevant? Because I really, data on hand, think it’s not. It’s just part of the dogma.
Stop trying to eat vegetables that need watering out of areas without a lot of rain, much better approach if you care about that. Eat what people on your area ate a few centuries ago if you want to be water sustainable.
Are you serious? Do you not know how they cool data centers?
https://e360.yale.edu/features/artificial-intelligence-climate-energy-emissions
https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-accelerating-the-loss-of-our-scarcest-natural-resource-water/
https://cee.illinois.edu/news/AIs-Challenging-Waters
That’s nothing compared with intensive irrigation.
Having a diet proper to your region has a massively bigger impact on water than some cooling.
Also not every place on earth have fresh water issues. Some places have it some are pretty ok. Not using water in a place where it’s plenty does nothing for people in a place where there is scarcity of fresh water.
I shall know as my country is pretty dry. Supercomputers, as the one used for our national AI, had had not visible impact on water supply.
You read all three of those links in four minutes?
Also, irrigation creates food, which people need to survive, while AI creates nothing that people need to survive, so that’s a terrible comparison.
I’m already familiarized on industrial and computer usage of water. As I said, very little impact.
Not all food is needed to survive. Any vegan would probably give a better argument on this than me. But choice of food it’s important. And choosing one food over another it’s not a matter of survival but a matter of joy, a tertiary necessity.
Not to sound as a boomer, but if this is such a big worry for you better action may be stop eating avocados in a place where avocados don’t naturally grow.
As I said, I live in a pretty dry place, where water cuts because of scarcity are common. Our very few super computers have not an impact on it. And supercomputers on china certainly are 100% irrelevant to our water scarcity issue.
I don’t know why you keep bringing up luxuries like avocados. I don’t remember the last time I ate an avocado, or a banana, or any exotic fruit for that matter.
I somehow feel like you won’t listen to LLM criticism from me either, so it’s a disingenuous argument.
The main issue is that the business folks are pushing it to be used way more than demand, as they see dollar signs if they can pull off a grift. If this or anything else pops the bubble, then the excessive footprint will subside, even as the technology persists at a more reasonable level.
For example, according to some report I saw OpenAI spent over a billion on ultimately failed attempts to train GPT5 that had to be scrapped. Essentially trying to brute force their way to better results when we have may have hit the limits of their approach. Investors tossed more billions their way to keep trying, but if it pops, that money is not available and they can’t waste resources on this.
Similarly, with the pressure off Google might stop throwing every search at AI. For every person asking for help translating a formula to code, there’s hundreds of people accidentally running a model sure to Google search.
So the folks for whom it’s sincerely useful might get their benefit with a more reasonable impact as the overuse subsides.
Same im not going back to not using it, im not good at this stuff but ai can fill in so many blanks, when installing stuff with github it can read instructions and follow them guiding me through the steps for more complex stuff, helping me launch and do stuff I woild never have thought of. Its opened me up to a lot of hobbies that id find too hard otherwise.