After avoiding AI/LLLMs like the plague, I’ve been asking Duck.ai some fairly detailed questions, and have been, admittedly, pretty impressed with the results (after checking its references of course). For an oldster like me, it turns the search process on its head: a regular search turns up x hits, and I try to choose what appear to be the most reputable sites among them to find an answer, and then dig through text to find it; now I’m given the answer right away, and then I just have to check and make sure that it’s actually correct.
Anyway, Mrs. Erinaceus keeps reminding me that AI sucks up huge amounts of energy (and water, too, apparently), much of it not from renewable sources. So my question is, is there any “acceptable” AI that has a good reputation for not serving slop and is also easy on the environment? Should I even be using AI at all? Mainly concerned about the environment here, I like to think I’ve been round long enough to have relatively well-developed bullshit detectors. Obviously want to avoid Big Tech LLMs too, unless there’s some way of using them without them getting any benefit from it.
The real power consumption cost is from the model training. The marginal cost of running a query is small. Unless you use LLMs heavily, you probably use more power in many other ways, like if you forget to put your computer to sleep.
More training means better models, generally. If you want to minimize that, look for smaller models on HuggingFace that are more tightly focused on whatever task you have at hand.
Or, you could just do most of your queries when your local electricity mix has more renewables.
If you run it on your own hardware then you know exactly how much energy it’s using. Some of the models can even run on the average computer, but the quality is not great.
The problem we have right now is where everyone is trying to use generative AI for everything, all the time, even for basic takes such as googling for a fact, simple math that could be done on a calculator, etc. They’re also often using the latest and greatest AI models, which are powerful and spend a lot of processing power each time. In order to run the servers to respond to all that, companies use a large amount of power, and then use water to cool the servers. That’s the water usage from what I understand.
So if you use a simpler / more efficient model, and only use it for tasks where it’s better than conventional tools, you’ll be doing much better power usage wise.
Thanks! 👍
BTW, are the Chinese way ahead of the US in this like they are with renewables and electric cars? They’ll probably figure out a way to have AI use significantly less power before “we” can.
They kinda did with Deepseek out of necessity to circumvent the import restrictions on AI chips. But this model is very pro-China (it has lots of censorship built in)