- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
Illusion — Why do we keep believing that AI will solve the climate crisis (which it is facilitating), get rid of poverty (on which it is heavily relying), and unleash the full potential of human creativity (which it is undermining)?
iirc, there were some statements from companies (Microsoft?) that we won’t have to worry about AI’s effect on climate change because it’ll also come up with the solutions
We’ve had the tech to drastically cut power consumption for a few years now, it’s just about adapting the existing hardware to include the tech.
There’s a company MythicAI which found that using analog computers (ones built specifically to soft through .CKPT models, for example) drastically cuts down energy usage, is consistently 98-99% accurate, simply by taking a digital request call, converting it to an analog signal, the signal is processed then converted back to a digital signal and set to the computer to finish the task.
In my experience, AI is only drawing 350+ watts when it is sifting through the model, it ramps up and ramps down consistently based on when the GPU is utilizing the CUDA cores and VRAM, which are when the program is processing an image or the text response (Stable Diffusion and KoboldAI). Outside of that, you can keep stable diffusion open all day idle and power draw is marginally higher, if it even is.
So according to MythicAI, the groundwork is there. Computers just need an analog computer attachment that remove the workload from the GPU.
The thing is… I’m not sure how popular it will become. 1) these aren’t widely available and you have to order them from the company and get a quote. Who knows if you can only order one. 2) if you do get one, it’s likely not just going to pop into most basic users Windows install running Stable Diffusion, it’s probably expecting server grade hardware (which is where the majority of the power consumption comes from, so good for business but consumer availability would be nice). And, most importantly, 3), NVIDIA has sunk so much money into GPU powered AI. If throwing 1,000 watts at CUDA doesn’t keep making strides, they may try to obfuscate this competition. NVIDIA has a lot of money riding on the AI wave and if word gets out that some other company can cut costs of development both in cost of hardware and cost of running it, and the need for multiple 4090s or whatever is best and you get more efficiency from accuracy per watt.
Oh, and 4) MythicAI is specifically geared towards real time camera AI tracking, so they’re likely an evil surveillance company and also the hardware itself isn’t explicitly geared towards all around AI, but specific models built in mind. It isn’t inherently an issue, it just circles back to point 2) where it’s not just the hardware running it that will be a hassle, but the models themselves too.