Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?
My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?
You have to understand the priorities of the rack server market.
#1 is dependability. It needs to keep running no matter what. Evenrthing is built around overbuilding it. More cooling, dual CPUs, Dual power supplies, lots of drives in RAID…
#2 is size. Colo space is expensive! So keep it small. So everything is densely packed, which is bad for airflow. And you get stacked small fans running at the speed of sound.
#3 is performance. Yeah, you would think it was first, but it ain’t. But that means 10k and 12k spinning drives. These are loud and noisy!
Way down the list is power… When you consider the cost of the hardware new, the cost of the colo space, and the cost of the people maintaining it, the power cost is next to nothing. The only thing less important than power consumption is sound which is not even on the list…
Now, compare that with workstations. They have a lot of the same components like Xenon CPUs, lots of ram, raid… But they sit on a desk, so noise, heat and power are a real concern. And they are often overlooked in the used and refurb market. So for less money, you get server like components and performance, in a quieter and more power friendly form factor.