how much power does your system need?
from GravitySpoiled@lemmy.ml to selfhosted@lemmy.world on 25 Dec 23:40
https://lemmy.ml/post/24029065
from GravitySpoiled@lemmy.ml to selfhosted@lemmy.world on 25 Dec 23:40
https://lemmy.ml/post/24029065
I wonder if my system is good or bad. My server needs 0.1kWh.
threaded - newest
There are some really efficient systems out there, but power requirements depend a lot on what is run.
A simple website is very different that a photo gallery running content ID for example.
I think at max 200w? It runs a collection of fedi/self service stuff.
I also run a pi with a couple of apps on a pi 3 that sips power.
It’s a legitimate issue because it’s 50+ cents per killowat hour where I live so power is very expensive…
That seems really high, I think power where I live is about 12-14 cents per kilowatt hour. What makes it so expenses where you live?
Mostly just that they can. It’s more expensive per tier actually.
pge.com/…/residential-electric-rate-plan-pricing.…
Take a look, this is the old pricing. They just voted to up it again.
There’s legislation that is moving along to charge people with solar because…idk.
They’re charging for solar because PGE is a greedy fuck.
Yes. There was talk locally for local government to take control of the power but it’s just talk…
It gets over 110 where I live in the summer…so air conditioning can make it very expensive.
Wait this is in the US? How, this is even more expensive than Hawaii, and they have obvious reasons for power to be more expensive there
Yep. And they are talking about a couple more price hikes next year. Significant ones.
PGE serves Northern California. They keep raising rates like 10-15% each year to cover their losses after all the wildfires a couple years ago and because of the greed.
Take note, folks. Watch this trend spread.
Damn, I wish ours was that cheap. We’re roughly $.30/kwh, mostly because our local poco is a reseller of SCE and we’re in a rural area.
0.50 $/kWh is a normal price in Europe now
Holy shit. I’m paying less than 10c per kwh even in the “high usage” tier.
I wish that was ours…
that’s insane, i pay like 5¢ a KWh
Want to switch?
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
The N100 is such a little powerhouse and I’m sad they haven’t managed to produce anything better. All of the “upgrades” are either just not enough of an upgrade for the money, it just more power hungry.
To my understanding 0.1kWh means 0.1 kW per hour.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your
devicefactory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.Thank you for explaining it.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.
Thanks!
A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.
A watt-hour is (1 J/s) * (1 hr)
This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.
1 kWh is 3,600 kJ or 3.6 MJ
0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.
Thanks. Hence, in the future I can say that it uses 0.1kW?
If this was over an hour, yes. Though you’d typically state it as 100W ;)
Yes. Or 100W.
.
You might have your units confused.
0.1kWh over how much time? Per day? Per hour? Per week?
Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you’re talking about something like charging a battery to full.
Power being used by a device, (like a computer) is just watts.
Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
kWh is the stupidest unit ever. kWh = 1000J/s * 6060s = 3.610^6J so 0.1kWh = 360kJ
I forgive 'em cuz watt hours are a disgusting unit in general
Work over time, × time, is just work! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah
Could be worse, could be BTU. And some people still use tons (of heating/cooling).
Power over time could be interpreted as power/time. Power x time isn’t power, it’s energy (=== work). But otherwise I’m with you. Joules or gtfo.
Whoops, typo! Fixed c:
A maximum of 500 watts. Fortunately your PC doesn’t actually max out your PSU or your system would crash.
the boxes i have running 24/7 use about 20w max each, and about half that at idle or ‘normal’ loads.
About 700 watts, it makes for a decent space heater in the winter.
I’m right around the same level, and it actually keeps my server room / workshop at comfortable temperature during the winter. I also have my gaming PC mounted in my server rack; when that’s running, there are times where my AC will still kick in even when it’s 40 degrees outside.
For two servers (one with a lot of spinning rust), two switches, and a few other miscellaneous network appliances. My server rack averages around 600-650W. During periods of high demand (nightly backups, for instance), that can peak at around 750W.
Wow, that sounds like I have rookie numbers
I’m idling at 120W with eight drives, but I’m currently looking into how to lower it.
My server with 8 hard drives uses about 60 watts and goes up to around 80 under heavy load. The firewall, switch, access points and modem use another 50-60 watts.
I really need upgrade my server and firewall to something about 10 years newer, it would reduce my power consumption quite a bit and I would have a lot more runtime on UPS.
80-100 watts at idle which is most of the time. Two OS drives, two fast drives, two spinners, lots of networking and always syncing with the rest of the cluster.
9 spinning disks and a couple SSD’s - Right around 190 watts, but that also includes my router and 3 PoE WiFi AP’s. PoE consumption is reported as 20 watts, and the router should use about 10 watts, so I think the server is about 160 watts.
Electricity here is pretty expensive, about $.33 per kWh, so by my math I’m spending $38/month on this stuff. If I didn’t have lots of digital media it’d be worth it to get a VPS probably. $38/month is still cheaper than Netflix, HBO, and all the other junk I’d have to subscribe to.
That’s true. And the children of my family see no ads which is priceless. Yet I am looking into ways to cut costs in half by using an additional lower powered mini pc which is always on and the main computer only running in the evening - maybe.
Same here. 300w with 12 disks, switches, and router. But electricity only costs $.12/kwh. I wouldn’t trust having terabytes of data in the cloud.
My 10 year old ITX NAS build with 4 HDDs used 40W at idle. Just upgraded to an Aoostart WTR Pro with the same 4 HDDs, uses 28W at idle. My power bill currently averages around US$0.13/kWh.
0.1kWh per hour? Day? Month?
What’s in your system?
Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from
You are misunderstanding the confusion, Kwh is an absolute measurement of an amount of power, not a rate of power usage. It’s like being asked how fast your car can go and answering it can go 500 miles. 500 miles per hour? Per day? Per tank? It doesn’t make sense as an answer.
Does your computer use 100 watt hours per hour? Translating to an average of 100 watts power usage? Or 100 watt hours per day maybe meaning an average power use of about 4 watts? One of those is certainly more likely but both are possible depending on your application and load.
You’re adding to the confusion.
kWh (as in kW*h) and not kW/h is for measurement of energy.
Watt is for measurement of power.
Lol thank you, I knew that I don’t know why I wrote it that way, in my defense it was like 4 in the morning.
They said kilawatt hours per how, not kilawatts per hour.
kWh/h = kW
The h can be cancelled, resulting in kW. They’re technically right, but kWh/h shouldn’t ever be used haha.
Yeah but tbh it’s understandable that OP got confused. I think he just means 100W
He might, but he also might mean that he has a power meter that is displaying Kwh since last reset and he plugged it in and then checked it again later when it was all set up after an arbitrary time period and it was either showing the lowest non-zero value it was capable of displaying or was showing a number from several hours.
Which GPU? How many drives?
Put a kill-o-watt meter on it and see what it says for consumption.
AiBot post. Fuck this shit.
Can you please explain?
My servers (an old desktop overstuffed with drives and an old dell laptop), networking gear and a 50 gal aquarium all run on the same outlet. As long as the aquarium heater is off, the outlet pulls about 200 watts. The aquarium heater spikes that to 400 watts when it kicks in.
Idle: 30 Watts Starting all docker containers after reboot: 140 Watts
It needs around 28 kWh per month.
last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
I also have two battery systems split between high priority and low priority infrastructure.
Ugh, I need to get off my ass and install a rack and some fiber drops to finalize my network buildout.
would love to add more fiber to my diet! if I had the time and money. next four years is going to get pricy so I’m solidifying my stack now with backup hardware and planning for failures.
the brocade is running my pvt lan since it’s the most important. physically cut off public access. just upgraded most my servers to use 10gbe and would love to run fiber to my office about 60-70 feet away.
the brocade I’m using was unlocked by the eBay seller I got it from, so it can theoretically transfer up to 40g. would be great for my AI rig I keep in the office.
That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.
IDK, after a week of runtime it told me 2.5kwh average. could be average per hour?
Highest power bill I ever saw was summer of 2022. $1800. temps outside were into to 110-120 range and was the hottest ever here.
maybe I’ll hook it back up, but I’ve got different (newer) hardware now.
If it gives you kWh as a measure for power, you should toss it because it’s obviously made by someone who had no idea what they were doing.
My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭
17W for an N100 system with 4 HDD’s
Which HDDs? That’s really good.
Seagate Ironwolf “ST4000VN006”
I do have some issues with read speeds but that’s probably networking related or due to using RAID5.
That’s pretty low with 4 HDD’s. One of my servers use 30 watts. Half of that is from the 2 HDD’s in it.
@meldrik @qaz I've got a bunch of older, smaller drives, and as they fail I'm slowly transitioning to much more efficient (and larger) HGST helium drives. I don't have measurements, but anecdotally a dual-drive USB dock with crappy 1.5A power adapter (so 18W) couldn't handle spinning up two older drives but could handle two HGST drives.
Around 18-20 Watts on idle. It can go up to about 40 W at 100% load.
I have a Intel N100, I’m really happy about performance per watt, to be honest.
kWh is a unit of energy, not power
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
.
Do you regularly divide/multiply by 3600? That’s not something I typically do in my head, and there’s no reason to do it when everything is denominated in watts. What exactly is the benefit?
I pay my electric bill by the kWh too, and I don’t live in the US. When it comes to household and EV energy consumption, kWh is the unit of choice.
No, if you’re going to lecture people on this, at least be right about facts. 1W is 1J/s. So multiply by an hour and you get 1Wh = 3600J
It’s not literally the same thing. The two units are linearly proportional to each other, but they’re not the same. If they were the same, then this discussion would be rather silly.
Finally, something I can agree with. But that’s only because physics is so undervalued in most educational systems.
I did a physics degree and am comfortable with Joules, but in the context of electricity bills, kWh makes more sense.
All appliances are advertised in terms of their Watt power draw, so estimating their daily impact on my bill is as simple as multiplying their kW draw by the number of hours in a day I expect to run the thing (multiplied by the cost per kWh by the utility company of course).
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
Joules (J) are the official unit of energy. 1W=1J/s. That means 1Wh=3600J or that 1J is kinda like “1 Watt second”. You’re right that Wh is easier since everything is rated in Watts and it would be insane to measure energy consumption by seconds. Imagine getting your electric bill and it says you’ve used 3,157,200,000J.
Thanks for the explainer, that makes a lot of sense.
Or just 3.1572GJ.
Which apparently is how this Canadian natural gas company bills its customers: www.fortisbc.com/about-us/…/how-gas-is-measured
I guess it wouldn’t make sense to measure energy used by gas-powered appliances in Wh since they’re not rated in Watts. Still, measuring volume and then converting to energy seems unnecessarily complicated.
lemmy.world/comment/14146864
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
Yeah misxed up pists, thought one depended on another because it was under it. Again forget my post :-)
My server rack has
All together that draws… 0.1 kWh… in 0.327s.
In real time terms, measured at the UPS, I have a running stable state load of 900-1100w depending on what I have at load. I call it my computationally efficient space heater because it generates more heat than is required for my apartment in winter except for the coldest of days. It has a dedicated 120v 15A circuit
Good lord, how much does electricity cost where you are? Combined with the air conditioning to keep the space livable, that would be prohibitively expensive for me
It’s always wild reading the power draw people wrote here.
I knew it was because this is a US & Europe centric site and many people from homelabs actually run Enterprise size rigs, but my 4 member household run on 2kW for the entire house lol and 75℅ of that is just A/C we use at night.
My household of 7 averages 900 watts year-round.
On cold winter days, we can average 6kW over 24h, but peak is more like 10 I’d 13. Not talking just about my space heaters with embedded computing power and TBs of storage, but the whole household.
Yeah it’s a bit of a chonk. I don’t remember the exact itemization on the power bill and I don’t have one in front of me.
Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.
.1kWh is 100Wh
Good point. Now it does make sense. I know the secret to the perpetual motion machine now.
This is a factual but irrelevant statement
I ate sushi today.
50W-ish idle? Ryzen 1700, 2 HDDs, and a GTX 750ti. My next upgrade will hopefully cut this in half.
Running an old 7th gen Intel, It has a 2070 and a 1080 in it, six mechanical hard drives 3 SSDs. Then I have an eighth gen laptop with a 1070 TI mobile. But the laptop’s a camera server so it’s always running balls to the wall. Running a unified dream machine pro, 24 port poe, 16 port poe and an 8 port poe
Because of the overall workload and the age of the CPU, it burns about 360 watts continuous.
I can save a few watts by putting the discs to sleep, But I’m in the camp where the spin up and spin down of the discs cost more wear than continuous running.
Edit: cleaned up the slaughter from the dictation, after I cleaned up my physical space from Christmas festivities.
I use unraid with 5950x and it wouldn’t stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7
<img alt="" src="https://lemmy.wtf/pictrs/image/c1da9f53-6483-4446-b366-170dc3ee2e42.png">
For the whole month of November. 60kWh. This is for all my servers and network equipment. On average, it draws around 90 watt.
How you measuring this? Looks very neat.
Looks like home assistant
Shelly plug, integrated into Home Assistant.
My home rack draws around 3.5kW steady-state, but it also has more than 200 spinning disks
What are you hosting?
I think I would go over 10 kW if I fire up everything. Only obout 80 spinny plates of rust though.
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
Pulling around 200W on average.
The PC I’m using as a little NAS usually draws around 75 watt. My jellyfin and general home server draws about 50 watt while idle but can jump up to 150 watt. Most of the components are very old. I know I could get the power usage down significantly by using newer components, but not sure if the electricity use outweighs the cost of sending them to the landfill and creating demand for more newer components to be manufactured.
I came here to tell my tiny Raspberry pi 4 consumes ~10 watt, But then after noticing the home server setup of some people and the associated power consumption, I feel like a child in a crowd of adults 😀
we’re in the same boat, but it does the job and stays under 45°C even under load, so I’m not complaining
I have an old desktop downclocked that pulls ~100W that I’m using as a file server, but I’m working on moving most of my services over to an Intel NUC that pulls ~15W. Nothing wrong with being power efficient.
Quite the opposite. Look at what they need to get a fraction of what you do.
Or use the old quote, “they’re compensating for small pp”
I’m using an old laptop with the lid closed. Uses 10w.
All in, including my router, switches, modem, laptop, and NAS, I’m using 50watts +/- 5.
It does everything I need, and I feel like that’s pretty efficient.
80-110W
With everything on, 100W but I don’t have my NAS on all the time and in that case I pull only 13W since my server is a laptop
Is there a (Linux) command I can run to check my power consumption?
Get a Kill-a-Watt meter.
Or smart sockets. I got multiple of them (ZigBee ones), they are precise enough for most uses.
If you have a laptop/something that runs off a battery,
upower
If you have a server with out-of-band/lights-out management such as iDRAC (Dell), iLO (HPe), IPMI (generic, Supermicro, and others) or equivalent, those can measure the server’s power draw at both PSUs and total.
My server uses about 6-7 kWh a day, but its a dual CPU Xeon running quite a few dockers. Probably the thing that keeps it busiest is being a file server for our family and a Plex server for my extended family (So a lot of the CPU usage is likely transcodes).
Between 50W (idle) and 140W (max load). Most of the time it is about 60W.
So about 1.5kWh per day, or 45kWh per month. I pay 0,22€ per kWh (France, 100% renewable energy) so about 9-10€ per month.
Are you including nuclear power in renewable or is that a particular provider who claims net 100% renewable?
Net 100% renewable, no nuclear. I can even choose where it comes from (in my case, a wind farm in northwest France). Of course, not all of my electricity come from there at all time, but I have the guaranty that renewable energy bounds equivalent to my consumption will be bought from there, so it is basically the same.
Thanks. I buy Vattenfall but make net 2/3rds of my own power via rooftop solar.
Mine runs at about 120 watts per hour.
Please. Watt is an SI unit of power, equivalent of Joule per second. Watt-hour is a non-SI unit of energy( 1Wh = 3600 J). Learn the difference and use it correctly.
45 to 55 watt.
But I make use of it for backup and firewall. No cloud shit.