N100 Mini PC w/ 3xNVMe?

helenslunch@feddit.nl to Selfhosted@lemmy.world – 21 points –

Not sure why this doesn't exist. I don't need 12TB of storage. When I had a Google account I never even crossed 15GB. 1TB should be plenty for myself and my family. I want to use NVMe since it is quieter and smaller. 2230 drives would be ideal. But I want 1 boot drive and 2 x storage drives in RAID. I guess I could potentially just have 2xNVMe and have the boot partition in RAID also? Bonus points if I can use it as a wireless router also.

63

You are viewing a single comment

Before anyone loses their minds, imagine you get the i3-8300T model that will peak at 25W, that’s about 0.375$ a month to run the thing assuming a constant 100% load that you’ll never have.

Not sure how you came to that conclusion, but even in places with very cheap electricity, it does not even come close to your claimed $0.375 per month. At 25 W you would obviously consume about 18 kWh per month. Assuming $0.10/kWh you'd pay $1.80/month. In Europe you can easily pay $0.30/kWh, so you would already pay more than $5 per month or $60 per year.

Just used cpubenchmark.net.

Well, what they are stating is obviously wrong then. No need to use some website for that anyway, since it is so easy to calculate yourself.

Okay, you're into something, they're considering 8 hours / day at 0.25$ per kWh2 and 25% CPU by default. Still if we tweak that into reasonable numbers, or even consider your 60$/year it will still be cheaper than a cloud service... either way those machines won't run at 25W on idle, more like 7W.

Sure, cloud services can get quite expensive and I agree that using used hardware for self-hosting - if it is at least somewhat modern - is a viable option.

I just wanted to make sure, the actual cost is understood. I find it rather helpful to calculate this for my systems in use. Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.

Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.

Yes and we usually see that with very old server grade hardware vs new-ish consumer hardware. Once the price difference is around 100$ or so, we're talking about years before break-even and it may not make much sense.