Looking to build my first PC in almost 30 years; What should I be on the look out for?

CosmicTurtle@lemmy.world to Selfhosted@lemmy.world – 143 points –

It looks like !buildapc community isn't super active so I apologize for posting here. Mods, let me know if I should post there instead.

I built my first PC when I was I think 10-11 years old. Built my next PC after that and then sort of moved toward pre-made HP/Dell/etc. My last PC's mobo just gave out and I'm looking to replace the whole thing. I've read over the last few years that prefabs from HP/Dell/etc. have gone to shit and don't really work like they used to. Since I'm looking to expand comfortably, I've been thinking of giving building my own again.

I remember when I was a young lad, that there were two big pain points when putting the rig together: motherboard alignment with the case (I shorted two mobos by having it touch the bare metal of the grounded case; not sure how that happened but it did) and CPU pin alignment so you don't bend any pins when inserting into the socket.

Since it's been several decades since my last build, what are some things I should be aware of? Things I should avoid?

For example, I only recently learned what M.2 SSD are. My desktop has (had) SATA 3.5" drives, only one of which is an SSD.

I'll admit I am a bit overwhelmed by some of my choices. I've spent some time on pcpartpicker and feel very overwhelmed by some of the options. Most of my time is spent in code development (primarily containers and node). I am planning on installing Linux (Ubuntu, most likely) and I am hoping to tinker with some AI models, something I haven't been able to do with my now broken desktop due to it's age. For ML/AI, I know I'll need some sort of GPU, knowing only that NVIDIA cards require closed-source drivers. While I fully support FOSS, I'm not a OSS purist and fully accept that using a closed source drivers for linux may not be avoidable. Happy to take recommendations on GPUs!

Since I also host a myriad of self hosted apps on my desktop, I know I'll need to beef up my RAM (I usually go the max or at least plan for the max).

My main requirements:

  • Intel i7 processor (I've tried i5s and they can't keep up with what I code; I know i9s are the latest hotness but don't think the price is worth it; I've also tried AMD processors before and had terrible luck. I'm willing to try them again but I'd need a GOOD recommendation)
  • At least 3 SATA ports so that I can carry my drives over
  • At least one M.2 port (I cannibalized a laptop I recycled recently and grabbed the 1TB M.2 card)
  • On-board Ethernet/NIC (on-board wifi/bluetooth not required, but won't complain if they have them)
  • Support at least 32 GB of RAM
  • GPU that can support some sort of ML/AI with DisplayPort (preferred)

Nice to haves:

  • MoBo with front USB 3 ports but will accept USB 2 (C vs A doesn't matter)
  • On-board sound (I typically use headphones or bluetooth headset so I don't need anything fancy. I mostly listen to music when I code and occasionally do video calls.)

I threw together this list: https://pcpartpicker.com/list/n6wVRK

It didn't matter to me if it was in stock; just wanted a place to start. Advice is very much appreciated!

EDIT: WOW!! I am shocked and humbled by the great advice I've gotten here. And you've given me a boost in confidence in doing this myself. Thank you all and I'll keep replying as I can.

117

GPUs these days use a whole lot of power. Ensure your power supply is specced appropriately.

And make sure it's an actually good PSU too.

I know in gaming, possibly in other loads Nvidia 40 series, and especially 30 series love transient spikes which can easily exceed 2x the nominal power consumption. Make sure your PSU can handle those spikes both in terms of brevity, and current.

That's what finally did in my 10 year old Corsair. I was technically within specs on wattage with my new 4070 but certain loads would cause it to trip the over current protection anyway.

Well, let's see:

  • You no longer have to set jumpers to "master" or "slave" on your hard drives, both because we don't put two drives on the same ribbon cable anymore and because the terminology is considered kinda offensive.

  • Speaking of jumpers, there's a distinct lack of them on motherboards these days compared to the ones you're familiar with: everything's got to be configured in firmware instead.

  • There's a thing called "plug 'n play" now, so you don't have to worry about IRQ conflicts etc.

  • Make sure your power supply is "ATX", not just "AT". The computer has a soft on/off switch controlled through the motherboard now -- the hard switch on the PSU itself can just normally stay on.

  • Cooling is a much bigger deal than it was last time you built a PC. CPUs require not just heat sinks now, but fans too! You're even going to want some extra fans to cool the inside of the case instead of relying on the PSU fan to do it.

  • A lot more functionality is integrated onto motherboards these days, so you don't need nearly as big a case or as many expansion slots as you used to. In fact, you could probably get by without any ISA slots at all!

While I love this list, it is more applicable to the turn of the century than a a decade ago. I was half expecting to see “ram no longer has to be installed in pairs” on the list.

ETA: Talking about EDO memory not dual channel

I think you may have misread OPs post. They haven't built a PC since shirtly after they were 10-11, which was almost 30 years ago. So developments since the turn of the century are in fact relevant here, heh.

They haven't built a PC since shirtly after they were 10-11…

Well, they’d need a much bigger shirt now than when they were 10 or 11.

I’ll see myself out.

Wait RAM doesn’t need to be installed in pair? I am an old apparently

Compared to those pain points building a modern PC should be a breeze. CPUs go in Zero Insertion Force sockets so as long as you remember to lift the little lever you won’t bend any pins. People don’t even wear static discharge wrist bands anymore (all though it couldn’t hurt) or worry about shorting things out. And power connectors only fit one way unlike the AT power connector.

Speaking of breeze your only pain point might be making sure you have enough air circulation for cooling all that gear.

I remember working on a PC back in my Geek Squad days that had a lever.

For air circulation, what should I be on the lookout for? Making sure I have clearances, of course, but should I buy more fan that I need?

Case usually have fans preinstalled that should be fine. Just pay attention to the direction, have tham all blow air front to back. There is usually an arrow indicating which way it moves air.

Run a benchmark after buiding the PC and check temperatures.

Don't just look at temperatures though, look at clock speed too. 95c+ is normal for modern high end CPUs (AMD 7000 series actively try to run at that temp under full load). What you want to make sure is that it's not throttling.

If this is a server and you don't want your thermal paste to be toast in a year then I'd suggest lowering the maximum temperature in the bios if it lets you.

That and you need to decide how much positive or negative pressure you want in there as well. You could always do some calculations. Treat your case as an open control volume where mass can transfer across the boundaries. Then the sum of air going into and out of the case must equal the rate of change of air in the case. Assuming the volume of air in your case is constant, this term would be zero. So you can look at the rated volume flow rate for each fan (CFM - aka cubic foot per minute) and see if this summation is positive or negative. A positive value would mean "positive pressure" and a negative value "negative pressure". The only problem is if the fans are not running at max RPM and/or the rated CFM value - which is the case if you have your fans plugged into the motherboard( regardless of whether you're using PWM or 3 pin). In this case, you would have to calculate the volume flow rate of each individual fan as a function of the RPM. This may not be a linear function and would probably require taking some data and coming up with a regression for the data. This would be way harder to do.

tldr: add up the CFM going into the case, subtract the CFM leaving the case. If the value is positive you have "positive pressure"

Some thoughts:

Ubuntu, most likely

I'd encourage you to take a look at Linux Mint, it alleviates some of the Ubuntu fuckiness. And if you want to join the "I use arch btw" crowd, maybe checkout EndeavourOS if you're feeling more brave than just Ubuntu variants (which is built on arch, but makes barrier to entry a little easier).

i9s are the latest hotness but don’t think the price is worth it

Take a look at last generation to soften the blow to your wallet. E.g., instead of looking at a 14900k, look at 13 or even 12 series. In fact, this is a useful strategy all around if you're being price conscious: go one gen older.

GPU that can support some sort of ML/AI with DisplayPort

Probably going to want to go with a discrete card, rather than just integrated. Other major consideration is going to be nvidia vs AMD, for which you'll need to decide if CUDA should be part of your calculus or not. I'll defer to any data science engineers that might wander through this post.

The rest of your reqs pretty much come as regular stock options when building a pc these days. Though another nicety for my latest builds, is multi-gig nics (though 2.5Gb was my ceiling, since you'll also need the network gear to utilize it). Going multi-gig is nice for pushing around a fuckton of data between machines on my lan (including a NAS).

Very last thing that I've found helpful in my last 3 builds spanning 15 years: I use newegg for its reviews of items, specifically so I can search for the term "linux" in any given product's reviews. Often times I can glean quick insight on how friendly (or not) hardware has been for other's linux builds.

And I lied, I just remembered about another linux hardware resource: https://linux-hardware.org/?view=search

You can see other people that have built with given hardware. Just remember to do a scan too once your build is up to pay it forward.

Good luck, and remember to have fun!

I have to agree here. I use PopOS mostly, but most Ubuntu derivative nowadays beat the living crap out of Ubuntu. PopOS, Zorin, Mint, etc. Like many others, Ubuntu was my gateway to Linux, but I lived out of that in less than a year. Started spinning Mandriva (damn I'm old), Debian itself, and I've tried Ubuntu a few times over the years, mostly on VMs now, since now I hold no hope that it'll ever go back to what it was.

Used EndeavourOS for a few years too but switched to Fedora Workstation recently. EndeavourOS is still great but I like Fedora more now since it's just easier. A lot of stuff I did manually before like switching ext4 for BTRFS, enabling compression and switching to Pipewire is done by default (also LUKS for full diks encryption which I was too lazy to install before) and I can update my system and install most software through GNOME Software without having to use the CLI. It's also very easy to get OpenCL and HIP working, it's just one package each you need to install. Only downside for me is that it's not as easy to install stuff from COPR than it is from the AUR because you first have to enable the repo for each package you want to install from COPR. I think COPR is more secure tho, especially for someone like me who never looked at the PKGBUILD when installing from AUR.

And if you want to join the “I use arch btw” crowd...

I may be a linux nerd and pedantic, but not that pedantic. 😅 I've looked into Linux Mint and not opposed to an distro switch. I've been very happy with Ubuntu over the years. My first distro was slackware, then Fedora. Settled in Ubuntu and haven't turned back.

if CUDA should be part of your calculus or not.

Probably not, if my cursory google search is correct. But happy to be convinced otherwise.

Though another nicety for my latest builds, is multi-gig nics (though 2.5Gb was my ceiling, since you’ll also need the network gear to utilize it)

I've had the benefit of laying my own CAT-5e in my house. Given the distances, CAT-6 was going to cost twice as much with a negligible increase in bandwidth. That said, I'm restricted by the narrowest straw, which is wifi (when streaming media to my phone) and ISP (which taps out at around 300mb/s). My current PC has 1gb/s card and I've only occasionally had issues.

I use newegg for its reviews of items, specifically so I can search for the term “linux” in any given product’s reviews.

Oh that's a good tip!

5 more...
6 more...

There is no need for a separate sound card now, it is built in.

Get Nvidia GPU for AI, period.

Read the manual for the motherboard you want and make sure that the M2 slot supports NVMe rather than SATA. (Also, learn to tell NVMe from SATA chips.) M2 slots that are SATA usually share a SATA lane with the SATA connectors and if you populate the M2 slot you might lose a connector.

Another thing to read about is whether populating which M2 slot reduces the speed of one of the PCIe slots. Same reason (shared lanes) but with PCIe instead of SATA. These things should be spelled out next to the M2 connectors.

NVMe drives in Linux have /dev/nvme* designations not /dev/sd*.

For AI/ML workloads the VRAM is king

As you are starting out something older with lots of VRAM would be better than something faster with less VRAM for the same price.

The 4060 ti is a good baseline to compare against as it has a 16GB variant

"Minimum" VRAM for ML is around 10GB the more the better, less VRAM could be usable but with sacrefices with speed and quality.

If you like that stuff in couple of months, you could sell the GPU that you would buy and swap it with 4090 super

For AMD support is confusing as there is no official support for rocm (for mid range GPUs) on linux but someone said that it works.

There is new ZLUDA that enables running CUDA workloads on ROCm

https://www.xda-developers.com/nvidia-cuda-amd-zluda/

I don't have enough info to reccomend AMD cards

I got ROCm to work on a 7800XT after fixing some Python errors. It was quite unstable though.

AMD is the gold standard for general user PCs in the last 5+ years. Intel simply cannot compete at the same energy expenditure/performance. At the same/close price/performance, Intel either burn a small thermonuclear power plant to deliver comparable performance, or simply is worse compared to similar Ryzens

Ryzens are like aliens compared to what AMD used to be before them

So I'd go with them

As for the GPU, if you want to use Linux forget Nvidia

lots of good advice here. I just want to restate: do yourself a favor and migrate your HDDs over to any solid state drive. Whether that means "classic" SSDs with a SATA-Port or M.2s is your prerogative, but in either case you'll start wondering how you could ever stand that s pinning noise and the vibrations and the slow, slow data transfer.

Honestly any parts you buy today probably won't be much good in 30 years.

Honestly any parts you buy today probably won't be much good in 30 years.

At least not when AGP comes to town.

I'd defintely go with an M.2 SSD, you can get 1tb for 50€ and 2tb for 100€ now and they're much faster, more reliable and take up way less space.

For ML/AI stuff, you might be just fine using an AMD GPU. AMD GPUs are a lot easier to use on Linux and are also a good bit cheaper. I use Fedora with an AMD GPU and I just installed the packages for OpenCL and HIP and now I can run LLMs on my PC using my GPU. I've also used Stable Diffusion with that GPU on Linux before. If there's something specific you want to do regarding that, I'd look up first if you need an Nvidia GPU for that but from my experience AMD GPUs work just fine.

I'd take a look at AMD CPUs again. Last time I checked they were even cheaper (including mobo price) than Intel even though they're also more efficient (faster and less power draw). Prices might have changed tho. You should probably use a Ryzen 5, a Ryzen 7 will only make sense if you use all cores because game performance is pretty much the same. A Ryzen 3 is more of a budget option tho, I wouldn't use that. If it's in your budget, you should also use the newest generation that uses the AM5 socket because you'll be able to upgrade your CPU without needing a new mobo. I think it also only supports DDR5 RAM, which is more expensive than DDR4. If you use a Ryzen generation that uses the AM4 socket, it's gonna be cheaper but if you want to upgrade you'll need a new mobo with AM5 and new DDR5 RAM in addition to the new CPU.

As for Linux distros, my recommendations are Linux Mint if you want something very easy, EndeavourOS if you want something Arch-based or Fedora if you want something that's not quite as easy as Mint but more up-to-date. I personally use Fedora but I used EndeavourOS before. I detailed why I switched to Fedora in a reply here somewhere.

I for one would not purchase any Intel hardware as long as AMD is around. Not that they're bad or anything, but AMD gives me much Kore "bang for the buck". To future proof your rig, I strongly suggest you go with the latest socket (be it Intel or AMD, doesn't matter) and make sure you get DDR5 RAM. PCI Gen 4, and then have at it.

Getting an 80 Plus Gold power supply is always nice too.

And then there's the cooling. I see you went with a radiator and fan, but I strongly suggest getting some type of liquid cooling. The prices are not that bad anymore (unlike about 10 years ago, which was insane).

As for the board, you'll get all kinds of different suggestions. Some people swear by Asus, I'd rather go with Gigabyte (love the Aorus line), so it'll come down to brand trust at the end of the day.

As for the card, I hear a lot of crap given to Nvidia about being closed source, and I sort of agree that's messed up, but ATI cards (while pretty good) are always a step behind Nvidia. Plus, most distros have them working out of the box.

It can be intimidating after so many years, but its way simpler than it was back then.

Good luck man, you got this, there's nothing to fear but fear itself.

I for one would not purchase any Intel hardware as long as AMD is around. Not that they’re bad or anything, but AMD gives me much Kore “bang for the buck”.

If you have a processor line in mind, let me know. Happy to give them another look, given my experience with AMD is 30 some years old.

And then there’s the cooling. I see you went with a radiator and fan, but I strongly suggest getting some type of liquid cooling. The prices are not that bad anymore (unlike about 10 years ago, which was insane).

I'm not tied to the cooling solution I picked. I just picked something that looked affordable and did what I wanted. I'd love to do liquid cooling so long as it isn't a pain. I helped my friend back in high school do liquid cooling and it was a proper mess. We came close to shorting his entire rig.

As for the board, you’ll get all kinds of different suggestions. Some people swear by Asus, I’d rather go with Gigabyte (love the Aorus line), so it’ll come down to brand trust at the end of the day.

I have zero brand loyalty here. The boards I'm looking at right now all have embedded wifi with the annoying antenna...I really want bluetooth embedded so it seems like I'll have to have wifi but just not use it.

May I ask why water cooling? Its just more loud and more expensive afaik, they just look awesome.

Decent air coolers are cheap, silent and easier to install. When I was overclocking i5 9600k temperature was not an issue at all. Is it different with CPUs today?

Simple, the difference in cost is negligible in terms of keeping the CPU at way lower temperatures, extending the life of the CPU and better avoiding throttling. And they are not louder than a regular fan with heatsink, since the fans spin at lower RPMs most of the time because they don't need to increase it since the CPU is already running cooler. And if you add a high end GPU, that's way louder and will drown the noise of any other fan in the rig when it kicks in.

AMD graphics are terrible at any kind of media encoding or decoding. That probably won't affect most people but it can be a problem for self hosting.

I also find that Intel CPUs are much easier to find than AMD when it comes to used hardware.

They are not terrible, they just don't hold a candle to a Nvidia card of the same tier. And the Op is buying new not old. Not once did I say anything bad about any brand, like the Op, I'm not married to any of them, and only speak out of personal experience. I dont make money from them, they make money out of me.

I take it your taking about Intel? AMD GPUs tend to be a pretty good deal as there are tons of them used on eBay.

Maybe, I've never bought PC parts on eBay, or used for that matter. Too risky from my perspective. And yes, I'm talking about AMD GPUs. They are very good, but still behind Nvidia in every aspect.

Except for Linux support. Nvidia is awful on Linux compared to Intel (best) and AMD (solid).

Name one example in which an Nvidia card from the same gen of an AMD GPU performed equal or worse, regardless of the driver. Why do you think that even manufacturers focused on hardware for Linux choose Nvidia over AMD GPUs? Cost? Unlikely, since Nvidia is usually more expensive.

Cost wise I believe they are close. For instance, according to Toms Hardware a RTX 3070 is the same cost and performance as a 6700XT.

https://www.tomshardware.com/features/nvidia-geforce-rtx-3070-vs-amd-radeon-rx-6700-xt

What your saying doesn't align with what I've seen and read online. Admittedly I'm not a GPU expert so maybe I'm just out of touch. Anyway I wouldn't by Nvidia because the free software drivers are still being worked on. We are seeing a lot of progress with NVK but its nowhere near complete.

To be honest with you I mostly use Intel integrated graphics which works very well and can even do some light gaming.

By all means, unless I wanted to play some AAA games, I wouldn't get any dedicated GPU. AMD and Intel's integrated cards are more than adequate for most usecases of daily computer use, and work great in any Linux distros. For example, my work PC runs on a Ryzen 7 7735HS with the integrated 680m iGPU. I usually have 3 or 4 workspaces open at once, each running a different browser (Vivaldi, Brave, Librewolf and Mullvad) with anywhere between 4 to 20+ tabs open on each, OnlyOffice on one workspace and LibreOffuce Calc on another one, a Flatpak of Teams for work, freetube, and a bunch of different dashboards, and an instance of Sins of a Solar Empire (I play while I work). I still have to see my CPU go past 20%. Now, for more powerful needs, I have a laptop running on an Intel 11th gen I7 with a 3070TI Nvidia card. Great laptop (System76 Gazelle16), but I barely ever use it, since I don't really need that much power anyways. The days of forcibly having a dedicated GPU to avoid CPU lag on daily use computing are long gone. Unless you're going to push things to the limit with heavy video rendering, AAA gaming or AI, any integrated GPU will suffice.

Some value software freedom more than performance, and the open source Nouveau Nvidia driver isn't quite there yet on performance.

I absolutely agree with your statement. However, the point in his questions is performance, because of his work with AI. I'd rather Nvidia opensourced their drivers (which is being worked on already, and has been for a while), I think that probably every Linux user wants this to happen already, but that does not change the fact that, even if proprietary drivers are needed for the vest performance, Nvidia is still ahead of anything else out there. But like you, I'm not a fan of having proprietary crap on my devices.

4 more...

It is not exactly what you asked, but if it is customization and expand ability you are looking for, did you consider custom build cites like cyberpowerpc.com? You can select from wide variety of components, they build and test it for you, and make sure that everything is compatible and working. You do pay a bit premium for the assembly and tests but it is not that much, and you save your time and have a peace of mind that everything works.

Oooh, I had not considered that but thank you for the recommendation. The only thing I don't like about these PCs is that they all have RGB lighting. I really don't need this and I don't get their appeal.

I really don't need this and I don't get their appeal [of RGB].

It costs manufacturers pennies to include but gives them an excuse to mark the price up by dollars, so every single one of them shoves it down all our throats.

The only PC fan manufacture that has not gone RGB at all is a Noctua (premium). Their fans are poop brown and beige or black for consumer, grey for industrial, but are great in terms of noise to cool performance. If noise is important then there's videos of people comparing fans so you can pick a tone that is subjectively best.

I enjoyed the days of one color LEDS. Couldn't beat a Tron blue or The Matrix green.

They have tons of cases, including those with minimal RGB, but yes, those are gaming PCs.

A lot of great suggestions here already. But nobody is mentioning that if you really want to future-proof, you should go fully quantum.

I actually did. And the quantum twin that succeeded is now solving global warming.

I am the twin that didn't succeed.

According to global trends, your twin did not succeed either. Sorry.

I went through the same process myself a couple years ago, first PC build in a while. The biggest shock for me was finding out hard drives (SSD, HHD, etc) were outdated: its all about NVMe cards which look like a stick of RAM and plug directly on the motherboard.

Unless you want a bunch of storage and modularity. The benefit to Sata is that it is much more flexible and Sata SSD's are cheaper and can be put in a ZFS raid to increase maximum speeds.

I've gone back and forth on whether I need RAID locally. Giving up at least a third of your storage capacity (assuming RAID 5) for the off-chance that your hard drive dies in 3-4 years seems like a high price to pay. I had two drives fail in the lifespan of my current desktop. And I had enough warning from SMART that I could peel off the data before the drives bricked. I know I got lucky, but still...

If you're practicing 3-2-1 backups then you probably don't need to bother with RAID.

I can hear the mechanical keyboards clacking; Hear me out: If you're not committed to a regular backup strategy, RAID can be a good way to protect yourself against a sudden hard drive failure, at which point you can do an "oh shit" backup and reconsider your life choices. RAID does nothing else beyond that. If your data gets corrupted, the wrong bits will happily be synced to the mirror drives. If you get ransomwared, congratulations you now have two copies of your inaccessible encrypted data.

Skip the RAID and set up backups. It can be as simple as an external drive that you plug in once a week and run rsync, or you can pay for a service like backblaze that has a client to handle things, or you can set up a NAS that receives a nightly backup from your PC and then pushes a copy up to something like B2 or S3 glacier.

The only thing I'll add is that RAID is redundancy. Its purpose is to prevent downtime, not data loss.

If you aren't concerned with downtime, RAID is the wrong solution.

I know that this is the self-hosted community but I very much agree. The way I run my desktop is that I can, in most cases, lose my primary hard drive and I'll survive. It won't be pretty and I might have a few local repos that I haven't synced in a while but overall, it ain't bad.

Now, that doesn't mean I don't want my primary hard drive restored if I can do it. I've been lucky enough to be able to restore them from the drive. But if I can't, the most I lose is some config files, which I should start to version control but I get lazy.

I can't back up my media. It's just too big. But yar.

My greatest fear is losing my porn collection. 😅 But not enough to RAID.

1 more...
1 more...

I've been learning the same. Though, I don't get the sense that SATA is going out of style. I could be wrong though.

You're right, SATA isn't going anywhere for a very long time. If you have a need for 4+ TB of total storage there is nothing at all wrong using HDDs or 2.5" SSDs.

sata would be used more for secondary storage or for systems setup as network attached storage. the nvme m. 2 formfactor for ssds is more convenient for users as its both smaller and does not require the user to wire 2 cables to use it.

is that really a concern? 2 cables vs. pushing a card into the mobo?

The other poster said it's about convenience but that's not really true. The claim to fame for NVMe drives is speed: While SATA SSDs can theoretically run at up to 500 MB/s, the latest NVMe drives can hit 7000+ MB/s.

It's for this reason that you should pay attention to which NVMe drive you choose (if speed is what you're after). SATA-based M.2 drives exist -- and they run at SATA speeds -- so if you see a cheap M.2 drive for sale it's probably SATA and intended for bulk storage on laptops and SFF PCs without room for 2.5" drives. Double check the specs to be sure what you're getting.

I just checked the specs for the M.2 NVMe drive that I pulled from an old laptop. It's read speed is 3000mbs so it looks like I'm good there. Thanks for the heads up though.

Double-check the physical size of your NVME. Some laptops have a smaller M2 form factor than desktops.-

This is the reason why most of us have lived to NVMe. Speed, when compared to SATA, is ludicrous. But SATA is not going anywhere any time soon.

its more so the convenience factor. this also doesnt consider the hard drive mounting mechanism that youd spend time on as well, which limits case choice. with an m.2 drive, the drive can be installed outside of the case trivially

1 more...

You state i5s can't keep up with what you code. What do you code?

I typically code a lot of back-end and processor intensive workloads. The issue I have with i5s is that they don't seem to be as "snappy" as i7s. I've worked with both for good long periods of time. When I had an i5 laptop, I had to off-load a good majority of my development to the cloud because I couldn't do containers and listen to music and run two monitors at the same time. I never had the same issue with i7 processors, even on a laptop.

Seems like you got your answers already, but pcmasterrace community also exist

They sadly don't have 3.5" [floppy] drives anymore, and both the ISA and PCI busses are nowhere to be found 😔

I used pcpartpicker for my latest build it's a good help when assembling and can help avoid those incompatible parts.

Technically 3.5" SSDs are still out there, but they're massive (16-64 TB) and target enterprise use (with a price to match).

And 3.5" is still the standard for platter HDDs, which are still the more economical option if you need large amounts of storage.

Now if you meant no more 3.5" floppy disk drives, then yes, those are definitely gone. ;)

I meant to write 3.5" floppy drives, and yes the 3.5" and 2.5" form factors are still going strong, even if the NVMe's probably will reduce the use of 2.5"

Someone might have already mentioned it, but M.2 is just a physical connector. You can have M.2 SATA or M.2 NVME drives. Prefer NVME (a modern motherboard should support it but older ones only do SATA)

Could you buy both an AMD GPU and a Nvidia GPU? You could pass the Nvidia GPU into a VM with vfio for AI and then you could daily drive AMD with Foss drivers. (AMD is in a little less demand) There is also the option of Intel GPUs as they should work pretty well under Linux.

I personally would avoid Ubuntu do to snaps as there are many other options. Do what you feel comfortable with but you also could go with Linux mint or Fedora both of which don't have snap.

For AI I'm less experienced but I would use containers as that will make the setup much nicer.

Two GPUs? Is that a thing? How does that work on a desktop? Honestly, if it wasn't for my curiosity into AI, I'd just go with the onboard video though given my need for specific resolutions, I find comfort in having a dedicated card.

I've been using ubuntu exclusively for 10 some years and don't use snap at all. tbh, not even sure what snap is.

If it's not apt, then I don't use it.

You do use it as a bunch of snap packages automatically install the snap instead.

For Nvidia I still think passthough is the best option as it isolates the Nvidia issues to a VM instead of the host. If you aren't going to spend a bunch of time on AI then you can just use a CPU as long as you have enough ram.

Two GPUs? Is that a thing? How does that work on a desktop?

GPUs these days aren't like your old Voodoo, with its daisy-chained VGA port and one-way, fixed-function graphics pipeline. They can actually send the results of their calculations back to the CPU over the PCIe bus instead of only out to the monitor!

(In all seriousness though, you don't actually need two GPUs.)

SNAP is just a proprietary packaging by Canonical. Basically the same as a flatpak, but fully controlled by Canonical, store and all. Integrated graphics will give you as much resolution as most GPUs, albeit they won't be able to render at dedicated GPU speeds. But unless you're actually rendering very heavy videos, integrated, matched together with 1 or more CUDA TPUs, and YOU set the limits.

If you don't need a lot of GPU horsepower besides the AI stuff then you could just use the integrated graphics and have a dedicated GPU for the AI stuff.

Having multiple GPUs in your system isn't really that special. Plug HDMI into GPU1 to make GPU1 drive your display/play games. Plug HDMI into GPU2 to have GPU2 do stuff. If you're doing AI work then you don't need anything connected to the GPU, the program just needs to know it's there and to use it.

The only thing to look out for when using the iGPU and a dGPU is that the bios doesn't turn off the iGPU if it detects the dGPU. If you have 2 dGPUs then it shouldn't matter outside of maybe the bios wanting to use the first one.

6 more...
6 more...

Linus tech tips recently made huge pc build guide video that you might benefit from watching.

https://m.youtube.com/watch?v=BL4DCEp7blY&pp=ygUbbGludXMgdGVjaCB0aXBzIGJ1aWxkIGd1aWRl

AM5 sockets are now LGA like Intel. AM4 was the last PGA socket, so bent pins on the chip are a thing of the past. Make sure to leave the socket cover in place while installing the CPU. Now, the fear is bending a pin on the MoBo.

I think you mean LGA (Land Grid Array), meaning the pins are on the motherboard. Ball Grid Array (BGA) is used for embedded, non-removable CPUs.

I did. This should teach me not to try and cook dinner and post at the same time. I'm NOT that good at multitasking...

Btw, there are fast and beautiful M.2/NVMe to USB cases, for your SSD.

Asrock's DeskMeet or DeskSlim series could fit your bill in a small form factor.

The responsiveness between a hard drive and an SSD is night and day. NVMe is even faster but not noticeable unless you move a hell of a lot of data around. A motherboard having at least 1 M.2 NVMe slot is common, so installing the OS on it is an option. Hard drives have more storage per price, but unless space is significant factor I suggest using SSDs (also quieter than a spinning disk!). More info on storage formats in this video

Recent generations of motherboards use DDR5 RAM, which were very expensive on release. I think the price has come down but I am not up to date this generation. You may be able to save money making a DDR4 system but you'll be stuck on a less supported platform.

AMD had like ~10 years of bad/power hungry processors and Intel stagnated, re-releasing 4-core processors over and over. AMD made a big comeback with their Ryzen series becoming best bang for buck, then even over taking Intel. I think it's pretty even now.

If you don't intend to game or do certain compute workloads then you can avoid buying a GPU. Integrated CPUs have come quite far (still low end compared to a dedicated GPU). Crypto mining, Covid and now AI has made the GPUs market expensive and boring. Nvidia has more higher-end cards, mid range is way more expensive for both and low end sucks ass. On Linux AMD GPUs drivers come with the OS, but Nvidia you have to get their proprietary drivers (Linux gaming has come a long way).

Here is an alternative Piped link(s):

this video

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

DDR5 has gone down dramatically compared to launch. You can get 64GB with a very fast bus for under 200 dollars now. At launch 32GB would easily set you back 250+. AMD has made a killing with Ryzen. Never mind the new naming convention that Intel came up with to make it even more complicated to choose the right CPU for your use cases, ridiculous. As for Nvidia GPU drivers, at the end of the day, they just work, regardless their proprietary drivers philosophy (which, again, I agree sucks). But if the OP is going to be doing AI development, machine learning and all that cool stuff, he'd be better served by getting a few CUDA TPUs. You can get those anywhere from 25 dollars to less than 100, and they come in all types (USB, PCI, M.2). https://coral.ai/products/#prototyping-products I have 1 USB Coral running the AI on my Frigate dicker for 16 cameras, and my CPU never reaches more than 12% while the TPU itself barely touches 20% utilization. You put 2 of those bad boys together, and the CPU would probably not even move from idle 🤣

Hold on a second, how come every time i look for TPUs i get a bunch of not-for-sale nvidia and Google cards, but this just exists out there and i never heard of it?

I found out about those about 6 months ago only, and it was by chance while going over the UnRaid forum for Frigate, so I decided to do some research. It took me almost 4 months to finally get my paws on one. They were seriously scarce back then, but have been available for a couple of month now. I only got mine finally at the end of November. They seem to be in an availability trend similar to Raspberry Pis.

8 more...

I was really hoping to cannibalize the 32 GBs of DDR3 RAM but I couldn't find a MoBo that supports it anymore. Then I saw DDR5 is the latest!

I don't really do any gaming. If I wasn't going to tinker with AI, I'd just need a card for dual DisplayPort output. I can support HDMI but...I prefer DP

The 4070 TI will give you quite a few years out of it for sure. You could also completely forego the GPU and get a couple of CUDAs for a fraction of the cost. Just use the integrated graphics and you're golden.

Are CUDAs something that I can select within pcpartpicker? Or is this like a cloud thing?

I misspoke, and I apologize. I could not recall the term TPU, so I just went with the name of the protocol (CUDA). Nvidia has various TPU devices that use CUDA protocol (like the K80 for example). TPUs (Tensor Processing Units) are coprocessors designed to run some GPU intensive tasks without the expense of an actual GPU unit. They are not a one to one replacement, as they perform calculations in completely different ways.

I believe you would be well served by researching a bit and then making an informed decision on what to get (TPU, GPU or both).

1 more...
3 more...
3 more...
11 more...

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:

Fewer Letters More Letters
NAS Network-Attached Storage
NVMe Non-Volatile Memory Express interface for mass storage
PCIe Peripheral Component Interconnect Express
PSU Power Supply Unit
RAID Redundant Array of Independent Disks for mass storage
SATA Serial AT Attachment interface for mass storage
SSD Solid State Drive mass storage
ZFS Solaris/Linux filesystem focusing on data integrity

8 acronyms in this thread; the most compressed thread commented on today has 5 acronyms.

[Thread #529 for this sub, first seen 20th Feb 2024, 23:55] [FAQ] [Full list] [Contact] [Source code]

I went through the comments briefly and didn't see anything about cases. I highly recommend spending some money on a good case with good cable management slots. It may not look important but it will make life so much easier. Fractal cases are good budget friendly ones. I usually prefer large bulky cases (I like ATX), to ensure good ventilation and easiness to assemble things. Having plenty of space to move around helps a lot while cleaning also.

Another thing, you will see a lot of articles about positive pressure or negative pressure fan arrangement and all. TBH, I really don't think that matters a lot. Just regular cleaning with a cheap rocket air blower will do. And more than 4 fans are not really needed. The gain is negligible. But ensure you have a good dedicated fan/water cooling for CPU cooling. I recommend Noctua for air cooling.

Oh regarding backing up your data, make sure you plug in your SSD once in a while, to avoid charge depletion. Nowadays the claim is SSDs have better retention and you don't need to keep it active, but I'm not really sure.

The thing about case pressure does actually matter a lot for dust management. Positive pressure makes the case build up far less dust because air will only flow into the filtered intake, and will flow out through the outtake as well as any openings or gaps it can find, which prevents dust from flowing into the case except by possibly making past the filter.