deadbeef

@deadbeef@lemmy.nz
0 Post – 44 Comments
Joined 1 years ago

Not sure a short summary will cut it.

They had no competition for a long period and ended up with an accountant CEO that caused their R&D to stagnate massively. They had a ton of struggling and failing to deliver all in most areas, and they wombled about releasing CPU generations with ~4% performance uplifts, probably saving a few bucks in the process.

AMD turned back up again with Ryzen and Epyc models that were pretty good and and an impressive pace of improvement ( like ~14% generational uplifts ) that caused them such a fright that they figured out they had to ditch the accountant.

Pat Gelsinger was asked to step up as CEO and fix that mess. They axed some obvious defective folks in their structure and rushed about to release 12th generation products with decent gains by cranking the power levels of the CPUs to absurd levels, this was risky and it kind of looks like they are being bit with it now.

Server CPU sales are way down because they are just plain uncompetitive. They have missed out on the chunk of money they could have got from the AI bubble because they never had a good GPU architecture they could leverage over to use. They have been shutting down unprofitable and troublesome divisions like the Optane storage and NUC divisions to try and save money, but they are in a bad way.

The class actions mentioned elsewhere in the thread are probably coming because the rush to make incremental improvements to 13th generation and 14th generation CPU's resulted in issues with power levels and other problems that seem to be causing those CPU's to crash and sometimes fail altogether.

11 more...

I got a pretty nice Yamaha bluray player that was an appropriate match to my home theatre amp.

Put a bluray in it, got a piracy warning, a few unskippable ads for other movies, an obnoxious excessively drawn out animated menu screen that stuttered like hell and was laggy to use.

Pulled the bluray back out of it, stuck it back in the DVD drawer and proceeded to download a copy of the movie to watch. Been doing that ever since.

6 more...

1988 Nissan Skyline GT with an RB20DET.

It was abandoned by my uncle at our place when he moved overseas and subsequently my sister drove it around a bit. Eventually it leaked coolant from the water pump, overheated and blew a head gasket because she wasn't paying attention.

I was unemployed and bored and I decided to pull it apart and bought all the bits to fix it. I didn't really know anything about mechanical stuff at the time, but I am good at logic and try not to be useless at practical stuff even though I'm really a computer geek. I drove it around for a bunch of years after that until I was earning enough money that I could buy something I wanted which was a Mitsubshi EVO 1.

So to answer the question, favorite thing was that I rescued it from oblivion even though I didn't know much about cars or engines at the time.

The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won't work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.

I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).

I build ISP and private data networks for a living.

A contention ratio for residential circuits of 3 to 1 isn't bad at all. You'd have to get pretty unlucky with your neighbors being raging pirates to be able to tell that was contended at all. Any data cap should scare the worst of the pirates away, so you probably won't be in that situation.

If you can feel the circuit getting worse at different times of the day then the effective contention ( taking into account further upstream ) is probably more like 30 to 1 than 3 to 1.

6 more...

I got banned from Reddit while I was on a work trip to the USA. Hadn't posted in over a year at the time. I'm absolutely mystified as to why.

They refer you to the content policy but they won't tell you which post might violate it. I asked a few times what it was that caused the ban, but they either just referred to the content policy or once they said "Repeated violations". I actually requested my user data so I could stick up my complete post history publicly and see if anyone else could figure it out. My username was / is sirdeadbeef on reddit.

I haven't put any work into displaying them nicely, this is the format you get them in if you request your data:

https://db.osoal.org.nz/reddit/

1 more...

It isn't something that is in the distro vendors control. Nvidia do not disclose programming info for their chipsets. They distribute an unreliable proprietry driver that is obfuscated to hell so that noone can help out fixing their problems.

If you use an AMD card it will probably work fine in Windows and Linux. If you use an Nvidia card you are choosing to run windows or have a bad time in Linux.

3 more...

Please drink a verification can to continue.

A couple of seagulls made their nest in the cooling vent for the radiator of one of our backup generators. I caught it on our security cameras and mentioned it to management which resulted in folks being dispatched to evict them and clean up the giant pile of sticks and other junk they had dragged in.

Not sure what would have happened next time the thing started, so it was probably for the best. I still felt bad.

The samsung TV that I bought for my son had this annoying overlay thing that pops up when you turn it on that shows all the different inputs and nags about various things it thinks are wrong with the world. It is plugged into an Nvidia shield that we do most things on, but you can't use the shield until the overlay calms the fuck down and disappears.

It'd be great if you could just have the thing turn on and display an input like our older TVs do.

4 more...

I learned what a tankie is, which is fun.

I've been commenting a bit, whereas on reddit I would only post a comment a few times a year when I could be bothered dealing with the likely burst of negativity that would come as a response to it.

Kind of feels a bit more like Web 1.9 or so from about 2003 which I think was about the sweet spot for minimal rage bait and crazy and still a decent bit of user interaction and scale.

It would be about perfect if you could chop out a few of the folks trying to shoehorn in politics to every little thing.

The support for larger numbers of monitors and mixed resolutions and odd layouts in KDE vastly improved in the ubuntu 23.04 release. I wouldn't install anything other than the latest LTS release for a server ( and generally a desktop ), but KDE was so much better that it was worth running something newer with the short term aupport on my desktops.

We aren't too far off the next LTS that will include that work anyway I guess. I'm probably going to be making the move to debian rather than trying that one out though.

The most impressed I've been with hardware encoding and decoding is with the built in graphics on my little NUC.

I'm using a NUC10i5FNH which was only barely able to transcode one vaguely decent bitrate stream in software. It looked like passing the hardware transcoding through to a VM was too messy for me so I decided to reinstall linux straight on the hardware.

The hardware encoding and decoding performance was absolutely amazing. I must have opened up about 20 jellyfin windows that were transcoding before I gave up trying and called it good enough. I only really need about 4 maximum.

The graphics on the 10th generation NUC's is the same sort of thing that is on the 9th gen and 10th gen desktop cpu's, so if you have and intel cpu with onboard graphics give it a try.

It's way less trouble than the last time I built a similar setup with NVidia. I haven't tried a Radeon card yet, but the jellyfin docs are a bit more negative about AMD.

2 more...

Last week I did an install of Debian 12 on a little NUC7CJYH to use for web browsing and ssh sessions into work and ended up with wayland by default. Seems to work great.

From what I have experienced, it goes great with intel integrated graphics, great with a radeon card and can be made to work with Nvidia if you are lucky or up for a fight.

I have two AMD Radeon cards for Linux that I'm pretty happy with that replaced a couple of Nvidia cards. They are an RX6800 and an RX6700XT. They were both ex mining cards that I bought when the miners were dumping their ethereum rigs, so they were pretty cheap.

If I had to buy a new card to fill that gap, I'd probably get a 7800XT, but if you don't game on them you could get a much lower end model like an RX7600.

Haha, 144p @ 60hz is fricking hilarious.

Reminds me of seeing completely rubbish resolution real player videos embedded in websites back in the late 90s and me thinking, "Well that isn't ever going to take off".

This video is perfectly applicable, the rot that sets in in a large company when you have no competition to counteract it is exactly what has happened here.

It might help for the folks here to know which brand and model of SSDs you have, what sort of sata controllers the sata ones are plugged into and what sort of cpu and motherboard the nvme one is connected to.

What I can say is Ubuntu 22.04 doesn't have some mystery problem with SSDs. I work in a place where we have in the order of 100 Ubuntu 22.04 installs running with SSDs, all either older intel ones or newer samsung ones. They go great.

1 more...

Sorry to hear about that mess.

I posted here https://lemmy.nz/comment/1784981 a while back about what I went through with the Nvidia driver on Linux.

From what I can tell, people who think Linux works fine on Nvidia probably only have one monitor or maybe two that happen to be the same model ( with unique EDID serials FWIW ). My experience with a whole bunch of mixed monitors / refresh rates was absolutely awful.

If you happen to give it another go, get yourself an AMD card, perhaps you can carry on using the Nvidia card for the language modelling, just don't plug your monitors into it.

My day job is building ISP networks. It's been about 20 years since I had a home connection that I didn't configure up both ends of myself.

I've got a 1G / 500M tail into home where I am right now, not that that is particularly impressive. One of the jobs I've been putting off at work is standardising our usage of the 10G GPON platform available here in NZ, when I do that I'll get one of the >1G tails to use at home.

Usually the answer is how ever much I can be bothered building, but my usage is pretty low.

2 more...

I had a Brother black and white laser (I think a HL1240?) for almost 10 years and then we started having to print a ton of education related stuff for our kid and colour made sense, so I got the closest thing that I could to the colour one that I use at work which ended up being a DCPL3551CDW.

Printing a little in Windows and Linux, but more often from apps on my Android phone and my partners iPhone.

I absolutely hate printers, but they have been fine.

This damnable prison of log and ice eats away at my fibre. I find the lack of culture astonishing.

Steam can do pretty well filling a tail circuit, probably better on average. But a torrent of a large file with a ton of peers when your client has the port forward back into the client absolutely puts more pressure on a tail circuit. More flows makes the shaping work harder.

Sometimes we see an outlier in our reporting and it's not obvious if a customer has a torrent or a DDoS directed at them for the first few minutes.

I've been using Linux for something like 27 years, I wouldn't say evangelical or particularly obsessed.

I started using it because some of the guys showing up to my late 90's LAN parties were dual booting Slackware it and it had cool looking boot up messages compared to DOS or Windows at the time. The whole idea of dual booting operating systems was pretty damn wild to me at the time too.

After a while it became obvious to me that Slackware '96 was way more reliable than DOS or Windows 95 at the time, a web browser like Netscape could take out the whole system pretty easily on Windows, but when Netscape crashed on Linux, you opened up a shell and killed off whatever was left of it and started a new one.

I had machines that stayed up for years in the late 90's and that was pretty well impossible on Windows.

I've been running Linux for 100% of my productive work since about 1995. Used to compile every kernel release and run it for the hell of it from about 1998 until something like 2002 and work for a company that sold and supported Linux servers as firewalls and file servers etc.

I had used et4000's, S3 968's and trio 64's, the original i740, Matrox g400's with dual CRT monitors and tons of different Nvidia GPU's throughout the years and hadn't had a whole lot of trouble.

The Nvidia Linux driver made me despair for desktop Linux for the last few years. Not enough to actually run anything different, but it did seem like things were on a downward slide.

I had weird flashing of sections of other windows when dragging a window around. Individual screens that would just start flashing sometimes. Chunky slideshow window dragging when playing video on another screen. Screens re-arranging themselves in baffling orientations after the machine came back from the screen being locked. I had crap with the animation rate running at 60hz on three 170hz monitors because I also had a TV connected to display network graphs ( that update once a minute ). I must have reset up the panels on cinnamon, or later on KDE a hundred times because they would move to another monitor, sometimes underneath a different one or just disappeared altogether when I unlocked the screen. My desktop environment at home would sometimes just freeze up if the screen was DPMS blanked for more than a couple of hours requiring me to log in from another machine and restart X. I had two different 6gb 1060's and a 1080ti in different machines that would all have different combinations of these issues.

I fixed maybe half of the issues that I had. Loaded custom EDID on specific monitors to avoid KDE swapping them around, did wacky stuff with environment variables to change the sync behaviour, used a totally different machine ( a little NUC ) to drive the graphs on the TV on the wall.

Because I had got bit pretty hard by the Radeon driver being a piece of trash back in something like 2012, I had the dated opinion that the proprietary Nvidia driver was better than the Radeon driver. It wasn't til I saw multiple other folks adamant that the current amdgpu driver is pretty good that I bought some ex-mining AMD cards to try them out on my desktop machines. I found out that most of the bugs that were driving me nuts were just Nvidia bugs rather than xorg or any other Linux component. KDE also did a bunch of awesome work on multi monitor support which meant I could stop all the hackery with custom EDIDs.

A little after that I built a whole new work desktop PC with an AMD GPU ( and CPU FWIW ) . It has been great. I'm down from about 15 annoying bugs to none that I can think of offhand running KDE. It all feels pretty fluid and tight now without any real work from a fresh install.

I have an A1502 Macbook that I have been using for work since it was new in 2014. It triple boots Windows, Linux and OSX, but I only really use Linux.

Mine has the same CPU, a i5-4308U but 16GB of memory, I think it was a custom order at the time.

If I recall I did the regular bootcamp process you would do to install Windows, installed Windows on a subset of the free space and Linux on the rest.

I've got Linux mint 21 on it currently, but I have had vanilla Ubuntu at different times. I can't think of anything on it that doesn't just work off hand.

I swear allegiance to the only one true storage vendor, Micropolis. The Micropolis 1323A being the embodiment of perfection in storage basked in the glow of the only holy storage interconnect, MFM.

I wait patiently for the return of Micropolis so that I may serve as their humble servant.

Your bridge isn't bridging properly. If Router B is sending a destination unreachable then the packets are being handled on it further up the stack at layer 3 by some sort of routing component rather than by a layer 2 bridging one.

2 more...

If you haven't already, try running hdparm on your drive to get an idea of if the drives are at least doing large raw reads straight off the disk at an appropriate performance level.

This is output from the little NUC I'm using right now:

# lsblk
NAME   MAJ:MIN RM   SIZE RO TYPE MOUNTPOINTS
sda      8:0    0 465.8G  0 disk 
├─sda1   8:1    0   512M  0 part /boot/efi
├─sda2   8:2    0 464.3G  0 part /
└─sda3   8:3    0   976M  0 part [SWAP]

# hdparm -i /dev/sda

/dev/sda:

 Model=Samsung SSD 860 EVO 500GB, FwRev=RVT02B6Q, SerialNo=S3YANB0KB24583B
...

# hdparm -t /dev/sda

/dev/sda:
 Timing buffered disk reads: 1526 MB in  3.00 seconds = 508.21 MB/sec

If your results are really poor for this test then it points more at the drive / cable / controller / linux controller driver.

If the results are okay, then the issue is probably something more like a logical partitioning / filesystem driver issue.

I'm not sure what a good benchmark application for Linux that tests the filesystem layer as well is other than bonnie++ which has been around forever. Someone else might have a more current idea of something to use for this.

We did an address check when we could first order it and about a third of the folks in the office could get it about a year and a half ago. I know the majority of the address checks that we do for commercial locations in tenders come up positive now.

It is not cheap to get an off the shelf router that does a solid job of forwarding multiple gigabits and the vast majority of folks ( me included ) probably will rarely notice the difference outside of speed tests. The last firewall build that I did for home was with a pair of virtual Linux boxes with 10G interfaces just so I could do a 2G or 4G GPON upgrade later on without having to throw everything out.

In New Zealand it seems like 10G GPON services are mostly cannibalizing high quality lit ethernet services at 1G and 10G subrate rather than replacing consumer tails. So more likely a business is going from spending $1500 a month on uncontended 1G to spending $400 a month on contended 4G, rather than a residential user going from spending $150 on 1000/500 to $280 on 2000/2000.

I started using reddit during the digg migration. I lurked, replied to stuff and tried to upvote sanity in technical threads occasionally.

A year or so ago during a work trip to another continent I found that my account had been banned for "violation of the content policy". I worked their process to try and figure out why, but the replies were totally vague and either bot like or possibly written by someone with english as a second language.

It turned out that at the point I was banned I hadn't actually posted anything in over a year, so I really didn't have anything to go off. It is still a total mystery to me. I created a new account ( which I know they could consider ban evasion ) so I could copy my subscribed subreddits over and I was just lurking for the last year or so until the noise from their API changes pointed me at all the current alternatives. So here I am checking out the alternatives.

Which workflows? Asking because I'd like to experiment with some edge case stuff.

I'm running KDE with wayland on multiple different vintage machines with AMD and intel graphics and it would take alot for me to go back to the depressing old mess that was X.

The biggest improvement in recent times was absolutely pulling out all my Nvidia cards and putting in second hand Radeon cards, but switching to wayland fixed all the dumb interactions between VRR ( and HDR ) capable monitors of mixed refresh rates.

Even the little NUC that drives the three 4k TV's for the security cameras at work is a little happier with wayland, running for weeks now with hardware decoding, rather than X crashing pretty well every few days.

3 more...

I just read the update to the post saying that the issue has been narrowed down to the NTFS driver. I haven't used NTFS on linux since the NTFS fuse driver was brand new and still wonky as hell something like 15 years ago, so I don't know much about it.

However, it sounds like the in kernel driver was still pretty fresh in 5.15, so doing as you have suggested and trying out a 6.5 kernel instead is a pretty good call.

If you go back a bit further, multi monitor support was just fine. Our office in about 2002 was full of folks running dual ( 19 inch tube! ) monitors running off matrox g400's with xinerama on redhat 6.2 ( might have been 7.0 ). I can't recall that being much trouble at all.

There were even a bunch of good years of the proprietry nvidia drivers, the poor quality is something that I've only really noticed in the last three or so years.

Appreciate the reply. Which desktop environment are you using?

My only experience with Wayland is also with KDE. Wheres for the 27-ish years before that I've used all sorts of stuff with X.

I've scripted the machine that drives the frontend for our video surveilance ssytem to place windows exactly where I want them when it comes up.

I use a couple of dbus triggers that make the TV on the wall in my garage go to sleep from the shell, perhaps not tested via ssh though. They were pretty well the functional equivalent of some xset dpms commands that I used to use. Not sure if that is what you were meaning. I think I also had something working that disabled the output altogether. I think that was pretty clunky as it used some sort of screen ID that would occasionally change. Sorry I'm hazy on the details, I'm old.

I'll try it all out when I get home, I've got to find some old serial crap for a coworker in the garage anyway.

1 more...

Oh yeah. That video of Linus Torvalds giving Nvidia the finger linked elsewhere in this thread was the result of a ton of frustration around them hiding programming info. They also popularised a dodgy system of LGPL'ing a shim which acted as the licence go-between the kernel driver API ( drivers are supposed to be GPL'd ) and their proprietary obfuscated code.

Despite that, I'm not really that anti them as a company. For me, the pragmatic reality is that spending a few hundred bucks on a Radeon is so much better than wasting hours performing arcane acts of fault finding and trial and error.

The context of this post is Linux on AMD cards, is there any support at all for raytracing or upscaling of any sort on Linux on either AMD or Nvidia? Serious question.

Agreed, it seems like they should have put just a little bit more in the standard feature set so every little window manager doesn't have to reinvent the wheel.

Often on Linux group membership changes only take effect on login. So you could try logging out of your session and logging back in after your group changes to test that theory out.

I replaced mythtv with tvheadend on the backend and kodi on the frontend like 5 or 6 years ago.

The setup and configuration at the time on mythtv was slanted towards old ( obsolete ) analog tuners and static setup and tvheadend was like a breath of fresh air in comparison where you could point it at a DVB mux or two and it would mostly do what you want without having to fight it.

I'm not sure how much longer I'll want something that can tune DVB-S2 and DVB-T though. Jellyfin and friends handle everything other than legacy TV better than kodi these days.