Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo...

L4sBot@lemmy.worldmod to Technology@lemmy.world – 124 points –
Screens keep getting faster. Can you even tell?
theverge.com

Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo...::CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wonder how we ever put up with ‘only’ 240Hz displays?

96

On one hand, 360hz seems imperceptibly faster than 240hz for human eyes.

On the other hand, if you get enough frames in, you don't have to worry about simulating motion blur.

I never worry about motion blur, because I turn it off. The stupidest effect ever. If I walk around I don't see motion blur. Cameras see motion blur because of shutter speed, not the human eye.

Umm, well, there is something like motion blur experienced by humans, in fact, your brain creates the time bending effect based on picture 1 and picture 2

https://www.abc.net.au/science/articles/2012/12/05/3647276.htm

There is a trick where you watch a clock that counts seconds and turn your head fastly away and back there (or something like that) and you will see, that the rate of seconds seem to be inconsistent

See "1. CHRONOSTASIS" https://bigthink.com/neuropsych/time-illusions/

Alright. I didn't know, thanks. Though the human motion blur is vastly different to camera blur in my experience. And games that have motion blur look really unnatural.

More realistic blur smudges things based on how the object is moving rather than how the camera is moving. For example, Doom Eternal applies some blur to the spinning barrels and the ejected shells on the chaingun while it's firing, but doesn't blur the world while you're sprinting.

Yup this is called per-object motion blur and is more common in modern games. I'm still not that big of a fan but I've heard good things about it from other high framerate enjoyers

On the other hand, humans don't see in defined frames. The signals aren't synchronized. So a big part of perceived blurring is that the succession of signals isn't forming a single focused image. There isn't really a picture 1 and 2 for your brain to process discreetly. And different regions in your vision are more sensitive to small changes than others.

A faster refresh rate is always "better" for the human eye, but you'll need higher and higher panel brightness to have a measurable reaction time difference.

But hitting really high refresh rates requires too many other compromises on image quality, so I won't personally be paying a large premium for anything more than a 120hz display for the time being.

Motion blur in games gives me bad motion sickness and garbles what I'm seeing. I already have a hard enough time processing information fast enough in any kind of fast paced game I don't need things to be visually ambiguous on top of that

Wave your hand in front of your face and tell me it's not blurry

That also depends on the person. Save for really fast moving things I can barely tell the difference between 30 and 60fps, and I cap out at 75 before I can't notice a difference in any situation. One of my friend's anything less than 75 gives them headaches from the choppiness.

Yeah, personally playing games at 30fps feels disruptively laggy at least for the first few minutes. 60 is good, but the jump to 120 is night and day. I was shocked that going from 120 to 240 was just as noticeable an improvement as the last to me, especially when so many people say they don't notice it much. Hard to find newer games that give me that much fps though.

I'd much rather they invest efforts into supporting customisable phones. Instead of just releasing a few flavours of the same hardware each year, give us a dozen features we can opt into or not. Pick a base size, then pick your specs. Want a headphone jack, SD card, FM radio, upgraded graphics performance? No problems, that'll cost a bit extra. Phones are boring now - at least find a way to meet the needs of all consumers.

Not exactly what you are talking about, but slightly related: the company Fairphone makes phones with parts that can easily be replaced. The philosophy is that you will not have to buy a new phone every 3 years. They do have some customized options aswell (i.e. ram, storage, models) but its limited.

But going full on optimization with phones, laptops and tablets, similar as a desktop, is just incredibly hard due to the lack of space in the device for the components. As such it makes more sense to offer a wide variety of models, with some customizable options, and then have the user pick something.

On Fairphone, they flat out refuse to even discuss adding a headphone jack (check the posts in their forums - it's a "hands over ears" no) so I'm sticking with Sony/ASUS (the latter atm as they've been slightly less anticompetitive recently but I'd much rather go to a decent company) until they do... It's not like you notice a phone being 1mm thicker when you have a 3mm case on it anyway

Damned asus. Refusing to have a microsd card slot on a single rog phone, ever.

It seems we all have our dealbreakers. I love the Zenfone but the lack of software updates is a hard no from me.

Their answer is buying the usb-c to 3mm adapter. If you keep that connecter in you bag, ot connected to your headphones, you should be fine most of the time. Unless you would like to charge and listen to audio at the same time.

To me, that feels like a solid design choice, but yes we all have our dealbreakers.

Solid in the same way the designers heads are solid bone I guess...

A 3.5mm adapter is not an answer as it causes wear on the USB C port in ways it's not designed for (but 3.5mm is as it's circular so the cable rotates and breaks before the port), and it's hard to get a good dac that isolates the power noise when using a multiple charging/listening adapter that's also that small

My problem with fair phone is that they use old hardware.

I never replaced any parts on my old phone and only replaced the phone with a new one because it was getting really slow. I replaced the xr with an iPhone 15.

So my concern with the fair phone is that I’ll replace it with faster hardware more frequently than I would have replaced a no repairable phone that’s faster.

I wish a company would build 4.5"-5.5" and 5.5"-6.5" flagship phones, put as many features that make sense in each.

Then when you release a new flagship the last flagship devices become your 'mid range' and you drop the price accordingly, with your mid range dropping to budget the year after.

When Nokia had 15 different phones out at a time it made sense because they would be wildly different (size, shape, button layout, etc...).

These days everyone wants as large a screen as possible on a device that is comfortable to hold, we really don't need 15 different models with slightly different screen ratios.

These days everyone wants as large a screen as possible

iPhone mini gang rise up!

Here's the summary for the wikipedia article you mentioned in your comment:

Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

^to^ ^opt^ ^out^^,^ ^pm^ ^me^ ^'optout'.^ ^article^ ^|^ ^about^

Your talking about project Ara. Google bought it out and killed it roughly ten years ago.

Here's the summary for the wikipedia article you mentioned in your comment:

Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

^to^ ^opt^ ^out^^,^ ^pm^ ^me^ ^'optout'.^ ^article^ ^|^ ^about^

Well no, because most people aren't getting them. It's nice but it's difficulty to justify spending hundreds on a lightly better screen

This tech trickles down to mainstream in a few years. That's always how it is.

Cool, so in a few years we'll have a screen which isn't better in any noticeable way?

Don't be so negative, imagine a phone screen at 480 Hz. It'll be great for when you have too much charge left in your battery and need to drain some.

Well it's more like you'll get the usable parts without a huge premium. The was a time when monitors faster than 60hz were premium but now it's pretty common to see 120hz and beyond on even basic monitors.

There's still diminishing returns as you go higher, but there's definitely a noticeable difference between 60hz and 120hz, as well as a less noticeable but noticeable difference between 120hz and 240hz.240hz is becoming more standard now on regular high end monitors and beginning to trickle down too.

Beyond that in terms of response times, you might not notice a difference between 240hz and 360hz, but image clarity will be better because you'll get less ghosting just from the virtue of the pixels changing so quickly, so it's not entirely useless.

Part of the reason you're seeing this is because they can. The panel technology (OLED in this case) is super fast due to it's design, so it's not too costly to add the necessary hardware to drive those speeds. For LCD tech, you do get to drive the panels faster and harder, that's why older screens required shitty TN panels to get those refresh rates, but everything else has been around for a while.

Reminiscent of the hi-res audio marketing. Why listen at a measly 24bit 48khz when you can have 32/192?!

These have an actual perceivable difference even if subtle. Hires audio, however, is inaudible by humans.

I tend to agree, but the audiophiles always have an answer to rebuttal it with.

I'm into audio and headphones, but since I've never been able to reliably discern a difference with hi-res audio, I no longer let it concern me.

I've bought pretty expensive equipment, tube amplifier, many fancy headphones, optical DACs. A library full of FLAC files. I even purchased a $500 portable DAP. I've never been able to reliably tell a difference between FLAC and 320k MP3 files. At this point, it really doesn't concern me anymore either, but I at least like to see my fancy tube amp light up.

I will say, though, $300 seems to be the sweet-spot for headphones for me.

I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files

I just keep FLAC around so I can transcode them to new lossy formats as they improve. And so I can transcode aggressively for my mobile when I'm streaming from home, and don't need full transparency.

Blackmail -- Evon. That's the one song where I ever heard a difference, though that was ogg, dunno what bitrate I used back then but it was sufficient for everything else. Listening on youtube yep that's mushy. The noisy goodness that kicks in at 0:30, it's crisp as fuck on CD.

...just not the kind of thing those codecs are optimised for I'd say. Also it still sounds fine, just a bit disappointing if you ever heard the uncompressed thing. Which is also why you should never try electrostatic headphones.

Imo the biggest bump is from mp3 to lossless. The drums sound more organic on flacs whereas on most mp3s they sound like a computer MIDI sound.

The biggest bump for me was the change in headphones. It made my really old aac 256kbps music sound bad.

320kbps cbr and v0 vbr mp3 are audibly transparent. Most likely, 250kbps and v2 are too.

Tried flac vs 192 vorbis with various headphones. E.g. moondrop starfield, fiio fa1, grado sr80x...

Can't tell a difference. Kept using vorbis.

They have tests you can take to see if you can hear the difference. A lot of people fail! Lol

Usually percussion is where it's easiest to notice the difference. But typically people prefer the relatively more compressed sound!

I'd thought I could hear a difference in hires audio, but after reading up on it I'm starting to think it may have been some issue with the tech I was using, whether it be my headphones or something else, that made compressed audio sound veeeery slightly staticky when high notes or loud parts of the track played.
Personally though, even if it wasn't, the price for the equipment wasn't worth it for a difference that was only perceptible if I was listening for it. Not to mention it's near impossible to find hires tracks from most bands. Most claiming to be hires are just converted low res tracks and thus have no actual difference in sound quality, the only difference being the file is way larger for no good reason.

So when are 144hz, 1440p, hdr oleds going to come down in price?

I splurged on a 4k 144hz monitor when I worked constant night shifts in covid times and I don't think I will ever need something else.

What is the idea behind 144? It seems to particular a number to be arbitrary. 24, 60 and 120 seem to be based on other techs and related media.

I found people online saying it's because it's 24 frames (standard frame rate) higher than 120 meaning it can be used to watch movies using integer scaling (1:6 ratio of frame rate rather than 1:5.5 or something strange), take that with a massive grain of salt though because lots of people say there's other reasons.

If consuming media with integer scaling is the main concern, then 120Hz would be better than 144Hz, because it can be divided by 5 to make 24Hz (for movies) and divided by 2 or 4 to make 30/60Hz (for TV shows).

144Hz only cleanly divides into 24Hz by dividing it by 6. In order to get to 60Hz you need to divide by 2.4, which is not an integer.

And with either refresh rate 25/50Hz PAL content is still not dividable by a nice round integer value

Yeah as I said take what I said with a massive grain of salt, some people are saying it's because of a limit of hdmi data sending so it could be that.

Oh man those maths didn't click with me, of course it's just another 24 frames.

I think it's because 120hz + overclock can get to 144 so someone probably started selling factory OCd 120hz screens at 144hz and then it caught on. Then someone did the same to native 144hz and we got 165hz. I'm more curious about why 165 was chosen, its not a nice number like 144. Maybe since vrr is widespread now they didn't need nice numbers

All I want is a 27/28 inch oled 4k monitor with good hdr. I don't care about the refresh rate as long a it's 60Hz+

Minimum for me would be 120hz, i've been using 120hz since 2012 (12 years... man) and anything less feels like a massive step backwards. My old S10+ and my cheapie laptop feel sluggish in any animated / transmission scenario.

I'm sticking out with IPS until MicroLED matures enough for me to afford.

OLED was never designed to be used as a computer monitor and I don't want a monitor that only lasts a couple years.

Researchers just designed a special two layer (thicker than current OLED) that doubles the lifespan to 10,000hours at 50% brightness without degrading.

I'm totally with you on good HDR though. When it works, it's as night -and-day as 60 -> 144hz felt for me.

Why would oled only last 2 years?

It doesn't only last for two years, however it begins to degrade after one year of illuminating blue. This would reduce the color accuracy.

However OLEDs are also very bad at color accuracy across it's brightness range. Typically at lower brightness their accuracy goes out the window.

This isn't as bad on smart phones ( smart phones also apply additional mitigations such as subpixel rotation) however desktop computers typically display static images for much longer and so not use these mitigations afaik.

I don't need or want a phone over 90hz, and a pc screen over 180hz. A phone is a waste of battery and a pc screen over that is a waste of money.

Then don't buy them? With better screens coming out the ones you do want to buy get cheaper.

Back in the day 144hz screens cost a premium, now you can have them for cheap.

I stopped buying tvs from 2000 until like two years ago, when i saw them on sale for like $200. Been living off of projectors & a home server. I skipped so many "innovations" like curve, flat, HD, 4K, trueColor.

Weird that it has a OS and that was a shocker.

I look forward to what TVs bring in 2040.

I mean OLEDs are damn amazing image quality wise, but I'm also not a fan of "smart" TVs. The apps can be useful (like native Netflix, Amazon video and so on), but 90% of the time I use my PC over HDMI.

I use my chromecast dongle for my smart TV. My smart TV will never get to have an internet connection.

I didn't say I wanted them disallowed from being made. Just that it's dumb to buy them.

I think there's an argument to make screens faster. Graphics have hit a point where resolution isn't going to give anything of substance... It's now more about making lighting work "right" with ray tracing... I think the next thing might be making things as fluid as possible.

So at least in the gaming space, these higher refresh rates make sense. There's still fluidity that we as humans can notice that we're not yet getting. e.g. if you shake your mouse like crazy, even on a 144hz the mouse will jump around to different spots it's not a fluid motion (I've never seen a 180hz but I bet the same applies).

You can see it moving a mouse super quick on a static background, but I never notice it happening in games. There's probably something there a touch noticeable in some fps online games if you really paid attention and could lock your max fps at 120fps with a 240hz monitor, but that would be about it, and I don't competitively play fps games. I'm perfectly happy with running 60fps at 120hz for myself.

It won't matter until we hit 600. 600 integer scales to every common media framerate so frametimings are always perfect. Really they should be focusing on better and cheaper variable refresh rate but that's harder to market.

Well, not really, because television broadcast standards do not specify integer framerates. Eg North America uses ~59.94fps. It will take insanely high refresh rates to be able to play all common video formats including TV broadcasts. Variable refresh rate can fix this only for a single fullscreen app.

I mean the 240 I use already does that. So would 360 or 480. No clue why you fixate on 600.

How can screens be fast when our eyes are slow?

Maybe this is what Jaden Smith meant when he famously stated:

How Can Mirrors Be Real If Our Eyes Aren't Real

Wow, still blown away...

The bigger the screen, the more you notice because it covers more of your field of view. I would say 240Hz is the sweet spot. You can definitely feel the improvement from lower rates, but rates above it start to be barely noticeable. However I am fine with 144-165Hz if I wanted to save money and still get a great experience. Bellow 120Hz is unusable for me. Once you go high refresh, you cannot go back, ever. 60Hz feels like a slideshow. For gaming 60 is fine, but for work use and scrolling around I can't have 60. Yes people, high refresh rate is useful even outside of gaming.

Funny thing is, while gaming, even if my monitor and PC can do it, I rarely let my fps go above 120-140. I limit them in the game. PC gets much quieter, uses less power, heats up less and its smooth enough to enjoy a great gameplay. I will never understand people who get a 4090 and play with unlocked fps just to get 2000 fps on minecraft while their pc is screaming for air. Limit your fps at least to your Hz people, have some care for your hardware. I know you get less input lag but you are not Shroud, those less 0.000001ms of input lag will not make a difference.

I went from 1080p60 as my standard for literal decades to 3440x1440 @144hz over the last 2 years and I can't go back, mostly for non-gaming activities, find the ultrawide better than multi monitor for me, would love a vertical e-ink display though for text. I also limit my fps to 120, I don't like feeling like my PC is going to take off and the place I rent is older so the room I use for my office is smaller, heats up quickly.

Minecraft actually gets better FPS when you don't limit its FPS. When I play at 60 fps, it usually dips into the 50's and 40's, while when its unlimited, there are no noticeable dips.

I get less motion sickness with higher refresh rates. But anything above 120hz makes no tangible difference. I’m more interested in OLED and color accuracy.

For some, but it's just that it's diminishing returns as you go up. 60 going to 120 is about as visually distinct as going 120 to 240, and then 240 to 480.

I have had a 120hz, 144hz, and 165hz monitors in my house, and although the 120-144 jump was pretty much undetectable to me, thr 120-160 was smoother but not like when I first moved to 120.

I'm colorblind though, so color accuracy doesn't always do much for me until my wife points out people's skin looking weird if I've switched to a color palette that's easier to track for me.

It's complicated. Certain slow and continuously moving objects would be perceived as moving more slowly even fast 500hz, but due to the nature of displays displaying frames, certain other types of motion would show no improvement. For me, 144hz looks the same as 240hz for most games, but not the same for others.

This is the best summary I could come up with:


After all, it wouldn’t be the first time manufacturers have battled over specs with debatable benefit to customers, whether that’s the “megahertz myth” or megapixel wars of the ‘00s or, more recently, smartphone display resolution.

You can read an in-depth breakdown of the reasoning in this post in which they argue that we’ll have to go beyond 1000Hz refresh rates before screens can reduce flicker and motion blur to a level approaching the real world.

Higher refresh rate monitors might be smoother, with better visual clarity and lower input latency for gamers — but at what point does it stop making sense to pay the price premium they carry, or prioritize them over other features like brightness?

All of this also assumes that you’ve got the hardware to play games at these kinds of frame rates, and that you’re not tempted to sacrifice them in the name of turning on some visual eye candy.

But even as a person who’s been enjoying watching the monitor spec arms race from afar, I’m not looking to imminently replace my 100Hz ultrawide LCD, which I’ve been using daily for over half a decade.

Or, to use an even sillier example, it’s like drinking bad coffee after a pandemic spent obsessing over brewing the perfect cup at home.


The original article contains 1,095 words, the summary contains 214 words. Saved 80%. I'm a bot and I'm open source!

I still use a 75hz desktop and 60hz on my Laptop. I can’t tell the difference.

Even looking at the iPhone 15 pro next to a 15 looks the same to me.

To be fair, 60Hz to 75Hz is barely noticeable even by those who are used to notice these things. If you can't tell the difference, it's understandable.

Then, the bigger the screen, the more noticeable high refresh rate is, because it covers more of your field of vision. So in a small iPhone screen it's not easy for everyone to notice (and then there is the fact that iPhones rarely go to 120Hz anyway which is an absolute mess by itself but that's another topic, so your 15 Pro probably rarely goes above a noticeable Hz change anyway).

However, if you get a 144Hz or above, 24" and above monitor, you will IMMEDIATELY see the difference against your 60Hz monitor. Even moving the mouse feels more responsive and accurate. Makes targeting stuff with your mouse easier. Reading text while scrolling is possible also. It unlocks a new world, it's not only for gaming. Casual and work use also benefit a lot from high refresh rates.

When it comes to my phone, I can tell the difference between 90 and 60 Hz. I prefer 90 Hz but I would be okay with 60. Ideally I'd like the highest refresh rate I can get inside my budget when I have to buy a new device with a screen. One annoying limitation is that all the video is played in 24-60 Hz. I notice it when things move with any kind of speed or when I'm looking at background object and the camera moves.

All things considered, it's a minor annoyance. My quality of life wouldn't change if I were limited to 30 Hz.

Now I need to find a 144hz monitor. I’m not gaming outside of my steamdeck and don’t do online, so I don’t care about that. But I’m always working with text.

I can’t even be bothered with 4k, never mind this. I guess least it gives them something to do…

I just want my low key input to display latencies in games and I'll be happy with as low as 40 fps. Unless it's vr then I also need way more stuff than I can afford like 140+hz, lenses and fov something something, panel tech something also, etc. in order to not get sick

My screen is still the same 144hz. How do you make it go faster?

Honestly as funny as it is I've been able to notices differences between 120 - 144 hz displays pretty consistently

What a dumb premise.

No, the content isn't a thing on the media side and most folks don't have 3080 class hardware to get to 120 much less higher at any decent resolution and fidelity.