What was wrong with them? They served their purpose just fine for many years
The weighed a ton, they were limited in size, their resolution was terrible, they sucked down electricity...
Their screen was curved the wrong way until they released flat screen TVs
4:3 resolution meant you lost some of the content from movies or you watched them with black bars
Except movies keep changing so now if you want imax at home you need 4:3.
Whatever isn't available at home is what movies will change to to keep themselves unique.
Widescreen has been the movie industry standard for how many decades now? IMAX is its own beast but most movies aren't filmed in real IMAX resolution and now there's digital IMAX which is basically 19:10 which is the same as many TVs...
Movies used to be all 4:3 before tv. It's called the academy ratio. Movies now do 1.85:1 and even 2.39:1. A few even do anamorphic 2.76:1. Anything but the dominant home format.
Major movie studios have mostly used widescreen since the 1950s and all the different ratios you mentioned except 4:3 are better watched on a widescreen TV than a 4:3 TV.
4:3 resolution also means that a lot of good shows will never be watchable in the proper 16:9 format
No, it means 4:3 IS proper format
We had four channels and loved it!
And most people were lucky to have a TV. You were lucky to have a HOUSE! We used to live in one room, all hundred and twenty-six of us, no furniture. Half the floor was missing; we were all huddled together in one corner for fear of FALLING!
Have you compared NES games on a CRT with the same games on a modern screen?
CRTs just look miles better.
EDIT: OK, it's ackchually not technically "resolution" per se, I get it. :p
That's because the graphics were tailored to CRT resolution - which is to say, [things that just so happened to have] low/outright bad resolution.
CRTs have advantages over more modern stuff but that's mostly about latency.
It's not as much about resolution as it was about exploiting the quirks of CRT. Artists usually "squished" sprites horizontally (because crt screens would stretch them) and used the now famous "half dot" technique to have more subtle shading than what was actually possible at the pixel level. So if you just display the original sprites with no stretch and no "bleed" between pixels, it doesn't look as good as it should.
I had never heard of this. Fascinating!
That’s because the graphics were tailored to CRT resolution - which is to say, low/outright bad resolution.
No, it's because the graphics were tailored to the analog characteristics of CRTs: things like having scanlines instead of pixels and bleed between phosphors. If they were only tailored to low resolution they'd look good on a low resolution LCD, but they don't.
I admit I'm quibbling, but the whole thread is that, so...
As a pedant, that is impressive work. Fair enough.
Thanks, but @Sylvartas' reply (which I didn't read until after writing mine) did a much better job, TBH.
CRTs themselve don't have concept of resolution.
Maybe one's before the 80s didn't, but they almost all do after that. Exactly how this worked varied between manufacturers and the display type, but they tended to have some kind of mask that pushes it to a preferred resolution.
CRTs don't have pixels so the resolution of the signal isn't that important. It's about the inherent softness you get from the technology. It's better than any anti-aliasing we have today.
CRTs do have pixels. If they didn't, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently "soft" is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
Would it, though? I'm skeptical.
If it did, it wouldn't be because they have "pixels," though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.
Otherwise (if it didn't overheat), it should "work." The result might look weird if the modulation of the signal didn't line up with the apertures in the shadow mask right, but I don't see any reason why sweeping the beam across faster would damage the phosphors. (Also, I'm not convinced a black & white TV would have any problem at all.)
It will tend to turn the beam on when it's off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.
Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.
When you say "blow up" do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?
I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.
Literally blow up the tube in the worst cases.
Interesting, TIL!
Crts were jagged and blurry. A misconverged pixel isn't good anti aliasing.
That image is a digital rendering of the raw data, not a photo of how a CRT would render it.
CRTs were nothing if not the opposite of jagged.
I grew up on 2600 on a tv in the 70's. Computer graphics on crts were incredibly jagged. If you used a magnifying glass on a pixel it was blurred misconverged spot because it didn't hit the shadow mask exactly on target.
The jaggedness of the 2600 wasn't because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled -- naively, with no antialiasing! -- even just to get to NTSC (480 scanlines, give or take).
So yeah, when each "pixel" is three scanlines tall, of course it's going to look jagged even after the CRT blurs it!
A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn't good even on computers that could output higher than 2600 resolution.
CRT filters exist now, and with HDR output (or just sending an HDR-enable signal to get tv's to use the full brightness range) and 4k displays it honestly as good at this point. or better because the only good CRT's you can get now are pretty small P/BVM and my tv is much bigger than those
There are plenty of upscalers with minimal latency that fix that.
There also isn't just "CRT" in this space. Professional video monitors give a very different picture than a consumer TV with only the RF converter input.
If one more under 25 retro fan tells me that RF tuners are the "true experience", I'm going to drink myself to death with Malort.
Edit: please don't tell me you believe CRTs have zero latency. Because that's wrong, too.
Compare a PS5 on a modern day large screen 4k TV vs a CRT of your favorite brand from any year.
If your only use case is playing old consoles, there's filters for current emulators that fill that need adequately.
[This comment has been deleted by an automated system]
Use better power supply
Is there a better power supply than cable + wall socket?
Better than the one soldered into the main circuit board of the TV?
Yes! Hire an electrical engineer to improve it for you, you pleb! \s
If you got TV from crapufacturer, then yes.
Sony Trinitrons had whine to them, and those were basically the top consumer display back then. I think my JVC PVM has power supply whine.
Are you serious?
Curved (the wrong way)
Massively heavy
Noise (just from the unit itself
Very low resolution
Noticably hot (might be a benefit in the winter)
Small picture, especially relative to weight
Depending how far back you go, no/shitty remote, only has 1 port for video
[This comment has been deleted by an automated system]
Sometimes I think about how some technologies could have evolved if they didn't get out of fashion. I always thought it's a bit unfair to compare products made decades ago with new ones and use it as a comparison for the whole technology.
In the case of crts, it would be totally possible to make them with modern aspect ratio and resolutions. The greatest challenges would probably be size, weight and power consumption.
Very low resolution
For TVs, that's just because they didn't need any more resolution because the signal they were displaying was 480i (or even worse, in the case of things like really old computers/video game consoles).
My circa-2000 19" CRT computer monitor, on the other hand, could do a resolution that's still higher than what most similarly-sized desktop flat screen monitors can manage (it was either QXGA [2048x1536] or QSXGA [2560x2048], I forget which).
And then, of course, there were specialized CRT displays like oscilloscopes and vector displays that actually drew with the electron beam and therefore had infinite "resolution."
Point is, the low resolution was not an inherent limitation of CRT technology.
They were great until you had to move them. They were clunkier than a sofa because they had no place to hold and weighted as much as a refrigerator
They did break, You know? My father fixed those things, it's that they were actually fixable back then and it was cool.
Or maybe it was just russian tech that broke, we lived in one of those ussr sattellite countries.
What was wrong with them? They served their purpose just fine for many years
The weighed a ton, they were limited in size, their resolution was terrible, they sucked down electricity...
Their screen was curved the wrong way until they released flat screen TVs
4:3 resolution meant you lost some of the content from movies or you watched them with black bars
Except movies keep changing so now if you want imax at home you need 4:3.
Whatever isn't available at home is what movies will change to to keep themselves unique.
Widescreen has been the movie industry standard for how many decades now? IMAX is its own beast but most movies aren't filmed in real IMAX resolution and now there's digital IMAX which is basically 19:10 which is the same as many TVs...
Movies used to be all 4:3 before tv. It's called the academy ratio. Movies now do 1.85:1 and even 2.39:1. A few even do anamorphic 2.76:1. Anything but the dominant home format.
Major movie studios have mostly used widescreen since the 1950s and all the different ratios you mentioned except 4:3 are better watched on a widescreen TV than a 4:3 TV.
4:3 resolution also means that a lot of good shows will never be watchable in the proper 16:9 format
No, it means 4:3 IS proper format
We had four channels and loved it!
And most people were lucky to have a TV. You were lucky to have a HOUSE! We used to live in one room, all hundred and twenty-six of us, no furniture. Half the floor was missing; we were all huddled together in one corner for fear of FALLING!
Have you compared NES games on a CRT with the same games on a modern screen?
CRTs just look miles better.
EDIT: OK, it's ackchually not technically "resolution" per se, I get it. :p
That's because the graphics were tailored to CRT
resolution- which is to say, [things that just so happened to have] low/outright bad resolution.CRTs have advantages over more modern stuff but that's mostly about latency.
It's not as much about resolution as it was about exploiting the quirks of CRT. Artists usually "squished" sprites horizontally (because crt screens would stretch them) and used the now famous "half dot" technique to have more subtle shading than what was actually possible at the pixel level. So if you just display the original sprites with no stretch and no "bleed" between pixels, it doesn't look as good as it should.
I had never heard of this. Fascinating!
No, it's because the graphics were tailored to the analog characteristics of CRTs: things like having scanlines instead of pixels and bleed between phosphors. If they were only tailored to low resolution they'd look good on a low resolution LCD, but they don't.
I admit I'm quibbling, but the whole thread is that, so...
As a pedant, that is impressive work. Fair enough.
Thanks, but @Sylvartas' reply (which I didn't read until after writing mine) did a much better job, TBH.
CRTs themselve don't have concept of resolution.
Maybe one's before the 80s didn't, but they almost all do after that. Exactly how this worked varied between manufacturers and the display type, but they tended to have some kind of mask that pushes it to a preferred resolution.
http://filthypants.blogspot.com/2020/02/crt-shader-masks.html
CRTs don't have pixels so the resolution of the signal isn't that important. It's about the inherent softness you get from the technology. It's better than any anti-aliasing we have today.
CRTs do have pixels. If they didn't, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
The exact mechanism varied between manufacturers and types: http://filthypants.blogspot.com/2020/02/crt-shader-masks.html
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently "soft" is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
Would it, though? I'm skeptical.
If it did, it wouldn't be because they have "pixels," though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.
Otherwise (if it didn't overheat), it should "work." The result might look weird if the modulation of the signal didn't line up with the apertures in the shadow mask right, but I don't see any reason why sweeping the beam across faster would damage the phosphors. (Also, I'm not convinced a black & white TV would have any problem at all.)
It will tend to turn the beam on when it's off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.
Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.
When you say "blow up" do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?
I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.
Literally blow up the tube in the worst cases.
Interesting, TIL!
Crts were jagged and blurry. A misconverged pixel isn't good anti aliasing.
https://en.m.wikipedia.org/wiki/Missile_Command#/media/File%3AA5200_Missile_Command.png
That image is a digital rendering of the raw data, not a photo of how a CRT would render it.
CRTs were nothing if not the opposite of jagged.
I grew up on 2600 on a tv in the 70's. Computer graphics on crts were incredibly jagged. If you used a magnifying glass on a pixel it was blurred misconverged spot because it didn't hit the shadow mask exactly on target.
Look at that rope: https://www.deviantart.com/gameuniverso/art/Review-of-Pitfall-Atari-5200-761326088
"Blurred" is the opposite of "jagged," though.
The jaggedness of the 2600 wasn't because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled -- naively, with no antialiasing! -- even just to get to NTSC (480 scanlines, give or take).
So yeah, when each "pixel" is three scanlines tall, of course it's going to look jagged even after the CRT blurs it!
A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn't good even on computers that could output higher than 2600 resolution.
CRT filters exist now, and with HDR output (or just sending an HDR-enable signal to get tv's to use the full brightness range) and 4k displays it honestly as good at this point. or better because the only good CRT's you can get now are pretty small P/BVM and my tv is much bigger than those
There are plenty of upscalers with minimal latency that fix that.
There also isn't just "CRT" in this space. Professional video monitors give a very different picture than a consumer TV with only the RF converter input.
If one more under 25 retro fan tells me that RF tuners are the "true experience", I'm going to drink myself to death with Malort.
Edit: please don't tell me you believe CRTs have zero latency. Because that's wrong, too.
Compare a PS5 on a modern day large screen 4k TV vs a CRT of your favorite brand from any year.
If your only use case is playing old consoles, there's filters for current emulators that fill that need adequately.
They make a high pitched whine
[This comment has been deleted by an automated system]
Use better power supply
Is there a better power supply than cable + wall socket?
Better than the one soldered into the main circuit board of the TV?
Yes! Hire an electrical engineer to improve it for you, you pleb! \s
If you got TV from crapufacturer, then yes.
Sony Trinitrons had whine to them, and those were basically the top consumer display back then. I think my JVC PVM has power supply whine.
Are you serious?
[This comment has been deleted by an automated system]
Sometimes I think about how some technologies could have evolved if they didn't get out of fashion. I always thought it's a bit unfair to compare products made decades ago with new ones and use it as a comparison for the whole technology.
In the case of crts, it would be totally possible to make them with modern aspect ratio and resolutions. The greatest challenges would probably be size, weight and power consumption.
For TVs, that's just because they didn't need any more resolution because the signal they were displaying was 480i (or even worse, in the case of things like really old computers/video game consoles).
My circa-2000 19" CRT computer monitor, on the other hand, could do a resolution that's still higher than what most similarly-sized desktop flat screen monitors can manage (it was either QXGA [2048x1536] or QSXGA [2560x2048], I forget which).
And then, of course, there were specialized CRT displays like oscilloscopes and vector displays that actually drew with the electron beam and therefore had infinite "resolution."
Point is, the low resolution was not an inherent limitation of CRT technology.
They were great until you had to move them. They were clunkier than a sofa because they had no place to hold and weighted as much as a refrigerator
They did break, You know? My father fixed those things, it's that they were actually fixable back then and it was cool. Or maybe it was just russian tech that broke, we lived in one of those ussr sattellite countries.