Compression artifacts will exist as long as we use lossy video compression. You can however eliminate visible to the eye artifacts with high enough bitrate, but that has always been the case.
Considering the ungodly size of raw video or even video with lossless compression, we will need lossy compression for the next century.
It will only change if the bandwidth and storage become practically free, which would require some unforseen breakthrough in technology.
It's so rare to see someone speak the voice of reason on future technology matters. People think we'll be able to do anything, when there are physical limits to just how far we can advance current technology.
Even if we invented something new, you have to deal with the size restrictions of atoms. Silicon has an atomic radius of 1.46 Å, gold 1.35 Å, and our current process for manufacturing that's in development is a 2 nm, or 20 Å process, although that number doesn't mean much, since the measurements are closer to 20 nm (metal pitch). There are experiments dating back about a decade where someone created a transistor out of a phosphorus atom. We're a lot closer to the end than we might realize.
As you mentioned chip lithography is hitting a wall. They're not going to be able to make things much smaller with current technology. People have been working on new ways to build these mechanisms. They're researching ways to do things at the quantum level, that's some seriously sci-fi tech.
Yes compression will always be required. The raw video they work with in production takes up enormous amounts of drive space. As things become less restricted it only means we'll be able to use less compression and higher bitrates. Though you can use fairly liberal bitrates with local storage already. I have zero issue with local videos, but I can run into compression artifacts when streaming from sites with less than optimal bitrates.
Compression, yes, but lossy compression will eventually die, even for video.
I already stopped downloading MP3s. Storage space is cheap and abundant enough now that I replaced almost my entire music collection with FLACs.
Considering the ungodly size of raw video or even video with lossless compression, we will need lossy compression for the next century.
The technology for film photography (which was later used for video) existed for a century before digital was popular ¯\_(ツ)_/¯
There's also 90s VHS degradation and lot of plug-ins for video editors to choose.
People are already doing that
> artists
Deep-fried memers are artists now?
People definitely already add some of the tracking and other artifacts from VHS to videos. It's decently common on YouTube
We do something similar over at !mavica@normalcity.life, but with photos. Of course, we're using old floppy disk cameras, so the compression, aberration, and CCD weirdness is indeed authentic.
Holy shit I haven't heard that word in a very long time. My gifted teacher in elementary had one. We used it to take pictures at the dam. I remember bringing a few spare floppies for it. Good times!
Although I'm not so sure if the intentional ones are compression artifacts or corrupt digital video artifacts. The video itself is low quality with unintentional compression artifacts.
Or the new Ratchet and Clank that just came out on PC? There's literally a weapon called the "Pixel Gun" that turns enemies into Doom sprites.
They could.
But why wouldn't they just actually compress the video?
In digital/streaming video, there is no physical tape which produces grain, so there is no choice but to add it in post.
But if you're serving up digital video, then you can simply just serve up an authentic compression/bitrate/resolution.
I suppose maybe players in the future won't support the codecs we use today?
The client might even have a local AI that extrapolates away visible artifacts from old compressed videos.
How would the AI distinguish between accidental artifacts due to compression, and intentionally introduced artifacts?
It won't
They might do that, in a dynamic fashion. However, if it's being used artistically, they likely will want a particular effect. Having greater control over the artifacts might be useful. E.g. lots of artifacts, but none happening to effect the face of the actor, while they are actually speaking.
They might also only want 1 type of artifact, but not another. E.g. want blocking, but not black compression.
I mean you'll always be able to. In the same way you could use actual VHS recorders or film to get those effects. But it's far easier to apply the filters. Likely yes, such codecs won't be readily available and it might be more effort to do the real thing than add a filter. Who knows.
Nah, they will keep using interlace effect centuries after the last cathode ray tube burns out...
Have a jpg slider.
The Pixalater setting
Compression artifacts will exist as long as we use lossy video compression. You can however eliminate visible to the eye artifacts with high enough bitrate, but that has always been the case.
Considering the ungodly size of raw video or even video with lossless compression, we will need lossy compression for the next century.
It will only change if the bandwidth and storage become practically free, which would require some unforseen breakthrough in technology.
It's so rare to see someone speak the voice of reason on future technology matters. People think we'll be able to do anything, when there are physical limits to just how far we can advance current technology.
Even if we invented something new, you have to deal with the size restrictions of atoms. Silicon has an atomic radius of 1.46 Å, gold 1.35 Å, and our current process for manufacturing that's in development is a 2 nm, or 20 Å process, although that number doesn't mean much, since the measurements are closer to 20 nm (metal pitch). There are experiments dating back about a decade where someone created a transistor out of a phosphorus atom. We're a lot closer to the end than we might realize.
As you mentioned chip lithography is hitting a wall. They're not going to be able to make things much smaller with current technology. People have been working on new ways to build these mechanisms. They're researching ways to do things at the quantum level, that's some seriously sci-fi tech.
Yes compression will always be required. The raw video they work with in production takes up enormous amounts of drive space. As things become less restricted it only means we'll be able to use less compression and higher bitrates. Though you can use fairly liberal bitrates with local storage already. I have zero issue with local videos, but I can run into compression artifacts when streaming from sites with less than optimal bitrates.
Compression, yes, but lossy compression will eventually die, even for video.
I already stopped downloading MP3s. Storage space is cheap and abundant enough now that I replaced almost my entire music collection with FLACs.
The technology for film photography (which was later used for video) existed for a century before digital was popular ¯\_(ツ)_/¯
People are already doing this. Example: the band Vulfpeck has been doing this since at least 5-6 years ago. https://www.youtube.com/watch?v=le0BLAEO93g&pp=ygUJZGVhbnRvd24g
that video is technically is film grain. Compression artefact would look more like like this.
Thanks! TIL.
Soooo... Shitposts
Here is an alternative Piped link(s): https://piped.video/watch?v=le0BLAEO93g&
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.
In the future?
There's also 90s VHS degradation and lot of plug-ins for video editors to choose.
People are already doing that
> artists
Deep-fried memers are artists now?
People definitely already add some of the tracking and other artifacts from VHS to videos. It's decently common on YouTube
We do something similar over at !mavica@normalcity.life, but with photos. Of course, we're using old floppy disk cameras, so the compression, aberration, and CCD weirdness is indeed authentic.
Holy shit I haven't heard that word in a very long time. My gifted teacher in elementary had one. We used it to take pictures at the dam. I remember bringing a few spare floppies for it. Good times!
Have you ever played WATCH_DOGS?
https://youtu.be/5dItY-4G8QU
Cyberpunk 2077 is also using similar effects.
I had not.
The future is now.
Although I'm not so sure if the intentional ones are compression artifacts or corrupt digital video artifacts. The video itself is low quality with unintentional compression artifacts.
Or the new Ratchet and Clank that just came out on PC? There's literally a weapon called the "Pixel Gun" that turns enemies into Doom sprites.
Here is an alternative Piped link(s): https://piped.video/5dItY-4G8QU
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.
They could. But why wouldn't they just actually compress the video?
In digital/streaming video, there is no physical tape which produces grain, so there is no choice but to add it in post. But if you're serving up digital video, then you can simply just serve up an authentic compression/bitrate/resolution.
I suppose maybe players in the future won't support the codecs we use today?
The client might even have a local AI that extrapolates away visible artifacts from old compressed videos.
How would the AI distinguish between accidental artifacts due to compression, and intentionally introduced artifacts?
It won't
They might do that, in a dynamic fashion. However, if it's being used artistically, they likely will want a particular effect. Having greater control over the artifacts might be useful. E.g. lots of artifacts, but none happening to effect the face of the actor, while they are actually speaking.
They might also only want 1 type of artifact, but not another. E.g. want blocking, but not black compression.
I mean you'll always be able to. In the same way you could use actual VHS recorders or film to get those effects. But it's far easier to apply the filters. Likely yes, such codecs won't be readily available and it might be more effort to do the real thing than add a filter. Who knows.
Nah, they will keep using interlace effect centuries after the last cathode ray tube burns out...