Are movie and show file sizes more efficient than they were years ago?

lemonlemonlemon@lemm.ee to Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ@lemmy.dbzer0.com – 115 points –

This may be a stupid question, but I just got back into pirating some shows and movies and realize that many of the QxR files are much smaller than what I downloaded in the past. Is it likely that I am sacrificing a noticeable amount of quality if I replace my files with the smaller QxR ones?

For example, I have Spirited Away from 2017 at 9.83 GB, but I see the QxR is only 6.1 GB. I also have the office from 2019 and the entire show (no bonus content) is about 442 GB, while the QxR version is only 165.7 GB. Dates are what they are dated on my hard drive, can't speak to their actual origin, but they would've been from RARBG. (Edit to add: I also can't really speak to the quality of the downloads, back then I was just grabbing whatever was available at a reasonable size, so I wasn't deliberately seeking out high quality movies and shows - a simple 1080p in the listing was enough for me).

I did some side by side on episodes of the Office (on my PC with headphones, nothing substantial), and I don't notice any differences between the two.

Thoughts on this? Are people better at ripping/compressing/whatever now that they can do so at a smaller size without sacrificing noticeable quality?

53

Newer codecs are more efficient. H.265 and AV1 are often 2/3 to 1/2 the size of an H.264 file for the same quality.
Of course there are also people uploading lower quality files as well.

As an editor I loved/hated h.265 until like…a year ago. Some NLE’s dragged their feet on support for some odd reason.

Licensing, probably. H.265 is very not open and you have to pay the MPEG piper to actually use it.

I wonder what the implications would be for us consumers & pirates.

Nothing, the licenses are for content providers and equipment manufacturers, obviously in the end you pay the license when purchasing the goods but the amount is small.

They have licensed countless other codecs and tons of cameras adopted it before they supported it. These aren’t some FOSS hobbyist projects. These are professional NLE’s for Hollywood level work. There’s no excuse if you ask me. Hell Resolve had it like 2-3 years prior I believe.

Agreed, proper h.265 support came way too late for some NLE's.

I really didn’t get it! When I got my GH5 I was pumped to do 10bit 422 h.265. Really wanted to see the latitude we could get at that compression. Premiere and FCPX in particular for like 5 years went “lol no.”

Yeah, beats me, I was surprised as well, like why do we still have to work with AVC clips, why can't I just import a HEVC clip... Premiere: nope, that ain't happenin'.

While I have you here, I recently had a project where they shipped me three separate FedEx packages of loose SD cards that were unlabeled, all .mts files. What on gods earth happened on that shoot?

Oh and the footage was all interlaced I shit you not.

Yeah, I can related to that 😔. I work in a TV station in a country that had the PAL standard, and... well, every fucking shot is interlaced 🤦. Not only that, but the output from the station is as well 🤦. Why? Backwards compatibility... what in the actual fuck 🤦... it's a stream, the decoder doesn't care about that, it can decode any frame rate, any frame type, it's all digital now 🤦. Tried explaining this, nope, we're still doing interlaced.

And, of course, after the cable companies have their way with the signal, the output is shit... not to mention they also archive the material as interlaced 🤦... and then people from outside the station complain about the matrial being garbidge... they still don't budge.

Just goes to show you what management is all about these days. They have the power, so they're gonna use it any way they see fit. Why? Cuz they're THE BOSS GOD DAMN IT 😠.

JUST CHANGE THE SETTINGS AHHHHHHH WHY BOSS WHY IT TAKES 3 SECONDS AND CHANGES NOTHING FOR YOU!!

Lol 😂. Actually, it doesn't take 3 seconds (well, 3 seconds for the stream output, yeah), there are workstations that need to be adjusted as well, but in a course if a day or 2, yeah, you can change the whole thing 👍. You could do the cameras last, that's not such a big problem, they can shoot interlaced for a few days, edit it that way, export it non-interlaced.

H.265 on my phone is not 50% the size. Maybe ~25% less at maximum.

It completely depends on the specific video file. HEVC and AV1 are more efficient in general, but most of their benefits become apparent with 4K video, which they were specifically designed to be better at handling than AVC. It also depends on your phone's software and hardware, as it might not be fast enough to encode in real-time with higher compression settings (and you don't get to use things like 2-pass encoding which can drastically lower bitrate without sacrificing visual quality).

It can be. Heavily depends on the bitrate, which as a shooter (depending on the camera) I can often control and with a wide array of options at that.

I assume you mean H.265 recorded on your phone? That is live encoded in a single pass. It doesn't compress as much in that scenario. When you give a system more time that the real time playback of the video it can encoded things more efficiently.

Same movie. 1080p. 2h. 6000 Bitrate. AAC 5.1 audio.

  • H264: 8 GB
  • H265: 5 GB
  • AV1: 3 GB

You can't just compare the file sizes without looking at the quality. Each will have different quality loss depending on the exact encodings used.

That makes no sense. The bitrate is how many actual bits per second the data uses after compression, so at the same bitrate all codecs would be the same size.

The bitrate is the rate of the video, not the size of the file. Think of different codecs as different types of compression, like rar vs zip vs 7z

I'm not saying it is the size of the file, I'm saying the bitrate multiplied by the number of seconds determines the size in bits of the file. So for a given video duration and a given bitrate, the total size (modulo headers, container format overhead etc) is the same regardless of compression method. Some codecs can achieve better perceived quality for the same number of bits per second. See. e.g. https://veed.netlify.app/learn/bitrate#TOC1 or https://toolstud.io/video/bitrate.php

If it's compressed to 6,000 kilobits per second then ten seconds of video will be 60,000 kilobits or 7 megabytes, regardless if it's compressed with h.264, h.265 or AV1.

Well, we're talking about fully compressed videos in this thread.

Yes, we are. And my point stands. The bitrate is the number of bits per second of video, as measured on the fully compressed video.

1 more...

Yeah my data is definitely an oversimplification. Raw bitrate doesn’t mean the same between them because they compress differently. I tried to control for that as best I could so it wasn’t the bitrate that was saving file size but the efficiency of the codec.

It’s like a fuzzy start line 🤷‍♂️

As I've said elsewhere, raw bitrate means exactly the same between them, because the bitrate is the number of bits per second of video after compression. What you mean is that you set a target bitrate and the different codecs have varying success in meeting that target. You can use two-pass encoding to improve the codec's accuracy.

But what matters is the average bitrate required by each codec to achieve the desired level of video quality, as perceived by you. The lower bitrate you need for the quality you want, the better the codec is.

1 more...

op is describing the source video file bitrate, not the target codec bitrate. 6000kbps compresses to different amounts depending on the codec and quality used. Op doesnt mention the quality factor for the codecs, so this is less than helpful.

You choose the output bitrate by adjusting the quality. If you ask for a 3GB file you get a 3GB file.

Unless you switch to using crf, which tries to give a consistent quality level, damn the file size.

Of course there are different ways to select the quality level. CRF numbers don't mean anything though, they're just higher or lower than each other.

1 more...
1 more...

Your old stuff is most likely in x264 video codec, while, especially at the higher resolutions, x265 / HEVC and in rare cases AV1 are the standard today. But it also depends on the specific release how many streams (like audio tracks, subtitles) are included

Be warned though, some x265 stuff out there, particularly at 1080p and lower, is a reencode of a x264 source file. So lower filesize, but also slightly lower quality. Scene regulations say only higher resolutions should be x265.

I still prefer it, HDDs aren't free and I personally really can't tell the difference (my TV kinda sucks anyway)

Thanks for all of the replies, they were very insightful!

The H264 to H265 appears to account for the majority of differences that I was seeing in file sizes.

lower visual noise of modern movies and series also helps a lot

Yup, people love film, but film rain can double or triple the bitrate you need for movie.

H265 has film grain mode so the encode can remove the film grain, set a bit saying there is film grain, and the player adds fake film grain back in.

Also it's common for anime to be encoded in 10-bit color rather than 8-bit which can also be used to encode files more efficiently.

Compression and encodings are not my forte at all: why is 10bit color more efficient to encode than 8?

I'm no expert either, this is just what I know from experience and through people smarter than me on the internet. I've put together some resources below for you. Apologies for the reddit links, it's just where this content lives.

encoding 10bit reduces the amount of truncation errors during the encoding, which leads to a much smaller filesize despite the two additional bits. To the point where, despite it storing more information, there being less truncation errors offsets the filesize gain to the point where you get similar or even smaller sizes than 8bit encodes.

source: https://www.reddit.com/r/animepiracy/comments/m98u30/comment/grm8uq9/

https://yukisubs.files.wordpress.com/2016/10/why_does_10bit_save_bandwidth_-_ateme.pdf

https://www.reddit.com/r/x265/comments/e08tfu/ultimate_encoding_test_results_for_animation/

Counterintuitive but very interesting. Thanks for the explanation!

10-bit color is HDR and has a bigger file size. You probably mean something different.

My understanding is that HDR formats utilise 10-bit color formats rather than being strictly equivalent.

From Wikipedia:

HDR video refers to a video encoded in an HDR format. Thoses HDR video have a greater bit depth, luminance and color volume than standard dynamic range (SDR) video which uses a conventional gamma curve.

As it alludes, HDR is more than just a 10-bit color space. Encoding anime in 10-bit doesn't take advantage of the greater luminance or dynamic range. It does take advantage of the color space to minimise banding.

It’s quite remarkable really. A single layer DVD stores 4.7 GB, for a movie with 576p (H.262). A while later those videos could be compressed using DivX or Xvid (H.263) down to 700 MB to fit on a standard CD, though full quality was more like 2 GB.

The Blu-ray standard came along with 25 GB per layer, and 1080p video, stored in H.262 or H.264.

Discs encoded in MPEG-2 video typically limit content producers to around two hours of high-definition content on a single-layer (25 GB) BD-ROM. The more-advanced video formats (VC-1 and MPEG-4 AVC) typically achieve a video run time twice that of MPEG-2, with comparable quality. MPEG-2, however, does have the advantage that it is available without licensing costs, as all MPEG-2 patents have expired.

Now H.265 is now even smaller than H.264, so now you could record a full 1080p movie onto a 4.7 GB DVD. Now the Ultra HD Blu-ray Discs are only slightly larger (33 GB per layer), but they store 4K video by supporting H.265 codec. I guess by now a 720p video encoded to H.265 could make a decent copy on a 700 MB CD.

You're right except for that last part. The newer smaller file size video codecs are really only effective on higher resolution video. So a 720p movie encoded with H.265 to fit on a 700MB CD isn't going to look much better if at all than older codecs (maybe better than DivX). H.265 really shines at 4K and up but does offer some benefit at 1080p.

That is interesting. Of course there aren’t any HDMI CD Video players so it doesn’t much matter. But it’s interesting how a 4 GB DVD in H.262 would compare to a 1080p copy of the same movie in H.265.

I wonder if there’s a lot of room for encoders to improve the quality per byte without changing the format. For instance jpeg and mozjpeg.

There's so much room that format specifications don't tell you how to encode, only how to decode. Designing the best encoder is a huge research project.