I recently started building a movie/show collection again on my home NAS.

I know that generally H.265 files can be 25-50% less bitrate than H.264 and be the same or better quality. But what’s the golden zone for both types? 10 Mbps for a 1080p H.264 movie? And would it be like 5 Mbps for H.265 1080p to be on par with H.264? What about 4K?

For file size: would it be 25GB for a 2 hour 1080p movie to be near or at original Blu-Ray/digital quality?

  • SamSausages@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    This is very complicated to just give an answer, because:
    It varies greatly based on the content. Animated compresses vastly differently than an action movie.

    Varies greatly based on encoder. NEVC vs CPU etc

    Varies greatly based on encoder options. I.e. -b:v -minrate -maxrate vs -rc vbr -qmin -qmaxcq values, etc

    Varies greatly based on who is watching, the TV they use and their tolerance and experience.

    Savings are greater at 4k than 1080p. But once you start adding HDR into the mix, you’re in a whole new world.

    Even the people with very discerning eyes can’t agree on everything related to this topic. Wish I could just tell you do x… but you’ll have to test various methods and determine what you are happy with.
    or, if you just want some space savings… use some default setting that cuts it in 1/2 and forget about it.

    • corruptboomerang@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Varies greatly based on encoder. NEVC vs CPU etc

      Not asserting this isn’t the case, I’ve not noticed it, but I can’t see why this would be the case for the actual encoding. Decoding I’ve seen it make a difference but that’s mostly the pre-Skylake iGPUs using a poor implementation of QuickSync.

      • AshleyUncia@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Not asserting this isn’t the case, I’ve not noticed it, but I can’t see why this would be the case for the actual encoding. Decoding I’ve seen it make a difference but that’s mostly the pre-Skylake iGPUs using a poor implementation of QuickSync.

        No, it’s totally a fact. Software encoding yields you better results in terms of ‘quality per megabyte’ over hardware encoding unless you are using some real bad sloppy software encoding results. If size efficiency matters more than anything, you use software encoding or you’re basically leaving money on the table. Of course the downside is that hardware encoding is a whoooooooooooooole heck of a lot faster.