...

Our_Benefactors

760

Karma

2022-09-06

Created

Recent Activity

  • The CPU is an 11700k it’s not the beefiest. Maybe the high end AMD chips make this a more viable choice

  • The encoding speed with my CPU was less than real time. This was unacceptable performance to me.

    Maybe my CPU settings weren't right?

  • Yes, I wrote a python script that uses FFMPEG and detects the bitrate of the file and determines approximately what CQ to use. If the original file has a low bitrate, by reencoding it with a high CQ you can actually increase the file size (lol).

    Normal CQ = 28 Aggressive CQ = 34

      # Thresholds to detect ultra-low-bitrate inputs (use more aggressive encoding)
      # (width, height) → min sensible bitrate (bps). If stream <= threshold, use aggressive profile.  
    
     LOW_BITRATE_THRESHOLDS = [
           ((3840, 2160),  13_000_000),  # 4K  ≤13 Mbps → use aggressive encoding
           ((2560, 1440),   8_000_000),  # 1440p
           ((1920, 1080),   5_000_000),  # 1080p
           ((1280, 720),    2_500_000),  # 720p
           ((854, 480),     1_200_000),  # 480p
       ]

  • I use “slow” GPU encoding. I optimized my pipeline for minimal quality loss, not maximum file size savings. I wouldn’t recommend using consumer grade CPUs to reencode because to get a reasonable speed you’ll need to use “fast” presets which tank the quality. Anecdotally fast CPU method saves closer to 60-70% file size.

  • It’s such a shame as h265 is such an amazing codec breakthrough. I’m in the process of converting my library for space saving and the h265 files are literally 50% of the original size (give or take), with imperceptible quality difference. I can reencode around 100-200GB/day typically, using a 3090

HackerNews