Hello Everyone,
I’m currently looking for a way to optimise even further the amount of video file I have, I mainly store anime and series episodes.
Currently what I do is : Getting the video on my main computer (5800X3D + 1080ti) Convert it using Handbrake to H265 NVENC then store it on my server which is basically a Ubuntu server (i7 8700 no graphic cards) with mainly a zfs pool.
I’m looking to improve this to getting directly the video on my server, converting it using handbrake (I’m considering getting an intel ark card to switch to AV1 instead of H265) and storing it directly on it.
The issues I’m currently facing are mainly concerning Handbrake itself, the support on linux using hardware encoding looks limited I still have to make more testing but I was wondering if anyone got a better solution than what I’m currently envisaging.
Use Tdarr! Grab the Server and then use your main PC and other PCs you have around as nodes. It’s fully automated once you set it up. Just point it to the libraries and it scans and converts. Completes a Arrs/Plex/Jellyfin setup
I don’t really care how slow handbrake is on Linux because I first use makemkv to rip the blurays to disk, and then run HandBrakeCLI in a bash for loop against the *mkv files.
The hardest part is putting the blurays in the machine, but the bash for loop is easy you just come back a week later and get the *mp4 files.
Store it as it is, no reencoding. Storage is cheap.
Considered doing this a few months ago. Depending on the size of your library, it’s a mixed bag. Some concerns:
-
People on the Plex communities talk about the H265 conversion never being as good a quality as a conversion from a primary or secondary source. I’m not a quality purist, but it might be an issue for some pieces for you. Worth considering.
-
I ran a test on a season of a TV show I had in 1080p, and the results were about a 70% decrease in file size for each episode, which was huge, but the problem for me was that it (an admittedly very old gaming desktop) took about 6 hours per episode to get that. Having a machine churn like that for the extended time required for a whole library introduces a whole host of issues re: hardware fatigue, heat, electricity, sound, tying up your machine during the whole process, and I’m sure plenty more I’ve forgotten to mention, that might limit the value of embarking on such a task.
-
It’s generally considered to be much cheaper (and a better use of all of your resources) to just spend what you’d spend doing that on more storage. 16 TB drives running ~ $200 will last anyone except the most extreme 2% of DataHoarders very well for media storage for quite a while.
Anyways, I hope any of this helps. I ended up holding off. The storage space savings are tempting, but the path there is expensive (in more than just $) and probably not worth the investment for most.
-
My target was Emby. While Emby can and does support AV1, it needs some hardware decoding to shift it back down for most devices, because most devices can’t handle AV1.
I picked HEVC (H.265). I can still throw cores at it to CPU decode to target, AVC (H.264) which is the best most devices support, so I don’t need to finagle a GPU into my R710.
For the encode, I wrote a script to scour my movie and tv collection for anything huge (movies > 4GiB, TV episodes > 1.5GiB), and in anything other than HEVC already (pulled from
mediainfo --Inform="Video;%Format%" "$file"
).Once I had the list, batch processing of copy file over to desktop, ffmpeg re-encode, tag it in
[]
for Emby, copy it back into place, and finally delete the original file.It took the better part of 3 months to complete.
However, my collection went down in size from ~38TiB to ~26TiB. That’s a win in my book.
I rewrote the scripts to be manual triggers and just folder watching, now (
inotifywait -q -q -e close_write
) the shared folder, and it just fires whenever I drop shit in that folder, and spits out much smaller, still high-quality HEVC MKVs for me to pick up later :)I use Handbrake and my own secret sauce preset with Tdarr. Hardware encoding is good but I prefer using CPU and squeezing the maximum quality and smallest size for h265. Tdarr is what you are looking for. All these questions have been asked and answered there.
any chance you would share your secret sauce with us? I‘m interessted in converting h264 to hevc without loosing quality if at all possible.
you wouldn’t like it. I scale everything down to 576p. The result is small movie but there is def a loss of quality. I tuned it to the limit as far as trading file size and quality. 576p is actually standard definition but if done right the result looks like HD. At least to me.
I’ve been debating this as well and I decided to go with H.255 since AV1 decoding hardware is years out for any client devices that access Plex. I am going to use Handbrake CLI in Linux to run the conversion of all of my media for space savings. I am using the Super HQ present with audio passthrough for all resolution ranges (except 4K).
I have about 18TB of media to churn though and I don’t want it taking a all winter or tying up my desktop/server for that duration. My plan is to distribute the transcoding tasks across a SLURM compute cluster I built out of old Dell desktops. Need to do the math on how long it might take, but I have a 8 thread/8GB VMs living on all PCs in the house as nodes for the cluster which totals 14. I wrote a couple python scripts to create the transcode jobs, run them and handle file movement.
I’m thinking I will see about 50% storage savings on all transcoded media. If you want more details or the scripts, let me know and I will put them on GitHub.
If you’re game for a lot of setup, you might like Tdarr
It supports remote workers for encoding, so you can store your stuff on the server but still use the main computer’s hw accel.