The processing power would need to be beefed up by a lot more than the storage capacity. If you are already storing 4K high bitrate, make a version in 720p with a more moderate bitrate and call it a day.
USUALLY it should be just one stream, occasionally there might be two but that won’t be too often. My 4k library as it stands now is about 3TB, but at this point when it comes to video content I’m only investing in 4K so that’s liable to grow fairly quickly.
At 1-2 streams, a 1600 will struggle and a 1700x should be able to handle two 4k streams.
That said, if you downscale a copy to 1080, you’ll have the 4k file for original playback and the 1080 file for downscaled playback and it will transcode from the 1080 file, so if you need to downscale more, it will take significantly less power.
The cost of keeping a copy of 4k footage at 1080 is probably going to be about 1tb at your current capacity. I’m assuming you’ve got 120gb blu-ray rips, so if that’s the case, you’ll probably be around 40gb for 1080 footage.
My advice is to make a copy at 10mbps 1080 or get a ryzen 1700x
Another edit: this is from the plex forums:
PMS recommends a CPU with a passmark score of 2000 for a 10 Mbps video so an 80 Mbps video would need a score of 16,000. This is also assuming you are referring to H264 video. H265 takes a lot more power, so consider at least double the score needed if it is H265.
80mbps is going to be what your 4k blu-ray rips are, more or less.
Yeah, I didn’t really think through my math. OP will probably need another ~400GB for that quality.
This is true but CPU requirements go through the roof when you switch to H.265. at H.264, you’re going to need 16000 passmark score for each 4k stream, you’ll need 25000 for each 4k H.265 stream.
This is greatly lessened with the use of a hardware decoder (Decoding, I don’t know if this will aid transcoding). Ryzen has no iGPU so 30-40% for 120mbit . The i7 7700K will do this @ 5% load.
The problem with Intel QuickSync is that the quality of the transcode is shit. I have hardware encoding on my plex server disabled because when I used it, it came out terribly.
Thats the problem, we need a dedicated hardware solution for high bitrate files. Quicksync makes sense but it needs to be better, I will agree with that. Software encoding is too slow and no one has the power for a passmark score of 25K. Its best to never encode and direct play but when we are away we need to because either the device cant handle that or the internet is too damn slow. Plex has just started using hardware acceleration and I feel its the way to go in the future.
it’s at a completely unacceptable quality right now.
It’s definitely the way to go in the future. At the moment, the only good solution for hardware acceleration is NVENC. A GTX 1050 is enough to transcode 120mbps.
Agreed, I really dislike the hardware accelerated space right now, intel has bad quality, but unlimited streams. Nvidia had decent quality but limited to two streams. Amd limited to only decoding or encoding only cant remember which, still limited. I would love if ffmpeg would work well with amd graphics. Having a raven ridge would be a great way to have a low power server with good amount of horsepower.
That’s true, but I think nvidia is the way to go for hardware encoding right now. If you need more than 2 transcode streams, you’re doing it wrong. More importantly, the 1050 should be able to handle 2 of any stream and it’s a cheap gpu and it only draws 75W, which can come from the motherboard.
H.264 is everywhere but at 4K you get huge files. H.265 is better but there is not always hardware decoding. On linux software decoding a 4K stream can 100% load 6 threads on my Ryzen 1700 so its a hit if you dont have hardware decode.
i think that it was a beta feature on the Plex Pass for transcoding, i dont know if its currently working right now, and the GPU itself will decode the Video when playback if its on the End side
I forget if it’s limited to plexpass or not, but yeah, there’s GPU transcoding and it’s an option in server settings you just toggle to “on” and it will automatically use whatever GPU is available if it supports it.