TV operating system question

This is what I historically have done. In 2014 I built a HTPC for my parents to attach to their TV so that could watch films stored on a PC upstairs over the network, in addition to Netflix and Amazon Prime. It was literally a Pentium G3258 using its iGPU with Kubuntu LTS installed, in a really nice 19" Silverstone case… It still more than sufficient for the job and cost like £300 all in all to build. if i were building that system now, I’d probably look at an Athlon based APU but a Pentium would still be fine.

I do a similar thing with my TV in my flat. It’s not connected to any kind of cable or satellite. It gets all of its content from either my main PC, or a console. It may as well be a 40" PC monitor, and honestly if money were no object it’s these “dumb” BFDs that I’d actually be looking at instead of a TV.

The only downside is that you need Intel and Windows to reliably decode 4K content, for no other reason than because DRM has always been about extorting money from corporations and nothing to do with piracy.

1 Like

Yeah, the problem with LineageOS would be that Widevine doesn’t work on it though…

So, it’s probably the Shield or Apple TV. I will have to check how the preinstalled app performs though, but I’d still prefer to have everything on one device.

Yeah, DRM sucks…
Anyway, why do you need Intel? As far as I know, AMD does support DRM.

4K Blu Ray DRM is only supported by the iGPU of iirc 8th and 9th gen Intel CPUs only. Not only is it not supported by AMD or Nvidia, but it’s not even supported by new Intel CPUs. The platform is basically dead on PC for this reason. This is what happens when companies like Intel make both the hardware and the DRM standards.

2 Likes