I have a TV which says it supports 4K 144Hz and right now I run an old laptop as a media server/desktop to it, which can only handle 1080p. I wish to switch to some NUC/mini-pc that run a linux desktop in 4K and run media flawlessly on it.

There are two things that I get confused about trying to find something that suits my wishes:

  1. How do I properly find out if the hardware can handle this? Like a rpi5 can handle 4K video playback with like librelec, but a 4K desktop distribution is laggy and slow. Is a CPU only enough, or do I need dedicated GPU? Should I be looking at the Ultra Core series from intel, does it have good linux support?
  2. Are my wishes on hdmi 2.1 level troughput? Which may not work on linux? Reading about hdmi 2.1, then it says that the hdmi forum forbids open source support for hdmi 2.1, does that mean there are binary blobs for linux that will work?

Is there anything else I’m missing? If you run a linux media server, what hardware and dist are you running?

  • doodoo_wizard@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    12 hours ago

    You find out if the hardware can handle it by looking up its video decoding capabilities on Wikipedia and checking that it’s capable of the resolution and codec you want. If you’re buying new hardware then a chip from Intel or amd that support the resolution and codec you expect will do the job. It doesn’t need to be the latest and greatest thing.

    If you can’t be content with 4k60 over hdmi then you either need to use proprietary drivers or a different cable. Your tv very well may have a displayport slot and that’ll sidestep the problem. I don’t have problems with proprietary drivers but you may.

  • juipeltje@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    16 hours ago

    I don’t really have any hardware recommendations, since i just use a dedicated server built from spare parts, and stream it to an android tv with jellyfin, but i can say that when it comes to hdmi 2.1, as far as i’m aware it’s only an issue on amd. Intel and nvidia have their own workarounds, but for some reason amd is just… sitting there, not doing anything it seems. I bought a ugreen displayport to hdmi adapter for my tv and it seems to work just fine, so that could be a workaround.

  • Majestic@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    17 hours ago

    If you’re going intel you can check the ark.intel pages for the processors in the devices you’re looking at. Intel does pretty good documentation so it’ll show you what integrated graphics they have and all that.

    Ideally you want a chip that can do hardware decoding (and if possible encoding if you’re serving media to others and intend for it to transcode and not direct-play) of common codecs so you’re not eating a massive power bill or generating tons of heat or getting bogged down in resource utilization.

    AV1 support is the only tricky part when it comes to hardware decode support. Maybe you don’t use it yourself but typically only the newer chips support hardware decode of AV1 files. Something to consider if that’s likely to be an issue for you if you have or plan to have lots of AV1 encoded files. (Though there is software decode of course)

    The Intel N150 can do a 4K desktop, you won’t be doing 4k gaming on it at all but it can do the desktop and video playback and is a low power consumption chipset. Should be able to support at least 2-3 4k transcodes as well. A lot of enthusiasts use it for just this purpose in fact and it’s fairly snappy for uses like these.

    Anything more powerful than an N150 will be fine as well for 4K video viewing, transcoding, 4k desktop, etc. So if you want to spend more and get a more powerful Intel chip you can. Just avoid 13/14th generation i series (i5/i7/i9) especially used because of the hardware damage bad design did to those and there are a lot of messed up ones floating around from people trying to offload.

    144hz may be the really tricky part. Lots of these mini boxes are capped at 60hz so definitely double-check that. There’s always the option of displayport to HDMI cables too if it has a DP output that supports the necessary 4k framerate. N150 might struggle driving that to be honest.

    Oh and be aware of thermal throttling. Lots of manufacturers stuff Ultra 9 series in things like laptops and minis with inadequate cooling and they thermal throttle like crazy so you pay $800 and get something with the same performance as a properly cooled Ultra 7 or 5 series.

    To loop back around to whether you need a dedicated GPU. You have to ask yourself are you transcoding streams for others or is it mostly direct-play without transcode? Integrated GPU on the CPU die should be good enough unless you have an awful lot of streams going at once or some other pressing need.

    You can run whatever distro you want. There are extremely specialized distros like OSMC (https://osmc.tv/) which is basically kind of like Kodi running on Debian but without a desktop environment (extremely media center focused).

  • afk_strats@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    21 hours ago
    1. You can look at manufacturers info pages and see what they support. Intel integrated chips usually list the capabilities and you’ll want to double check with your mini PC or motherboard manufacturer to make sure they support it too. I think any i5+ from the past 5 years with integrated graphics should be able to play/decode 4k media (someone correct me if this sounds crazy). Fornsure my core ultra 265. As far as codec support, I’m not familiar with the compatibilities but I’m sure everything CAN be played on recentish hardware. Encoding is out of my weelhouse.

    2. I’ve used HDMI 2.1 hdr 4k120 on Linux with Nvidia, AMD and Integrated Intel. AMD will be the best experience especially on cards from the past 5 years. Nvidia, with proprietary drivers, on 3000 series or newer should be good for a few more years. I heard 2000 series will be dropped from support soonish m. Intel HDMI 2.1 is a pain on linux and I’ve only been able to get HDR 4k120 using a special DP to HDMI cable.

  • lsjw96kxs@sh.itjust.works
    link
    fedilink
    Français
    arrow-up
    3
    arrow-down
    1
    ·
    20 hours ago

    Personally, I have a GUI-less server that is connected to my lan running jellyfin server and an nvidia shield plugged to the TV with an alternative launcher to get rid of ads. That way, the server consumption is low and always on and you get a good interface and a remote for the TV.

    If you really want a Linux device plugged to the TV, you could look into Plasma Bigscreen, a KDE made for TV.