The TL;DR is that the organization that controls the HDMI standard won’t allow any open source implementation of HDMI 2.1.
So the hardware is fully capable of it, but they’ll get in trouble if them officially implement it.
Instead it’s officially HDMI 2 (which maxes out at 4k @ 60Hz), but through a technique called chroma sub-sampling they’ve been able to raise that up to 4k @ 120Hz.
However there are some minor reductions in picture quality because of this, and the whole thing would be much easier if the HDMI forum would be more consumer friendly.
In the meantime, the Steam Machine also has display port as a completely issue free display option.
Capitalism is so cool dude I love having inferior transit of 1s and 0s because some group of leeches in California own the shape that those 1s and 0s pass through
I so wish they only gave a complementary display port and not an HDMI.
Fuck HDMI! All my homies use DisplayPort.
The main feature of Hdmi has always been DRM.
As a bazzite user, with it connected to my living room TV that only has HDMI ports, yeah this was obviously why Valve said 2.1 isn’t supported at the steam machine reveal.
I have a HDMI splitter, like a 5 input 1 output thing. I have not used it in awhile. Does HDMI pass through the DRM or is the DRM in the splitter?
The source device (the steam machine in this case) will check with the display and see what the highest HDMI standard they both support is. It may also check if your splitter supports it, but I suspect the splitter is just a passthrough device.
I figured. I also used the wrong term, it is a switch.
I know some HDMI switches will, some won’t and others will strip the DRM and let the picture go through. I had to try several ones to get a conference room TV to work with a HDMI auto switch. Funny it was the cheaper model on Amazon lol
AMD already spent a significant amount of effort implementing HDMI2.1 in their open driver in such a way that it would be compliment. The suits from HDMI consortium still said No.
https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected
AMD Linux engineers have spent months working with their legal team and evaluating all HDMI features to determine if/how they can be exposed in their open-source driver. AMD had code working internally and then the past few months were waiting on approval from the HDMI Forum… Sadly, the HDMI Forum has turned down AMD’s request for open-source driver support.
AMD Linux engineer Alex Deucher commented on the ticket:
"The HDMI Forum has rejected our proposal unfortunately. At this time an open source HDMI 2.1 implementation is not possible without running afoul of the HDMI Forum requirements."I’m honestly surprised TV OEMs haven’t bothered to at least try throwing in DisplayPort, especially during the period of time it far exceeded the highest possible quality on HDMI.
HDMI is just the last hardware standard created from the ashes of the format wars that has no practical place anymore. It only exists to collect hostage licensing fees.
Tv oems are the ones that set up the hdmi club. They want the content encrypted with drm, from transit, to your pc, to your cable, to your screen. Look up the analog hole. This battle has been going on for 20 years. Share this with interested people.
I don’t know why they’d think I’d capture 600MB/s of uncompressed video though.
Since the torrent sites are crammed with full quality 4k Bluray remuxes and WebDLs direct from Amazon, there’s clearly easier and better ways of doing this than putting encryption in a cable.
Oh, they are working on that too. Win 11 depends on TPM modules, and it’s not by mistake. Once they have the full software and hardware pipeline, they can control the media we see (for profit, but also for authoritarism because the jump is so small).
I love that you’re talking about these issues, but the TPM has nothing to do with any of this. It’s also not a hard requirement for Windows 11 (even though that’s basically all the media was talking about).
The TPM included DRM as its objective 20 years ago. They backtracked at that moment because of serious backslash. Now, it’s present and a dependency. Then, it will be needed for banking. Afterwards, for social media and social credit like in China.
If you think they are going to stop where we are…
I think you’re confusing TPM (Trusted Platform Module) with TEE (Trusted Execution Environment).
If not, I’d love some links to read about that TPM with DRM from 20 years ago. However… I don’t know, why would there be backlash about something that has then been implemented anyway, just in TEE instead?
Because TV OEMs are the ones in the HDMI consortium.
I hated HMDI when it came out, and I continue to hate it.
And what do you use?
DVI, of course
Displayport
Component RCA like god intended
DisplayPort?
I see.
Fuck HDCP.
I tried to stick with DP only, but the tvs with it are getting rarer and much more expensive.
Buy monitors instead. Also saves you the headache of dealing with “smart TV” bullshit.
While I agree, I think you might not find a monitor that sitze. tvs and monitors have different use cases:
Monitor:
- High dpi
- Low distance to viewer
- Must support high resolution, high frame rate for games
- Must be compact enough for a desk
TV:
- low dpi
- greater distance to viewer
- 120hz and 1080 resolution enough for most movies and shows
- price and size more important than technical aspects.
Hard to find monitors larger than a certain size that aren’t exorbitantly expensive, and I do like a large screen when it comes to Couch Gaming and watching TV (well, streaming video, I ain’t gonna pay for a TV license just to watch the one terrestrial TV show I actually care about)
Do you know of any big dumb TVs or monitors that I could buy in Europe? I only know of Sceptre TVs which are mostly meant for businesses and storefronts but they are extremely hard to get in Europe.
Biggest I found is the Acer Nitro XV275KP3 Gaming Monitor. You kinda pay gaming hardware prices, but given the support for up to 120Hz and HDR10, I was OK with that. It’s mostly used for gaming, anyway.
It used to be that beamers were a way to sidestep the “smart” bullshit, but they started adding that, too. Even the business ones.That one is 27". I think that’s way too small for the living room. I also consider 120hz to be overkill for some couch gaming and movies.
Do let me know if you find something more suited.
A 27" monitor will not be anywhere near a replacement for 60" TV, I’m afraid.
fuck HDMI
all my homies hate HDMI
Why?
You can look up most of the issues with the standard, but TL;DR: DRM, expensive licensing fees, suboptimal performance compared to DisplayPort, and not able to be implemented in FOSS or even OSS systems (because of the shitty DRM which can be circumvented by an AliExpress splitter lmao)
CEC is not standard even though it should be a standard for hdmi.
With TVs starting to get USB-C inputs, which are displayport under the hood, hopefully HDMI fucks off.
Ballparking but it will likely take closer to a decade than not for that to actually happen… and I am still not optimistic. And there are actually plenty of reasons to NOT want any kind of bi-directional data transfer between your device and the TV that gets updated to push more and more ads to you every single week.
The reason HDMI is so successful is that the plug itself has not (meaningfully?) changed in closer to 20 years than not. You want to dig out that PS3 and play some Armored Core 4 on the brand new 8k TV you just bought? You can. With no need for extra converters (and that TV will gladly upscale and motion smooth everything…).
Which has added benefits because “enthusiasts” tend to have an AV receiver in between.
The only way USB C becomes a primary for televisions (since display port and usb c are arguably already the joint primary for computer monitors) is if EVERY other device migrates. Otherwise? Your new TV doesn’t work with the PS5 that Jimmy is still using to watch NFL every week.
there are actually plenty of reasons to NOT want any kind of bi-directional data transfer between your device and the TV
I’ve got bad news for you about HDMI then…
USB-C adapters for absolutely everything are thankfully quite common now thanks to the laptop/dock industry.
In the sense that we have dongles/docks, sure. In the sense of monitors with native USB-c input? These are still fairly rare as the accepted pattern is that your dock has an HDMI/DP port and you connect via that (which actually is a very good pattern for laptops).
As for TVs? I am not seeing ANYTHING with usb c in for display. In large part because the vast majority of devices are going to rely on HDMI. As I said above.
I’ll also add that many (most?) of those docks don’t solve this problem. The good ones are configured such that they can pass the handshake information through. I… genuinely don’t know if you can do HDCP over USBC->HDMI as I have never had reason to test it. Regardless, it would require both devices at the end of that chain to be able to resolve the handshakes to enable the right HDMI protocol which gets us back to the exact same problem we started with.
And the less good docks can’t even pass those along. Hence why there is a semi-ongoing search for a good Switch dock among users and so forth.
Regarding the Nintendo Switch, it’s because of their engineered malicious USB-C protocol design that makes the console “Not behave like a good USB citizen should”. It’s less of an issue with the peripherals as a whole.
USB-C probably cannot replace either, because the unmating force is too light. A typical HDMI or DisplayPort cable is much thicker, longer and hence heavier than a typical USB-C cable (even those specced to carry high bandwidth, like a thunderbolt cable) because they need better shielding to carry high bandwidth signals long distances - it’s not unusual to need to route HDMI several metres (but USB-C cables that long are unusual because of the different purposes)
For TVs and such it’s useful to have the inputs connect vertically, so that they don’t stick out the back of the device and cause problems pushing it against a wall. Then the weight of the end of the cable is going to be trying to pull the connector out of the TV. DisplayPort connectors can have a latch to deal with this.
Of course, there a ways around this: a new connector, for example. But it does mean that you can’t just leverage the existing pool of USB-C connectors and cables to make this ubiquitous.
To mention, this is also a problem with HDMI (but not DP).
But just have the usb-c insert top down instead of bottom up, include room for a small loop and cable retention to ensure slack doesnt put pressure on the port. This easily allows for fixed connections with usb-c.
There are also side-screw locking connectors for usb-c. With HDMI, a top-screw option was made for more fixed install scenarios. That design is ugly af and uses a massive amount more room than the usb-c screw lock approach.
Could it be done with a tiny magnet?
Anything important enough to be secured you probably dont want to involve a magnet for.
Oh, screw lock as in like some PC tower cables? Yes, that would be really nice, I wouldn’t mind that for a phone.
Exactly that.
Its mostly common on commercial devices still, but it is defined.
https://www.usb.org/document-library/usb-type-cr-locking-connector-specification
As an example:
https://www.startech.com/en-us/cables/usb31ccslkv1m
Though again, a lot of this could be mitigated with a loop, strain relief, and inserting from above into a port rather than letting gravity pull it down. But it would be nice to see side screw locks be more common with usb-c.
That usb-c connector with side screw locks looks like a future I want to live in.
It’s futuristic, but it’s open and not corporate. It’s miniaturized and sleek, yet still mechanically rugged.
A good USB c cable and port can hold quite a bit of weight, I’ve easily picked my phone up by it as long as you don’t make any jerking movements. That’s a lot more weight than a few feet of even a very heavily shielded cable.
Then the weight of the end of the cable is going to be trying to pull the connector out of the TV.
Just duck tape the usb cable to the back of the TV
Solvable by moving the locking mechanism out of the port and making one that you can retrofit to any cable
The connectors on the back of the TV can be oriented horizontally (like parallel to the screen, not perpendicular), which at least changes the pull force to a torque force, which isn’t ideal but easier to hold on to.
DisplayPort rocks
Yeah fuck the video codec mafia and all these proprietary shits like HDMI
Fun fact, all of the audio codes are proprietary too. You won’t find a HDMI surround sound splitter on Aliexpress. Say no to HDMI, say no to E-ARC.
You know it reminds me of the academic publishing mafia of Elsevier and the like
Both cartels are leeching off often-publicly funded research.
I used to find it took forever to start showing a picture compared to HDMI on my PC. Getting a new GPU so maybe that will improve things.
What display are you using as well? That sounds quite unusual.
If you can find anything to connect it to.
HDMI needs to die.
I mean the many incarnations of usbc are slowly making headway. For better and worse.
governments should start cracking down on codecs. tf are dipshits allowed to hold standards hostage?
They really just need to demand that open formats are implemented in parallel with any proprietary ones, with no artificial feature/performance disparity allowed.
That kills any incentive to keep the proprietary ones locked down because eventually the open formats will be available throughout the ecosystem and users will have devices with support in the entire pipeline. Then users will simply no longer want to deal with the locked down formats for long and nobody will want to sell them.
Proprietary formats should be illegal. Consumers are idiots, marketing will convince them to support proprietary, and regulatory capture will compromise any attempt to stop disparity
It pisses me off that you gotta pay so much money to look at the official ISO 8601.
The good news is RFC 3339 doesn’t have this problem and is an unambiguous subset of ISO 8601.
Dated joke
Is this why DisplayPort looks better for me on Linux???
Yes. DP is the right choice for civilized people.
Yep it’s pretty much better in all regards.

The only downside is no ARC support, but I suppose support for that is pretty hit or miss anyway.
Honestly arc is a great idea that never seems to work for me. I’ll always be RIGHT there, but my Blu-ray player turns on randomly when I’m doing something else, or something like that. So I end up turning it off.
At this point just make an “adapter” that captures the disaply port signal and outputs it from a “supported” device
At this point stop using HDMI
. And players that want to avoid the issue can use the Steam Machine’s DisplayPort 1.4 output, which supports even more bandwidth than HDMI 2.1 (and which can be converted to an HDMI signal with a simple dongle).
So, ship with a dongle.
Doesn’t the system driver need to support the standard?
That’s what I’m saying, it would be more complicated than a dongle, the PS5 has some sorta system that handles this, it would essientally be a device that supports it, that just decodes and encodes the video feed, as dumb as this sounds it’d the only soluations to use on most TVs






















