If I was AMD I'd tell them to suck my ass and reverse engineer that shit anyway. Unfortunately I'm not AMD, lol.
Linux Gaming
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
Resources
WWW:
Discord:
IRC:
Matrix:
Telegram:
They don't need to RE it; they have access to the full spec and everything for their Windows drivers anyways. They'd open themselves up for litigation if they implemented this behind the forum's back though and that's something AMD (understandably) simply won't do.
They could hire dedicated teams that has no access to the full spec to RE it and it should be above board, as long as it's done right ofc.
No, they can't. AMD is a member of the HDMI forum, which means they're contractually obligated to follow the forum's rules. In exchange, they get voting rights on decisions like this one, the right to propose changes to the HDMI standards, technical details that are protected by NDAs, etc. They wouldn't throw that all away and open themselves up to a lawsuit just for their OSS drivers.
Is there a specific contractual obligation stating that they can't hire teams whom have no access to NDA protected specs to RE HDMI products through the usual legal means? If not, then they should be well within their legal rights, tho it'd be worth consulting a lawyer first. Now, would it damage their relationship with the HDMI people? maybe, likely.
Their contacts are most likely protected by NDAs, but they're also written by lawyers who know how to close loopholes. There's no way a SIG like the HDMI forum would allow members to release compatible products without following the rules.
Even if it isn't covered by the contract, the other members could hold a vote to remove AMD from the forum.
Hopefully AMD start doing what Intel does and including a DP -> HDMI 2.1 converter in the card itself. There are already third party adapters that work reasonably well with existing AMD GPUs, especially on Linux. If they had their own implementation they could iron out the quirks and driver issues and get something that should be equivalent to real HDMI 2.1.
To hell with proprietary, binary blobs.
Why did HDMI succeed over display port? Always the same problems with closed source.
In my experience, its cause monitors are already over priced, and adding a display port to it seems to add at least another 100 on top of that.
Which is why I prefer HDMI. Less cable headache too, since I only have to keep one type of cable in stock and so i can easily switch for testing/diagnostics/layout change purposes.
I... don't think display ports add 100 on top of the price. Do you have a source for it that its so much more expensive?
HDMI didn't succeed over display port; they're two different formats meant for two different audiences.
-
HDMI is meant for consumer electronics like TVs / set-top boxes because it focuses on delivering a little bit of everything (audio, video, network, etc) in a single cable for the best, easiest singular TV / device experience.
-
Display port is meant for computers because it focuses on delivering the best responsive multi-monitor experience.
In other words, if you are working or gaming on a computer, you should be using Display port; however, if you are using anything else, you should be using HDMI.
I just hate how my monitor came with a HDMI cable
Regardless of the cable that came attached, check the monitor to see if it supports a Display port -- It most likely does. Your monitor likely came with an HDMI cable for sake of cost reduction (I guess better than no cable at all). If the monitor doesn't support Display port, it means that manufacturer didn't build or manufacture that monitor to be a true computer monitor class product.
I know it's probably for cost cutting. But the monitor does indeed have a DP input option. Maybe the HDMI is included because it has inbuilt speakers and as far as I know those aren't usable thrpugh DP and I don't know if it has a separate audio input.
HDMI did have a head start, but nowadays, the answer is money. As usual.
That also includes money to upgrade, for example, display equipment in virtually every office conference room, classroom, home theater, etc. It took a long time to shake VGA in those settings and now that that's largely been dropped in favor of HDMI it'll be a tall order to chase after the next best thing with no benefit noticeable to 99.9% of people.
Fair, but the same was said about USB. We got there eventually.
It's an older interface than DP and has "better" support for audio (I.e. all of those proprietary passthrough audio formats that home theater setups support) so it became dominant in TVs. Monitors are still DP first but likely have a HDMI port as well.
Most modern monitors have a single displayport, and then a small army of HDMIs.
That kind of makes sense though. I figure they assume you’ll have one computer hooked up and then a bunch of consumer devices that all use HDMI. And if you need a second computer hooked up you can also use HDMI if needed. Probably makes the most sense to the most people as having more DP in place of HDMI would just mean the average user couldn’t hook up as many devices since (almost?) no consumer devices use DP unfortunately.
You forget every desktop GPU having 3 DisplayPorts and only 1 HDMI, and USB C supporting DisplayPort?
Because the movie studio execs like their hdcp drm
Which is funny because of how easy it is to circumvent
Which in a lot of cases can be easily removed with adding an HDMI splitter in between. Fuck DRM!
Is display port still open source? I thought something happened.