846
top 50 comments
sorted by: hot top controversial new old
[-] darth_helmet@sh.itjust.works 277 points 11 months ago

Sounds like hdmi Forum are a bunch of twats. Time for a new format.

[-] dinckelman@lemmy.world 310 points 11 months ago

DisplayPort already exists

[-] Tja@programming.dev 123 points 11 months ago

We cannot have two standards, that's ridiculous! We need to develop one universal standard that covers everyone's use cases.

[-] SlopppyEngineer@lemmy.world 127 points 11 months ago* (last edited 11 months ago)

There are now three competing standards.

https://xkcd.com/927/

[-] baropithecus@lemmy.world 43 points 11 months ago

I know what you are referencing, but displayport already covers everybody's use cases

load more comments (3 replies)
[-] sunbeam60@lemmy.one 20 points 11 months ago
[-] Holzkohlen@feddit.de 23 points 11 months ago* (last edited 11 months ago)

And what does that use? That's right it's Displayport Alternate Mode! Oh you've got Thunderbolt? Guess what, also Displayport!

load more comments (1 replies)
[-] Tattorack@lemmy.world 19 points 11 months ago

Yes, I agree. And it needs to be open bloody source!!

[-] darth_helmet@sh.itjust.works 55 points 11 months ago

Hard to find on non-pc gear, but that’s a fair point

load more comments (3 replies)
[-] altima_neo@lemmy.zip 27 points 11 months ago
[-] BetaDoggo_@lemmy.world 85 points 11 months ago

USB-C display output uses the Display Port protocol

[-] ABCDE@lemmy.world 24 points 11 months ago

Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.

[-] BetaDoggo_@lemmy.world 52 points 11 months ago

I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.

The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What's worse is that most aren't labeled, so even if you know what spec you need you're going to have a hell of a time finding it in a pile of identical black cables.

Not that I dislike USB-C. It's a great connector, but the branding of USB has always been a mess.

[-] strawberry@kbin.run 17 points 11 months ago* (last edited 11 months ago)

would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?

better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc

[-] Flipper@feddit.de 20 points 11 months ago

Have you looked at the naming of the usb standards? No you havn't otherwise you wouldn't make this sensible suggestion.

load more comments (5 replies)
load more comments (1 replies)
load more comments (6 replies)
load more comments (3 replies)
[-] Player2@lemm.ee 21 points 11 months ago

USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess

load more comments (1 replies)
load more comments (4 replies)
[-] Zellith@kbin.social 40 points 11 months ago

More people should try DP.

[-] wjrii@kbin.social 36 points 11 months ago

I thought I had NSFW turned off... 🤣

[-] Zellith@kbin.social 20 points 11 months ago

( ͡° ͜ʖ ͡°)

load more comments (2 replies)
[-] lolcatnip@reddthat.com 15 points 11 months ago

As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn't done anything to get it built into TVs.

load more comments (3 replies)
[-] TheImpressiveX@lemmy.ml 196 points 11 months ago
[-] LifeInMultipleChoice@lemmy.world 22 points 11 months ago

Is there a reason or way to prevent display port from having so many connection issues specifically on port replicators (docking stations)?

In corporate environments I find so many times that you plug them up over and over, unplug over and over and check the connection a million times before turning everything off one final time, holding the power button on everything (kind of like an smc reset) and then booting up everything like you originally did and they come up. Is this a result of the devices trying to remember a previous setup or is their an easy way to avoid it?

I've hooked up dozens of them and still ran into issues when a family member brought a setup home to work when they were sick last week.

load more comments (1 replies)
load more comments (5 replies)
[-] jlh@lemmy.jlh.name 146 points 11 months ago

This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I've been trying to use an adapter, but it crashes a lot.

AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.

[-] Sudomeapizza@lemm.ee 22 points 11 months ago* (last edited 11 months ago)

ive found that the issue in my experience is that X11 only supports a max of 4k60, but Wayland supports 4k120 and beyond. I dont think the cable matters as the same cable im using works on windows with 4k160.

load more comments (2 replies)
load more comments (11 replies)
[-] generalpotato@lemmy.world 85 points 11 months ago

Should… should we sic EU on them?

load more comments (4 replies)
[-] Blackmist@feddit.uk 66 points 11 months ago* (last edited 11 months ago)

So why is it rejected?

Just because they're still trying to use HDMI to prevent piracy? Who in fuck's name is using HDMI capture for piracy? On a 24fps movie, that's 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.

I've got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.

[-] buddascrayon@lemmy.world 43 points 11 months ago

The CEO's of the media companies are all fucking dinosaurs who still think VCRs should have been made illegal. You will never convince them that built in copy protection is a dumb idea and a waste of time.

load more comments (2 replies)
load more comments (7 replies)
[-] Godort@lemm.ee 50 points 11 months ago

What's the over/under that this was about preventing people getting around HDCP using a modified driver?

load more comments (2 replies)
[-] Declamatie@mander.xyz 41 points 11 months ago

Alright, from now on I will never again buy any electronics with HDMI.

[-] csolisr@hub.azkware.net 37 points 11 months ago

If we had to relay exclusively on non-proprietary protocols, I doubt that GNU/Linux would have gone anywhere beyond the Commodore 64

[-] riskable@programming.dev 95 points 11 months ago* (last edited 11 months ago)

Linux never ran on the Commodore 64 (1984). That was way before Linux was released by Linus Torvalds (1991).

I'd also like to point out that we do all rely on non-proprietary protocols. Examples you used today: TCP and HTTP.

If we didn't have free and open source protocols we'd all still be using Prodigy and AOL. "Smart" devices couldn't talk to each other, and the world of software would be 100-10,000x more expensive and we'd probably have about 1/1,000,000th of what we have available today.

Every little thing we rely on every day from computers to the Internet to cars to planes only works because they're not relying on exclusive, proprietary protocols. Weird shit like HDMI is the exception, not the rule.

History demonstrates that proprietary protocols and connectors like HDMI only stick around as long as they're convenient, easy, and cheap. As soon as they lose one of those properties a competitor will spring up and eventually it will replace the proprietary nonsense. It's only a matter of time. This news about HDMI being rejected is just another shove, moving the world away from that protocol.

There actually is a way for proprietary bullshit to persist even when it's the worst: When it's mandated by government.

[-] Fisch@lemmy.ml 40 points 11 months ago

AV1 is a good example of a non-proprietary protocol replacing proprietary protocols (h.264, h.265, ...)

load more comments (5 replies)
[-] tabular@lemmy.world 32 points 11 months ago* (last edited 11 months ago)

Why? Most software wasn't proprietary before companies realized they could make more money at your expense (not all the profit is going into making a better product).

If given the choice of an uncomfortable dormitory or a comfortable jail, at least the residents can improve the living areas in the former.

[-] rtxn@lemmy.world 17 points 11 months ago

Parent is right though. Unix being proprietary is why the GNU project was started, and why the Linux kernel and BSDs rose above.

load more comments (6 replies)
[-] CosmicCleric@lemmy.world 35 points 11 months ago

No disrespect meant towards GamingOnLinux, but this article from Tom's Hardware has a much better description of what's going on, including quotes.

[-] autotldr@lemmings.world 33 points 11 months ago

This is the best summary I could come up with:


If you were hoping at some point to see HDMI 2.1+ on Linux with AMD + Mesa, you're out of luck right now as it's simply not going to be happening.

There's been a bug report on the Mesa GitLab of "4k@120hz unavailable via HDMI 2.1" that's been open for a few years now, with lots of comments and chatter about the issue.

In an update on the bug report, AMD engineer Alex Deucher commented: "The HDMI Forum has rejected our proposal unfortunately.

So if you're on Linux, it's going to continue to be best to buy hardware that uses DisplayPort.

On the NVIDIA side though, it seems like it may not be an issue, as developer Karol Herbst wrote on Mastodon: "Even though AMD might not be able to add support for HDMI 2.1, nouveau certainly will as Nvidia's open source driver also supports HDMI 2.1 so there is no reason to believe that at least some drivers can't support HDMI 2.1

It's quite backwards, but apparently having all the logic inside firmware (like Nvidia does) will probably help us implementing support for HDMI 2.1"


The original article contains 244 words, the summary contains 183 words. Saved 25%. I'm a bot and I'm open source!

[-] FartsWithAnAccent@kbin.social 22 points 11 months ago

Boo! Get off the stage HDMI Forum!

[-] WeirdGoesPro@lemmy.dbzer0.com 21 points 11 months ago

Eli5, what are the security risks of my HDMI cable?

[-] thedirtyknapkin@lemmy.world 83 points 11 months ago

Piracy being easier is the only risk. Once again ruining the experience of legitimate customers to try and stop a thing that they have had no success at even slowing down.

[-] Acters@lemmy.world 26 points 11 months ago

Even further, it made it more expensive to buy products from all the dumb licensing fees that all the middlemen try to shoehorn in.

[-] fidodo@lemmy.world 16 points 11 months ago

The security of mega Corp IP.

load more comments (3 replies)
[-] NoLifeGaming@lemmy.world 19 points 11 months ago

Always thought that display port is better anyways lol. Anything that HDMI does or have that display port doesnt?

load more comments (8 replies)
[-] BetaDoggo_@lemmy.world 16 points 11 months ago

VESA or bust

load more comments
view more: next ›
this post was submitted on 29 Feb 2024
846 points (100.0% liked)

Technology

61456 readers
3606 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS