Loved automobiles with 4 wheels? Chinese cars have 13! In your face suckers!
Why not use the already open displayPort and make it better.
noo we need yet another standard!
This was exactly what I wanted to post… 😅
Lock-in.
Thought of this too, with the addition “so we can control that market”.
Is it an open standard?
To quote the article:
a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard.
So its half proprietary. No thanks!
Most important question
Imagine putting out a new high bandwidth cable standard in 2025 based on copper.
The sooner display and networking move to SFP, the better.
Today I learned DidplayPort 2.1 can carry 240W.
That’s a lot of power! Are there even any devices that use this?
PCs can use >1KW.
I don’t know why you’d power a PC over DisplayPort though. New 8k monitors do go up to 190W, so we could exceed 240W if we try hard enough.
So if you have a beefy psu you should be able to power your monitor off tbe DP?
Or does carrying power limit data throughput?
The way it works for power over Ethernet — and I assume USB power delivery must work the same way — is that it does not reduce bandwidth because they run the power and the signal over the same wires at the same time.
There is a a power injector at one end and a filter at the other end that separate out the high-frequency signal and the DC (no-frequency) power into different wires.
This is essentially the same thing as they’re already doing for multi-frequency stacking on those same wires (and on fiber) to get the crazy bandwidth in the first place. DC power is just one more low (very very low) frequency running on the same stack.
It might? I think USB uses data lanes for power delivery above some point, and I wouldn’t be surprised if DP does the same.
Hi! I actually work at a major electrical connector company, so maybe I can shed some light on this.
I have no idea.
I used to work with electrical engineers, and whenever I asked about details, they’d shrug and say, “black magic?” Checks out.
Based on this pin configuration, there’s only two dedicated power pins, which isn’t very good for large wattages. The rest are twinax signal pairs separated by ground to reduce crosstalk.
Usually when connectors are designed for power delivery, they’ll use bigger contacts to reduce the contact resistance (signal contacts tend to be small so you can fit more of them in the same space). I’m guessing the original DP connector form factor wasn’t made with such high power in mind, so it would make a lot of sense to use the spare signal pins for power delivery in this case. Running too much power through too few small pins can damage the contacts, by either by instant-welding the contact surfaces or by overheating the connector (see NVIDIA GPUs) ((also high voltages can cause arcing, which even in the best case will seriously degrade any connector)).
Take all of this with a huge grain of salt cause I just learned this stuff like a month ago, and my department has nothing to do with any of it. Just though someone might find it interesting.
They fixed it.
Running that much power next to a data line sounds like a terrible idea for signal integrity, especially if something shorts to said data lines. It just sounds sketchy or filled with so many asterisks that it’s functional impossible to reach their claimed throughput.
See, IDK anything about data and power and cables but I dislike the vibe when I dock my laptop with that itty bitty USB-C connector that does power and 2x monitors and networking and peripherals.
I did buy the bonkers expensive proper cable from lenovo, and it does generally just work, but maybe once every few weeks I have to unplug & re-plug.
More power and more data through the same cable just seems daft.
It’s likely dc current which without the alternating magnetic fields will not degrade the signal as bad. But I whole heartedly agree with you on power delivery. What could possibly need/use that much power‽
Yeah, considering the recent VGA power connectors problems, what could possibly go wrong?
wHy Is mY tV sMoKiNg?!?42??
The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.
Maybe you could use this to daisy chain monitors and power them all.
The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.
USB-C docks can already do this. Obviously with less power and it’s not perfect by any means, but we don’t need another technology for this. And sure, it’s two cables, one from wall outlet to integrated dock/monitor and usb-c from dock to laptop, but no matter the technology you still need something to plug in to wall outlet.
Bigass showroom screens I suppose? Maybe large sound systems?
Displayport and hdmi are either twisted pair or coaxial I think. Low frequency RF from 50hz AC shouldn’t interfere with them, but high frequency changes in current on a power wire will.
USB standard is up to what, 40Gbps and 240W? That’s pushing the envelope already. We’ll see if this new standard can prove itself, anyways.
USB4v2 can do 80Gbps and 240W.
It can also do 120Gbps/40Gbps asymmetric.
This must be for commercial displays where it is beneficial for installation to have power and data over a single cable.
I can’t think why I would want power delivery to my PC monitor over the display cable. It would just put extra thermal load on the GPU.
I think it’s aimed at TVs in general, not computer monitors. Many people mount their TVs to the wall, and having a single cable to run hidden in the wall would be awesome.
I wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you’re going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.
Gigantic 80" screens generally draw something like 120W
In HDR mode they can draw a lot more than that for short peaks
My 50" 1080p LCD draws over 200w…
Headroom and safety factor. Current screens may draw 120w, but future screens may draw more, and it is much better to be drawing well under the max rated power.
Even in that scenario it will complicate the setup. Now your Roku will also have to power your TV? No, any sane setup will have a separate power cable for the TV.
I don’t think you’d ever have a peripheral power the tv. The use case I’m envisioning is power and data going to the panel via this single connector from a base box that handles AC conversion, as well as input (from Roku etc) and output (to soundbar etc.). Basically standardizing what some displays are already doing with proprietary connectors.
In wall power cables need to be rated for it to prevent fire risks. This will need to have thick insulation or be made of a fire resistant material.
It would just put extra thermal load on the GPU.
Passing power through doesn’t have to put noticeable load on the GPU. The main problem I see there is getting even more power to the GPU - Nvidia’s top cards are already at the melting point for their power connector.
Passing power through doesn’t have to put noticeable load on the GPU.
I specifically said thermal load. Power delivery always causes heat dissipation due to I2R losses.
That’s what I meant. Compared to the power the GPU is actually using, transmission losses for a pass-through should be negligible. If you have a good way to get it to the card in the first place.
Nah, it’s for powering the 1000w RTX 6090.
The popular use for power delivery through a display cable is charging a laptop from your monitor; it’s already very common with Thunderbolt or USB-4 monitors. But 480W seems a bit overkill for that.
~~Why is that better than usb-c? ~~
Wait… Power the other way. Whoops, I get it.
That already kinda allow this and the actual load is pretty small
Even a big 30 in display is maybe 20 watts
Well, power delivery goes several times that. Laptops are another very useful case for it. It’s nice to be able to just have a single display port and power connector
You can do this to an extent, today
Even an 80” tv only uses around 150W, if my research is correct. Surely this must be thinking about massive displays.
If you’re gonna release a new standard, may as well have the headroom for future growth so it’s not outdated too soon in the future.
Your research would be incorrect
Yeah it was a quick google search. Do you have better numbers available?
Most manufacturers only list average power draw, but in HDR mode you can get much higher peak power useage.
This website also lists peak power draw for many TVs, in this example the Bravia 9 85 inch has a peak of 380W
https://www.displayspecifications.com/en/model-power-consumption/fca71198
Ah perfect, that makes a lot more sense to me
Now you can use one cable for two 80".
If it’s not usb-c it’s banned in EU. Because we stopped there and we won’t go forward.
I think you could have a second connector in addition to a main USBC.
Honestly we need higher capacity for screen cables for PC. Both HDMI and display port are limiting performance because of their low, 40-80gbps, bandwidth. Their performance maxes out at 4k120hz with uncompressed HDR color. You can’t use 8k screens or multiple 4k screens without lowering quality.
Where I work, everyone has 2 4k screens. You can use two cables to connect them, you know…
And every one of them has either put their scaling up to 150% or simply set them to 2k, because you cannot read a damn thing on them.
More than 4k is a theoretical need for a veeeery small market
Graphics cards only come with one HDMI port though. The LG OLED is popular for 4k screens because it ticks all the boxes and is much cheaper than equivalent gaming monitors, but that means it doesn’t support dp.
And it means that you have to upgrade the graphics card just for the cable even if it is still relatively new. The point is that we shouldn’t be held back by just a cable .
the GPMI cable comes in two flavors — a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard
I actually copied this from the article to come here to the comments and have a whinge about all the different USB-C standards, and here you are explaining the reason why.
Don’t get so excited. Read my comment again.
Actually? I don’t know much about that legislation. Does it really not have room built-in for tech improvements?
It does! If there’s a good alternative it can be proposed, or that’s what I read here on Lemmy
Won’t this heat up like a mother fucker
It depends on the voltage used. If they run 48V which seems to be supported by USB-C EPR. Then the cable has to do the same 5A it’s capable of doing today. Then the heat is the same.
When it comes to their own new connector/cable they can use even higher voltage or more/thicker conductors for power.
If it’s physically more stable and reliable than HDMI, then count me in
We already have alternative, it’s called thunderbolt port.
No, we don’t. Apple proprietary nonsense isn’t worth the metal it’s made of.
I usually associate it more with Intel since they certify Thunderbolt devices on all the non-Apple hardware and that’s all I use. I forgot Apple had anything to do with it.