Why wouldn't a fully analog (read: dumb) kvm work?

Purely based on the fact that such a product doesn’t exist, my assumption is that such a simple setup as a pair of MOSFETs for each line in a dp cable wouldn’t function. I’m really interested to know WHY, though, so I thought I’d pose the question here before I do any meat space tests.

  • Is it as simple as the impedance in each MOSFET degrading the signal to the point of uselessness?

  • Is there some kind of initial communication between the monitor and PC that isn’t triggered with such a setup?

  • Would the act of routing a dp signal over a pcd introduce too much noise?

Would love to hear your thoughts, especially if you’ve got experience with high frequency electrical design.

In the dvi and vga days you could get kvms which physically switched between inputs.

I imagine it’s because HDMI and DP is much higher frequency and the impedance through a single MOSFET is probably enough to distort the signal.

There are probably mosfets that would work but then you have to create some fairly complicated traces to ensure your board doesn’t become a giant or that your signal is within spec (see ram boards for an example of this). This would be quite expensive since you would need at least a couple of fets per signal line and in an hdmi cable there are 19 wires. A single kvm chip simplifies a lot of those things.

EDIT:

  • HDMI is stupidly high frequency, 2.1a is 48Gbps! At 4k60 we are talking almost 600 MHz which would be a pain to lay out (just look at the input PCB of a 100 MHz scope, the level of shielding is insane)
3 Likes

I believe the reason is at least partially your second listed point. A lot of modern kvms need to emulate an EDID to the graphics card so it doesn’t think it’s unplugged and go wonky.

1 Like

The short answer is “righsholders”/hollywood. Its horrifying.

For the 10 gigabit data pairs this is essentially what the l1 kvm does and why the loss of cables on both sides of the kvm is additive. If i dont decode the signal on those 4 pair i sidestep literal and figurative insanity.

This is a loophole the hdmi people are desperate to close so they can reach into my wallet for “licensing” even though imho such setups are at best antitrust and at worst a pox upon humankind

9 Likes

The days of downtime caused by software license bullshit should have caused a universal uprising of the non-smooth-brained part of the population.

1 Like

I didn’t even think about licensing issues, that’s gross.

What’s been in my head lately is a compact device that plugs directly into the video ports on a monitor, and just synchronizes the switching either wirelessly or with a connecting cable. Cut one of the cables out of the equation? Maybe doesn’t matter since the total length would be the same, but it feels like it would.

Then just run two(+) cables into each module. Maybe not the best cable management wise, especially above two.

Oh, I thought this was gonna go more physical. Taking each wire in, splitting it’s pins onto a loom, then having contacts for the loom to engage with?
Loom goes up, output 1 connected. lower loom, Output 2.

But, that would be a lot of wires/pins… and like the big W mentioned, might break the hdpc? Negotiated connection for DRM content.

Fun long form read re HDCP.: Reposted: A cryptanalysis of HDCP v2.1 – A Few Thoughts on Cryptographic Engineering

1 Like