I now have a capture card I can trust, the Decklink Mini Recorder 4K from Blackmagic, and I’m running into an issue with DSR (Dynamic Super Resolution) because the highest native resolution is 4096x2160 for the capture card.
DSR allows you to run a higher resolution while it downscales it to a lower one. Handy for Twitch streaming cause it doesn’t support 4K.
However, because the native resolution EDID seen by the GPU is the capture card’s max resolution, it only offers DSR at 8192x4320.
So… I need a EDID emulator to restrict the resolutions down to RGB 1080p60, so that DSR would then render 4K 3840x2160, but then the output is 1080p60.
The reason I need this is because 4K downsampling works really well to crispen the image. Using a DirectX capture at native 4K hampers the performance of the game, so I’m dedicating one GPU to OBS, and then using a GPU output from the rendering GPU at 1080p60, while the game is rendered in 4K, with the OBS canvas stuff done on a separate GPU with no DirectX game hooks.
What’s a good EDID emulator to do this? I have no clue which one is good cause I just want a preset that limits it to RGB 1080p60.
Edit: If this thread gets no responses, I’m going to auto assume the Dr. HDMI from HDFury is the one to get. Monoprice apparently sells it.
Did some tinkering today and proved my proof of concept by using a HDMI AV receiver as the EDID manager/emulator. I was able to run 1920x1080 RGB 60p, and it proved my concept. The rendering GPU had no increase in usage due to DirectX game capture hooks, and I was able to sustain a solid 4K 60fps in Crash Bandicoot, with the output of the GPU being downscaled 1080p for Twitch using DSR, and captured in pristine 1080p RGB.
The next best hardware solution for this is the Ninja V or Ninja Inferno with the Decklink Mini Recorder 4K, but those recorders cost an arm and a leg. A EDID Emulator, HDMI Splitter, and using DSR on the GPU is only ~$150. A Ninja Inferno is $895 and a Ninja V is $695. And while you might not be able to see native 4K with the DSR solution, it’s a heck of a lot cheaper for a 4K rendering and downscale solution that works with a lot more games without having to mess with render scale with in game settings.
So, back to the thread question… What’s the best EDID emulator for this?
Starting to lean back towards a StarTech based one because after reading the manual, it seems to be more granular in it’s control, being able to switch between PC and AV modes.
RTFM before seeing too many controls and freaking out I guess.
I thought you were using Looking glass to capture your guest you could just as easily fake it from there I would think.
Yes, for native 1080p. For DSR 4x of Rendering 4K and downscaling to 1080p, DSR is a better scaling method on output, cause if Looking Glass is running 4K 60, it needs the DXGI overhead and the memory bandwidth to capture and carry the buffers. Using no DXGI or DirectX game capture hooks, that allows max performance, with a downscaled image at the same time.
Once Looking Glass can use NvFBC, and it dumps to a V4L2 source that can be grabbed by OBS without worrying about client scaling, then I might reconsider it.
For now though, one of my games, Crash Bandicoot, is causing audio issues on the host for audio capture, so I’m running that game with the same setup of splitter, then capture card on Windows 7 Bare metal.
Ah I’ve got what you’re saying now. that does make it trickier. right now I can’t even do a 480p stream until I get to move and have a usable upload.
could you not get the same result with a cheap edid plug and just fake whatever value you want with a custom resolution in nvidia settings?
edit: oh never mind I see what that emulator does now. that’s really neat.
Unfortunately, I have other use cases that need oddly specific configs. So better to get the multi-functional device with more flexibility.
I think it’s now a necessity if you have a dumb HDMI splitter, like the infamous grey market one under many brand names.
BTW, the end goal of this project is to have both a amazing streaming GPU passthrough machine, that can work with GPU passthrough AND bare metal, so long as you have 2 monitors.
I also have a side project with the Renesas uPD720202.
Startech EDID Emulator get.
The most handy feature is the DVI/HDMI switch, cause if a HDMI signal with audio is fed to an old monitor like the Samsung 245B/245BW, the entire display firmware freaks out cause it doesn’t know what to do with the audio channels. By strictly limiting it to DVI mode, both my monitor and my capture card are happy.*see note
The resolution limit made DSR work properly. FINALLY.
However, using this thing with a PS3 meant there were issues. The PS3 doesn’t change resolutions on the fly though, unlike a GPU which has to read the EDID. Constantly feeding out a fixed resolution means a EDID emulator isn’t really necessary. GPUs though… GPUs want to read the EDID then match resolution immediately.
*Note: Not all capture cards support DVI signals over HDMI. It could even be as strict as the Linux Blackmagic drivers work with DVI signals and the Windows Blackmagic drivers don’t work.
So… Since I’m forced to use HDMI mode for my EDID emulator because of the capture cards snafu with DVI mode, and my least input latent monitor is also an ancient Samsung, I get glitchy video (tear free though) with a full firmware lockup monitor-side (none of the buttons work) in order to preview my primary GPU output.
Guess I’m gonna have to use a new monitor with native HDMI. That’s in the plans for December.
Long shot but what if you disabled the gpu audio device?
Nope, the EDID emulator defines if audio channels are present. And the capture card cannot work on Windows without a HDMI signal, and not a DVI signal.