I'm thinking about using my graphics card as a trigger for some external devices.
This came to me, because I'm calculating some images and show them in about 50Hz on a monitor. And I've got a laser, that I want to trigger in a specific sync pattern to the images.
So for example, I'm showing an image on the monitor and 5ms later I want to send a trigger signal (e.g. red channel of VGA gets the value 255).
Then, before I'm showing the next image, I'll set the trigger to low and the cycle begins anew.
Does anyone know, if it is possible to set some high signal over VGA without having a real monitor behind it but just some cables for trigger?
You might look up the VGA Specs and how each pin carries the signal. My way of approaching this problem would be first research the pinout for VGA, then look to see if your driver support manually sending a signal to each pin...if not you'll have to figure out a way to trick it into thinking a monitor is connected and then set that color that corresponds to a high signal.
Using Serial over USB would be much easier since thats what it is there for. Just grab an arduino and wire it up.
I'd have to agree with @dwd002; arduino would be much simpler. As for emulating a monitor, you can pick up a handful of ...terminators? termination- term-... terminators (whatever)- connectors with no cable that emulate a display- for a couple bucks. You could stick a vga extension in between and hack at the cable to intercept the signal. Maybe try using Processing for generating a video signal (similar to Arduino).
I would think you can but theres probably some kind of faking you'd have to do to trick the video card into thinking a monitor is plugged in
If you can do that, you can measure the output of the red pin when full red and tie that to an input
Should be doable, but it might be a lot easier just to write something similar for a protocol like SDI (used in broadcast communications for camera control) or just putting a monitor in between/parallel with your solution
My understanding is that all you'd need to trick your GPU into thinking it was connected to a monitor is:
- a fake i2c signal on pin 12, i2c clock on pin 15
- an MCU capable of faking said signal
You could then theoretically tell your gpu that it was connected to a 50 hz monitor via i2c, use the vsync signal as your data clock for the trigger, and output color signals via pushing color data via opengl to pull pins high or low on the other side.
this is an incredibly inefficient way of doing things, though
I concur. You'll find this to be alot harder and the OpenGL libraries are alot less fun than serial libraries to use. It would be much easier using USB, all you would need is power, ground RX and TX. Then the code for that would be something like:
Then you can build from there. I pretty sure the library for python is pyserial and then you can build your app from there.
That was my thought as well. I only have to trick the GPU, that there is some monitor connected, without having an actual monitor connected.
Since it is a timing critical thing, that is depending on the time frame, that I'm showing the image on my monitor, I thought using the VGA output as a trigger. An Arduino would not do it, since it would be just "guessing" when I'm showing something and when not. That would not suffice. I can already do something like that with a controller, that I'm using, but it's still just theoretical guessing and does not work well for my purpose.
So I thought, I'll send signals from the system, when I'm sure what is going on, since everything is happaning on the GPU.
why not just use a script that watches for something that gets put in the logs when you start the image display? or a video jockey program? or and SDI controller?
What exactly do you mean by using a script for watching the log?
Also how would a video jockey program help in this case? And I'm not quite sure, what a SDI controller does. Google gives me just some controller with very vague descriptions without telling me exactly, what these things are for (not sure, if the Wikipedia page is the right one for what you meant).
most things are logged in operating systems. so, you'd need to find what log spits out something when the process that your video runs in starts, and write a script that watches for it, and triggers whatever you want to happen when it sees the output it's looking for.
SDI lets you control broadcast equipment (cameras and displays, usually) for live presentation (TV, multi-camera recording, etc)
A video jockey program would let you trigger the video at a precise moment you deploy your trgger via a DMX controller or similar (a bit of a backwards approach, but should work just the same as long as the only requirement is synchronicity)