Monitors off secondary ATI video card are always mirrored/flickering

Hey guys,

I'm trying to get 5 monitors running off two ATI Radeon 5570 cards - 3 off the primary and two off the secondary. Both cards have 1xHDMI, 1xDP and 1xDVI. I am using the open source driver.

I can get three monitors running off the primary card, and the 4th monitor (DVI) running off the secondary card. I am using the display applet in Gnome3 to configure the displays. The OS is Linux Mint 15.

The problem is when I enable the 5th screen (tried HDMI, DP->HDMI, DP->DVI). The 4th and 5th screens become mirrored. And every time the mouse is moved any windows on those screens flicker (I assume it's redrawing the shadows or something). Video performance on the primary card is also affected.

Are there any configuration settings that can be modified, or is this a limitation of the open source driver? Where does the Displays applet store it's configuration for X?

I have tried to use fglrx and fglrx-updates from the Driver Manager that comes with linux mint, then used amdcccle to try to wrangle the monitors into place but the monitor placement and multi-display options leave much to be desired. I could only get one screen usable and 5 blank screens (multiple "Single Desktop" monitors). Using "Multi Desktop" display makes windows expand across all monitors on that card, while separating the desktops between cards (so you can't drag windows between screens on different cards - WTF?).

Anyone have any ideas? Thanks.

I also tried an active DP-DVI converter on the 5th screen with no luck :(

The PSU is a 700W 80Plus Bronze CoolerMaster, but the cards have no AUX power on them. The motherboard however does have two AUX power plugs (Both SATA and Molex power types) of which the molex is plugged in. There is only a single SSD and DVD drive in it so shouldn't have too much load.

You probably just have a failed vesa mode autoconfiguration. Most likely the cause of such a thing is the monitor itself, but as long as you know the specs of the monitor, you can set the mode manually.

First thing I would do is check RandR, you're using linux mint with Gnome3, so install "arandr", which is the gtk GUI front-end for the X RandR, it'll work better than the applet. In ARandR, you can select between recognized modes for the different monitors, and sort them visually, and switch display modes. Probably this is everything you'll need, because as soon as you click apply, it will store the modes with RandR.

If it still doesn't work, you'll have to check the files of your monitors. In linux, every device is a file, I can't tell you the name of those files because I don't know Ubuntu/Mint enough to know which system they are using right now, however I suspect that they will be named after the interface type and a sequential number, like HDMI1 or VGA1. In those files you check for incongruences , like refresh rate, native resolution, etc... Sometimes a monitor will not quite work flicker free at the rated refresh rate, for instance the manufacturer says that the refresh rate of the monitor is 60 Hz, but in fact it's 59 Hz, and the thing to do then is to set the mode for that display manually by calculating the vesa mode. But first start with ARandR, I think it will solve the problem.

Another possible cause is that the cards don't get enough power when so many monitors are attached, which would typically be if the last monitors that are attached, are connected with an analog connection, like VGA or the analog connectors of a DVI connector. In that case, check the power ratings of the PSU outputs to the cards. This is also quite probable, since it affects your other GPU card (but that can happen in the other hypothesis also, it's hard to tell not seeing the artefacts in person). A typical thing that could happen if the cards are not getting enough amps, is that X would crash and restart. That doesn't mean your session would end, so it might just be visible as a short blank screen and a realigning desktop. The system wouldn't crash itself unless the entire power supply were fucked up, even ripping out a GPU card mid-session won't crash the system, although it might crash X and cause an instant restart of X.

Where does the Displays applet store it's configuration for X?

~/.config/monitors.xml

It uses xrandr to apply the settings. Zoltan already gave you a few possible solutions but you can also use xrandr from the command line. The manual is pretty straight forward.

Awesome! Thanks guys! I'll give it a try :)