GLSL refraction and total inner reflection on nVidia/AMD cards

Hi everyone!

I'm having a problem with a shader I wrote and was hoping someone here might be able to help me. Here's the code that gives me trouble:

vec3 refractedDirection = normalize(refract(ray, normal, eta2 / eta1));

float TIRfactor = 1.0;  //-1 on total internal reflection

if(refractedDirection == 0)

        TIRfactor = -1.0;

Note: The refract function returns the refracted vector or 0.0 in case of a total internal reflection.

This code compiles and runs just fine on machines with nVidia graphics cards. On a machine with an AMD graphics card however it fails to compile because of the comparison of a vec3 to a float in the if condition.

Is there any way to find out the type of the variable? I tried to google it but couldn't find one.

The compile error says it all, you are comparing a vec3 to an integer in your if statement.  The nvidia compiler might be smart enough to do an implicit type conversion for you, but generally you want to keep all of your types in order.  Normalize will always return a vec3, when it say that it will return 0.0 it probably means that it will return a zero vector, vec3(0.0).  Try something like:

if(refractedDirection == vec3(0.0))