I've heard that at higher resolutions (i.e 1440p, 4K) the in game anti-aliasing settings can be toned down or even turned off. Is this true? And how much does it impact the image quality by changing these settings at higher resolutions?
The AA highly depends on quality of monitor you have. Some people might not see the edges on their monitors, some people might ~ and AA is for those people who do see it. Higher quality monitors do not require to use AA at all.
at 1440p you might want to turn it off, if you have below 60FPS performance.
As a general rule, yes, at higher resolutions, you do not notice the "jaggies" as much, so you can get by without it. Depending on the type of AA, the impact of framerates vary, as well as image quality. A lot of people are fine with FXAA, but there are those that despise it with a passion. SSAA looks pretty damn good, but it's resource draw is astounding.
Like others have said, it also depends on the monitor. Some 1440p monitors might not be as pleasing to the eyes as some others - which I suppose is a given for any monitor, really.
I switch AA on because it's not about the jaggies up close on the higher res monitors but further away when textures begin t get at more extreme angles. On my retina screen in my MacBook, effectively as dense as any screen can get the dpi, I notice a big difference in jaggie reduction.
There is only so much the in game engine can do to smooth edges and textures out at a distance. But when most people critically analyze the crispness of a shape or object they get as close as possible.
It also depends on the games you play. I play a lot of survival and RPG so I am constantly looking at the landscape far away or scanning the horizon for enemies, etc...