I had been playing Battlefield 4, GTA V, The Witcher 3, Rise of the Tomb Raider, and The Division for a long time at 1080p/60fps on a 65 inch plasma TV at a distance of 8.5 ft (optimal distance for that size/resolution according to online calculators & charts). I was able to easily see the difference between Low, Medium, High, Ultra on the TV just fine. But for some reason when I play those games using the Oculus Rift (CV1) and VorpX (or Virtual Desktop) I can’t tell the difference between the highest and lowest settings. In terms of visual fidelity I see absolutely no difference or the difference might for some reason be so small that I can’t detect it. Obviously there is a huge impact on frame rate going from lowest to highest, but I’m curious about the visual fidelity. I am running those games at 2160 by 1200, 90+ fps (depending on game somewhere between medium to ultra)’ 980 ti. Any idea why that might be? Have u experienced the same thing (same visual fidelity regardless of low or ultra settings)?