Hoping someone can explain to me the sequences by which things are generated by the graphics card. I would like to know so i can determine which is more important in attempt to optimize for my preference (seem I need higher refresh so greater then image quality).
More specifically, how are rendering resolution settings (found in the Quest desktop app), and game resolution settings different. Or, how do they impact they tend to impact performance.
I have two possibly thoughts:
(1) Game (virtual monitor) and virtual environment are rendered independently. And combined right before forming a final image (rasterization). In this thought, Rendering Resolution has no effect on the virtual monitor, only the environment. I image being in mindcraft world, and looking through a portal/mirror to Cyberpunk 2077. Therefore, 0.7x vs. 1.3x has little effect on the quality of the virtual monitor/game image.
(2) Game (virtual monitor) and virtual environment are rendered on top of another (like the game monitor is a texture). Assuming the game is rendered at a lower resolution, and the game would be scaled to the Rendering Resolution. Stretching the image and making it look fuzzy. I guess this would be like looking at a low resolution texture/posture in a game running 4k. The text would be all blurry. In this case, i’m not sure where to start to find optimal balance, Rendering Resolution or Game Resolution. Although Rendering Resolution would be the max, for game. It’s effectively the monitor resolution.