Forum Replies Created
-
AuthorPosts
-
StakFallTParticipant
But for games like Quantum Break and Metal Gear Solid The Phantom Pain and some more,
the engine of these games are not Vorpx friendly, so we need to get the ā3Dā from somewhere else like Nvidia 3D Vision or Tridef 3D.See to me that doesn’t A LOT of sense. It makes me wonder what the devs of those games are actually really doing. Yes DX / D3D and OGL have evolved over the years, but the process is still pretty much the same as it has been (i.e. For D3D: Create a vertex / index buffer, lock the buffer, fill the buffer, unlock the buffer, then draw the buffer). I imagine it has gotten more complicated with things like geometry shaders, etc. but I’m sure other games are using those as well and they’re supported. So I can’t really imagine what specifically about a game can cause it to be THAT drastically different that would make it, at least, infeasible, to support.
StakFallTParticipant^^^
Second that! (Metal Gear Solid: Phantom Pain — heck the Fox Engine in general would be awesome to have support for). I think it’s one of the only games I can think of that I play, REALLY want support for, and there’s not a profile to base one off of. I mean yes, there’s probably TONS of games (mainly indie-dev 2D style games would be my guess) that probably don’t have “base” profiles. So I’m sure there’s lots of other games with proprietary profiles that there isn’t a profile for, but for me, it’s the only game I play (that I can think of at least) in which there isn’t a profile I can base one off of.StakFallTParticipantJust tried it. Yeah… no that definitely didn’t help. It brought the square that the game is rendered onto “closer” to voprX’s camera (removing vorpX’s environment from showing around the bottom), but with image zoom at 0% the square (that the game is rendered to) is already chopped off a little on the sides, so you can imagine what bringing the zooming up so that it reaches to bottom does. My next attempt is to switch the Vive headset out of direct mode, set it to extended mode and maybe try creating a custom resolution.
StakFallTParticipantOk, I’ll give it a shot soon as I get a chance (and report back if it still has issues). Thanks
StakFallTParticipantI can’t get it to reach the bottom without making it go off the sides of the screen. Hence why I started to play with aspect ratios and doing the same there
StakFallTParticipantYeah but then it’s too wide
StakFallTParticipantI agree, I’m applying the word “fix” to something that isn’t broken but because I’m not sure what else to call it… Maybe a workaround? A readjustment? A recompensation?
Image zoom only works on the “texture” of the game that is presented to the headset though right? I ask this because the empty space I’m referring to, if I put the view mode into immersive, I see the colors of the background of immersive mode, below the in-headset screen — unless I’m remembering incorrectly. Which tells me using image zoom isn’t the correct “fix”. Kind’ve like if you held your hand (and let’s say it’s completely non-textured — we’ll say the hand is pure-white to avoid adding the un-necessary complexity of lighting into the example) in front of your face like 6 inches away, and I adjust image-zoom to make the texture of a hand fit (map correctly) on my hand, I can zoom the image out so that it fits. However, I still see all the background (that is out-of-band from the hand itself) stuff below my hand. So image zoom doesn’t make the correct adjustment. I can make it smaller so I can see all of the hand’s texture on the hand itself, but not without making everything else that was correct scale also smaller. So image zoom isn’t really even a “fix” I would say.
StakFallTParticipantThen by the sounds of it, if my understanding is correct. There’s not really any way to “fix” this since the issue occurs due to a mis-match between the two camera matrices (one that is used in the matrix transformation operation of the game, and one in vorpX’s matrix transformation operation)? Which isn’t really a problem, except that because the headset (I guess on a hardware level?) forces the rendering to be the way it is (what you referred to as trying to squeeze out a few more pixels, “fixing” the issue would create distortion? Wait. So does that mean game developers manually factor in that headsets put their game somewhat into a blender and adjust their matrices prior to pushing to to the HMD so that when the HMD display it with its own rendering, it looks correct?
-
AuthorPosts