my projectors is 1600x1200 but i use it at 2048x1200 and my LCD screen im not sure, but the 1400 ish resolution was the max i could get, its only a old VGA monitor :/ the smart tv we have might be better
No, the OpenGL setting will usually benefit less powerful computers through occlusion culling, which doesn't render terrain that won't be visible to the player. Enabling it should lower the load on the GPU. However, since your GPU is not being put under enough load to boost your frame-rates, you will wish to disable this, as it will place greater load on the GPU, which may in turn boost the GPU's clocks from idle, giving you greater performance.
im not sure if you read a couple of posts before but i said that i averaged 150 fps, and my max was around 200 with no mods on high settings
Most monitors have a refresh rate of 60Hz. This means that the screen will refresh itself 60 times per second. If your monitor runs at 60Hz and you run a game at over 60 fps, all of the extra frames will not even be created. So if you get 150 fps on Minecraft but your monitor runs at 60Hz, that means 90 of those frames aren't even being rendered so there will be no difference between 60Hz and 200Hz.
Rollback Post to RevisionRollBack
Quote from TheFieldZy »
Nobody's perfect, so neither is Hannah Montana Linux, but it's pretty great.
Quote from BC_Programming on Operating Systems »
They all suck. They just suck differently. Sort of like prostitutes.
Max is most likely 60fps with your monitor.
Most monitors have a refresh rate of 60Hz. This means that the screen will refresh itself 60 times per second. If your monitor runs at 60Hz and you run a game at over 60 fps, all of the extra frames will not even be created. So if you get 150 fps on Minecraft but your monitor runs at 60Hz, that means 90 of those frames aren't even being rendered so there will be no difference between 60Hz and 200Hz.