The Meaning of Life, the Universe, and Everything.
Join Date:
9/17/2013
Posts:
61
Minecraft:
TheKingaviMC
Member Details
Hi guys, I just got a new computer. It's very powerful as it will be used for many intensive things.
I get 1200-1900FPS on a super flat world while running around, and I'm using the same settings that used to give me around 100 on my old computer (Optimal performance without looking TOO ugly).
The thing is, whenever I join a server, my FPS drops to 100-200. I have player visibility disabled in the lobby I'm in and still there is not much improvement.
When I look to the sky I sometimes shoot up to 500 and sometimes to 1000.
But this is very weird and I want to know how I can fix it.
I am using OptiFine and Fastcraft (ver 1.21, had 1.9 on my old computer)
And please don't give me "100fps is enough!" nonsense, I would like my problem to be resolved rather than contested.
Thanks!
Extra info: Monitor res is weird, 1650x1050, quite an old monitor. I'm playing Minecraft 1.7.10 in window size 1280x720.
Hi guys, I just got a new computer. It's very powerful as it will be used for many intensive things.
I get 1200-1900FPS on a super flat world while running around, and I'm using the same settings that used to give me around 100 on my old computer (Optimal performance without looking TOO ugly).
The thing is, whenever I join a server, my FPS drops to 100-200. I have player visibility disabled in the lobby I'm in and still there is not much improvement.
When I look to the sky I sometimes shoot up to 500 and sometimes to 1000.
But this is very weird and I want to know how I can fix it.
I am using OptiFine and Fastcraft (ver 1.21, had 1.9 on my old computer)
And please don't give me "100fps is enough!" nonsense, I would like my problem to be resolved rather than contested.
Thanks!
Extra info: Monitor res is weird, 1650x1050, quite an old monitor. I'm playing Minecraft 1.7.10 in window size 1280x720.
Welcome to the world of computer graphics. What your experiencing is fairly normal, when looking at the world your computer has to do a lot more work the more complex the world is, if you have a big building with crazy juts out it has to calculate ambient occlusion / "smooth lighting" and lighting a lot more than say looking at the sky (which if you have no blocks in your field of view, would equate to no calculations). Super flat has almost no complexity to the world (aside from generated structures and/or flora) so it has drastically less things to work with. In other words, there's nothing wrong, this is how games work.
I know you said you didn't want this, but 100 FPS is enough if you have a 60Hz monitor or a monitor below 100Hz. More FPS does not always help, FPS is only important up until your monitor's refresh rate, running 1000 FPS on a 60Hz monitor, you're only seeing 60 FPS but your computer is pushing to get 1000 FPS, cap it to 60 and you're saving your computer a lot of work. Now if you dropped below your refresh rate, then it could be a problem that you may want solved. But, as I said, this isn't an actual problem. Games do this all the time, look at something more taxing on your computer (and yes the sky is easier to compute and render than the world) and it will struggle.
Rollback Post to RevisionRollBack
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
Welcome to the world of computer graphics. What your experiencing is fairly normal, when looking at the world your computer has to do a lot more work the more complex the world is, if you have a big building with crazy juts out it has to calculate ambient occlusion / "smooth lighting" and lighting a lot more than say looking at the sky (which if you have no blocks in your field of view, would equate to no calculations). Super flat has almost no complexity to the world (aside from generated structures and/or flora) so it has drastically less things to work with. In other words, there's nothing wrong, this is how games work.
I know you said you didn't want this, but 100 FPS is enough if you have a 60Hz monitor or a monitor below 100Hz. More FPS does not always help, FPS is only important up until your monitor's refresh rate, running 1000 FPS on a 60Hz monitor, you're only seeing 60 FPS but your computer is pushing to get 1000 FPS, cap it to 60 and you're saving your computer a lot of work. Now if you dropped below your refresh rate, then it could be a problem that you may want solved. But, as I said, this isn't an actual problem. Games do this all the time, look at something more taxing on your computer (and yes the sky is easier to compute and render than the world) and it will struggle.
Hello there, and thanks for your reply!
Your answer is logical. But, I have smooth lighting off, ruling off the possibility of too many odd structures to trigger ambient occlusion, and my graphics set to fast because the dynamic vignette can cause a lot of lag.
The lobby I am in is not too big... a lot smaller than my render distance. And while it is nicely built with leaderboards and houses and such, there's nothing too extravagant.
And I just simply don't get why it cuts my framerate off by such a big amount. My old computer used to get 100FPS on a superflat world and 75 odd FPS in the same lobby.
The reason I would like higher framerate is because I intend to use Minecraft as a base for video editing, and thus would like to record my screen at 120FPS.
Also, funny thing is, I've never tested my new computer on a default world! Only superflat and this server lobby. I'll try a default singleplayer world and get back with the results.
Hello there, and thanks for your reply!
Your answer is logical. But, I have smooth lighting off, ruling off the possibility of too many odd structures to trigger ambient occlusion, and my graphics set to fast because the dynamic vignette can cause a lot of lag.
The lobby I am in is not too big... a lot smaller than my render distance. And while it is nicely built with leaderboards and houses and such, there's nothing too extravagant.
And I just simply don't get why it cuts my framerate off by such a big amount. My old computer used to get 100FPS on a superflat world and 75 odd FPS in the same lobby.
The reason I would like higher framerate is because I intend to use Minecraft as a base for video editing, and thus would like to record my screen at 120FPS.
Also, funny thing is, I've never tested my new computer on a default world! Only superflat and this server lobby. I'll try a default singleplayer world and get back with the results.
Even then, AO / "smooth lighting" isn't the only thing that causes drops in frame rate. Are you running 1.7 or 1.8? 1.7 and below are inefficient with rendering. Modded or vanilla? Modded will cause more lag due to the frequent use of TESRs (a dynamic renderer for tile entities (a tile entity is a special "entity" that is linked to a location in-world and stores extra data for that location, chests, furnaces, beacons, signs, and other blocks that store extra data are examples of tile entities) that is updated every tick) for custom rendered blocks and custom models.
And the thing is AO isn't just calculated on corners, it's calculated on every block and takes adjacent blocks into account during the calculations. Might I ask why you wish to record at 120 FPS? Playing at 120 FPS, sure, but recording video at 120 FPS? I assume you intend to upload to Youtube? In that case, Youtube only supports 60 FPS maximum currently.
Rollback Post to RevisionRollBack
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
Even then, AO / "smooth lighting" isn't the only thing that causes drops in frame rate. Are you running 1.7 or 1.8? 1.7 and below are inefficient with rendering. Modded or vanilla? Modded will cause more lag due to the frequent use of TESRs (a dynamic renderer for tile entities (a tile entity is a special "entity" that is linked to a location in-world and stores extra data for that location, chests, furnaces, beacons, signs, and other blocks that store extra data are examples of tile entities) that is updated every tick) for custom rendered blocks and custom models.
And the thing is AO isn't just calculated on corners, it's calculated on every block and takes adjacent blocks into account during the calculations. Might I ask why you wish to record at 120 FPS? Playing at 120 FPS, sure, but recording video at 120 FPS? I assume you intend to upload to Youtube? In that case, Youtube only supports 60 FPS maximum currently.
As I mentioned in the original post, 1.7.10 with OptiFine and Fastcraft 1.21, mods intended to increase framerate. And I am well aware of what entities are, and I am not using any mods which will add extra ones to my world.
I wish to record at 120FPS so that I will have a lot more flexibility with editing.
As I mentioned in the original post, 1.7.10 with OptiFine and Fastcraft 1.21, mods intended to increase framerate. And I am well aware of what entities are, and I am not using any mods which will add extra ones to my world.
I wish to record at 120FPS so that I will have a lot more flexibility with editing.
Sorry if I jump in on this discussion but 60 FPS is good for most situations in editing. The human eye sees 15-24 (depending on who you ask) FPS as a fluid movement so if you must slow down things mid-video 60 FPS is enough for half speed and still having a fluid movement. You might even get away with halving that again as that gives you 15.
Sorry if I jump in on this discussion but 60 FPS is good for most situations in editing. The human eye sees 15-24 (depending on who you ask) FPS as a fluid movement so if you must slow down things mid-video 60 FPS is enough for half speed and still having a fluid movement. You might even get away with halving that again as that gives you 15.
(I hope I am not screwing up my facts completely)
You're somewhat correct. Just pure footage and the human eye will see 24 FPS as jittery and slow, however movies add a special effect known as motion blur to "trick" the eye into thinking the video is smoother than it actually is, that's how movies get away with it. But, the human eye does not see in terms of FPS, it sees in motion. Your eye is designed to pick up motion in an environment rather than a static environment. If the eye didn't see past 30 FPS (the typical excuse for low framerates is the human eye only sees at 30 / 60 / n FPS, 30 usually), we wouldn't have 120Hz and 144Hz monitors and we wouldn't be able to tell the difference, we wouldn't be able to tell the difference between 30 FPS and 60 FPS. There's a video on Youtube of Linus from LinusTechTips being able to correctly distinguish between a 30Hz, 60Hz, and 144Hz monitor within a couple seconds of using it to play Battlefield 3 (I think? Not sure what game they played, I know it was a Battlefield game), and they change the settings of the monitor and game without Linus looking each time, so it's a legitimate experiment.
This is a really touchy subject, because the argument changes depending on so many factors. You cannot compare what FPS is best between both games and movies, primarily because games involve a form of interaction and the FPS alters your perceived smoothness of both the video and input (ie your brain thinks that on a higher framerate input is received faster, note that I'm simplifying massively here), and even then the technology is different; a cinema projector functions better at lower framerates while gaming functions better at higher framerates. And even then you cannot rely on the FPS your game or Fraps is telling you because the actual framerate your eyes receive is determined by the weakest link. On an uber computer that can easily get 300 FPS on GTA5 on Ultra, chances are that weak link is your monitor. If you're playing a game at 300 FPS on a 144Hz monitor, your eyes are only seeing 144Hz, or effectively 144 FPS as that's the quickest that monitor can update it's display. And to complicate matters further, eyes register motion in different effectiveness "levels", my eye may be better at registering motion than yours is.
Rollback Post to RevisionRollBack
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
Hi guys, I just got a new computer. It's very powerful as it will be used for many intensive things.
I get 1200-1900FPS on a super flat world while running around, and I'm using the same settings that used to give me around 100 on my old computer (Optimal performance without looking TOO ugly).
The thing is, whenever I join a server, my FPS drops to 100-200. I have player visibility disabled in the lobby I'm in and still there is not much improvement.
When I look to the sky I sometimes shoot up to 500 and sometimes to 1000.
But this is very weird and I want to know how I can fix it.
I am using OptiFine and Fastcraft (ver 1.21, had 1.9 on my old computer)
And please don't give me "100fps is enough!" nonsense, I would like my problem to be resolved rather than contested.
Thanks!
Extra info: Monitor res is weird, 1650x1050, quite an old monitor. I'm playing Minecraft 1.7.10 in window size 1280x720.
Welcome to the world of computer graphics. What your experiencing is fairly normal, when looking at the world your computer has to do a lot more work the more complex the world is, if you have a big building with crazy juts out it has to calculate ambient occlusion / "smooth lighting" and lighting a lot more than say looking at the sky (which if you have no blocks in your field of view, would equate to no calculations). Super flat has almost no complexity to the world (aside from generated structures and/or flora) so it has drastically less things to work with. In other words, there's nothing wrong, this is how games work.
I know you said you didn't want this, but 100 FPS is enough if you have a 60Hz monitor or a monitor below 100Hz. More FPS does not always help, FPS is only important up until your monitor's refresh rate, running 1000 FPS on a 60Hz monitor, you're only seeing 60 FPS but your computer is pushing to get 1000 FPS, cap it to 60 and you're saving your computer a lot of work. Now if you dropped below your refresh rate, then it could be a problem that you may want solved. But, as I said, this isn't an actual problem. Games do this all the time, look at something more taxing on your computer (and yes the sky is easier to compute and render than the world) and it will struggle.
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
My Github page.
The entire Minecraft shader development community now has its own Discord server! Feel free to join and chat with all the developers!
Hello there, and thanks for your reply!
Your answer is logical. But, I have smooth lighting off, ruling off the possibility of too many odd structures to trigger ambient occlusion, and my graphics set to fast because the dynamic vignette can cause a lot of lag.
The lobby I am in is not too big... a lot smaller than my render distance. And while it is nicely built with leaderboards and houses and such, there's nothing too extravagant.
And I just simply don't get why it cuts my framerate off by such a big amount. My old computer used to get 100FPS on a superflat world and 75 odd FPS in the same lobby.
The reason I would like higher framerate is because I intend to use Minecraft as a base for video editing, and thus would like to record my screen at 120FPS.
Also, funny thing is, I've never tested my new computer on a default world! Only superflat and this server lobby. I'll try a default singleplayer world and get back with the results.
Even then, AO / "smooth lighting" isn't the only thing that causes drops in frame rate. Are you running 1.7 or 1.8? 1.7 and below are inefficient with rendering. Modded or vanilla? Modded will cause more lag due to the frequent use of TESRs (a dynamic renderer for tile entities (a tile entity is a special "entity" that is linked to a location in-world and stores extra data for that location, chests, furnaces, beacons, signs, and other blocks that store extra data are examples of tile entities) that is updated every tick) for custom rendered blocks and custom models.
And the thing is AO isn't just calculated on corners, it's calculated on every block and takes adjacent blocks into account during the calculations. Might I ask why you wish to record at 120 FPS? Playing at 120 FPS, sure, but recording video at 120 FPS? I assume you intend to upload to Youtube? In that case, Youtube only supports 60 FPS maximum currently.
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
My Github page.
The entire Minecraft shader development community now has its own Discord server! Feel free to join and chat with all the developers!
As I mentioned in the original post, 1.7.10 with OptiFine and Fastcraft 1.21, mods intended to increase framerate. And I am well aware of what entities are, and I am not using any mods which will add extra ones to my world.
I wish to record at 120FPS so that I will have a lot more flexibility with editing.
Sorry if I jump in on this discussion but 60 FPS is good for most situations in editing. The human eye sees 15-24 (depending on who you ask) FPS as a fluid movement so if you must slow down things mid-video 60 FPS is enough for half speed and still having a fluid movement. You might even get away with halving that again as that gives you 15.
(I hope I am not screwing up my facts completely)
You're somewhat correct. Just pure footage and the human eye will see 24 FPS as jittery and slow, however movies add a special effect known as motion blur to "trick" the eye into thinking the video is smoother than it actually is, that's how movies get away with it. But, the human eye does not see in terms of FPS, it sees in motion. Your eye is designed to pick up motion in an environment rather than a static environment. If the eye didn't see past 30 FPS (the typical excuse for low framerates is the human eye only sees at 30 / 60 / n FPS, 30 usually), we wouldn't have 120Hz and 144Hz monitors and we wouldn't be able to tell the difference, we wouldn't be able to tell the difference between 30 FPS and 60 FPS. There's a video on Youtube of Linus from LinusTechTips being able to correctly distinguish between a 30Hz, 60Hz, and 144Hz monitor within a couple seconds of using it to play Battlefield 3 (I think? Not sure what game they played, I know it was a Battlefield game), and they change the settings of the monitor and game without Linus looking each time, so it's a legitimate experiment.
This is a really touchy subject, because the argument changes depending on so many factors. You cannot compare what FPS is best between both games and movies, primarily because games involve a form of interaction and the FPS alters your perceived smoothness of both the video and input (ie your brain thinks that on a higher framerate input is received faster, note that I'm simplifying massively here), and even then the technology is different; a cinema projector functions better at lower framerates while gaming functions better at higher framerates. And even then you cannot rely on the FPS your game or Fraps is telling you because the actual framerate your eyes receive is determined by the weakest link. On an uber computer that can easily get 300 FPS on GTA5 on Ultra, chances are that weak link is your monitor. If you're playing a game at 300 FPS on a 144Hz monitor, your eyes are only seeing 144Hz, or effectively 144 FPS as that's the quickest that monitor can update it's display. And to complicate matters further, eyes register motion in different effectiveness "levels", my eye may be better at registering motion than yours is.
Author of the Clarity, Serenity, Sapphire & Halcyon shader packs for Minecraft: Java Edition.
My Github page.
The entire Minecraft shader development community now has its own Discord server! Feel free to join and chat with all the developers!
Ah, thanks for clearing that up. Always nice to get some extra information on the topic.