I really want to get 60 fps all the time but for some reason no matter what i try it always seems to drop to about 30 in places. I know what your going to say upgrade your computer but i know my computer is not the issue. It's perfectly fine here are my specs to prove it. [email protected] turbo core is disabled
ASUS F2A85-v pro mobo
ASUS AMD Radeon HD 7770 GHz edition 2GB GDDR5 clocks core/mem 1020/1150MHz
Corsair Vengence 8GB DDR3 1600Mhz 9-9-10-24 (2x4GB)@1866MHz 10-10-11-28
windows 8 pro
catalyst version 13.4
I have optifine installed in case your wondering and i have everything enabled at far render distance I also have 8x AA 16x AF and max mipmapping on trilinear. I usally play with 720P or 1080P ,and I have the latest version of 64-bit java installed with at least 4GB of ram allowicated.
The werid part of this is that even when i'm only getting 30 fps my cpu and gpu are at no more then 60% usage. so i was wondering is there is any way to make minecraft use more cpu/gpu power and get more fps.
It's the small things you need to change though. If you still want more fps and still nice-looking minecraft, try this:
Render Distance: Tiny or small
Performance: Max FPS
Advanced OpenGL: OFF
Particles: Decreased
Use VSync: OFF
If these don't work, I tried. These are the settings I use to get better FPS.
-Card
To much ram allocated is one problem, cut that down to 2GB and don't cry saying more ram equals more performance because it does not. Minecraft barely even uses 500~MB on vanilla.
Get rid of AA and AF, those serve no benefits for this game and only cause issues really. Minecraft is a tiling game, in which AA and AF can cause issues per tile greatly. Modern games use meshes and a giant texture that is stretched over it and benefit from AA and AF because of that.
But moving on.
Your computer is fine, but your expectations are flawed. You are enabling settings the game was never designed for which is killing it.
You're using 8x AA and 16x AF and wondering why you're getting lag sometimes. Is this thread for real? Turn them down to 4x each MAX. Especially on stock textures.
I use AA and AF that are a part of optifine and it causes no issues.
To much ram allocated is one problem, cut that down to 2GB and don't cry saying more ram equals more performance because it does not. Minecraft barely even uses 500~MB on vanilla.
Get rid of AA and AF, those serve no benefits for this game and only cause issues really. Minecraft is a tiling game, in which AA and AF can cause issues per tile greatly. Modern games use meshes and a giant texture that is stretched over it and benefit from AA and AF because of that.
But moving on.
Your computer is fine, but your expectations are flawed. You are enabling settings the game was never designed for which is killing it.
It also looks really nice when i said i'm try to find ways to increase fps i was thinking along the lines of java run-time parameters and stuff like that.I already made one discovery on my own if i add -XX:MaxGCPauseMillis=0 to the list of run-time parameters it seems to improve fps by quite a bit but it uses more memory no problem for me since i have more than 6Gb of free ram.
You're using 8x AA and 16x AF and wondering why you're getting lag sometimes. Is this thread for real? Turn them down to 4x each MAX. Especially on stock textures.
I would use x16 AA but i think that might not be worth the performance hit. i'm trying to improve my performance without changing any graphical settings. It seems like only a fraction of my available resources are being used like 25-50% i was wondering if i could up that to like 80-90% and get a lot more fps by doing so.
I think that your problem is that Minecraft is using the APU's integrated graphics.
I'm not sure how to disable it but there should be a way in the AMD control panel. Once you manage to do that, you should be able to get ~150FPS consistently with the card.
The 7660D in the 5800K is powerful enough to handle vanilla MC with ease.
It is just fact OP is shoving on AA and AF to max that is killing a game never designed for such, nor benefits from such (it's a fallacy if anyone says otherwise).
4GB of ram also does not help, esp on a 64bit environment.
Also the discrete GPU will be used since this is a desktop environment. Most largest case is hybrid crossfire.
I really want to get 60 fps all the time but for some reason no matter what i try it always seems to drop to about 30 in places.
Do you have v-sync enabled? If so, try disabling it and it should help. From what I understand, the way it works with v-sync is if your PC can't mantain a frame rate at your monitor's refresh rate, it can tend to drop half of that or something like that. For example, let's say your refresh rate it 60Hz, and at a given scene, your PC is only to manage, say, 53 FPS. Instead of running at 43 FPS, it may drop to 30 FPS.
If you don't have v-sync enabled, then I wouldn't know, but I'm guessing you do have it enabled since this is a common symptom of it.
Get rid of AA and AF, those serve no benefits for this game and only cause issues really. Minecraft is a tiling game, in which AA and AF can cause issues per tile greatly. Modern games use meshes and a giant texture that is stretched over it and benefit from AA and AF because of that.
While I'm not saying you're wrong here on how it works, I do have to admit that my experiences leave me the opposite conclusion that AA and AF serve no benefits. Maybe it's only one of them doing something and not the other (I never tried each one independantly, so maybe one of them is useless), but there is a striking difference in quality when AA and AF are both off compared to when AA and AF are both on, and I don't notice any odd White lines that others seem to get either.
I use 4xAA and 16xAF at 1920 x 1200 (FOV of 85) on a modest-ish GeForce GTX 560 Ti and my frame rates are probably never below 60 FPS (generally between about 80 FPS and 120 FPS). This is with using maximum in-game settings as well.
I've seen the same quality improvements on my secondary PC with a GeForce GT 430 using AA and AF, though not to the same extent (2xAA and a lower AF value that I can't exactly recall offhand [maybe 4x or 8x] at 1360 x 768 also using maximum in-game settings with fine frame rates too).
Quote from "zsx47" timestamp="1375071518" post="23621834" »
I already made one discovery on my own if i add -XX:MaxGCPauseMillis=0 to the list of run-time parameters it seems to improve fps by quite a bit but it uses more memory no problem for me since i have more than 6Gb of free ram.
not really considered a command it's a run-time parameter like the parameters used to increase memory allocated to java. essential what it does is decrease the amount of time that is pent on a gc run. so that less computer resources are used on minor GCs.
Do you have v-sync enabled? If so, try disabling it and it should help. From what I understand, the way it works with v-sync is if your PC can't mantain a frame rate at your monitor's refresh rate, it can tend to drop half of that or something like that. For example, let's say your refresh rate it 60Hz, and at a given scene, your PC is only to manage, say, 53 FPS. Instead of running at 43 FPS, it may drop to 30 FPS.
If you don't have v-sync enabled, then I wouldn't know, but I'm guessing you do have it enabled since this is a common symptom of it.
While I'm not saying you're wrong here on how it works, I do have to admit that my experiences leave me the opposite conclusion that AA and AF serve no benefits. Maybe it's only one of them doing something and not the other (I never tried each one independantly, so maybe one of them is useless), but there is a striking difference in quality when AA and AF are both off compared to when AA and AF are both on, and I don't notice any odd White lines that others seem to get either.
I use 4xAA and 16xAF at 1920 x 1200 (FOV of 85) on a modest-ish GeForce GTX 560 Ti and my frame rates are probably never below 60 FPS (generally between about 80 FPS and 120 FPS). This is with using maximum in-game settings as well.
I've seen the same quality improvements on my secondary PC with a GeForce GT 430 using AA and AF, though not to the same extent (2xAA and a lower AF value that I can't exactly recall offhand [maybe 4x or 8x] at 1360 x 768 also using maximum in-game settings with fine frame rates too).
Out of curiousity, what does this command do?
It is one of those your mileage may vary things, it is not for everyone.
Minecraft is one of those games, you design it however you want. Bit a setup towards large view distance and having a stable tick rate is all that matters at most.
Though my main point is, if Minecraft could use AA and AF properly, then its developers would implemented natively such.
Mean time;
The maximum pause time goal is specified with the command line option -XX:MaxGCPauseMillis=<N>. This is interpreted as a hint that pause times of <N> milliseconds or less are desired; by default there is no maximum pause time goal. If a pause time goal is specified, the heap size and other garbage collection related parameters are adjusted in an attempt to keep garbage collection pauses shorter than the specified value. Note that these adjustments may cause the garbage collector to reduce the overall throughput of the application and in some cases the desired pause time goal cannot be met.
The throughput goal is measured in terms of the time spent doing garbage collection vs. the time spent outside of garbage collection (referred to as application time). The goal is specified by the command line option -XX:GCTimeRatio=<N>, which sets the ratio of garbage collection time to application time to 1 / (1 + <N>).
For example, -XX:GCTimeRatio=19 sets a goal of 1/20 or 5% of the total time in garbage collection. The default value is 99, resulting in a goal of 1% of the time in garbage collection.
Maxmimum heap footprint is specified using the existing option -Xmx<N>. In addition, the collector has an implicit goal of minimizing the size of the heap as long as the other goals are being met.
Though my main point is, if Minecraft could use AA and AF properly, then its developers would implemented natively such.
Maybe. Remember that we're talking about a game that has a relatively limited number of options. Consider that simple things like fullscreen and v-sync weren't always there. Again, I'm sure you're right about "how things work" beneath it all, so to say, although I've seen a quality improvement of more than one PC with no side effects, so I figured I'd comment with my experiences. I will add that I've used the third party program "nVidia inspector" to set profiles for these games, while leaving the nVidia control panel itself alone (I think this shouldn't matter in theory, but I wouldn't know for sure).
The maximum pause time goal is specified with the command line option -XX:MaxGCPauseMillis=<N>. This is interpreted as a hint that pause times of <N> milliseconds or less are desired; by default there is no maximum pause time goal. If a pause time goal is specified, the heap size and other garbage collection related parameters are adjusted in an attempt to keep garbage collection pauses shorter than the specified value. Note that these adjustments may cause the garbage collector to reduce the overall throughput of the application and in some cases the desired pause time goal cannot be met.
The throughput goal is measured in terms of the time spent doing garbage collection vs. the time spent outside of garbage collection (referred to as application time). The goal is specified by the command line option -XX:GCTimeRatio=<N>, which sets the ratio of garbage collection time to application time to 1 / (1 + <N>).
For example, -XX:GCTimeRatio=19 sets a goal of 1/20 or 5% of the total time in garbage collection. The default value is 99, resulting in a goal of 1% of the time in garbage collection.
Maxmimum heap footprint is specified using the existing option -Xmx<N>. In addition, the collector has an implicit goal of minimizing the size of the heap as long as the other goals are being met.
But is a pointless argument otherwise unless you have garbage issues or need to keep tight ram allocation management for that application.
Ha, a lot of that went over my head! It sounds like I should basically leave it alone then, especially since it isn't broke (even though I seem to like "fixing" things until there are broke...).
Ha, a lot of that went over my head! It sounds like I should basically leave it alone then, especially since it isn't broke (even though I seem to like "fixing" things until there are broke...).
This is how new things are created, or solutions to otherwise UN-fixable problems are found.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
[email protected] turbo core is disabled
ASUS F2A85-v pro mobo
ASUS AMD Radeon HD 7770 GHz edition 2GB GDDR5 clocks core/mem 1020/1150MHz
Corsair Vengence 8GB DDR3 1600Mhz 9-9-10-24 (2x4GB)@1866MHz 10-10-11-28
windows 8 pro
catalyst version 13.4
I have optifine installed in case your wondering and i have everything enabled at far render distance I also have 8x AA 16x AF and max mipmapping on trilinear. I usally play with 720P or 1080P ,and I have the latest version of 64-bit java installed with at least 4GB of ram allowicated.
The werid part of this is that even when i'm only getting 30 fps my cpu and gpu are at no more then 60% usage. so i was wondering is there is any way to make minecraft use more cpu/gpu power and get more fps.
Render Distance: Tiny or small
Performance: Max FPS
Advanced OpenGL: OFF
Particles: Decreased
Use VSync: OFF
If these don't work, I tried. These are the settings I use to get better FPS.
-Card
Get rid of AA and AF, those serve no benefits for this game and only cause issues really. Minecraft is a tiling game, in which AA and AF can cause issues per tile greatly. Modern games use meshes and a giant texture that is stretched over it and benefit from AA and AF because of that.
But moving on.
Your computer is fine, but your expectations are flawed. You are enabling settings the game was never designed for which is killing it.
It also looks really nice when i said i'm try to find ways to increase fps i was thinking along the lines of java run-time parameters and stuff like that.I already made one discovery on my own if i add -XX:MaxGCPauseMillis=0 to the list of run-time parameters it seems to improve fps by quite a bit but it uses more memory no problem for me since i have more than 6Gb of free ram.
I would use x16 AA but i think that might not be worth the performance hit. i'm trying to improve my performance without changing any graphical settings. It seems like only a fraction of my available resources are being used like 25-50% i was wondering if i could up that to like 80-90% and get a lot more fps by doing so.
The 7660D in the 5800K is powerful enough to handle vanilla MC with ease.
It is just fact OP is shoving on AA and AF to max that is killing a game never designed for such, nor benefits from such (it's a fallacy if anyone says otherwise).
4GB of ram also does not help, esp on a 64bit environment.
Also the discrete GPU will be used since this is a desktop environment. Most largest case is hybrid crossfire.
However the HD7700 will be active display.
Do you have v-sync enabled? If so, try disabling it and it should help. From what I understand, the way it works with v-sync is if your PC can't mantain a frame rate at your monitor's refresh rate, it can tend to drop half of that or something like that. For example, let's say your refresh rate it 60Hz, and at a given scene, your PC is only to manage, say, 53 FPS. Instead of running at 43 FPS, it may drop to 30 FPS.
If you don't have v-sync enabled, then I wouldn't know, but I'm guessing you do have it enabled since this is a common symptom of it.
While I'm not saying you're wrong here on how it works, I do have to admit that my experiences leave me the opposite conclusion that AA and AF serve no benefits. Maybe it's only one of them doing something and not the other (I never tried each one independantly, so maybe one of them is useless), but there is a striking difference in quality when AA and AF are both off compared to when AA and AF are both on, and I don't notice any odd White lines that others seem to get either.
I use 4xAA and 16xAF at 1920 x 1200 (FOV of 85) on a modest-ish GeForce GTX 560 Ti and my frame rates are probably never below 60 FPS (generally between about 80 FPS and 120 FPS). This is with using maximum in-game settings as well.
I've seen the same quality improvements on my secondary PC with a GeForce GT 430 using AA and AF, though not to the same extent (2xAA and a lower AF value that I can't exactly recall offhand [maybe 4x or 8x] at 1360 x 768 also using maximum in-game settings with fine frame rates too).
Out of curiousity, what does this command do?
not really considered a command it's a run-time parameter like the parameters used to increase memory allocated to java. essential what it does is decrease the amount of time that is pent on a gc run. so that less computer resources are used on minor GCs.
It is one of those your mileage may vary things, it is not for everyone.
Minecraft is one of those games, you design it however you want. Bit a setup towards large view distance and having a stable tick rate is all that matters at most.
Though my main point is, if Minecraft could use AA and AF properly, then its developers would implemented natively such.
Mean time;
The maximum pause time goal is specified with the command line option -XX:MaxGCPauseMillis=<N>. This is interpreted as a hint that pause times of <N> milliseconds or less are desired; by default there is no maximum pause time goal. If a pause time goal is specified, the heap size and other garbage collection related parameters are adjusted in an attempt to keep garbage collection pauses shorter than the specified value. Note that these adjustments may cause the garbage collector to reduce the overall throughput of the application and in some cases the desired pause time goal cannot be met.
The throughput goal is measured in terms of the time spent doing garbage collection vs. the time spent outside of garbage collection (referred to as application time). The goal is specified by the command line option -XX:GCTimeRatio=<N>, which sets the ratio of garbage collection time to application time to 1 / (1 + <N>).
For example, -XX:GCTimeRatio=19 sets a goal of 1/20 or 5% of the total time in garbage collection. The default value is 99, resulting in a goal of 1% of the time in garbage collection.
Maxmimum heap footprint is specified using the existing option -Xmx<N>. In addition, the collector has an implicit goal of minimizing the size of the heap as long as the other goals are being met.
~ Oracle Technetwork Java Docs
But is a pointless argument otherwise unless you have garbage issues or need to keep tight ram allocation management for that application.
Maybe. Remember that we're talking about a game that has a relatively limited number of options. Consider that simple things like fullscreen and v-sync weren't always there. Again, I'm sure you're right about "how things work" beneath it all, so to say, although I've seen a quality improvement of more than one PC with no side effects, so I figured I'd comment with my experiences. I will add that I've used the third party program "nVidia inspector" to set profiles for these games, while leaving the nVidia control panel itself alone (I think this shouldn't matter in theory, but I wouldn't know for sure).
Ha, a lot of that went over my head! It sounds like I should basically leave it alone then, especially since it isn't broke (even though I seem to like "fixing" things until there are broke...).
This is how new things are created, or solutions to otherwise UN-fixable problems are found.