Read this, I'm almost sure I know what your problem is. If you have an i7 core and you usually run it at 16 fps, something is extremely wrong. You have a very powerful processor, and should be running minecraft at over 300 fps with performance set on max fps. You must be running it with the wrong graphics card. Usually i7 core processors come with build in graphics processors called integrated graphics, and it sucks. Do you have Nvidia or any other graphics card besides the one build into the processor? If so, then you should find out how to change the settings so that your other graphics card is the default graphics processor. If you have Nvidia, just right click minecraft, select Run with graphics processor, then select Change default graphics processor and change it from integrated graphics to your other graphics processor. If you have another kind of graphics card, find out how to change the default graphics processor to that one. If you do not have another graphics processor, then it is a COMPLETE waste of money to have an i7 core processor and not have a good graphics card.That is like buying the most expensive car ever made but missing one key part in the engine, causing it to perform like a bicycle. Only bad computers should get below 20 fps in minecraft.
Edit: I have a weaker version of the i7 than you do, and I only have 6GB of RAM. Yet, I can run minecraft at over 250 fps with performance set on max fps.
THANK GOD for this post! This was my problem all along! It was running on the Integrated graphics card instead of my Nvidia... Hallelujah I can finally play UHC!!!
I've installed 1.6.1 on three different computers and all of them have a max FPS of 4. One of them (the least powerful) starts up at about 20 FPS but then slowly moves down to 2 or 3 FPS after about 30 seconds, even if I do absolutely nothing in the game. The best computer is a top-of-the-line Dell Optiplex. The GPU on the motherboard rivals ones on high end graphics cards.
Here are my video settings:
Graphics: Fast
Render Distance: Normal
Smooth Lighting: Off
Performance: Max FPS
3D Anaglyph: OFF
View Bobbing: OFF
GUI Scale: Auto
Advanced OpenGL: OFF
Brightness: Bright
Clouds: OFF
Particles: Minimal
Server Textures: OFF
Fullscreen: OFF
Use VSync: OFF
Server: Moocraft (mc.moocraft.org)
Using these settings, Windows Task Manager shows that between 46% and 54% of my 2-core CPU is being used by Minecraft. The distribution is up to 80% of core #1 and up to 50% of core #2 are being used by Minecraft. Current FPS is 2 to 3 with occasional drops to 0 or 1. This is with no movement whatsoever, and no visible animals or players in the area.
I also play a 3D MMO game called Fiesta (www.outspark.com). This game has far more intense graphics than Minecraft, and I play it in full screen mode (1650x1080). I get between 40 and 60 FPS even on the slowest computer. When I play it at work, it doesn't matter which video card I play it on, I still get great performance.
This tells me that my computers aren't the problem, it's the program. I found that Java is an interpreted language, which is no better than BASIC. A program written in Java puts it into the minor leagues compared to the native code games like Fiesta or WoW. Minecraft has an '80s look to it and now it has the performance to match.
I'm sorry I bought this game. $30 is way too much to spend for a game that can't even get out of its own way. I have friends that want to buy the program, but now I'm forced to tell them to not buy it because it won't run on their computers. I just tested the program on a friend's computer and it fails as well.
-Ricky
PS. If you claim you're running at 150 FPS, I don't believe you.
I run at about 200 fps. And the problem is that the new launcher runs terribly with integrated graphics cards. If you configure your dedicated graphics card settings so that it runs the new minecraft launcher with your dedicated graphics card instead of integrated graphics, this will fix the problem. All it is is a problem with running with integrated graphics processors which are built into most intel processors.
There isn't any way to configure which card I use over the other. The computer decides which card to use, usually based on which screen the window is on.
And what about my other computers? They have only the integrated graphics card. Are you saying that no computer can run Minecraft if it has the integrated card? That would rule out about 99% of all computers. Not exactly the best way to sell software.
I'm typing this in a Firefox browser where the base priority was raised to Above Normal. This was necessary to get the browser to run since Minecraft stole so much processing time from the system that nothing else would run. The rest of the system essentially freezes when Minecraft is running. And that's minimized which takes the graphics card out of the equation.
Tonight, on this particular computer, Minecraft started up at 0 FPS. I raised it's priority to Above Normal and got enough FPS to log into my server. I then reduced it to Normal, minimised it, and raised Firefox's in order to be able to type this. There are no other applications running on the system.
2.8 GHz, 1 core 2 threads, 4 GB RAM. 100% of one core used by Minecraft. Altitude 230 feet so the ground is invisible. Even with nothing visible it's running at 0 or 1 FPS constantly. There is something WRONG with this game!!!
Perhaps one of you people that run 100+ FPS would post your settings so I can get my computers running at that speed?
What I would suggest is:
1. Making sure you have at least Java 7(64-bit) update 25
2. Making sure that minecraft has at least one full gb of Ram allocated to it.
3. Making sure that minecraft is running using an nvidia, ATI or AMD GPU with at least 1 GB vRAM(preferrably GDDR5)
4. Making sure that it is NOT using built in graphics of any sort(intel HD, etc)
5. Making sure that the in game settings are appropriate for your pc/Mac(Don't use the highest settings on a computer that is horribly outdated)
Notes:
1. nVidia card and program settings can be controlled from the nVidia control panel
2. For nVidia you should probably download the GeForce experience program to update your drivers
With the new launcher I actually have an increase from 20 to 35 FPS. These are my specs to my comp:
MS Windows Vista Home Premium 32-bit SP2
Intel Core 2 Duo E4500 @ 2.20GHz Conroe 65nm Technology
2.00 GB Dual-Channel DDR2 @ 332MHz
DELL E228WFP (1680x1050@59Hz)
Intel(R) G33/G31 Express Chipset Family
Rollback Post to RevisionRollBack
"Just remember that when you are falling, turn it into a dive." ~Wookiefoot
"Why isn’t my life like a situation comedy? Why don’t I have a bunch of friends with nothing better to do but drop by and instigate wacky adventures? Why aren’t my conversations peppered with spontaneous witticisms? Why don’t my friends demonstrate heartfelt concern for my well-being when I have problems?… I gotta get my life some writers." ~Calvin, Calvin and Hobbes
1.5.2 was build with bug fixes and a lot less lag. The 1.6 update requires the rendering of a new mob, new items, and new blocks. Hopefully 1.6.2 will be better.
I'm still wondering this, so I have to ask - has anybody managed to get Minecraft to run off of an ATI Radeon card? I set Minecraft to High Performance in the Catalyst Control Center, and I'm still getting 5-29 FPS...
Running on a Crapalist as we speek... AMD HD 6770, 1 gig of RAM.... Getting anywhere between 150 fps to 25 fps in my lab where there is a lot of things to draw. Was running a little worse but I turned off Open GL and it added to my frames. *shrug* Your results may vary.
Down side is that my little gadget that measures the temp and GPU load on the card...Never gets above 20% load when Minecraft is running. The game uses, by default, very little of the video card. I think MS Word uses more GPU than Minecraft...
Soon as I put Optifine on, the GPU jumps to 80% or higher and my frames double.
They need to implement Optifine (or something like it) directly into the game. If they at Mojang can't program it, they need to hire someone to. Not like they are short on money at the moment. It can't be that hard if someone is programming it in his spare time after all.
I also have a laptop that is running an i5 with intel 4000 GPU. FPS suck but the game is playable still. Big drop in FPS from 1.5 however. Waiting for Optifine to come out and fix those issues.
Friends computer has one of those dual GPU's. I locked out the the Intel video card though Windows 7's power management and Crapalist's menu a long time ago. This locked it onto the (still pretty crappy) Radeon Mobile GPU. If I remembered how to do it, I would have made a video by now. Saddly I don't have a clue as the process included a few beers.
you say cake you just made me want to eat 50 +1 half of a cake
it aslo happens in gmod 13 and tf2 team fortress 2 and oblivion 2006 but not in deus ex goty 1999
THANK GOD for this post! This was my problem all along! It was running on the Integrated graphics card instead of my Nvidia... Hallelujah I can finally play UHC!!!
No hate i Do have a Desktop
Here are my video settings:
Graphics: Fast
Render Distance: Normal
Smooth Lighting: Off
Performance: Max FPS
3D Anaglyph: OFF
View Bobbing: OFF
GUI Scale: Auto
Advanced OpenGL: OFF
Brightness: Bright
Clouds: OFF
Particles: Minimal
Server Textures: OFF
Fullscreen: OFF
Use VSync: OFF
Server: Moocraft (mc.moocraft.org)
Using these settings, Windows Task Manager shows that between 46% and 54% of my 2-core CPU is being used by Minecraft. The distribution is up to 80% of core #1 and up to 50% of core #2 are being used by Minecraft. Current FPS is 2 to 3 with occasional drops to 0 or 1. This is with no movement whatsoever, and no visible animals or players in the area.
I also play a 3D MMO game called Fiesta (www.outspark.com). This game has far more intense graphics than Minecraft, and I play it in full screen mode (1650x1080). I get between 40 and 60 FPS even on the slowest computer. When I play it at work, it doesn't matter which video card I play it on, I still get great performance.
This tells me that my computers aren't the problem, it's the program. I found that Java is an interpreted language, which is no better than BASIC. A program written in Java puts it into the minor leagues compared to the native code games like Fiesta or WoW. Minecraft has an '80s look to it and now it has the performance to match.
I'm sorry I bought this game. $30 is way too much to spend for a game that can't even get out of its own way. I have friends that want to buy the program, but now I'm forced to tell them to not buy it because it won't run on their computers. I just tested the program on a friend's computer and it fails as well.
-Ricky
PS. If you claim you're running at 150 FPS, I don't believe you.
There isn't any way to configure which card I use over the other. The computer decides which card to use, usually based on which screen the window is on.
And what about my other computers? They have only the integrated graphics card. Are you saying that no computer can run Minecraft if it has the integrated card? That would rule out about 99% of all computers. Not exactly the best way to sell software.
I'm typing this in a Firefox browser where the base priority was raised to Above Normal. This was necessary to get the browser to run since Minecraft stole so much processing time from the system that nothing else would run. The rest of the system essentially freezes when Minecraft is running. And that's minimized which takes the graphics card out of the equation.
Tonight, on this particular computer, Minecraft started up at 0 FPS. I raised it's priority to Above Normal and got enough FPS to log into my server. I then reduced it to Normal, minimised it, and raised Firefox's in order to be able to type this. There are no other applications running on the system.
2.8 GHz, 1 core 2 threads, 4 GB RAM. 100% of one core used by Minecraft. Altitude 230 feet so the ground is invisible. Even with nothing visible it's running at 0 or 1 FPS constantly. There is something WRONG with this game!!!
Perhaps one of you people that run 100+ FPS would post your settings so I can get my computers running at that speed?
1. Making sure you have at least Java 7(64-bit) update 25
2. Making sure that minecraft has at least one full gb of Ram allocated to it.
3. Making sure that minecraft is running using an nvidia, ATI or AMD GPU with at least 1 GB vRAM(preferrably GDDR5)
4. Making sure that it is NOT using built in graphics of any sort(intel HD, etc)
5. Making sure that the in game settings are appropriate for your pc/Mac(Don't use the highest settings on a computer that is horribly outdated)
Notes:
1. nVidia card and program settings can be controlled from the nVidia control panel
2. For nVidia you should probably download the GeForce experience program to update your drivers
Sites:
1. http://nvidia.com
2. http://java.com
3. http://amd.com
MS Windows Vista Home Premium 32-bit SP2
Intel Core 2 Duo E4500 @ 2.20GHz Conroe 65nm Technology
2.00 GB Dual-Channel DDR2 @ 332MHz
DELL E228WFP (1680x1050@59Hz)
Intel(R) G33/G31 Express Chipset Family
"Just remember that when you are falling, turn it into a dive." ~Wookiefoot
"Why isn’t my life like a situation comedy? Why don’t I have a bunch of friends with nothing better to do but drop by and instigate wacky adventures? Why aren’t my conversations peppered with spontaneous witticisms? Why don’t my friends demonstrate heartfelt concern for my well-being when I have problems?… I gotta get my life some writers." ~Calvin, Calvin and Hobbes
Running on a Crapalist as we speek... AMD HD 6770, 1 gig of RAM.... Getting anywhere between 150 fps to 25 fps in my lab where there is a lot of things to draw. Was running a little worse but I turned off Open GL and it added to my frames. *shrug* Your results may vary.
Down side is that my little gadget that measures the temp and GPU load on the card...Never gets above 20% load when Minecraft is running. The game uses, by default, very little of the video card. I think MS Word uses more GPU than Minecraft...
Soon as I put Optifine on, the GPU jumps to 80% or higher and my frames double.
They need to implement Optifine (or something like it) directly into the game. If they at Mojang can't program it, they need to hire someone to. Not like they are short on money at the moment. It can't be that hard if someone is programming it in his spare time after all.
I also have a laptop that is running an i5 with intel 4000 GPU. FPS suck but the game is playable still. Big drop in FPS from 1.5 however. Waiting for Optifine to come out and fix those issues.
Friends computer has one of those dual GPU's. I locked out the the Intel video card though Windows 7's power management and Crapalist's menu a long time ago. This locked it onto the (still pretty crappy) Radeon Mobile GPU. If I remembered how to do it, I would have made a video by now. Saddly I don't have a clue as the process included a few beers.