I seem to be having a problem with Minecraft running extremely poor with a pretty decent rig.
A bit of background: I have just recently built a new computer. My old computer could run minecraft just fine with max lighting, 60fps, 1 mipmap, 2x ASF, etc etc. Even Amplified worlds were not really a problem. The specs for the old machine are as follows:
2.3 ghz AMD 6400+
4GB DDR2 RAM
GeForce GTX 460
While the new system, which runs minecraft at a good framerate but suffers from a lot of ingame lag and delays, has the following specs:
AMD FX 8120 8-core (3.1ghz)
8GB DDR3 RAM
GeForce GTX 460 (Moved the old card over, hoping to get new one soon)
Here's the troubleshooting and comparisons between the two rigs. Keep in mind, this is all SINGLE PLAYER, no multipler is involved with this at all:
Old Rig: Minecraft could load worlds very quickly, load chunks alright, and had zero block lag. Usually ran from 30-60 fps. Interactions were immediate and levels dit not have to "freeeze" to load new chunks or areas.
New Rig: With settings even on lowest possible, blocks will not break immediately or must be broken multiple times even on regular survival or creative maps. Movement and framerate are not problematic, though at times the client will freeze up and do what I assume is load more chunks. Worlds take 30-60 seconds to lead, and making a new world takes quite some time. After adjusting multiple settings, the results remain the same (even on more graphically demanding settings).
This performance difference is throughout all versions up to, and including, 1.7.2. I have not checked earlier versions (but why bother?) and all drivers are updated to current, as is java itself.
What is causing such a dramatic performance difference between the two machines and, more importantly, why is minecraft running WORSE on the machine with better specs?
A bit of background: I have just recently built a new computer. My old computer could run minecraft just fine with max lighting, 60fps, 1 mipmap, 2x ASF, etc etc. Even Amplified worlds were not really a problem. The specs for the old machine are as follows:
2.3 ghz AMD 6400+
4GB DDR2 RAM
GeForce GTX 460
While the new system, which runs minecraft at a good framerate but suffers from a lot of ingame lag and delays, has the following specs:
AMD FX 8120 8-core (3.1ghz)
8GB DDR3 RAM
GeForce GTX 460 (Moved the old card over, hoping to get new one soon)
Here's the troubleshooting and comparisons between the two rigs. Keep in mind, this is all SINGLE PLAYER, no multipler is involved with this at all:
Old Rig: Minecraft could load worlds very quickly, load chunks alright, and had zero block lag. Usually ran from 30-60 fps. Interactions were immediate and levels dit not have to "freeeze" to load new chunks or areas.
New Rig: With settings even on lowest possible, blocks will not break immediately or must be broken multiple times even on regular survival or creative maps. Movement and framerate are not problematic, though at times the client will freeze up and do what I assume is load more chunks. Worlds take 30-60 seconds to lead, and making a new world takes quite some time. After adjusting multiple settings, the results remain the same (even on more graphically demanding settings).
This performance difference is throughout all versions up to, and including, 1.7.2. I have not checked earlier versions (but why bother?) and all drivers are updated to current, as is java itself.
What is causing such a dramatic performance difference between the two machines and, more importantly, why is minecraft running WORSE on the machine with better specs?
EDIT: It seems as if this issue has been solved with recent patching. No longer a problem.