So first off, this isn't one of those "my fps went from 200 to 35 and i'm TICKED" threads. Seems to me 35fps is just fine for this game.
So I happened upon a DL of one of the Enterprise D's out there (Minetrek 2.0 to be exact).
So I loaded er up as a single player map to check it out. This was on my laptop. Game ran fine. No issues at all. 30-60 FPS depending on where I was. GREAT!
Buuuuut, it sure would look better on a desktop I have elsewhere in the house. So I loaded the exact same game up on the desktop and yeah 30" IPS LED trumps the laptops 17"er. Trouble is, within 30 seconds to a minute of getting in, the game drops to ....say...1 frame per 30 seconds. Looking around can bring it back to normal fps for a second or two then back to virtually frozen. After a few minutes I get the "java has run out of memory" crash. Question is, what...ON EARTH is going on here?! Is this bizarro world? In my travels its laptops that have issues not their desktop counter parts.
So, any ideas?
Laptop Specs:
Vanilla Minecraft 1.4.7, NO MODS, no texture packs, no nuthin.
Win 7 64bit
All drivers up to date
Java 32 bit
i7 840QM
16GB DDR3
Dual HD 5870's
Desktop Specs:
Vanilla Minecraft 1.4.7, NO MODS, no texture packs, no nuthin.
Win 7 64bit
All drivers up to date
Java 32 bit
i7 970
32GB DDR3
Dual GTX 570's
-There are no mods on either system.
-No OCing on either system.
-Both system are ROCK SOLID doing anything else (from compiling to MAX setting Skyrim/Far Cry 3/etc).
-All drivers and Java are up to date.
-Don't see why I would need to increase or change any settings for Java since default runs fine on the laptop which has inferior hardware.
-Have tried running SLi, and running a single card.
-Have tried MANY variants of in game settings from tiny to max view distance, OpenGL on and off, particle differences....nothing changes the rate drop then crash.
-Error occurs wether the game is maximized, windowed or fullscreen.
I guess the issue HAS to be related to the savegame but I dunno WHY it would work great on one system and not another (AMD vs Nvidia issue..I ......guess since that is the main hardware difference between the two systems?)
That assumption is because if I start a random game world and fly around......no problems, 60fps all the time (vsync).
Edit- Seems that if I just sit somewhere when entering the game it will run fine its once I get moving and looking around that the issues arise.
Used memory in the console never reads beyond mid 20's%
So I happened upon a DL of one of the Enterprise D's out there (Minetrek 2.0 to be exact).
So I loaded er up as a single player map to check it out. This was on my laptop. Game ran fine. No issues at all. 30-60 FPS depending on where I was. GREAT!
Buuuuut, it sure would look better on a desktop I have elsewhere in the house. So I loaded the exact same game up on the desktop and yeah 30" IPS LED trumps the laptops 17"er. Trouble is, within 30 seconds to a minute of getting in, the game drops to ....say...1 frame per 30 seconds. Looking around can bring it back to normal fps for a second or two then back to virtually frozen. After a few minutes I get the "java has run out of memory" crash. Question is, what...ON EARTH is going on here?! Is this bizarro world? In my travels its laptops that have issues not their desktop counter parts.
So, any ideas?
Laptop Specs:
Vanilla Minecraft 1.4.7, NO MODS, no texture packs, no nuthin.
Win 7 64bit
All drivers up to date
Java 32 bit
i7 840QM
16GB DDR3
Dual HD 5870's
Desktop Specs:
Vanilla Minecraft 1.4.7, NO MODS, no texture packs, no nuthin.
Win 7 64bit
All drivers up to date
Java 32 bit
i7 970
32GB DDR3
Dual GTX 570's
-There are no mods on either system.
-No OCing on either system.
-Both system are ROCK SOLID doing anything else (from compiling to MAX setting Skyrim/Far Cry 3/etc).
-All drivers and Java are up to date.
-Don't see why I would need to increase or change any settings for Java since default runs fine on the laptop which has inferior hardware.
-Have tried running SLi, and running a single card.
-Have tried MANY variants of in game settings from tiny to max view distance, OpenGL on and off, particle differences....nothing changes the rate drop then crash.
-Error occurs wether the game is maximized, windowed or fullscreen.
I guess the issue HAS to be related to the savegame but I dunno WHY it would work great on one system and not another (AMD vs Nvidia issue..I ......guess since that is the main hardware difference between the two systems?)
That assumption is because if I start a random game world and fly around......no problems, 60fps all the time (vsync).
Edit- Seems that if I just sit somewhere when entering the game it will run fine its once I get moving and looking around that the issues arise.
Used memory in the console never reads beyond mid 20's%