Alright, so i just bought my MSI GT70 laptop like 3 months ago, and there is NO possible way i should be lagging with my specs.
CPU: Intel Core i7-3610QM
VGA: NVIDIA Geforce GTX 675M/ 4GB GDDR5
RAM: DDRIII 16GB(4GB*4)
HDD: 750GB 7200RPM+128G SSD(Super Raid)
i will have like 30 seconds of excellent connection, as expected. and then out of no where i will get like 1FPS and will lag uncontrollably. it doesn't make any sense to me, even when i have graphics as low as possible, rendering to short, smooth lighting off, everything off or low, the same thing happens. Sometimes i even lag out and it says i ran out of game memory or something. Does anyone know what could help?
If that doesn't help, then try reinstalling your graphics drivers, however I really can't see that helping much either.
It could be your computers thermometer overreacting, and automatically clocking your CPU / GPU so it says cool (I had an old laptop that did this). Core temp is a program that displays temperature information.
my java is
Java SE 7 update 09
Windows 7.6.1
but my java architecture is 32-bit
they make a 64 bit i believe? but i don't understand why it wont let me download that version when i go to their site, i use Google Chrome. it says they are all up to date.
my java is
Java SE 7 update 09
Windows 7.6.1
but my java architecture is 32-bit
they make a 64 bit i believe? but i don't understand why it wont let me download that version when i go to their site, i use Google Chrome. it says they are all up to date.
I use 32 bit java on a 64 bit computer. What does the CPU/RAM clock look like when MC is lagging?
i got 64 bit java and didn't help, i went into cmd prmpt and typed "wmic cpu get CurrentClockSpeed" and it came up with 2301 when i was lagging. tried everything everyone has suggested and still, lags unbelievably
i figured out the problem! even though i had set my NVIDIA GTX 675M as my default graphics card, minecraft was running itself with my integrated 4000 graphics card. I have to right click and tell it to run with my NVIDIA, thanks for all the help guys! i appreciate you taking your time to try and help me! we need more nice people in the world like you fellas.
i figured out the problem! even though i had set my NVIDIA GTX 675M as my default graphics card, minecraft was running itself with my integrated 4000 graphics card. I have to right click and tell it to run with my NVIDIA, thanks for all the help guys! i appreciate you taking your time to try and help me! we need more nice people in the world like you fellas.
No problem.
Sort of funny that the card wasn't enabled by default...
Sort of funny that the card wasn't enabled by default...
Minecraft appears to not be a 3D demanding game, yet it is; yet it shouldn't be. Optimius, the graphics switcher for Nvidia mobile system chipset driver, tends to not switch to the discrete GPU from the ULV Intel GMA in some 3D demanding applications, minecraft being a example.
However OP seems to found the easy and only fix to that.
Minecraft appears to not be a 3D demanding game, yet it is; yet it shouldn't be. Optimius, the graphics switcher for Nvidia mobile system chipset driver, tends to not switch to the discrete GPU from the ULV Intel GMA in some 3D demanding applications, minecraft being a example.
However OP seems to found the easy and only fix to that.
That's because it's not true. Minecraft is a 3D demanding game.
The root cause is actually that the software in charge of switching (Optimus), doesn't think that Java requires discrete graphics and/or doesn't detect when an application is using Open GL.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
CPU: Intel Core i7-3610QM
VGA: NVIDIA Geforce GTX 675M/ 4GB GDDR5
RAM: DDRIII 16GB(4GB*4)
HDD: 750GB 7200RPM+128G SSD(Super Raid)
i will have like 30 seconds of excellent connection, as expected. and then out of no where i will get like 1FPS and will lag uncontrollably. it doesn't make any sense to me, even when i have graphics as low as possible, rendering to short, smooth lighting off, everything off or low, the same thing happens. Sometimes i even lag out and it says i ran out of game memory or something. Does anyone know what could help?
-
View User Profile
-
View Posts
-
Send Message
Moderatorsomething isn't right if your getting an out of memory error with 16GB
you are running a 64bit os correct?
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumIf that doesn't help, then try reinstalling your graphics drivers, however I really can't see that helping much either.
It could be your computers thermometer overreacting, and automatically clocking your CPU / GPU so it says cool (I had an old laptop that did this). Core temp is a program that displays temperature information.
Also, you might not be allocating enough RAM to minecraft. You have 16GB of memory (which is a lot..), so it wouldn't hurt giving Minecraft a chunk of that to work with. Here's how to run minecraft with certain amounts of RAM:
http://www.minecraftforum.net/topic/1395030-allocate-more-memory-for-minecraft/
Also, nice computer. It should be capable of running Minecraft about five times without much lag.
Urggh. Darn it. You keep beating me to these posts.
my java is
Java SE 7 update 09
Windows 7.6.1
but my java architecture is 32-bit
they make a 64 bit i believe? but i don't understand why it wont let me download that version when i go to their site, i use Google Chrome. it says they are all up to date.
I use 32 bit java on a 64 bit computer. What does the CPU/RAM clock look like when MC is lagging?
-
View User Profile
-
View Posts
-
Send Message
Retired Staffhttp://www.java.com/en/download/manual.jsp
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumNo problem.
Sort of funny that the card wasn't enabled by default...
Minecraft appears to not be a 3D demanding game, yet it is; yet it shouldn't be. Optimius, the graphics switcher for Nvidia mobile system chipset driver, tends to not switch to the discrete GPU from the ULV Intel GMA in some 3D demanding applications, minecraft being a example.
However OP seems to found the easy and only fix to that.
Just thought I'd leave that note.
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumI did not know that, thanks!
The root cause is actually that the software in charge of switching (Optimus), doesn't think that Java requires discrete graphics and/or doesn't detect when an application is using Open GL.