Hello there, I'm currently running two Radeon HD 5770's in crossfire, and have a FX 8120 3.1ghz AMD processor, and after testing temps and loads while playing minecraft, I discovered that my CPU socket and core temps were skyrocketing while playing mminecraft.
I know that google is my friend, so I looked around and was unable to come up with a solution. Some people had LOAD issues on their GPU's/CPU's, but mine is just that my Graphics cards are completely idle when minecraft is being played (no Video Ram is being used, and their GPU usage remains unchanged), while my CPU's socket temp soars to 65c if I let it (at which point my computer would shut down as I set it to).
to put this in perspective, when playing skyrim on max settings I hit maybe 50-52c socket, and never over 36c core temps, while my graphics cards reach about 60c.
I used to not have this issue for whatever reason, before I did a clean win7 install. Java is 64 bit, as is windows. When I capped the framerate using optifine the cpu socket temps still remain around 58-60. This leads me to believe minecraft is using the integrated fx graphics.
If anyone has any ideas as to how I could coax minecraft into using my GPU's like it used to, that would be much appreciated. I'll be glad to supply any other info.
Extra: Also realized that whenever a java-heavy webpage is open the same thing happens to a lesser degree
Have you tried downloading your graphics driver? I know you've heard this countless times, but as a person who knows little bit about OpenGL (which minecraft uses to render stuff), I can only guess that graphics related tasks are being done in your CPU instead of GPU, probably because your operating system could not find the graphic driver. Can you try downloading GPU Caps viewer (link, ignore the fake download button on left) and see what it says about GL_VERSION and GL_RENDERER under the OpenGL tab? If GL_VERSION says either 1.0 or 1.1, the graphic related tasks are probably being done in your CPU. If you are wondering why Skyrim utilizes your GPU while Minecraft doesn't, it is because Skyrim uses Direct3D, which comes with Windows operating system and updates automatically with Window Update.
If anyone thinks I'm wrong, feel free to correct me.
Minecraft doesn't choose which GPU it uses. If you have switchable graphics, then it is the switchable graphics driver which either:
*statically uses a specific GPU all the time
* dynamically "detects" when to switch based upon some heuristics
* statically uses a specific GPU for specific apps
We know that certain switchable graphics drivers don't switch for Open GL apps (i.e Minecraft).
You can try associating Minecraft with your dedicated GPU (either right click on the Minecraft program for a menu option, or if not there, your desktop), or hard switching to always use the dedicated GPU.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
I know that google is my friend, so I looked around and was unable to come up with a solution. Some people had LOAD issues on their GPU's/CPU's, but mine is just that my Graphics cards are completely idle when minecraft is being played (no Video Ram is being used, and their GPU usage remains unchanged), while my CPU's socket temp soars to 65c if I let it (at which point my computer would shut down as I set it to).
to put this in perspective, when playing skyrim on max settings I hit maybe 50-52c socket, and never over 36c core temps, while my graphics cards reach about 60c.
I used to not have this issue for whatever reason, before I did a clean win7 install. Java is 64 bit, as is windows. When I capped the framerate using optifine the cpu socket temps still remain around 58-60. This leads me to believe minecraft is using the integrated fx graphics.
If anyone has any ideas as to how I could coax minecraft into using my GPU's like it used to, that would be much appreciated. I'll be glad to supply any other info.
Extra: Also realized that whenever a java-heavy webpage is open the same thing happens to a lesser degree
If anyone thinks I'm wrong, feel free to correct me.
*statically uses a specific GPU all the time
* dynamically "detects" when to switch based upon some heuristics
* statically uses a specific GPU for specific apps
We know that certain switchable graphics drivers don't switch for Open GL apps (i.e Minecraft).
You can try associating Minecraft with your dedicated GPU (either right click on the Minecraft program for a menu option, or if not there, your desktop), or hard switching to always use the dedicated GPU.