After almost three hours of searching for various solutions and none of them working, I still cannot get Minecraft to utilize the GPUs over the CPU.
I'm getting solid 120+ FPS in Battlefield, Starcraft II, Overwatch, etc. on max settings, but Minecraft refuses to cross 24. (strange number to get stuck at, considering VSync is enabled)
Current system specs:
OS: Windows 10 Ultimate
CPU: AMD FX 8350 (stock clock)
GPU: Dual EVGA GeForce GTX 760 4GB (in SLI, stock clock)
RAM: 24GB DDR3 1866MHz max (8GB x2 & 4GB x2)
MB: ASUS Sabertooth 990FX R2
PSU: 950W Thermaltake Silver, plenty of power for the system.
JRE: 1.8u101 (latest, as of time of posting)
This is a list of everything I've tried:
GENERAL STUFF:
Update Graphics Drivers: Running the latest version, according to GeForce Experience. Update Java: Running the latest, 64 bit. Reboot: This problem has been around since I built this rig. Use Fullscreen: No difference.
NVIDIA CONTROL PANEL: Enable SLI: It's enabled. Add Java to NCP: I've added both java and javaw for both MC's special runtime and the 8u101 I have installed. No difference. Select the Preferred GPU for Java: This is a desktop. That option is not present. Set Rendering mode for Java: Already set. Nothing.
I've noticed the CPU shoots straight to 100% utilization on all cores, even in vanilla, and the GPUs remain basically idle with a TDP around 30% and usage around 10-30% (thanks Rainmeter). VRAM remains around 1GB on the primary card used and RAM stays around 400-600MB utilization from Java (it has 8GB allocated). I tried installing Optifine, but this actually hurts FPS, dropping it down to stable 16. GPU stays cold.
Minecraft largely uses CPU instead of GPU. This is because java is not designed for making games, it is a general purpose language designed for portability and utility. As such, java depends more on the CPU since a computer system is more likely to have a CPU than a GPU.
That being said, some systems are better at shunting graphics related work to the GPU than others. Some systems greatly benefit from a dedicated graphics card, particularly gaming systems that are designed to utilize GPU functions.
The best approach is generally to upgrade the CPU first, then the GPU. A great dual GPU is useless if you have a slow CPU.
Also, why do you have 8 gigs allocated to minecraft? Vanilla minecraft usually only needs one gig, while modded generally runs best at 4. Allocating too much ram can actually DECREASE performance in some cases.
And finally, if you use the shaders mod, then the GPU becomes WAY more important.
EDIT: after more closely researching your CPU specs, no longer think that bottle-necking is the problem. As an experiment, try running optifine with shaders and report GPU usage.
To the OP, post a screenshot of the F3 debug screen
I assume you're referring to the right portion of the screen, so here's that. The weird resolution is because it's in windowed mode. (Yes, I tried fullscreen)
EDIT: after more closely researching your CPU specs, no longer think that bottle-necking is the problem. As an experiment, try running optifine with shaders and report GPU usage.
I attempted this earlier (running shaders for the first time in a while is how I started to notice this problem) and tried again just now. It doesn't really matter which shader pack I select, the FPS is stuck at 12ish and GPU utilization remains low. Changing render resulution or shadows has zero affect on the framerate: it stays at 12 with both 0.5x and 2x.
Edit: My friend with a similar GPU (a 760 with only 2GB) is reporting no issue and getting 60 FPS with shaders, along with a few of my friends with AMD cards. The owner of a single GTX 1070 and one with an R9 280x are, however, experiencing my issue. It really seems to be a coinflip.
I've got an FX-6300 4.3ghz, and an R9 270x, run very well, so your specs are absolutely not an issue.
Check that your CPU isn't throttling (that cpu should not be run with the stock cooler, as I know FX chips and that one in particular run hot).
Check in either task manager, or HWmonitor/CPU-Z that you're running at full speed (4.0 GHZ, stock)
If you're not, check temps. And if your temps are bad, find a better cooling solution.
If your temps are fine, check power settings. I had a friend who's i7 was running at 700mhz instead of 3.5ghz, because for some reason, power saver was on, and it limited the CPU to like 5%
Rollback Post to RevisionRollBack
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
Minecraft largely uses CPU instead of GPU. This is because java is not designed for making games, it is a general purpose language designed for portability and utility. As such, java depends more on the CPU since a computer system is more likely to have a CPU than a GPU.
That being said, some systems are better at shunting graphics related work to the GPU than others. Some systems greatly benefit from a dedicated graphics card, particularly gaming systems that are designed to utilize GPU functions.
The best approach is generally to upgrade the CPU first, then the GPU. A great dual GPU is useless if you have a slow CPU.
Also, why do you have 8 gigs allocated to minecraft? Vanilla minecraft usually only needs one gig, while modded generally runs best at 4. Allocating too much ram can actually DECREASE performance in some cases.
And finally, if you use the shaders mod, then the GPU becomes WAY more important.
EDIT: after more closely researching your CPU specs, no longer think that bottle-necking is the problem. As an experiment, try running optifine with shaders and report GPU usage.
You seem to have some gross misunderstanding about everything involved. Minecraft while CPU-heavy does actually utilize your GPU a good deal. Minecraft uses an OpenGL binding for Java called LWJGL which stands for Lightweight Java OpenGL. This is a binding which is lightweight in the sense that it is very minimal in what it does, it just provides access to OpenGL functions without much in the way of making it Java-like. It's still calling the same underlying libraries as if it was any other program using the same version of OpenGL.
OpenGL on the otherhand has some interesting things to it. It can emulate features your hardware doesn't actually support either with CPU implementations or by using a combination of hardware functions that produce similar results. These sort of things are handled by the graphics driver and unfortunately some graphics drivers are really bad about this. However the OpenGL versions that Minecraft uses have been around awhile and any remotely decent graphics cards definitely supports it in it's entirety without those weird emulation behaviors.
One thing I can think of that may affect it is if the version of graphics driver used didn't include 64-bit OpenGL libraries it wouldn't be able to utilize the hardware bindings and would thus fall back on the emulation layer (provided one is implemented). This would be incredibly slow as described, I can't picture even the best CPU-side implementation of OpenGL being able to reach 20-30FPS without unrealistic optimization or blind luck. A possible solution would be to install the 32-bit version of Java, which can exist side-by-side with the 64-bit version and see if things improve. Java itself does have some interoperability between 32 and 64-bit versions, have certainly seen cases in the past where programs were written in 64-bit Java but made some calls into 32-bit Java functions for legacy support. I also know that Java 8 is incredibly weird when it comes to LWJGL, to the point where certain configurations (like PAE which is enabled by default on most new mainboards) can cause all sorts of crashing and odd behaviors.
After doing another series of tests, the CPU appears to be functioning normally. A closed-loop liquid coolant system is dedicated to the CPU itself, and the both the water temperature and the CPU's readout are at the expected values for a mid-range game.
Interestingly, when monitoring the CPU's individual cores, I can see a single core, which is different every time, immediately max out for a few minutes before the system will offload the thread to the next core in attempt to not burn out the chip. (Woo, single threaded applications... :c) The CPU is being dynamically clocked and will not raise itself above 30% max speed, as it doesn't see the single core's usage as enough to do so. Booting up a virtual machine in the background, interestingly, increases performance (but not by that much) as it forces the CPU to step up a bit.
This is feeling like a case of Java being Java and not wanting to do much, and the single-threaded nature of the game doesn't help. I'm curious to see the affects of some of the other versions of the game built in other languages and with more threads, particularly the Windows 10 edition.
Still though, if it works for some people, there has to be some way to get it working for others including myself.
Edit: Looks like I posted at the same time as @default_ex. I will go ahead and attempt your solution, but Minecraft packing 8u25 and refusing to use anything else will be tricky to work around. Symbolic link madness may prove effective or seriously break things; time to find out.
Edit 2: Nope. 32 bit java refuses to take more than 512MB of RAM, which is not enough for the game to exist. However, I used the same trick to force Minecraft to use 8u111 (which came out since the creation of this post) and had no luck either.
I'd suggest turning off all the CPU enhancements (cool n quiet, turbo, etc) in your BIOS.
Because there is absolutely 0 reason that MC should be maxing anything on your CPU.
Mine will spike up to ~90% on all cores when speeding around, generating chunks in spectator, but all 6 cores usually stay aroun 60%, so yours should be fine.
Rollback Post to RevisionRollBack
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
Wasn't as much to run with 32-bit Java but rather to have it there in case it was attempting to load a 32-bit library but we seem to have ruled that out.
Something else to try is to go into your BIOS/UEFI and try to disable PAE (Physical Address Extension). This places things like your graphics card's memory at an address after system memory and LWJGL really seems to struggle with that on some systems for no particular reason.
The CPU limitations your mentioning sound like C1E features wrecking havok. Those are worth disabling in the BIOS/UEFI to avoid all the problems they create. They don't work very well to achieve their goal anyway, which was to reduce power consumption by utilizing multiple CPU multipliers.
That it happens in both a virtual machine and the host itself is leading me to believe something is up with the OpenGL support on your particular machine. Any of the good VMs out there (I particularly like VirtualBox) have direct bindings in place which give near native performance.
An idea to try which I ran into with a pair of old Geforce 9600GTs. Go into the driver profile for Minecraft via the Nvidia control app and disable SLI. I ran into a problem in the past where running Minecraft in SLI was produce atrociously bad performance but running without it was phenomenally good. Along these lines editing the profile with Nvidia Inspector to force MSAA or no AA may help as well. The Nvidia control app kind of sucks at overriding AA settings and sometimes decides to use absurd high settings regardless of what you set in the profile or the game. The Inspector profile editor allows you to set very specific modes that you normally don't see exposed in the control panel app.
Alright. Reading up on the manual for your mainboard some things to look into to make sure you don't have some configuration issues.
Since your running mixed memory modules, they need to be placed in specific sockets. From left to right the sockets are A1, A2, B1 and B2. The A and B denote channels for the memory. So to utilize dual channel for maximum performance you want your biggest sticks in A1 and A2 and the smaller sticks in B1 and B2. This way your larger sticks will be engaged in dual channel mode and the smaller sticks in single channel. That board unfortunately doesn't support dual channel with mixed memory, it will simply use the larger of the two as the dual channel memory and allocate the rest as single channel. Note their wording isn't so specific so you might have to try big sticks in A1 and B1 with small sticks in A2 and B2 to get dual channel across the larger sticks.
You want your graphics cards in the first and third PCI-E slots. This is the recommended layout for maximum performance in dual SLI setup for your board.
Make sure your hard drive is connected to the top-most SATA connectors. The bottom connectors are on an ASMedia controller which I've never had any good experience with aside from DVD drives which are not at all sensitive to bussing problems.
All the rest will be settings in the BIOS "Advanced Mode".
- Set AI Overclock to Auto.
- Set CPU Spread Spectrum to Disabled.
- Set PCIe Spread Spectrum to Disabled.
- Set EPU Power saving to Disabled.
- Set OC Tuner to Disabled.
- Leave DRAM Timings on Automatic unless you fancy spending a night finding some common denominators with those sticks.
- Load-line calibration to Auto. There more than one of these to set.
- CPU Voltage to Auto and Spread to Disbled. Spread appears when Voltage is Auto.
- Cool n Quiet disabled.
- C1E disabled.
- SVM Enabled.
- Initial Graphics Adapter to PCI.
- HPET to Disabled. Never seen anything actually use them.
That's all the settings I see that could potentially cause a problem aside from the ones that you are likely too scared to touch. This is pretty close to default configuration with changes to disable known problems. The features disabled here are advertised to save power but the power savings they provide are so minimal that they are measured in 10s of dollars per year. Don't know about you but I find gaming without hiccups worth that much.
After almost three hours of searching for various solutions and none of them working, I still cannot get Minecraft to utilize the GPUs over the CPU.
I'm getting solid 120+ FPS in Battlefield, Starcraft II, Overwatch, etc. on max settings, but Minecraft refuses to cross 24. (strange number to get stuck at, considering VSync is enabled)
Current system specs:
OS: Windows 10 Ultimate
CPU: AMD FX 8350 (stock clock)
GPU: Dual EVGA GeForce GTX 760 4GB (in SLI, stock clock)
RAM: 24GB DDR3 1866MHz max (8GB x2 & 4GB x2)
MB: ASUS Sabertooth 990FX R2
PSU: 950W Thermaltake Silver, plenty of power for the system.
JRE: 1.8u101 (latest, as of time of posting)
This is a list of everything I've tried:
GENERAL STUFF:
Update Graphics Drivers: Running the latest version, according to GeForce Experience.
Update Java: Running the latest, 64 bit.
Reboot: This problem has been around since I built this rig.
Use Fullscreen: No difference.
NVIDIA CONTROL PANEL:
Enable SLI: It's enabled.
Add Java to NCP: I've added both java and javaw for both MC's special runtime and the 8u101 I have installed. No difference.
Select the Preferred GPU for Java: This is a desktop. That option is not present.
Set Rendering mode for Java: Already set. Nothing.
I've noticed the CPU shoots straight to 100% utilization on all cores, even in vanilla, and the GPUs remain basically idle with a TDP around 30% and usage around 10-30% (thanks Rainmeter). VRAM remains around 1GB on the primary card used and RAM stays around 400-600MB utilization from Java (it has 8GB allocated). I tried installing Optifine, but this actually hurts FPS, dropping it down to stable 16. GPU stays cold.
Any thoughts as to what's going on?
Myra ta Hayzel, Pal Kifitae te Entra en na Loka
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumAfraid there's not much you can do for this one. Minecraft just isn't coded to utilize the GPU.
My first mod =D
Where do fairy tales like this come from? If this was true you wouldn't see people with great GPUs get screaming FPS on Minecraft.
To the OP, post a screenshot of the F3 debug screen
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumMinecraft largely uses CPU instead of GPU. This is because java is not designed for making games, it is a general purpose language designed for portability and utility. As such, java depends more on the CPU since a computer system is more likely to have a CPU than a GPU.
That being said, some systems are better at shunting graphics related work to the GPU than others. Some systems greatly benefit from a dedicated graphics card, particularly gaming systems that are designed to utilize GPU functions.
The best approach is generally to upgrade the CPU first, then the GPU. A great dual GPU is useless if you have a slow CPU.
Also, why do you have 8 gigs allocated to minecraft? Vanilla minecraft usually only needs one gig, while modded generally runs best at 4. Allocating too much ram can actually DECREASE performance in some cases.
And finally, if you use the shaders mod, then the GPU becomes WAY more important.
EDIT: after more closely researching your CPU specs, no longer think that bottle-necking is the problem. As an experiment, try running optifine with shaders and report GPU usage.
My first mod =D
Sorry for the late response.
I assume you're referring to the right portion of the screen, so here's that. The weird resolution is because it's in windowed mode. (Yes, I tried fullscreen)
I attempted this earlier (running shaders for the first time in a while is how I started to notice this problem) and tried again just now. It doesn't really matter which shader pack I select, the FPS is stuck at 12ish and GPU utilization remains low. Changing render resulution or shadows has zero affect on the framerate: it stays at 12 with both 0.5x and 2x.
Edit: My friend with a similar GPU (a 760 with only 2GB) is reporting no issue and getting 60 FPS with shaders, along with a few of my friends with AMD cards. The owner of a single GTX 1070 and one with an R9 280x are, however, experiencing my issue. It really seems to be a coinflip.
Myra ta Hayzel, Pal Kifitae te Entra en na Loka
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumSounds about right. It really all depends on the system.
My first mod =D
I've got an FX-6300 4.3ghz, and an R9 270x, run very well, so your specs are absolutely not an issue.
Check that your CPU isn't throttling (that cpu should not be run with the stock cooler, as I know FX chips and that one in particular run hot).
Check in either task manager, or HWmonitor/CPU-Z that you're running at full speed (4.0 GHZ, stock)
If you're not, check temps. And if your temps are bad, find a better cooling solution.
If your temps are fine, check power settings. I had a friend who's i7 was running at 700mhz instead of 3.5ghz, because for some reason, power saver was on, and it limited the CPU to like 5%
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
-
View User Profile
-
View Posts
-
Send Message
Curse Premiumooh, good point!
My first mod =D
You seem to have some gross misunderstanding about everything involved. Minecraft while CPU-heavy does actually utilize your GPU a good deal. Minecraft uses an OpenGL binding for Java called LWJGL which stands for Lightweight Java OpenGL. This is a binding which is lightweight in the sense that it is very minimal in what it does, it just provides access to OpenGL functions without much in the way of making it Java-like. It's still calling the same underlying libraries as if it was any other program using the same version of OpenGL.
OpenGL on the otherhand has some interesting things to it. It can emulate features your hardware doesn't actually support either with CPU implementations or by using a combination of hardware functions that produce similar results. These sort of things are handled by the graphics driver and unfortunately some graphics drivers are really bad about this. However the OpenGL versions that Minecraft uses have been around awhile and any remotely decent graphics cards definitely supports it in it's entirety without those weird emulation behaviors.
One thing I can think of that may affect it is if the version of graphics driver used didn't include 64-bit OpenGL libraries it wouldn't be able to utilize the hardware bindings and would thus fall back on the emulation layer (provided one is implemented). This would be incredibly slow as described, I can't picture even the best CPU-side implementation of OpenGL being able to reach 20-30FPS without unrealistic optimization or blind luck. A possible solution would be to install the 32-bit version of Java, which can exist side-by-side with the 64-bit version and see if things improve. Java itself does have some interoperability between 32 and 64-bit versions, have certainly seen cases in the past where programs were written in 64-bit Java but made some calls into 32-bit Java functions for legacy support. I also know that Java 8 is incredibly weird when it comes to LWJGL, to the point where certain configurations (like PAE which is enabled by default on most new mainboards) can cause all sorts of crashing and odd behaviors.
After doing another series of tests, the CPU appears to be functioning normally. A closed-loop liquid coolant system is dedicated to the CPU itself, and the both the water temperature and the CPU's readout are at the expected values for a mid-range game.
Interestingly, when monitoring the CPU's individual cores, I can see a single core, which is different every time, immediately max out for a few minutes before the system will offload the thread to the next core in attempt to not burn out the chip. (Woo, single threaded applications... :c) The CPU is being dynamically clocked and will not raise itself above 30% max speed, as it doesn't see the single core's usage as enough to do so. Booting up a virtual machine in the background, interestingly, increases performance (but not by that much) as it forces the CPU to step up a bit.
This is feeling like a case of Java being Java and not wanting to do much, and the single-threaded nature of the game doesn't help. I'm curious to see the affects of some of the other versions of the game built in other languages and with more threads, particularly the Windows 10 edition.
Still though, if it works for some people, there has to be some way to get it working for others including myself.
Edit: Looks like I posted at the same time as @default_ex. I will go ahead and attempt your solution, but Minecraft packing 8u25 and refusing to use anything else will be tricky to work around. Symbolic link madness may prove effective or seriously break things; time to find out.
Edit 2: Nope. 32 bit java refuses to take more than 512MB of RAM, which is not enough for the game to exist. However, I used the same trick to force Minecraft to use 8u111 (which came out since the creation of this post) and had no luck either.
Myra ta Hayzel, Pal Kifitae te Entra en na Loka
I'd suggest turning off all the CPU enhancements (cool n quiet, turbo, etc) in your BIOS.
Because there is absolutely 0 reason that MC should be maxing anything on your CPU.
Mine will spike up to ~90% on all cores when speeding around, generating chunks in spectator, but all 6 cores usually stay aroun 60%, so yours should be fine.
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
Wasn't as much to run with 32-bit Java but rather to have it there in case it was attempting to load a 32-bit library but we seem to have ruled that out.
Something else to try is to go into your BIOS/UEFI and try to disable PAE (Physical Address Extension). This places things like your graphics card's memory at an address after system memory and LWJGL really seems to struggle with that on some systems for no particular reason.
The CPU limitations your mentioning sound like C1E features wrecking havok. Those are worth disabling in the BIOS/UEFI to avoid all the problems they create. They don't work very well to achieve their goal anyway, which was to reduce power consumption by utilizing multiple CPU multipliers.
That it happens in both a virtual machine and the host itself is leading me to believe something is up with the OpenGL support on your particular machine. Any of the good VMs out there (I particularly like VirtualBox) have direct bindings in place which give near native performance.
An idea to try which I ran into with a pair of old Geforce 9600GTs. Go into the driver profile for Minecraft via the Nvidia control app and disable SLI. I ran into a problem in the past where running Minecraft in SLI was produce atrociously bad performance but running without it was phenomenally good. Along these lines editing the profile with Nvidia Inspector to force MSAA or no AA may help as well. The Nvidia control app kind of sucks at overriding AA settings and sometimes decides to use absurd high settings regardless of what you set in the profile or the game. The Inspector profile editor allows you to set very specific modes that you normally don't see exposed in the control panel app.
Alright. Reading up on the manual for your mainboard some things to look into to make sure you don't have some configuration issues.
Since your running mixed memory modules, they need to be placed in specific sockets. From left to right the sockets are A1, A2, B1 and B2. The A and B denote channels for the memory. So to utilize dual channel for maximum performance you want your biggest sticks in A1 and A2 and the smaller sticks in B1 and B2. This way your larger sticks will be engaged in dual channel mode and the smaller sticks in single channel. That board unfortunately doesn't support dual channel with mixed memory, it will simply use the larger of the two as the dual channel memory and allocate the rest as single channel. Note their wording isn't so specific so you might have to try big sticks in A1 and B1 with small sticks in A2 and B2 to get dual channel across the larger sticks.
You want your graphics cards in the first and third PCI-E slots. This is the recommended layout for maximum performance in dual SLI setup for your board.
Make sure your hard drive is connected to the top-most SATA connectors. The bottom connectors are on an ASMedia controller which I've never had any good experience with aside from DVD drives which are not at all sensitive to bussing problems.
All the rest will be settings in the BIOS "Advanced Mode".
- Set AI Overclock to Auto.
- Set CPU Spread Spectrum to Disabled.
- Set PCIe Spread Spectrum to Disabled.
- Set EPU Power saving to Disabled.
- Set OC Tuner to Disabled.
- Leave DRAM Timings on Automatic unless you fancy spending a night finding some common denominators with those sticks.
- Load-line calibration to Auto. There more than one of these to set.
- CPU Voltage to Auto and Spread to Disbled. Spread appears when Voltage is Auto.
- Cool n Quiet disabled.
- C1E disabled.
- SVM Enabled.
- Initial Graphics Adapter to PCI.
- HPET to Disabled. Never seen anything actually use them.
That's all the settings I see that could potentially cause a problem aside from the ones that you are likely too scared to touch. This is pretty close to default configuration with changes to disable known problems. The features disabled here are advertised to save power but the power savings they provide are so minimal that they are measured in 10s of dollars per year. Don't know about you but I find gaming without hiccups worth that much.
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumLWJGL stands for Light Weight Java Game Library
My first mod =D