Hey, I'm having a problem where Minecraft is only using the integrated graphics card built into my CPU as an emergency backup, rather than using the Geforce GTX 750 Titan installed in the PCI Express slot. This makes the FPS struggle to reach 60 even on lowest settings, even though it would run better on an older GPU.
I know MC has to be using the integrated GPU instead of the Nvidia GPU (and not having a CPU bottleneck), because I can run nearly any other game at max settings and 60 FPS minimum no problem. If games with far more sophisticated graphics and effects can run on my PC crystal-clear and silky-smooth, the only real explanation I can think of for Minecraft chugging is Java not using the powerful card and running off of integrated graphics instead.
Does anyone know how to force Minecraft to use the more-powerful Nvidia card instead of the built-in card without disabling/removing anything? I'm on a desktop computer that doesn't have Nvidia Optimus due to not being a laptop, so setting card preference in Nvidia Control Panel or right-clicking the game .exe and choosing "Run with..." aren't available options for me. Device Manager can't help me either, since it only lists the 750 Titan's adapter and not the integrated one.
I do have one potential solution already, but I'd like to check for any others first. I could technically go into the BIOS on startup and turn off the integrated GPU entirely as a final solution, but I'd prefer to leave it functional for low-stress applications and for emergency in case the dedicated GPU kicks the bucket suddenly. Also, monkeying with the BIOS just doesn't sound like a good and safe idea to me, so I'd like to avoid fiddling with things that could thoroughly brick my computer unless I have to.
Edit, adding forgotten info: I used to play Minecraft with an Nvidia Geforce GT 520 graphics card a while back, and I got 120 FPS regularly on Far view distance with no mods or texture/resource packs installed. If that bargain bin GPU could handle Minecraft so well in tandem with my CPU, I fail to see how a way more powerful GPU like my current GTX 750 Ti could actually do worse unless it wasn't being used at all.
Edit 2: Tried the latest snapshot (14w31a) to see if it'd somehow help, performance is slightly better but Minecraft still isn't using my Nvidia GPU at all. I've also tried making sure it has half of my 8 GB of RAM to use, with no luck either.
PC Specifications, for anyone who can use them to help:
Motherboard: Asus M4N68T-M V2
CPU: AMD Phenom II x4 965 (4 Cores) @ 3.4 GHz
GPU: Nvidia Geforce GTX 750 Titan Superclock
RAM: 8 GB (Maximum amount my PC can have installed)
PSU: Dynex DX400WPS 400w Power Supply
Sound Card: VIA HD Audio (No model or product specification available)
OS: Windows 7
Minecraft Versions I'm Having Issues With: All versions, technically, but I've tested on 1.6.4, 1.7.2 and 1.7.10 with the same results.
It's not up to minecraft which GPU it's graphics requests are routed to.
It sounds like you have one of the system with non-configurable 'dynamic' switching. If your driver doesn't want to run Java on the dedicated GPU then it won't, and the only way to change that is to force everything to run on the dedicated GPU.
Alright, and how would I force everything to the dedicated GPU? I've been reading through my motherboard's manual (Asus M4N68T-M V2), and the only BIOS settings seem to be priority-based (PCI-E>PCI>IGD or IGD>PCI>PCI-E), rather than any on/off toggles.
Also, I'm completely certain that my PC has some form of integrated GPU built into the CPU, since the PSU died a while back and took a crappy old dedicated GPU with it. I had to use the integrated for a bit til I got my 750 Ti in the mail, so it certainly has integrated installed. Problem is, it doesn't show up in Device Manager, dxdiag or any other lists, meaning I have no control over it outside of possibly in BIOS.
I'm going to assume that my only solutions will be to either monkey around in the BIOS and see if the priority order is messed up and keep an eye out for toggles, or look/wait for a Java update or tweak that makes it use dedicated GPUs.
Looking at the fact that the integrated GPU is missing from all diagnostics and reports, however, I'm starting to think it might be running on the Nvidia card but still having issues due to poor optimization. I've read a lot of forum posts around the internet which say there's often a system mechanism which disables the integrated GPU when a working GPU is installed into the PCI-E slot, but I don't know for sure if that's the case for me. If it really is auto-disabled, then Minecraft just plain doesn't like my otherwise-powerful GPU/CPU, which means going deeper down the tweaking rabbit hole.
Edit: Then again, there are quite a few people who've made videos of testing the 750 Ti with shader mods, while getting excellent results. I'm gonna try driver cleaning, switching to Java 8 and a whole bunch of other options in the vain hope that one will magically fix the issue.
Alright, I feel extremely dumb right about now. Apparently, vanilla Minecraft had VSync set to On without me knowing. Turned it off while trying 14w31a again on a whim, and suddenly I'm getting anywhere from 150-300 FPS. There's no screen-tearing with VSync off, so either it was conflicting with Nvidia Control Panel, or it was just plain unnecessary.
Testing with other game versions, both vanilla and modded, getting the same results. Frame-limiting causes any performance hit to dip the FPS below 60, not limiting causes FPS to hit a ceiling of around 300 FPS and never dip below 100, let alone 60. No screen-tearing issues present in any situation, rendering VSync 100% detrimental.
Clearly, the problem is not what I thought.
Thanks for the help anyways, gerbil, sorry to have wasted your time. I'm going to keep testing it with other versions and with mods to make sure it's not a fluke, but it certainly seems to be the solution at the moment.
Edit:Jesus H. Christ, I tried out 1.7.10 again and started getting framerates of around 400-500 with Optifine. The dedicated GPU is most definitely in use.
I know MC has to be using the integrated GPU instead of the Nvidia GPU (and not having a CPU bottleneck), because I can run nearly any other game at max settings and 60 FPS minimum no problem. If games with far more sophisticated graphics and effects can run on my PC crystal-clear and silky-smooth, the only real explanation I can think of for Minecraft chugging is Java not using the powerful card and running off of integrated graphics instead.
Does anyone know how to force Minecraft to use the more-powerful Nvidia card instead of the built-in card without disabling/removing anything? I'm on a desktop computer that doesn't have Nvidia Optimus due to not being a laptop, so setting card preference in Nvidia Control Panel or right-clicking the game .exe and choosing "Run with..." aren't available options for me. Device Manager can't help me either, since it only lists the 750 Titan's adapter and not the integrated one.
I do have one potential solution already, but I'd like to check for any others first. I could technically go into the BIOS on startup and turn off the integrated GPU entirely as a final solution, but I'd prefer to leave it functional for low-stress applications and for emergency in case the dedicated GPU kicks the bucket suddenly. Also, monkeying with the BIOS just doesn't sound like a good and safe idea to me, so I'd like to avoid fiddling with things that could thoroughly brick my computer unless I have to.
Edit, adding forgotten info: I used to play Minecraft with an Nvidia Geforce GT 520 graphics card a while back, and I got 120 FPS regularly on Far view distance with no mods or texture/resource packs installed. If that bargain bin GPU could handle Minecraft so well in tandem with my CPU, I fail to see how a way more powerful GPU like my current GTX 750 Ti could actually do worse unless it wasn't being used at all.
Edit 2: Tried the latest snapshot (14w31a) to see if it'd somehow help, performance is slightly better but Minecraft still isn't using my Nvidia GPU at all. I've also tried making sure it has half of my 8 GB of RAM to use, with no luck either.
PC Specifications, for anyone who can use them to help:
Motherboard: Asus M4N68T-M V2
CPU: AMD Phenom II x4 965 (4 Cores) @ 3.4 GHz
GPU: Nvidia Geforce GTX 750 Titan Superclock
RAM: 8 GB (Maximum amount my PC can have installed)
PSU: Dynex DX400WPS 400w Power Supply
Sound Card: VIA HD Audio (No model or product specification available)
OS: Windows 7
Minecraft Versions I'm Having Issues With: All versions, technically, but I've tested on 1.6.4, 1.7.2 and 1.7.10 with the same results.
It sounds like you have one of the system with non-configurable 'dynamic' switching. If your driver doesn't want to run Java on the dedicated GPU then it won't, and the only way to change that is to force everything to run on the dedicated GPU.
Also, I'm completely certain that my PC has some form of integrated GPU built into the CPU, since the PSU died a while back and took a crappy old dedicated GPU with it. I had to use the integrated for a bit til I got my 750 Ti in the mail, so it certainly has integrated installed. Problem is, it doesn't show up in Device Manager, dxdiag or any other lists, meaning I have no control over it outside of possibly in BIOS.
Looking at the fact that the integrated GPU is missing from all diagnostics and reports, however, I'm starting to think it might be running on the Nvidia card but still having issues due to poor optimization. I've read a lot of forum posts around the internet which say there's often a system mechanism which disables the integrated GPU when a working GPU is installed into the PCI-E slot, but I don't know for sure if that's the case for me. If it really is auto-disabled, then Minecraft just plain doesn't like my otherwise-powerful GPU/CPU, which means going deeper down the tweaking rabbit hole.
Edit: Then again, there are quite a few people who've made videos of testing the 750 Ti with shader mods, while getting excellent results. I'm gonna try driver cleaning, switching to Java 8 and a whole bunch of other options in the vain hope that one will magically fix the issue.
Testing with other game versions, both vanilla and modded, getting the same results. Frame-limiting causes any performance hit to dip the FPS below 60, not limiting causes FPS to hit a ceiling of around 300 FPS and never dip below 100, let alone 60. No screen-tearing issues present in any situation, rendering VSync 100% detrimental.
Clearly, the problem is not what I thought.
Thanks for the help anyways, gerbil, sorry to have wasted your time. I'm going to keep testing it with other versions and with mods to make sure it's not a fluke, but it certainly seems to be the solution at the moment.
Edit: Jesus H. Christ, I tried out 1.7.10 again and started getting framerates of around 400-500 with Optifine. The dedicated GPU is most definitely in use.