Yesterday I upgraded my Gigabyte nVidia GeForce GTX 560Ti (1GB OC edition) for a Sapphire AMD R9 280X (3GB Vapor-X). For some reason Minecraft's frame rate is so so much lower than it was before. I've not changed any other hardware.
Current configuration:
Intel Core i7-920 (overclocked from 2.67 GHz to 3.8 GHz)
12 GB of DDR3 memory (1.6 GHz, 8-8-8-24)
Gigabyte X58A-UD5 (rev 2.0)
Sapphire AMD R9 280X (3GB Vapor-X) -- using Catalyst 13.11 (22nd November 2013)
1600x1200 display
Windows 7 Professional x64
Java SE 1.7.0_15-b03 (64-bit)
Java HotSpot 23.7-b01 (64-bit)
2GB of memory initially allocated, 4GB limit
Prior to the card swap I'd easily get 100FPS for 90% of the time playing. I now tend to hover around 70, but drops to 40 when the area gets busy. The results are much lower when using a 64x resource pack.
Other games I frequently have seen a performance increase. Both OpenGL: Natural Selection 2 -- a LOT, Sven Co-op, a bit (was near the limit anyway) so something tells me there is something wrong with Minecraft here. Not sure if anyone else here has a AMD R9 series card to compare with?
I usually use Optifine (Ultra C7 for 1.6.4), but I ran 1.7.2 vanilla to be sure it wasn't the mod -- 1.7.2 vanilla runs much worse than 1.6.4 with Optifine.
Drivers are already latest, only released 5 days ago.
Minecraft's launcher also updates LWJGL now doesn't it?
Windows Update has nothing new to update.
Drivers are already latest, only released 5 days ago.
Minecraft's launcher also updates LWJGL now doesn't it? Windows Update has nothing new to update.
Did you update the graphics drivers this way? or by going to AMD?
I went to AMD to get the latest driver + catalyst control centre. (I always do this even for things like USB and SATA drivers)
Microsoft will never have the latest drivers.
Spoke with AMD about this via their support thing. They told me (in poor English) that the version 13.11 beta 4 drivers are only optimized for Direct3D, and told me to try out version 13.9 instead. Did that (card will show up as a Radeon HD 7900 series), but Minecraft runs no better.
Looks like we'd be relying on AMD to get their drivers straight.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
Yesterday I upgraded my Gigabyte nVidia GeForce GTX 560Ti (1GB OC edition) for a Sapphire AMD R9 280X (3GB Vapor-X). For some reason Minecraft's frame rate is so so much lower than it was before. I've not changed any other hardware.
Current configuration:
- Intel Core i7-920 (overclocked from 2.67 GHz to 3.8 GHz)
- 12 GB of DDR3 memory (1.6 GHz, 8-8-8-24)
- Gigabyte X58A-UD5 (rev 2.0)
- Sapphire AMD R9 280X (3GB Vapor-X) -- using Catalyst 13.11 (22nd November 2013)
- 1600x1200 display
- Windows 7 Professional x64
- Java SE 1.7.0_15-b03 (64-bit)
- Java HotSpot 23.7-b01 (64-bit)
- 2GB of memory initially allocated, 4GB limit
Prior to the card swap I'd easily get 100FPS for 90% of the time playing. I now tend to hover around 70, but drops to 40 when the area gets busy. The results are much lower when using a 64x resource pack.Other games I frequently have seen a performance increase. Both OpenGL: Natural Selection 2 -- a LOT, Sven Co-op, a bit (was near the limit anyway) so something tells me there is something wrong with Minecraft here. Not sure if anyone else here has a AMD R9 series card to compare with?
I usually use Optifine (Ultra C7 for 1.6.4), but I ran 1.7.2 vanilla to be sure it wasn't the mod -- 1.7.2 vanilla runs much worse than 1.6.4 with Optifine.
Drivers are already latest, only released 5 days ago.
Minecraft's launcher also updates LWJGL now doesn't it?
Windows Update has nothing new to update.
-
View User Profile
-
View Posts
-
Send Message
ModeratorDid you update the graphics drivers this way? or by going to AMD?
Microsoft will never have the latest drivers.
Looks like we'd be relying on AMD to get their drivers straight.