Recently I started upgrading an old pc I had to replace my MacBook Pro, which did a nice job, but wasn't the ideal gaming pc, however my new upgraded pc performs much worse than my MacBook. My mac has a 1080p monitor as well as the PC, and the mac gets an average of 40-60fps on mid settings, but my PC gets 2-70 and it jumps around so much even with v sync enabled.
The Meaning of Life, the Universe, and Everything.
Join Date:
4/8/2015
Posts:
332
Member Details
For minecraft it was definitely a major downgrade, not an upgrade. For other games it may actually be a little bit better due to the graphics card, but minecraft mostly depends on CPU.
2. Such jumpiness is usually caused by Windows having a whole bunch of programs open in the background - while the FPS is technically set by the graphics card, CPU has a huge effect on it since if Minecraft falls behind on its chunk updates it'll start dropping frames until everything is fine and dandy again. As such, you should try and close as much stuff as possible, run a decent anti-virus (but disable it while MC is running :P) and check out the little arrow in the taskbar (I think it's called Notification Center, but that may be my iOS roots coming in...) and see if there's any hidden surprises there - just general speed-up-stuff.
3. Make sure you have the right Windows drivers for your card. It's not like a Mac where everything deals with itself - you have to install them the old fashioned way.
4. Don't use Windows. Seriously, my late 2006 Mac with its staggering ATI Radeon X1600 (woah! 128MB of graphics POWER!) 2Ghz Core 2 Duo and 4GB RAM (yes, that was me ;3) can run Minecraft on OSX 10.7.5 at around 30fps, sometimes getting more than 50 on a good day, while my Windows PC with its 2.8GHz Athlon Dual-Core, GeForce 8400 GS and (yet again) 4GB of RAM (but this ram is faster - DDR1 vs DDR2) can only get 20fps tops. It's weird.
The Meaning of Life, the Universe, and Everything.
Join Date:
4/8/2015
Posts:
332
Member Details
@Noah we have answered your question, your "upgrade" was actually a serious downgrade (at least for minecraft), and your computer is no longer minecraft-tier.
Minecraft can easily use up that 2GB of RAM, even at a 12 or 8 chunk render distance. To be honest, I really would consider upgrading to at least 4GB RAM if possible, and 8GB would be ideal. The reason I recommend more RAM is because if Windows runs out of RAM, which could happen easily with only 2GB of RAM, then it'll start using what's called a pagefile, which basically stores all extra RAM on your hard drives, and thus causes everything to run really slowly and take forever.
You have a great graphics card, but it could easily be bottlenecked by CPU and RAM usage, and the cheaper of those two to upgrade would definitely be the RAM. RAM has been getting a lot cheaper recently, since 4GB of DDR2 goes for anywhere between $16 USD and $40 USD, and 4GB of DDR3 goes from $25 USD to $60 USD (I don't know which your processor supports, so please be sure to triple-check and pick the right kind; you can even message me if you need help).
If you haven't gotten the hint, please upgrade your RAM if possible, it's most likely the biggest reason as to why your performance is bad and why everything runs so slow.
Sidenote: v-sync only caps your frame rate to your monitors refresh rate, so if your monitor refreshes at 60hz, then your maximum frame rate will be 60FPS with v-sync on; it doesn't affect performance very much.
Rollback Post to RevisionRollBack
Did you know I make music? Just click my logo to listen to my awesome Electronic beats!
In my experience, I've never had any RAM problems with vanilla Minecraft. I leave the default 512MB alone, keep the render distance below 8 (my GPU can't handle more than that anyway... 128MB VRAM is always fun) and I regularly push 40FPS on my Mac. The only time I've ever had RAM issues is when using Forge mods (especially the huge ones). Optifine, LiteLoader, 512MB and I'm fine.
In my experience, I've never had any RAM problems with vanilla Minecraft. I leave the default 512MB alone, keep the render distance below 8 (my GPU can't handle more than that anyway... 128MB VRAM is always fun) and I regularly push 40FPS on my Mac. The only time I've ever had RAM issues is when using Forge mods (especially the huge ones). Optifine, LiteLoader, 512MB and I'm fine.
I have 32GB of RAM (overkill for MC), so I let Minecraft run at 8GB min and 16GB max, and it has no trouble using as much as 6GB when it wants to even at 8 chunk render distance in vanilla. I'm not actually sure why it climbs up to that much at all... maybe it's just Java's Garbage collection?
Screenshot (top right shows RAM usage):
As for VRAM, 128MB is probably decent enough. I set the render distance to 32 chunks in an amplified world and used GPU-Z to monitor VRAM usage, and it never went over 1GB used. At 8 and 16 chunks, it never went over 256MB of VRAM used.
Here is a screenshot of a 32 chunk render distance in an Amplified world, for anyone curious:
Rollback Post to RevisionRollBack
Did you know I make music? Just click my logo to listen to my awesome Electronic beats!
I'll admit that I'm not even close to the 1080p resolution you're running at, but that's a graphics card thing and not a RAM thing. Anyway, I stared at the RAM meter for a bit and found that as soon as it hit 60-62% (around 300MB) some kind of garbage collection would kick in and it would jump back down to 35% before filling back up to 60% again, round and round and round...
Oh, and here's that same(ish) thing in fullscreen (closer to your conditions)
It's interesting to see just how different Minecraft runs for different people. In my opinion, what you give it, it'll take. Don't give it much, and it won't take much.
(Oh, by the way, all these screenshots were generated at a render distance of 8 in a freshly-generated world.)
I would get Optifine first off, that may help with shuttering, and your ram is a serious issue... why would you only put in 2gb? I mean your graphics card takes 2gb lol normally you need more ram that what your GPU has ay? lol
Hello!
Recently I started upgrading an old pc I had to replace my MacBook Pro, which did a nice job, but wasn't the ideal gaming pc, however my new upgraded pc performs much worse than my MacBook. My mac has a 1080p monitor as well as the PC, and the mac gets an average of 40-60fps on mid settings, but my PC gets 2-70 and it jumps around so much even with v sync enabled.
Specs:
MacBook Pro
ram: 8Gb
video card: Intel HD graphics 4000
processor: Intel I5 2.5GHZ
upgraded PC:
ram: 2GB
video card: Nvidia GeForce gtx 750ti w/ 2GB ram
processor: AMD Athlon 2.6ghz dual core.
windows 8.1
any help would be majorly appreciated thank you!
From 8 Gigs to 2 Gigs? And changing Intel i5 for AMD Athlon? I call that downgrade.
For minecraft it was definitely a major downgrade, not an upgrade. For other games it may actually be a little bit better due to the graphics card, but minecraft mostly depends on CPU.
-
View User Profile
-
View Posts
-
Send Message
Curse Premium1. Don't use VSync. ;3
2. Such jumpiness is usually caused by Windows having a whole bunch of programs open in the background - while the FPS is technically set by the graphics card, CPU has a huge effect on it since if Minecraft falls behind on its chunk updates it'll start dropping frames until everything is fine and dandy again. As such, you should try and close as much stuff as possible, run a decent anti-virus (but disable it while MC is running :P) and check out the little arrow in the taskbar (I think it's called Notification Center, but that may be my iOS roots coming in...) and see if there's any hidden surprises there - just general speed-up-stuff.
3. Make sure you have the right Windows drivers for your card. It's not like a Mac where everything deals with itself - you have to install them the old fashioned way.
4. Don't use Windows. Seriously, my late 2006 Mac with its staggering ATI Radeon X1600 (woah! 128MB of graphics POWER!) 2Ghz Core 2 Duo and 4GB RAM (yes, that was me ;3) can run Minecraft on OSX 10.7.5 at around 30fps, sometimes getting more than 50 on a good day, while my Windows PC with its 2.8GHz Athlon Dual-Core, GeForce 8400 GS and (yet again) 4GB of RAM (but this ram is faster - DDR1 vs DDR2) can only get 20fps tops. It's weird.
Bump
@Noah we have answered your question, your "upgrade" was actually a serious downgrade (at least for minecraft), and your computer is no longer minecraft-tier.
Minecraft can easily use up that 2GB of RAM, even at a 12 or 8 chunk render distance. To be honest, I really would consider upgrading to at least 4GB RAM if possible, and 8GB would be ideal. The reason I recommend more RAM is because if Windows runs out of RAM, which could happen easily with only 2GB of RAM, then it'll start using what's called a pagefile, which basically stores all extra RAM on your hard drives, and thus causes everything to run really slowly and take forever.
You have a great graphics card, but it could easily be bottlenecked by CPU and RAM usage, and the cheaper of those two to upgrade would definitely be the RAM. RAM has been getting a lot cheaper recently, since 4GB of DDR2 goes for anywhere between $16 USD and $40 USD, and 4GB of DDR3 goes from $25 USD to $60 USD (I don't know which your processor supports, so please be sure to triple-check and pick the right kind; you can even message me if you need help).
If you haven't gotten the hint, please upgrade your RAM if possible, it's most likely the biggest reason as to why your performance is bad and why everything runs so slow.
Sidenote: v-sync only caps your frame rate to your monitors refresh rate, so if your monitor refreshes at 60hz, then your maximum frame rate will be 60FPS with v-sync on; it doesn't affect performance very much.
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumIn my experience, I've never had any RAM problems with vanilla Minecraft. I leave the default 512MB alone, keep the render distance below 8 (my GPU can't handle more than that anyway... 128MB VRAM is always fun) and I regularly push 40FPS on my Mac. The only time I've ever had RAM issues is when using Forge mods (especially the huge ones). Optifine, LiteLoader, 512MB and I'm fine.
I have 32GB of RAM (overkill for MC), so I let Minecraft run at 8GB min and 16GB max, and it has no trouble using as much as 6GB when it wants to even at 8 chunk render distance in vanilla. I'm not actually sure why it climbs up to that much at all... maybe it's just Java's Garbage collection?
Screenshot (top right shows RAM usage):
As for VRAM, 128MB is probably decent enough. I set the render distance to 32 chunks in an amplified world and used GPU-Z to monitor VRAM usage, and it never went over 1GB used. At 8 and 16 chunks, it never went over 256MB of VRAM used.
Here is a screenshot of a 32 chunk render distance in an Amplified world, for anyone curious:
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumAnd yet, for me:
I'll admit that I'm not even close to the 1080p resolution you're running at, but that's a graphics card thing and not a RAM thing. Anyway, I stared at the RAM meter for a bit and found that as soon as it hit 60-62% (around 300MB) some kind of garbage collection would kick in and it would jump back down to 35% before filling back up to 60% again, round and round and round...
Oh, and here's that same(ish) thing in fullscreen (closer to your conditions)
It's interesting to see just how different Minecraft runs for different people. In my opinion, what you give it, it'll take. Don't give it much, and it won't take much.
(Oh, by the way, all these screenshots were generated at a render distance of 8 in a freshly-generated world.)
-
View User Profile
-
View Posts
-
Send Message
Curse PremiumI would get Optifine first off, that may help with shuttering, and your ram is a serious issue... why would you only put in 2gb? I mean your graphics card takes 2gb lol normally you need more ram that what your GPU has ay? lol