So I'm getting a new computer in a few days, and I just wanted to know how many fps I should be getting on Minecraft (Optifine installed) with and without shaders.
The Meaning of Life, the Universe, and Everything.
Location:
Beside You >:)
Join Date:
8/31/2013
Posts:
135
Minecraft:
bluecreeper111
Xbox:
BlueCreeper222
Member Details
The hard drive memory is a little rough, but it has an ok RAM. You didn't specify the GPU (Graphics Card) therefor we can't determine the framerate you will be getting.
It's a newer series of chip, and as far as integrated graphics go, it's decent, I think. Just don't expect to be playing on "fancy" with a high res monitor....
Not much. It definitely won't hurt, and it'll help your system run a bit better in the long term (especially dealing with however much RAM windows 10 hogs), but it probably won't have much impact on fps. It might, however, help a little with chunk loading lag, allowing you to use larger texture packs, or things like that.
If you're focused on fps, though, your single biggest speed up will be in getting a GPU in there. Any fairly new GPU (as in came out in the last twelve months) will probably provide a pretty substantial fps boost for you.
Check out Tom's Hardware GPU hierarchy chart for a decent cross-brand comparison. It's a rough comparison, but it works. For reference, I've got an HD 7770 (about 11 bars down), and I get decent (but not spectacular) fps with all settings maxed out, including setting the view distance a couple notches past "far." If you get anything at that level or higher, you'll do great.
Depends on what you get. What's your budget? And will you be using that budget on software too?
My budget for a GPU would probably be around 50$, since I am getting a keyboard and mouse too. And no, I probably won't be using it on software.
I'll probably just wait and see how well it runs Minecraft, and if I'm not happy, then I'll get a new GPU. I would probably be happy if it was about 60-100 fps because I use a laptop that sticks around 50 fps.
Well, let us know what you get in terms of fps. Although there's really no benefit to having it higher than 60 (unless you've got some awesome monitors with 120hz refresh rate). If the quality to fps ratio is good enough, then great. If not, I'd suggest holding onto the 50 for a while. Most decent GPUs are probably in the $100-$200 range.
Not sure if this is what you guys mean by graphics card, but there's a little sticker on the computer that says "Intel Pentium inside." Sorry, I don't know horribly much about the parts of computers. lol
The graphics card is the part in your computer that takes the data from the programs, and displays it on your screen. The better it is, the more FPS (the amount of "frames", or pictures displayed per second), usually.
Rollback Post to RevisionRollBack
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
Please can you people stop spreading the rumour that anything higher than 24/30/60fps etc. is useless.
I can tell the difference between 125 and 100 fps on my 60hz monitor. It depends on the person who's playing.
If OP can't tell the difference, then great, it doesn't matter, but some people can.
Doubt you can tell the difference between 125 and 100 fps on your 60Hz monitor, that isn't how it works. 60Hz refers to the refresh rate, ie how many times the picture on the monitor can refresh in a second. 60Hz means it can only refresh 60 times in a second, meaning that it simply cannot display over 60 frames per second as a physical limitation. In fact, displaying a higher framerate than the refresh rate can actually cause screen tearing, which can be just as visually disruptive as lower framerates, depending on how sensitive you are to it.
What you are seeing can probably be put down to placebo.
All that said though, there is certainly a very clear difference between 60/120/144fps/Hz, so long as you have a monitor that supports it.
(My Catleap can OC to 120Hz, and it is a damn sight smoother than 60Hz. Unfortunately my Catleap is somewhat of a dud, and anywhere over 75Hz will display artifacts and makes games somewhat unplayable, despite being smoother)
So I'm getting a new computer in a few days, and I just wanted to know how many fps I should be getting on Minecraft (Optifine installed) with and without shaders.
SPECS:
HP Pavilion p6-2310 Desktop 2.8 GHz AMD A4-3420 Processor, 6GB DDR3, 500GB HDD
>> Implying I have a signature
The hard drive memory is a little rough, but it has an ok RAM. You didn't specify the GPU (Graphics Card) therefor we can't determine the framerate you will be getting.
*cringes* integrated *recoils in horror* graphics
Joking, but seriously, that'll be a problem for you.
I'm gonna say you'll probably be able to pull 50-60 fps on lower settings.
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
It's a newer series of chip, and as far as integrated graphics go, it's decent, I think. Just don't expect to be playing on "fancy" with a high res monitor....
How much would it boost the fps if I purchased 4 extra gb of ram?
>> Implying I have a signature
Not much. It definitely won't hurt, and it'll help your system run a bit better in the long term (especially dealing with however much RAM windows 10 hogs), but it probably won't have much impact on fps. It might, however, help a little with chunk loading lag, allowing you to use larger texture packs, or things like that.
If you're focused on fps, though, your single biggest speed up will be in getting a GPU in there. Any fairly new GPU (as in came out in the last twelve months) will probably provide a pretty substantial fps boost for you.
Check out Tom's Hardware GPU hierarchy chart for a decent cross-brand comparison. It's a rough comparison, but it works. For reference, I've got an HD 7770 (about 11 bars down), and I get decent (but not spectacular) fps with all settings maxed out, including setting the view distance a couple notches past "far." If you get anything at that level or higher, you'll do great.
Here's a link to the official system requirements. So long as you can get somewhere near these, things should work.
Mojang's Official System Reqs
I can't really do a whole new computer, because the computer is already here, and I'm not sure how much a GPU would cost. Around 50-100$...?
>> Implying I have a signature
You'd almost definitely need a new PSU too. An alright one will be 30-50$.
"Imagination is more important than knowledge. Knowledge is limited. Imagination encircles the world."
-Albert Einstein
Current setup: http://pcpartpicker.com/p/PJzPD3
My budget for a GPU would probably be around 50$, since I am getting a keyboard and mouse too. And no, I probably won't be using it on software.
I'll probably just wait and see how well it runs Minecraft, and if I'm not happy, then I'll get a new GPU. I would probably be happy if it was about 60-100 fps because I use a laptop that sticks around 50 fps.
>> Implying I have a signature
Well, let us know what you get in terms of fps. Although there's really no benefit to having it higher than 60 (unless you've got some awesome monitors with 120hz refresh rate). If the quality to fps ratio is good enough, then great. If not, I'd suggest holding onto the 50 for a while. Most decent GPUs are probably in the $100-$200 range.
Not sure if this is what you guys mean by graphics card, but there's a little sticker on the computer that says "Intel Pentium inside." Sorry, I don't know horribly much about the parts of computers. lol
>> Implying I have a signature
The graphics card is the part in your computer that takes the data from the programs, and displays it on your screen. The better it is, the more FPS (the amount of "frames", or pictures displayed per second), usually.
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
Doubt you can tell the difference between 125 and 100 fps on your 60Hz monitor, that isn't how it works. 60Hz refers to the refresh rate, ie how many times the picture on the monitor can refresh in a second. 60Hz means it can only refresh 60 times in a second, meaning that it simply cannot display over 60 frames per second as a physical limitation. In fact, displaying a higher framerate than the refresh rate can actually cause screen tearing, which can be just as visually disruptive as lower framerates, depending on how sensitive you are to it.
What you are seeing can probably be put down to placebo.
All that said though, there is certainly a very clear difference between 60/120/144fps/Hz, so long as you have a monitor that supports it.
(My Catleap can OC to 120Hz, and it is a damn sight smoother than 60Hz. Unfortunately my Catleap is somewhat of a dud, and anywhere over 75Hz will display artifacts and makes games somewhat unplayable, despite being smoother)
K95 RGB / Logitech G502 PS / Alienware AW3418DW / ViewSonic XG2703-GS / Sennheiser HD 598
Weird cause it says 32 fps but it seems pretty smooth
>> Implying I have a signature
Well, performance has improved a little.
>> Implying I have a signature