Hello everyone, as some of you may know, i've been looking for parts lately. Well, i've finally got it!
i5 2500k
Asus 7870
8GB DDR3 RAM
MSI Z77 G45 MOBO
Seagate Barracuda 2TB 7200RPM HDD
LG DVD Reader
ThermalTake ATX V3 Case
Windows 7 64Bit
Well, I'm typing from this PC right now, installed a few things, but my main concern is fps. My cousin has an i3 with a 6580 and gets around 100fps while playing Minecraft. Me, on the other hand, am only getting around 50, where I suppose I should be getting a lot more. Max settings, and all. Help anyone? I've already installed all the drivers but the HDD one. Thanks in advance!
Rollback Post to RevisionRollBack
Intel Core i5 2500k - Asus Radeon HD 7870 - 8GB Memory - Seagate Barracuda 2TB 7200RPM - Corsair Builder Series X600 - MSI Z77 G45
Just turned VSync off. Lol. Got a 20fps boost, it's at 70. Apparently this card still should get around 60 on BF3 and 100+ on LOL, should get at least that on MC.
Rollback Post to RevisionRollBack
Intel Core i5 2500k - Asus Radeon HD 7870 - 8GB Memory - Seagate Barracuda 2TB 7200RPM - Corsair Builder Series X600 - MSI Z77 G45
Just turned VSync off. Lol. Got a 20fps boost, it's at 70. Apparently this card still should get around 60 on BF3 and 100+ on LOL, should get at least that on MC.
I get around 120 with a GTX 670.
Java is weird like that.
Also are you in the same world as your cousin when you measure your fps?
Is that necessary? I installed the default driver.
Yes, it is necessary, go to the websites that your parts are from (for example, Asus for your GPU), and install the newer drivers for the correct card.
You're card may be idling. That's when the computer uses the integrated graphics instead of dedicated because it thinks MC is just a regular application. To disable this, you have to manually enable the dedicated card in a control panel (Forgot which one)
You're card may be idling. That's when the computer uses the integrated graphics instead of dedicated because it thinks MC is just a regular application. To disable this, you have to manually enable the dedicated card in a control panel (Forgot which one)
Yes, it is necessary, go to the websites that your parts are from (for example, Asus for your GPU), and install the newer drivers for the correct card.
So I installed the newest drivers, get up to 90fps sometimes but mainly around 70. Tips?
Rollback Post to RevisionRollBack
Intel Core i5 2500k - Asus Radeon HD 7870 - 8GB Memory - Seagate Barracuda 2TB 7200RPM - Corsair Builder Series X600 - MSI Z77 G45
How much RAM are you giving Java? It generally just takes 1GB but you can assign it more. Assigning 2 or even 4GB would help a bit, although don't expect a dramatic improvement.
Haven't altered that, unsure how. I do have 8gb, don't want to apply all of that obviously. Gonna test now with TF2 to see if it's minecraft or the video card.
Rollback Post to RevisionRollBack
Intel Core i5 2500k - Asus Radeon HD 7870 - 8GB Memory - Seagate Barracuda 2TB 7200RPM - Corsair Builder Series X600 - MSI Z77 G45
How much RAM are you giving Java? It generally just takes 1GB but you can assign it more. Assigning 2 or even 4GB would help a bit, although don't expect a dramatic improvement.
Don't know, don't post.
If Minecraft ever uses more than 700MB you are using horrible mods and doing it wrong.
How much RAM are you giving Java? It generally just takes 1GB but you can assign it more. Assigning 2 or even 4GB would help a bit, although don't expect a dramatic improvement.
Wow, played some Tekkit, getting ~150fps. Guess optifine is magical then. Is this only with minecraft though? I'd like to play some bf3.
Tecckit uses 1.2.5, minecraft has lost most of it's optimization that means imagine a Ferrari now, replace the wheels with square blocks of steel. That is how minecraft past 1.3 is like...
It does actually.
I for example have a 144hz monitor and I want to run at that framerate whenever possible.(Don't say "the human eye can only see 30FPS, blablabla", since the human eye doesn't see in FPS)
I'd really like to improove my minecraft performance aswell since on Normal with a HD6850 I only get about 50FPS in Singleplayer, on multiplayer only about 35FPS. (Optifine installed, without it's even worse)
Yeah, and I have a 300 hz monitor.
Oh, wait. They don't make those.
Seeing over 60 fps would be pretty hard no matter how little you think the human eye is affected by fps, 60 fps is incredibly fast.
Seeing over 60 fps would be pretty hard no matter how little you think the human eye is affected by fps, 60 fps is incredibly fast.
What are you on about? Why do you think people who play competitive first person shooters invest in 120 hz monitors? Why do you think so many gamers still use CRT's? Why do you think 60 hz looks like total **** to the untrained eye?
What are you on about? Why do you think people who play competitive first person shooters invest in 120 hz monitors?
Because they're stupid, they're also the ones who buy 3k dollar computers that don't do anything. Were you trying to make a point? Next you're gonna say I should trust all the youtube fan trains that do minecraft lets plays and have some 5k dollar macbook for video editing.
I do not know a single gamer that uses a CRT still unless it's a really old one that they're playing a console on because they have no money. I don't know anyone that uses a CRT on purpose. Please provide citation that people do and there is an actual reason to, other wise I just think you're talking crap.
Yeah, but compared to 120/144 FPS 60FPS looks pretty choppy or blurry if motionblurr is on like in this screenshot: http://prntscr.com/idf45 The top is 120fps, the lower one is 60FPS both with the same motionblurr and velocity settings. here is the website if you want to test it out: http://frames-per-second.appspot.com
Also in while Gaming i see a huge difference. Especially in Bad Company 2, because you can see a lot better while moving around quickly.
But even just moving windows around on your desktop looks soooo much more fluid with a 120/144hz monitor, don't question it, you have obviously not tested one yourself
Yes because someone disagrees with you that obviously means they haven't tried it and you aren't just wrong.
Why does everyone on these forums think they're gods gift to mankind or something. Improving motion blur is hardly a signal of how having 120 fps over 60 improves visuals. I would prefer to see an actual ingame comparison rather than an app anyway.
I would be lieing if I said I wasn't used to everyone on these forums pretending they have a doctorate in everything from computer science to neurosurgery.
i5 2500k
Asus 7870
8GB DDR3 RAM
MSI Z77 G45 MOBO
Seagate Barracuda 2TB 7200RPM HDD
LG DVD Reader
ThermalTake ATX V3 Case
Windows 7 64Bit
Well, I'm typing from this PC right now, installed a few things, but my main concern is fps. My cousin has an i3 with a 6580 and gets around 100fps while playing Minecraft. Me, on the other hand, am only getting around 50, where I suppose I should be getting a lot more. Max settings, and all. Help anyone? I've already installed all the drivers but the HDD one. Thanks in advance!
I get around 120 with a GTX 670.
Java is weird like that.
Also are you in the same world as your cousin when you measure your fps?
No, no multiplayer testing yet.
Is that necessary? I installed the default driver.
Yes, it is necessary, go to the websites that your parts are from (for example, Asus for your GPU), and install the newer drivers for the correct card.
NECKBEERD FORUM
Yes, you should always update drivers.
If it was integrated it wold run at 20. :y
NECKBEERD FORUM
So I installed the newest drivers, get up to 90fps sometimes but mainly around 70. Tips?
Haven't altered that, unsure how. I do have 8gb, don't want to apply all of that obviously. Gonna test now with TF2 to see if it's minecraft or the video card.
Don't know, don't post.
If Minecraft ever uses more than 700MB you are using horrible mods and doing it wrong.
http://pcpartpicker.com/user/SteevyT/saved/21PI
Not that it really matters, if you're getting 70 FPS it shouldn't even require fixing.
Yeah, but I want to record, would like more. And from what I can see on youtube it's an average of 150, i'd like to have it as best as possible.
dont listen to this fool
Tecckit uses 1.2.5, minecraft has lost most of it's optimization that means imagine a Ferrari now, replace the wheels with square blocks of steel. That is how minecraft past 1.3 is like...
Yeah, and I have a 300 hz monitor.
Oh, wait. They don't make those.
Seeing over 60 fps would be pretty hard no matter how little you think the human eye is affected by fps, 60 fps is incredibly fast.
What are you on about? Why do you think people who play competitive first person shooters invest in 120 hz monitors? Why do you think so many gamers still use CRT's? Why do you think 60 hz looks like total **** to the untrained eye?
YouTube reduces any video you upload to 30 fps for playback.
Because they're stupid, they're also the ones who buy 3k dollar computers that don't do anything. Were you trying to make a point? Next you're gonna say I should trust all the youtube fan trains that do minecraft lets plays and have some 5k dollar macbook for video editing.
I do not know a single gamer that uses a CRT still unless it's a really old one that they're playing a console on because they have no money. I don't know anyone that uses a CRT on purpose. Please provide citation that people do and there is an actual reason to, other wise I just think you're talking crap.
I don't. I think it looks very smooth and you're, again, just talking crap.
Yes because someone disagrees with you that obviously means they haven't tried it and you aren't just wrong.
Why does everyone on these forums think they're gods gift to mankind or something. Improving motion blur is hardly a signal of how having 120 fps over 60 improves visuals. I would prefer to see an actual ingame comparison rather than an app anyway.
I would be lieing if I said I wasn't used to everyone on these forums pretending they have a doctorate in everything from computer science to neurosurgery.