I've been wondering if anyone has specifically tested DDR3 vs DDR4 with Minecraft java edition, not for running servers just in-game performance; chunk rendering times, FPS, world load times... etc.
I'm still running on an old platform, Ivy Bridge 3770k, but everything is very heavily overclocked which is why I haven't updated yet as €700-800 just for a platform change and maybe 15-20% more CPU horsepower, I can't justify it. OK, yes I know there are a number of extra benefits but the main one, CPU performance, it wouldn't be a huge increase.
Then it occurred to me, Minecraft has always, at least when I've played and tested, taken a noticeable increase or decrease in performance with regards to RAM frequency, although not huge and life-changing, it's still noticeable and measurable. Therefore I'm wondering if anyone has tested the performance increases (if any) with DDR4 vs DDR3, surely the large frequency increases have made a difference but by how much??
I did try searching the forum which resulted in nothing, zero posts or threads, even though I've just read 2 separate threads with posts containing those exact keywords just 5 minutes ago. It's good to see the sites search feature still hasn't been fixed in the 2 years I've been away
How many MHz for each RAM type, how many GBs per console, if say just 4gbs for both ddr3 and ddr4, the base clock for the ddr3 is at 1866MHz, and for ddr4 2400MHz, you should get a 20% increase in fps and around 10% in chunk updates, if 8gb consoles, you will get a 30%, and 16gb consoles, you will get 30% increase with ddr4, also, are you running internal graphics or external, so for that cpu internal graphics is the INTEL HD 4000 gpu, but say you have external like a gtx 1050 ti, the fps is changed, same with the amount of RAM used by minecraft, internal graphics uses some of the RAM as well with the cpu, thus, slowing the base clock for the ram, lowering performance of both types. An external gpu uses its own VRAM, but ddr4 will still have better performance then ddr3. these performance stats are minecraft graphics with everything set to max, such as render distance, and minecraft version is 1.10.2.
all in all, ddr4 wins everyway with base clock, response time, therefore an increased fps as well, but loses one way and that is the cost.
G.skill ddr3 RAM for an 8 gb console is $75.50, now a G.SKill ddr4 ram 8Gb console is $107.99.
How many MHz for each RAM type, how many GBs per console, if say just 4gbs for both ddr3 and ddr4, the base clock for the ddr3 is at 1866MHz, and for ddr4 2400MHz, you should get a 20% increase in fps and around 10% in chunk updates, if 8gb consoles, you will get a 30%, and 16gb consoles, you will get 30% increase with ddr4, also, are you running internal graphics or external, so for that cpu internal graphics is the INTEL HD 4000 gpu, but say you have external like a gtx 1050 ti, the fps is changed, same with the amount of RAM used by minecraft, internal graphics uses some of the RAM as well with the cpu, thus, slowing the base clock for the ram, lowering performance of both types. An external gpu uses its own VRAM, but ddr4 will still have better performance then ddr3. these performance stats are minecraft graphics with everything set to max, such as render distance, and minecraft version is 1.10.2.
all in all, ddr4 wins everyway with base clock, response time, therefore an increased fps as well, but loses one way and that is the cost.
G.skill ddr3 RAM for an 8 gb console is $75.50, now a G.SKill ddr4 ram 8Gb console is $107.99.
Hang on, are you saying you get a 20% increase in FPS just going from 1866MHz to 2400MHz? Because that doesn't sound right, not from my testing anyway. Sorry, not being rude but have you got anything to show this increase, as it seems much higher than I've ever tested or seen?
As for iGPUs or APUs, yes I'm aware allocating more and higher frequency RAM will dramatically improve the experience. I'm running a dedicated card, GTX 1080 (not Ti).
I get that the faster DDR4 should win out and improve things, but that's what I'm looking for, hard data that someone has tested DDR3 at X speed, benchmarked to get a baseline, then DDR4 at X speed then increased the frequency to gauge roughly what the increase is, and maybe if there's a cut off point where over a certain speed it doesn't improve.
well, by my testing I got 20%, it might not sound right, but that is it.
Wow, OK, that's a lot higher than my tests back on 1.7.2/5.
I'm just gauging what possible performance increases I would expect in order to work out whether to upgrade my platform towards the end of the year, it's easy enough to work out the CPU and GPU increases, but Minecraft is pretty unique so I'm hoping someone has benched and tested DDR3 vs DDR4.
I'm not as up to date on things anymore, but were there any platforms in that crossover time that could accept DDR3 or DDR4? If so, that is the only way you can answer this question. If you test a platform with DDR3 and DDR4 but they also have different CPUs, it skews your results.
That being said, part of the reason the new CPUs and platforms perform better compared to our old Sandy and Ivy Bridge ones is precisely because of the faster RAM. So I wouldn't worry much about DDR3 or DDR4 but just try and find and look at results from new platforms compared to yours with Minecraft (benchmarking Minecraft due to it's nature is really, really hard so you might find it hard to research). That being sand, looking at theoretical bandwidth numbers is NOT going to be reflective of the REAL performance gain you get. DDR4 over DDR3 on the same platform won't give you 20% more average FPS.
Minecraft is also a CPU/platform dependent game first and foremost (GPU is second unless you're using anti-aliasing, high resolution texture packs, shaders, and that sort of thing). Coming from an Ivy Bridge, let alone a heavily overclocked one as you say, you're going to want to be looking at basically the newest Intel family and to overclock it as much as you can to get a return worth the cost compared to what you're sitting on now. From what I understand, AMD has finally caught up to the IPC of Intel's Sandy/Ivy/Haswell era platforms, but is still a bit slower than their newest offerings in IPC. They do cost less and typically have more cores and/or threads, while still being close enough now (because the newest stuff isn't THAT much faster in IPC), so they are still fantastic CPUs for most other reasons, and even the best buy in a lot of cases, but going to one from your current CPU probably wouldn't be as big a jump in the case of Minecraft, especially since I don't think they overclock as well (though even the newest Intels don't overclock as well either as the ones from five years ago).
Sadly, RAM prices are way up now too. Having a CPU even older than yours, I don't have to worry much because 1) my disposable income right now is low, and 2) I don't really need more performance (I might want some if I had the spending money, but I don't "need" it), or otherwise I'd be in the same tough spot you are, but ultimately I'd probably hold off because of RAM prices. I got 16 GB of RAM for nearly $100 like... five or six years ago? The prices of RAM now makes it hard to justify getting even the same amount, let alone more (and upgrading only the CPU at the cost of a new motherboard, CPU, and RAM is hardly worth the cost to me). Sad to say, CPU improvements have been hitting limits and walls lately which makes the cost of upgrading them more questionable. This is still a fantastic CPU for my needs, for better and worse.
Minecraft is also a CPU/platform dependent game first and foremost (GPU is second unless you're using anti-aliasing, high resolution texture packs, shaders, and that sort of thing).
minecraft isnt only a cpu/platform, it is still very dependent on ram, just 100mb difference in usable ram results in a noticeable fps decrease. same with the gpu.
also, if your cpu supports both types of ram, you only need a different mobo, the base clock wasnt changed either. and i got a 20% increase with maxed settings except render distance, that maxed only made a 5-8% increase with ddr4, on a superflat world for testing, npc's didnt effect much, maybe a 0.9-1% decrease in fps with 100 pigs.
i still recommend ddr4. if you are budget minded, ddr3 then.
P.S. you were right with cpu increase with fps, it may be that mine and its more compatible with ddr4, i dont know the answer to that one. and I checked, my cpu is not cross platform but it worked somehow? Don't know why, but it did.
I'm not as up to date on things anymore, but were there any platforms in that crossover time that could accept DDR3 or DDR4? If so, that is the only way you can answer this question. If you test a platform with DDR3 and DDR4 but they also have different CPUs, it skews your results.
Yes, you're correct, there were some mobos and chipsets that would except both.
That being said, part of the reason the new CPUs and platforms perform better compared to our old Sandy and Ivy Bridge ones is precisely because of the faster RAM. So I wouldn't worry much about DDR3 or DDR4 but just try and find and look at results from new platforms compared to yours with Minecraft (benchmarking Minecraft due to it's nature is really, really hard so you might find it hard to research). That being sand, looking at theoretical bandwidth numbers is NOT going to be reflective of the REAL performance gain you get. DDR4 over DDR3 on the same platform won't give you 20% more average FPS.
I get what you mean, I just figured that there are a million+ nerds out there like me, so if I had an idea for testing something there's a good chance someone else had that idea and tried it.
Rather than going for a platform that accepts both DDR3 and 4, my methodology was to run a DDR3 based system like my Ivy or your Sandy with a fixed CPU clock of let's say 4.0GHz, and a RAM clock of 1600MHz (or whatever this kit can run at). Then benchmark it with some synthetics and get a non-graphics baseline score.
Then take the newer DDR4 system and underclock the RAM to match the DDR3 frequency and underclock the CPU to 3.?MHz until the bench scores are pretty much equal to the DDR3 system scores.
Now with the 2 systems being fairly equal in CPU/RAM performance, start testing Minecraft and gradually increase the RAM frequency of both systems to bench the improvements of going from 1333MHz, to 1600, 1866.... all the way up to whatever the DDR4 kit can run at - this would give pretty decent results and show how much of a performance increase there is with faster RAM, and if there's a point of diminishing returns where say, the difference between 3000MHz and 4000MHz is barely noticeable or measurable, so the sweet spot is 3000MHz.
That was my thinking anyway, I guess it's a lot of work for something not so super important, it's just something I'd be interested to know. When I do upgrade my platform I may have to do this testing to satisfy my nerdy itch
Minecraft is also a CPU/platform dependent game first and foremost (GPU is second unless you're using anti-aliasing, high resolution texture packs, shaders, and that sort of thing). Coming from an Ivy Bridge, let alone a heavily overclocked one as you say, you're going to want to be looking at basically the newest Intel family and to overclock it as much as you can to get a return worth the cost compared to what you're sitting on now. From what I understand, AMD has finally caught up to the IPC of Intel's Sandy/Ivy/Haswell era platforms, but is still a bit slower than their newest offerings in IPC. They do cost less and typically have more cores and/or threads, while still being close enough now (because the newest stuff isn't THAT much faster in IPC), so they are still fantastic CPUs for most other reasons, and even the best buy in a lot of cases, but going to one from your current CPU probably wouldn't be as big a jump in the case of Minecraft, especially since I don't think they overclock as well (though even the newest Intels don't overclock as well either as the ones from five years ago).
Exactly, it's fantastic to see AMD finally catch up and have some competitive CPUs, and the Zen SKUs are brilliant in many ways, but as you said the IPC performance is still a few generations off, which wouldn't be too bad if the clocks were equal to Intel's new chips, but with the likes of the 8700k pushing 5.2GHz and quite commonly even 5.3 and 5.4GHz on an open-loop system, that's a gigantic difference compared to AMDs 4.0 or 4.2GHz with the newer second gen SKUs.
I mean it's not as huge if you're a 30-60fps sort of gamer, but most always try to push 60+ with many of us wanting 100+fps for our high refresh panels, that 1GHz or more difference in clock speed makes a huge difference in being able to push high framerates.
Some of Intel's shady, anti-consumer over recent years has really made me want to vote with my wallet and not touch an Intel CPU for a few years until they cut the poop out, but for a gaming rig you want the best you can get for your money, and sadly AMD's are awesome but they really lag behind in mid to high-end gaming.
Fingers crossed for Zen++ and Zen+++ in the coming year or two.
Sadly, RAM prices are way up now too. Having a CPU even older than yours, I don't have to worry much because 1) my disposable income right now is low, and 2) I don't really need more performance (I might want some if I had the spending money, but I don't "need" it), or otherwise I'd be in the same tough spot you are, but ultimately I'd probably hold off because of RAM prices. I got 16 GB of RAM for nearly $100 like... five or six years ago? The prices of RAM now makes it hard to justify getting even the same amount, let alone more (and upgrading only the CPU at the cost of a new motherboard, CPU, and RAM is hardly worth the cost to me). Sad to say, CPU improvements have been hitting limits and walls lately which makes the cost of upgrading them more questionable. This is still a fantastic CPU for my needs, for better and worse
And it's doesn't look like the prices will be falling any time soon either. I saw many predicted the shortage of NAND flash was going to continue into 2019 until some new fabs are finally opened and some old fabs are expanded.
One hope is AMD, they're pushing hard on DDR5 development and stacked memory (HBM), they really want to be ahead of Intel with the upcoming DDR5 release, so hopefully, we might start to see the NAND flash drought begin to dry up (pun intended).
Yeah, it could likely be another year or two minimum before I upgrade this old, old Core i5 2500K, which is crazy but if it keeps performing, which is what matters. I don't really care what model number or year or origin it is I guess. I'm only using a 60 Hz 1920 x 1200 screen so it's okay for that, and if I ever need a bit more performance, I guess I could learn how to overclock it more, because 4 GHz on this is probably modest, and it runs pretty cool.
Ideally I'd like to look for a nice 2560 x 1440 120Hz/144Hz screen or something next thing to change (thing is, I'm dead set on wanting IPS like I have now so that won't be a cheap thing either). My hardware is okay by me still, having a recent and good graphics card, a lot of RAM still, and an SSD and a lot of secondary storage. It's just CPU seems next in line, but... hard to justify that right now, especially with low spending money to throw around.
Yeah, it could likely be another year or two minimum before I upgrade this old, old Core i5 2500K, which is crazy but if it keeps performing, which is what matters. I don't really care what model number or year or origin it is I guess. I'm only using a 60 Hz 1920 x 1200 screen so it's okay for that, and if I ever need a bit more performance, I guess I could learn how to overclock it more, because 4 GHz on this is probably modest, and it runs pretty cool.
Yes, 4.0GHz is very much modest for Sandy Bridge. You have one of the last properly made chips from Intel before they decided to screw the customers and started to cheap out on materials. Everything from Ivy Bridge onwards (sucks for me) uses some useless cheap rubbish TIM (aka toothpaste/peanut butter) under the IHS, you're lucky, you have the last of the soldered dies, believe me, many including me would be jealous.
I had to Delid my Ivy chip, replace the toothpaste with Gallium (liquid metal), lap and polish the IHS, then put it on water to get it past 4.4GHz. On the Noctua D14 I used back then when I bought it, even that monster of a heatsink wasn't enough to keep it cool.
If you spend some time fine tuning and of course, if you were lucky in the silicon lottery, you *should be able to push 4.5GHz easily even with a fairly modest cooler. Drop a CLC or Noctua D15 on it and well... many people have hit 5GHz on Sandy without having to go to a custom loop.
Ideally I'd like to look for a nice 2560 x 1440 120Hz/144Hz screen or something next thing to change (thing is, I'm dead set on wanting IPS like I have now so that won't be a cheap thing either). My hardware is okay by me still, having a recent and good graphics card, a lot of RAM still, and an SSD and a lot of secondary storage. It's just CPU seems next in line, but... hard to justify that right now, especially with low spending money to throw around.
Absolutely. I got to test a VA panel when I was shopping around for my upgrade last year. I know every panel tech' has its pros and cons, but I'd heard that the pros with VA panels outweigh the cons - thankfully I got to try one before spending my hard earned cash, as, in my opinion at least, it doesn't hold a candle to IPS - the blacks are great for sure and the cheaper price is awesome, but god damn the ghosting and motion blur, I felt nauseous after a while playing a fast-paced game. Some people have told me I obviously had a bad panel, but many more have concurred that it's just one of the cons with VA.
Yes, 4.0GHz is very much modest for Sandy Bridge. You have one of the last properly made chips from Intel before they decided to screw the customers and started to cheap out on materials. Everything from Ivy Bridge onwards (sucks for me) uses some useless cheap rubbish TIM (aka toothpaste/peanut butter) under the IHS, you're lucky, you have the last of the soldered dies, believe me, many including me would be jealous.
Hm, interesting, I never heard of that! I know my laptop CPU (Haswell?) seems to run sliiiightly warm (not overly so) for it's low clock speed, but I guess a lot happened when I stopped paying attention around the time of Ivy Bridge.
Absolutely. I got to test a VA panel when I was shopping around for my upgrade last year. I know every panel tech' has its pros and cons, but I'd heard that the pros with VA panels outweigh the cons - thankfully I got to try one before spending my hard earned cash, as, in my opinion at least, it doesn't hold a candle to IPS - the blacks are great for sure and the cheaper price is awesome, but god damn the ghosting and motion blur, I felt nauseous after a while playing a fast-paced game. Some people have told me I obviously had a bad panel, but many more have concurred that it's just one of the cons with VA.
Hm, well, IPS and probably VA both aren't as fast as TN, but each does indeed have cons (IPS is response time and cost generally). I'd probably take a good VA over a TN as well, but good viewing angles are a MUST for me. The biggest bother for me when using my laptop isn't it's lowly CPU, or it's onboard Intel graphics, or even it's tiny 1366 x 768 resolution. It's the quality of the screen and the horrible viewing angles namely. No matter where you position it or your head, it's just an inconsistent glossy mess and looks like... I am so forgetting the word for it but it's like looking through that type of filter where the entire edges are circularly darker, and Minecraft itself actually does this albeit it to a very minor extent). Sadly, I believe your typical low cost modern LED backlit (LCD) laptop screen to be about the worst displays modern PCs have used. It's the most important part of the PC; it's what you look at.
Oh my word! I received an official warning for this thread, for profanity!
I specifically make sure I don't use any profanity on this site so I've searched the thread for every swear word in my vocabulary to see which one slip out and nothing, not one just as I thought. *Smh
Oh my word! I received an official warning for this thread, for profanity!
I specifically make sure I don't use any profanity on this site so I've searched the thread for every swear word in my vocabulary to see which one slip out and nothing, not one just as I thought. *Smh
If you have an issue with a warning, you can contact the moderator who issued the warning or follow the appeal instructions in the warning message.
I've been wondering if anyone has specifically tested DDR3 vs DDR4 with Minecraft java edition, not for running servers just in-game performance; chunk rendering times, FPS, world load times... etc.
I'm still running on an old platform, Ivy Bridge 3770k, but everything is very heavily overclocked which is why I haven't updated yet as €700-800 just for a platform change and maybe 15-20% more CPU horsepower, I can't justify it. OK, yes I know there are a number of extra benefits but the main one, CPU performance, it wouldn't be a huge increase.
Then it occurred to me, Minecraft has always, at least when I've played and tested, taken a noticeable increase or decrease in performance with regards to RAM frequency, although not huge and life-changing, it's still noticeable and measurable. Therefore I'm wondering if anyone has tested the performance increases (if any) with DDR4 vs DDR3, surely the large frequency increases have made a difference but by how much??
I did try searching the forum which resulted in nothing, zero posts or threads, even though I've just read 2 separate threads with posts containing those exact keywords just 5 minutes ago. It's good to see the sites search feature still hasn't been fixed in the 2 years I've been away
How many MHz for each RAM type, how many GBs per console, if say just 4gbs for both ddr3 and ddr4, the base clock for the ddr3 is at 1866MHz, and for ddr4 2400MHz, you should get a 20% increase in fps and around 10% in chunk updates, if 8gb consoles, you will get a 30%, and 16gb consoles, you will get 30% increase with ddr4, also, are you running internal graphics or external, so for that cpu internal graphics is the INTEL HD 4000 gpu, but say you have external like a gtx 1050 ti, the fps is changed, same with the amount of RAM used by minecraft, internal graphics uses some of the RAM as well with the cpu, thus, slowing the base clock for the ram, lowering performance of both types. An external gpu uses its own VRAM, but ddr4 will still have better performance then ddr3. these performance stats are minecraft graphics with everything set to max, such as render distance, and minecraft version is 1.10.2.
all in all, ddr4 wins everyway with base clock, response time, therefore an increased fps as well, but loses one way and that is the cost.
G.skill ddr3 RAM for an 8 gb console is $75.50, now a G.SKill ddr4 ram 8Gb console is $107.99.
Hang on, are you saying you get a 20% increase in FPS just going from 1866MHz to 2400MHz? Because that doesn't sound right, not from my testing anyway. Sorry, not being rude but have you got anything to show this increase, as it seems much higher than I've ever tested or seen?
As for iGPUs or APUs, yes I'm aware allocating more and higher frequency RAM will dramatically improve the experience. I'm running a dedicated card, GTX 1080 (not Ti).
I get that the faster DDR4 should win out and improve things, but that's what I'm looking for, hard data that someone has tested DDR3 at X speed, benchmarked to get a baseline, then DDR4 at X speed then increased the frequency to gauge roughly what the increase is, and maybe if there's a cut off point where over a certain speed it doesn't improve.
well, by my testing I got 20%, it might not sound right, but that is it.
Wow, OK, that's a lot higher than my tests back on 1.7.2/5.
I'm just gauging what possible performance increases I would expect in order to work out whether to upgrade my platform towards the end of the year, it's easy enough to work out the CPU and GPU increases, but Minecraft is pretty unique so I'm hoping someone has benched and tested DDR3 vs DDR4.
I'm not as up to date on things anymore, but were there any platforms in that crossover time that could accept DDR3 or DDR4? If so, that is the only way you can answer this question. If you test a platform with DDR3 and DDR4 but they also have different CPUs, it skews your results.
That being said, part of the reason the new CPUs and platforms perform better compared to our old Sandy and Ivy Bridge ones is precisely because of the faster RAM. So I wouldn't worry much about DDR3 or DDR4 but just try and find and look at results from new platforms compared to yours with Minecraft (benchmarking Minecraft due to it's nature is really, really hard so you might find it hard to research). That being sand, looking at theoretical bandwidth numbers is NOT going to be reflective of the REAL performance gain you get. DDR4 over DDR3 on the same platform won't give you 20% more average FPS.
Minecraft is also a CPU/platform dependent game first and foremost (GPU is second unless you're using anti-aliasing, high resolution texture packs, shaders, and that sort of thing). Coming from an Ivy Bridge, let alone a heavily overclocked one as you say, you're going to want to be looking at basically the newest Intel family and to overclock it as much as you can to get a return worth the cost compared to what you're sitting on now. From what I understand, AMD has finally caught up to the IPC of Intel's Sandy/Ivy/Haswell era platforms, but is still a bit slower than their newest offerings in IPC. They do cost less and typically have more cores and/or threads, while still being close enough now (because the newest stuff isn't THAT much faster in IPC), so they are still fantastic CPUs for most other reasons, and even the best buy in a lot of cases, but going to one from your current CPU probably wouldn't be as big a jump in the case of Minecraft, especially since I don't think they overclock as well (though even the newest Intels don't overclock as well either as the ones from five years ago).
Sadly, RAM prices are way up now too. Having a CPU even older than yours, I don't have to worry much because 1) my disposable income right now is low, and 2) I don't really need more performance (I might want some if I had the spending money, but I don't "need" it), or otherwise I'd be in the same tough spot you are, but ultimately I'd probably hold off because of RAM prices. I got 16 GB of RAM for nearly $100 like... five or six years ago? The prices of RAM now makes it hard to justify getting even the same amount, let alone more (and upgrading only the CPU at the cost of a new motherboard, CPU, and RAM is hardly worth the cost to me). Sad to say, CPU improvements have been hitting limits and walls lately which makes the cost of upgrading them more questionable. This is still a fantastic CPU for my needs, for better and worse.
minecraft isnt only a cpu/platform, it is still very dependent on ram, just 100mb difference in usable ram results in a noticeable fps decrease. same with the gpu.
also, if your cpu supports both types of ram, you only need a different mobo, the base clock wasnt changed either. and i got a 20% increase with maxed settings except render distance, that maxed only made a 5-8% increase with ddr4, on a superflat world for testing, npc's didnt effect much, maybe a 0.9-1% decrease in fps with 100 pigs.
i still recommend ddr4. if you are budget minded, ddr3 then.
P.S. you were right with cpu increase with fps, it may be that mine and its more compatible with ddr4, i dont know the answer to that one. and I checked, my cpu is not cross platform but it worked somehow? Don't know why, but it did.
Yes, you're correct, there were some mobos and chipsets that would except both.
I get what you mean, I just figured that there are a million+ nerds out there like me, so if I had an idea for testing something there's a good chance someone else had that idea and tried it.
Rather than going for a platform that accepts both DDR3 and 4, my methodology was to run a DDR3 based system like my Ivy or your Sandy with a fixed CPU clock of let's say 4.0GHz, and a RAM clock of 1600MHz (or whatever this kit can run at). Then benchmark it with some synthetics and get a non-graphics baseline score.
Then take the newer DDR4 system and underclock the RAM to match the DDR3 frequency and underclock the CPU to 3.?MHz until the bench scores are pretty much equal to the DDR3 system scores.
Now with the 2 systems being fairly equal in CPU/RAM performance, start testing Minecraft and gradually increase the RAM frequency of both systems to bench the improvements of going from 1333MHz, to 1600, 1866.... all the way up to whatever the DDR4 kit can run at - this would give pretty decent results and show how much of a performance increase there is with faster RAM, and if there's a point of diminishing returns where say, the difference between 3000MHz and 4000MHz is barely noticeable or measurable, so the sweet spot is 3000MHz.
That was my thinking anyway, I guess it's a lot of work for something not so super important, it's just something I'd be interested to know. When I do upgrade my platform I may have to do this testing to satisfy my nerdy itch
Exactly, it's fantastic to see AMD finally catch up and have some competitive CPUs, and the Zen SKUs are brilliant in many ways, but as you said the IPC performance is still a few generations off, which wouldn't be too bad if the clocks were equal to Intel's new chips, but with the likes of the 8700k pushing 5.2GHz and quite commonly even 5.3 and 5.4GHz on an open-loop system, that's a gigantic difference compared to AMDs 4.0 or 4.2GHz with the newer second gen SKUs.
I mean it's not as huge if you're a 30-60fps sort of gamer, but most always try to push 60+ with many of us wanting 100+fps for our high refresh panels, that 1GHz or more difference in clock speed makes a huge difference in being able to push high framerates.
Some of Intel's shady, anti-consumer over recent years has really made me want to vote with my wallet and not touch an Intel CPU for a few years until they cut the poop out, but for a gaming rig you want the best you can get for your money, and sadly AMD's are awesome but they really lag behind in mid to high-end gaming.
Fingers crossed for Zen++ and Zen+++ in the coming year or two.
And it's doesn't look like the prices will be falling any time soon either. I saw many predicted the shortage of NAND flash was going to continue into 2019 until some new fabs are finally opened and some old fabs are expanded.
One hope is AMD, they're pushing hard on DDR5 development and stacked memory (HBM), they really want to be ahead of Intel with the upcoming DDR5 release, so hopefully, we might start to see the NAND flash drought begin to dry up (pun intended).
Yeah, it could likely be another year or two minimum before I upgrade this old, old Core i5 2500K, which is crazy but if it keeps performing, which is what matters. I don't really care what model number or year or origin it is I guess. I'm only using a 60 Hz 1920 x 1200 screen so it's okay for that, and if I ever need a bit more performance, I guess I could learn how to overclock it more, because 4 GHz on this is probably modest, and it runs pretty cool.
Ideally I'd like to look for a nice 2560 x 1440 120Hz/144Hz screen or something next thing to change (thing is, I'm dead set on wanting IPS like I have now so that won't be a cheap thing either). My hardware is okay by me still, having a recent and good graphics card, a lot of RAM still, and an SSD and a lot of secondary storage. It's just CPU seems next in line, but... hard to justify that right now, especially with low spending money to throw around.
Yes, 4.0GHz is very much modest for Sandy Bridge. You have one of the last properly made chips from Intel before they decided to screw the customers and started to cheap out on materials. Everything from Ivy Bridge onwards (sucks for me) uses some useless cheap rubbish TIM (aka toothpaste/peanut butter) under the IHS, you're lucky, you have the last of the soldered dies, believe me, many including me would be jealous.
I had to Delid my Ivy chip, replace the toothpaste with Gallium (liquid metal), lap and polish the IHS, then put it on water to get it past 4.4GHz. On the Noctua D14 I used back then when I bought it, even that monster of a heatsink wasn't enough to keep it cool.
If you spend some time fine tuning and of course, if you were lucky in the silicon lottery, you *should be able to push 4.5GHz easily even with a fairly modest cooler. Drop a CLC or Noctua D15 on it and well... many people have hit 5GHz on Sandy without having to go to a custom loop.
Absolutely. I got to test a VA panel when I was shopping around for my upgrade last year. I know every panel tech' has its pros and cons, but I'd heard that the pros with VA panels outweigh the cons - thankfully I got to try one before spending my hard earned cash, as, in my opinion at least, it doesn't hold a candle to IPS - the blacks are great for sure and the cheaper price is awesome, but god damn the ghosting and motion blur, I felt nauseous after a while playing a fast-paced game. Some people have told me I obviously had a bad panel, but many more have concurred that it's just one of the cons with VA.
IPS FTW
Hm, interesting, I never heard of that! I know my laptop CPU (Haswell?) seems to run sliiiightly warm (not overly so) for it's low clock speed, but I guess a lot happened when I stopped paying attention around the time of Ivy Bridge.
Hm, well, IPS and probably VA both aren't as fast as TN, but each does indeed have cons (IPS is response time and cost generally). I'd probably take a good VA over a TN as well, but good viewing angles are a MUST for me. The biggest bother for me when using my laptop isn't it's lowly CPU, or it's onboard Intel graphics, or even it's tiny 1366 x 768 resolution. It's the quality of the screen and the horrible viewing angles namely. No matter where you position it or your head, it's just an inconsistent glossy mess and looks like... I am so forgetting the word for it but it's like looking through that type of filter where the entire edges are circularly darker, and Minecraft itself actually does this albeit it to a very minor extent). Sadly, I believe your typical low cost modern LED backlit (LCD) laptop screen to be about the worst displays modern PCs have used. It's the most important part of the PC; it's what you look at.
Oh my word! I received an official warning for this thread, for profanity!
I specifically make sure I don't use any profanity on this site so I've searched the thread for every swear word in my vocabulary to see which one slip out and nothing, not one just as I thought. *Smh
If you have an issue with a warning, you can contact the moderator who issued the warning or follow the appeal instructions in the warning message.
- sunperp