After watching video, I noticed that the AMD card had much superior color. I did a bit of searching around, and surely enough, many say that their Nvidia cards look lifeless and dull, while their AMD cards looked much more vibrant.
Is this true? Can anyone confirm?
Is this a notable difference? Can it be changed/optimized to match AMD's color?
It's possible that they used different monitors on each.
That's probably correct. I'm unsure as to why he didn't just use a capture card/software instead of a camera.
Besides, it's pretty much standard these days to always calibrate your monitor as soon as you set it up. In my opinion, that test was horribly biased as he could've easily altered the monitors settings in between tests.
Rollback Post to RevisionRollBack
i5 2500k @ 4Ghz | GTX 670 @ stock | Hyper 212 Evo | Asrock Z68 pro3 gen3 | 8(2x4)gb Ripjaws 1600mhz | 2.5 TB of combined HD space | Corsair TX-750 | Asus Xonar DG | CM Storm Enforcer |
All drivers and driver versions will report different temps, have the GPU behave differently, and will usually have slightly different cooling plans.
All TIM is different.
Each and every card's bin, TIM, and cooler will be different, individually.
Two cards next to each other on the same production line could be upwards of 15C apart from one another in stock temps.
In addition, temperature monitoring software is all different.
Hardware sensors are calibrated differently, and report differently.
Software that interprets these messages is also calibrated and can report differently.
Really, it doesn't matter. All coolers are about the same, they cool effectively, if they cool effectively, they are doing their job. No one gives a damn if yours is 5C lower or 3C nearer to idle temps, it just does not matter.
But no, there is really no way to tell if one is "better", but then, who cares? Does it impact you in a significant way directly? Does it impact the lifespan of your card? Of course not to both of these, so why does it matter?
All drivers and driver versions will report different temps, have the GPU behave differently, and will usually have slightly different cooling plans.
All TIM is different.
Each and every card's bin, TIM, and cooler will be different, individually.
Two cards next to each other on the same production line could be upwards of 15C apart from one another in stock temps.
In addition, temperature monitoring software is all different.
Hardware sensors are calibrated differently, and report differently.
Software that interprets these messages is also calibrated and can report differently.
Really, it doesn't matter. All coolers are about the same, they cool effectively, if they cool effectively, they are doing their job. No one gives a damn if yours is 5C lower or 3C nearer to idle temps, it just does not matter.
But no, there is really no way to tell if one is "better", but then, who cares? Does it impact you in a significant way directly? Does it impact the lifespan of your card? Of course not to both of these, so why does it matter?
Because I did not think the thread would be stupid enough to be talking about color, so I assumed it was coolers and a typo.
OP, this is stupid, in the broadest sense of the word. The color will look no different unless you set it to be different.
Seriously, you guys will find anything minor to try and start a brand war about.
There actually might be difference in color, but either way both Nvidia and AMD would be producing techinically inaccurate colors unless the engine in question specifically aims at one or the other, since color reproduction is something that varies not only from card to card but from driver to driver.
That's why whenever you see anyone caring about accurate color reproduction it's done through software, not hardware.
BUT EITHER WAY I think aliasing and reliable performance is a more pressing issue than whether or not your reds are necessarily correct.
Presumed typo of Cooler instead Color and reasonable, as who cares about accurate color reproduction unless you doing digital art or movie production on a IPS monitor.
I can even spot a difference between my AMD card and Intel HD3000 color differences on same monitor. The AMD ones look washed out more, the Intel look more vibrant and saturated. Which is correct and more accurate? Neither. Because I use software to get my values correct and not rely on hardware and it's drivers.
Presumed typo of Cooler instead Color and reasonable, as who cares about accurate color reproduction unless you doing digital art or movie production on a IPS monitor.
I can even spot a difference between my AMD card and Intel HD3000 color differences on same monitor. The AMD ones look washed out more, the Intel look more vibrant and saturated. Which is correct and more accurate? Neither. Because I use software to get my values correct and not rely on hardware and it's drivers.
People that have working eyes care. I don't like it when my red is magenta.
And either way your second paragraph doesn't make sense.
I plan on getting an IPS monitor, as well as doing graphical work, so I am slightly worried about color.
It honestly does not matter. You guys tend to make things out to be bigger than they actually are, the core of the matter so happens to be that if you calibrate a monitor correctly, there is no difference and no reason to worry about color at all.
Don't bother with an IPS until you are doing it professionally, otherwise it's a complete waste. You could have the colors calibrated wrong to begin with and you would not even know it.
If your red is not red, then you have a bad monitor or something is configured wrong to begin with. I've never seen a calibrated monitor where red was not red.
How do you even know if the software is inaccurate if the monitor is also inaccurate? What if the game engine is also inaccurate? Then what?
Worrying about colors to begin with is completely stupid. As long as it looks good, what does it matter?
Just because Orange #948732 and Orange #948733 look the same does not mean the colors are "wrong".
I have yet to see a monitor, driver, or video card/GPU that produces "inaccurate" colors to the point where red is magenta.
Because there isn't a giant mystery about whether or not a color produced at a given moment is accurate or not. Most of the color inaccuracies produced in the software in question, that being video games, is not produced because those that created the software don't know any better, but simple because the colors have been produced from real-time algorithms whose concerns are 'good enough' and not 'accurate enough'. In other instances and avenues it is easily tellable whether or not a color produced is what was meant, and is even significant enough to matter (in any kind of area where image representation or manipulation matters to a significant degree).
Because there are people who care more about things looking 'good enough'. That's the root of it all. It's a personal choice.
Because there are people who care more about things looking 'good enough'. That's the root of it all. It's a personal choice.
There is a difference between things looking "good enough" and complaining that "this orange is slightly too orangey compared to this orange".
The latter honestly sounds like petty complaining.
If a properly calibrated monitor is still showing yellow as yellow and red as red, what does it matter? If the colors are off so much to where red is nearing magenta then something is calibrated incorrectly to begin with.
As for cards, I doubt there is any difference at all outside of possibly saturation/brightness/contrast default levels.
Is this true? Can anyone confirm?
Is this a notable difference? Can it be changed/optimized to match AMD's color?
This is a driver thing iirc.
fm87!Very true, but when multiple people on many forums are saying that their AMD card has warmer color, there's something more to it than that.
That's probably correct. I'm unsure as to why he didn't just use a capture card/software instead of a camera.
Besides, it's pretty much standard these days to always calibrate your monitor as soon as you set it up. In my opinion, that test was horribly biased as he could've easily altered the monitors settings in between tests.
All drivers and driver versions will report different temps, have the GPU behave differently, and will usually have slightly different cooling plans.
All TIM is different.
Each and every card's bin, TIM, and cooler will be different, individually.
Two cards next to each other on the same production line could be upwards of 15C apart from one another in stock temps.
In addition, temperature monitoring software is all different.
Hardware sensors are calibrated differently, and report differently.
Software that interprets these messages is also calibrated and can report differently.
Really, it doesn't matter. All coolers are about the same, they cool effectively, if they cool effectively, they are doing their job. No one gives a damn if yours is 5C lower or 3C nearer to idle temps, it just does not matter.
But no, there is really no way to tell if one is "better", but then, who cares? Does it impact you in a significant way directly? Does it impact the lifespan of your card? Of course not to both of these, so why does it matter?
It doesn't.
fm87!Why exactly are you on about coolers and temps?
OP, this is stupid, in the broadest sense of the word. The color will look no different unless you set it to be different.
Seriously, you guys will find anything minor to try and start a brand war about.
There actually might be difference in color, but either way both Nvidia and AMD would be producing techinically inaccurate colors unless the engine in question specifically aims at one or the other, since color reproduction is something that varies not only from card to card but from driver to driver.
That's why whenever you see anyone caring about accurate color reproduction it's done through software, not hardware.
BUT EITHER WAY I think aliasing and reliable performance is a more pressing issue than whether or not your reds are necessarily correct.
The driver, the way the driver and engine represent colors, the monitor and how it represents colors, how the driver is set up, etc. etc. etc.
Presumed typo of Cooler instead Color and reasonable, as who cares about accurate color reproduction unless you doing digital art or movie production on a IPS monitor.
I can even spot a difference between my AMD card and Intel HD3000 color differences on same monitor. The AMD ones look washed out more, the Intel look more vibrant and saturated. Which is correct and more accurate? Neither. Because I use software to get my values correct and not rely on hardware and it's drivers.
You can't begin to worry about the monitor when the software itself is producing inaccurate stuff.
I fear this tangent.
People that have working eyes care. I don't like it when my red is magenta.
And either way your second paragraph doesn't make sense.
Worrying about colors to begin with is completely stupid. As long as it looks good, what does it matter?
Just because Orange #948732 and Orange #948733 look the same does not mean the colors are "wrong".
I have yet to see a monitor, driver, or video card/GPU that produces "inaccurate" colors to the point where red is magenta.
I plan on getting an IPS monitor, as well as doing graphical work, so I am slightly worried about color.
Looks brown to me. http://jsbin.com/AqUhiXoy/1
Then you will be calibrating anyways.
fm87!Don't bother with an IPS until you are doing it professionally, otherwise it's a complete waste. You could have the colors calibrated wrong to begin with and you would not even know it.
If your red is not red, then you have a bad monitor or something is configured wrong to begin with. I've never seen a calibrated monitor where red was not red. Then you need to calibrate your monitor, that's a puke yellow/green.
Because there isn't a giant mystery about whether or not a color produced at a given moment is accurate or not. Most of the color inaccuracies produced in the software in question, that being video games, is not produced because those that created the software don't know any better, but simple because the colors have been produced from real-time algorithms whose concerns are 'good enough' and not 'accurate enough'. In other instances and avenues it is easily tellable whether or not a color produced is what was meant, and is even significant enough to matter (in any kind of area where image representation or manipulation matters to a significant degree).
Because there are people who care more about things looking 'good enough'. That's the root of it all. It's a personal choice.
Well yes, it's a hyperbole.
I have tried. It is a cheap $60 tn, can't really get it much better. It is sorta puke looking.
fm87!The latter honestly sounds like petty complaining.
If a properly calibrated monitor is still showing yellow as yellow and red as red, what does it matter? If the colors are off so much to where red is nearing magenta then something is calibrated incorrectly to begin with.
As for cards, I doubt there is any difference at all outside of possibly saturation/brightness/contrast default levels.