Whenever i launch any version from 1.0 to 1.7.10 i load up a world everything looks normal, but when i pick up something the screen becomes solid colors. gui's are broken, playermodel is broken. How can i fix This?
It's known behavior that can happen because those versions had an issue with OpenGL compliance but older video drivers were more lenient and excused it, but some newer video drivers don't.
This has been reported to happen to Intel, nVidia, and AMD video hardware/drivers, but I get the impression the latter two may have gone back to excusing it again because it seems more commonly happen on Intel now. I had nVidia before and have AMD now and personally have yet to see this, but I don't play older versions much (but when i did, I didn't see it on neither on "recent enough" drivers).
In any case, there's no public easy fix. You can try using OptiFine (or if you're using it, try not using it), but if that doesn't help, your next best solution would be to try older driver versions, which is not always preferable or even possible.
For 1.6.4 you can use my own fix, which has to be installed as a "jar" mod (see the instructions for TMCW for how to install) and is not compatible with Optifine since it modifies the same class (I've been told it does work with Forge, at least by itself since this sort of mod causes compatibility issues due to its very nature):
For other versions, nobody seem to be willing or able to make the simple fix necessary (literally a single line of code which Mojang removed for some reason in Beta 1.9, presumably because it seemed redundant (there is another line which is nearly identical and would seem to do the same thing, "glClientActiveTexture" vs "glActiveTexture") and had no effect back then. If anything, I'd blame all the drivers which just ignored it since they themselves technically weren't adhering to the official OpenGL specs, who knows how many old games suffer from bugs as a result, and why NVIDIA drivers, which include thousands upon thousands of patches for game bugs, are so large (other vendors seem to not want to do this, and I don't blame them, Intel and AMD have also long been notorious for their bad OpenGL performance/implementation (which continues to this day, e.g. "black screen after AMD driver update", in 1.21, not some decade-old version, of course, they will prioritize fixing it, but only because it affects the latest version), OpenGL in general is also a deprecated standard which had ceased development in 2017, Apple has even threatened to drop support for it).
Intel and AMD have also long been notorious for their bad OpenGL performance/implementation (which continues to this day...
Aren't you contrasting yourself by saying AMD's OpenGL implementation is bad, but then acknowledging the drivers are having to workaround what are issues with the games?
Saying AMD's OpenGL performance has "always been notoriously bad" is a bit of a statement. This was also more Windows specific, as it was better on Linux with Mesa drivers. Either way, it really does not continue like that to this day. The OpenGL implementation was improved two years ago, and then support for more extensions was added later. As is often the case, things can change, yet mindsets may be slow follow.
Intel is a different story, however. Traditionally, they typically never had anything more than integrated graphics, which were never meant to be much more than low cost, basic display solutions. Their venture into dedicated graphics, and improved support and performance, is pretty recent. Thus, they would be focusing on recent norms. They don't even have DirectX 9 support in hardware for Arc or modern integrated graphics like Iris Xe; it's done through emulation via Windows and Microsoft. Their efforts now are very much made for modern stuff like DirectX 12 (and other low level APIs like Vulkan) and it shows, as even in DirectX 11 there is a notable performance disadvantage. While they've been improving overall, I wouldn't be expecting serious OpenGL efforts at this stage, especially with all the really major issues they have going on right now...
OpenGL is seemingly becoming less and less relevant as time goes on. Apple hasn't been developing it for a long time, and apparently many game engines aren't even offering/supporting it anymore. Far fewer games use it now than in the past. Minecraft still uses it of course, because it's old, and at this point, at least for now, sticking with a pile of technical debt may be preferable to rewriting things. That's perhaps the real question to ask? Instead of "why don't modern drivers focus on supporting effectively-legacy APIs to the extent of excusing non-compliance issues" maybe it should be "perhaps we should consider moving to better rendering engines if they can bring better performance, even if it requires added effort and brings additional complexity"? Of course, as long as current performance can be "good enough" and as long as the added effort and complexity outweigh the benefits, I don't see it happening.
While the fault is with the game, it was specific to those versions, and later versions did address it. Mojang understandably won't maintain that many support branches. This is just a case where the community is likely the best answer for things, and thankfully Minecraft has a large one. At least 1.6 and 1.7 (not sure on sub-versions) sound they have an accessible solution.
While the fault is with the game, it was specific to those versions, and later versions did address it.
1.8 has the exact same problematic code as 1.6.4, as far as I can tell it is only sheer luck that whatever other code changes Mojang made "fixed" the issue (in other words, they didn't intentionally fix it, the same has happened with other bugs in the game, such as obsidian generators being patched due to changes to how blocks are internally handled in 1.8):
Not only that, 1.7-1.12 had their own issues with some NVIDIA drivers which caused literally all textures to be broken (not just some and/or seemingly random artifacting. The fact it only went back to 1.7 suggests yet some other subtle code change that only caused it to manifest since that version, and this is why it can be so difficult to pinpoint the cause of a bug, I only found out what it was because somebody else told me they fixed it in a mod that backported some newer code by reverting the code in the examples shown above to what was used before Beta 1.9, which did not have "OpenGlHelper", which now acts as a wrapper for the "setActiveTexture" calls to OpenGL (they added support for OpenGL 1.2 with the proper extensions, otherwise OpenGL 1.3); apparently when Mojang added that class and moved the OpenGL "setActiveTexture" methods into it they omitted one because it seemed unnecessary (I mean, drivers didn't care, so it has to be valid, right? Which again ultimately points blame to the drivers not adhering to spec, ideally they would issue an error, which the game has been coded to intercept and display (if often not very well, e.g. it just says "invalid operation" without specifying where it occurred, but this would also be accompanied by rendering errors):
Of course, NVIDIA implemented a work-around to fix the issue, and even before then you could create a custom profile with NVIDIA Inspector to import a file to change driver behavior (which Intel and AMD don't seem to support? The only fix I've seen given for the memory leak on AMD involves directly modifying the driver's code, only possible with the open-source version).
A lot of issues are caused by a combination of variables and not just one. Therefore, it wouldn't be surprising to me if one particular thing may have changed with 1.0, which ultimately introduced the issue, and then still remained that same way in 1.8, despite the issue not affecting 1.8. it very well could be down to some other change.
An example of this would be to imagine a room with a cat in it. The room is calm. Now imagine the introduction of a second cat, which doesn't get along with the first cat and it begins/instigates a fight with it. The fight represents the appearance of an issue. Now suppose you remove the first cat (not the second cat, which was what seemingly "introduced" the fight/issue). The fight still stops, right? Same concept. You can change something other than what "introduced an issue" and it may "resolve" it.
Issues aren't always what they seem. The recent issues with Intel's CPUs degrading is another good example of this. One of the common errors thrown is an error about being out of video memory (and it often occurs while compiling shaders for games), yet it's not an issue with the video card, drivers, or lack of memory or VRAM at all. It's the CPUs being unstable.
Hardware and software have so many variables that honestly, maybe it's a surprise there aren't even more issues.
Whenever i launch any version from 1.0 to 1.7.10 i load up a world everything looks normal, but when i pick up something the screen becomes solid colors. gui's are broken, playermodel is broken. How can i fix This?
here is a picture:
It's known behavior that can happen because those versions had an issue with OpenGL compliance but older video drivers were more lenient and excused it, but some newer video drivers don't.
This has been reported to happen to Intel, nVidia, and AMD video hardware/drivers, but I get the impression the latter two may have gone back to excusing it again because it seems more commonly happen on Intel now. I had nVidia before and have AMD now and personally have yet to see this, but I don't play older versions much (but when i did, I didn't see it on neither on "recent enough" drivers).
In any case, there's no public easy fix. You can try using OptiFine (or if you're using it, try not using it), but if that doesn't help, your next best solution would be to try older driver versions, which is not always preferable or even possible.
For 1.6.4 you can use my own fix, which has to be installed as a "jar" mod (see the instructions for TMCW for how to install) and is not compatible with Optifine since it modifies the same class (I've been told it does work with Forge, at least by itself since this sort of mod causes compatibility issues due to its very nature):
https://www.minecraftforum.net/forums/mapping-and-modding-java-edition/minecraft-mods/1294926-themastercavers-world?comment=294
For 1.7.10 you can use CoreTweaks, which uses my fix as a Forge mod (compatible with other mods):
https://github.com/makamys/CoreTweaks/releases/tag/0.3.3.2
For other versions, nobody seem to be willing or able to make the simple fix necessary (literally a single line of code which Mojang removed for some reason in Beta 1.9, presumably because it seemed redundant (there is another line which is nearly identical and would seem to do the same thing, "glClientActiveTexture" vs "glActiveTexture") and had no effect back then. If anything, I'd blame all the drivers which just ignored it since they themselves technically weren't adhering to the official OpenGL specs, who knows how many old games suffer from bugs as a result, and why NVIDIA drivers, which include thousands upon thousands of patches for game bugs, are so large (other vendors seem to not want to do this, and I don't blame them, Intel and AMD have also long been notorious for their bad OpenGL performance/implementation (which continues to this day, e.g. "black screen after AMD driver update", in 1.21, not some decade-old version, of course, they will prioritize fixing it, but only because it affects the latest version), OpenGL in general is also a deprecated standard which had ceased development in 2017, Apple has even threatened to drop support for it).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Aren't you contrasting yourself by saying AMD's OpenGL implementation is bad, but then acknowledging the drivers are having to workaround what are issues with the games?
Saying AMD's OpenGL performance has "always been notoriously bad" is a bit of a statement. This was also more Windows specific, as it was better on Linux with Mesa drivers. Either way, it really does not continue like that to this day. The OpenGL implementation was improved two years ago, and then support for more extensions was added later. As is often the case, things can change, yet mindsets may be slow follow.
Intel is a different story, however. Traditionally, they typically never had anything more than integrated graphics, which were never meant to be much more than low cost, basic display solutions. Their venture into dedicated graphics, and improved support and performance, is pretty recent. Thus, they would be focusing on recent norms. They don't even have DirectX 9 support in hardware for Arc or modern integrated graphics like Iris Xe; it's done through emulation via Windows and Microsoft. Their efforts now are very much made for modern stuff like DirectX 12 (and other low level APIs like Vulkan) and it shows, as even in DirectX 11 there is a notable performance disadvantage. While they've been improving overall, I wouldn't be expecting serious OpenGL efforts at this stage, especially with all the really major issues they have going on right now...
OpenGL is seemingly becoming less and less relevant as time goes on. Apple hasn't been developing it for a long time, and apparently many game engines aren't even offering/supporting it anymore. Far fewer games use it now than in the past. Minecraft still uses it of course, because it's old, and at this point, at least for now, sticking with a pile of technical debt may be preferable to rewriting things. That's perhaps the real question to ask? Instead of "why don't modern drivers focus on supporting effectively-legacy APIs to the extent of excusing non-compliance issues" maybe it should be "perhaps we should consider moving to better rendering engines if they can bring better performance, even if it requires added effort and brings additional complexity"? Of course, as long as current performance can be "good enough" and as long as the added effort and complexity outweigh the benefits, I don't see it happening.
While the fault is with the game, it was specific to those versions, and later versions did address it. Mojang understandably won't maintain that many support branches. This is just a case where the community is likely the best answer for things, and thankfully Minecraft has a large one. At least 1.6 and 1.7 (not sure on sub-versions) sound they have an accessible solution.
1.8 has the exact same problematic code as 1.6.4, as far as I can tell it is only sheer luck that whatever other code changes Mojang made "fixed" the issue (in other words, they didn't intentionally fix it, the same has happened with other bugs in the game, such as obsidian generators being patched due to changes to how blocks are internally handled in 1.8):
https://github.com/interactivenyc/Minecraft_SRC_MOD/blob/master/mcp811 1.6.4/src/minecraft/net/minecraft/src/RendererLivingEntity.java#L257
https://github.com/Marcelektro/MCP-919/blob/main/src/minecraft/net/minecraft/client/renderer/entity/RendererLivingEntity.java#L189
Not only that, 1.7-1.12 had their own issues with some NVIDIA drivers which caused literally all textures to be broken (not just some and/or seemingly random artifacting. The fact it only went back to 1.7 suggests yet some other subtle code change that only caused it to manifest since that version, and this is why it can be so difficult to pinpoint the cause of a bug, I only found out what it was because somebody else told me they fixed it in a mod that backported some newer code by reverting the code in the examples shown above to what was used before Beta 1.9, which did not have "OpenGlHelper", which now acts as a wrapper for the "setActiveTexture" calls to OpenGL (they added support for OpenGL 1.2 with the proper extensions, otherwise OpenGL 1.3); apparently when Mojang added that class and moved the OpenGL "setActiveTexture" methods into it they omitted one because it seemed unnecessary (I mean, drivers didn't care, so it has to be valid, right? Which again ultimately points blame to the drivers not adhering to spec, ideally they would issue an error, which the game has been coded to intercept and display (if often not very well, e.g. it just says "invalid operation" without specifying where it occurred, but this would also be accompanied by rendering errors):
https://www.minecraftforum.net/forums/support/java-edition-support/3122813-minecraft-1-8-9-broken
Of course, NVIDIA implemented a work-around to fix the issue, and even before then you could create a custom profile with NVIDIA Inspector to import a file to change driver behavior (which Intel and AMD don't seem to support? The only fix I've seen given for the memory leak on AMD involves directly modifying the driver's code, only possible with the open-source version).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
A lot of issues are caused by a combination of variables and not just one. Therefore, it wouldn't be surprising to me if one particular thing may have changed with 1.0, which ultimately introduced the issue, and then still remained that same way in 1.8, despite the issue not affecting 1.8. it very well could be down to some other change.
An example of this would be to imagine a room with a cat in it. The room is calm. Now imagine the introduction of a second cat, which doesn't get along with the first cat and it begins/instigates a fight with it. The fight represents the appearance of an issue. Now suppose you remove the first cat (not the second cat, which was what seemingly "introduced" the fight/issue). The fight still stops, right? Same concept. You can change something other than what "introduced an issue" and it may "resolve" it.
Issues aren't always what they seem. The recent issues with Intel's CPUs degrading is another good example of this. One of the common errors thrown is an error about being out of video memory (and it often occurs while compiling shaders for games), yet it's not an issue with the video card, drivers, or lack of memory or VRAM at all. It's the CPUs being unstable.
Hardware and software have so many variables that honestly, maybe it's a surprise there aren't even more issues.