I would like to discuss the intrinsics behind Minecraft shader packs for Java, because I want to understand the underlying works of how these shader packs work for the game. The phenomenon that has fueled my wishes to gain knowledge in shaders is not because I want to make my own, but rather to get an answer to the graphical phenomenon that seems to happen in all shaders: dithered, noisy, grainy shadows.
Backstory: I have grown dissatisfied with the usage of anti-aliasing in modern games, particularly Temporal Anti-Aliasing (TAA). On my computer, those games end up being noticeably blurry, even at native resolution - I game on 2560x1440 resolution - and the image quality further decreases when there's lots of motion in the current and next frames. Annoying graphical phenomena begins happening depending on the environment - loss of normal maps on textures, and objects in movement leave a 'ghost' behind them on the screen; this particular phenomenon is called "ghosting". Depending on the game and implementation, the ghosting phenomenon is amplified five-fold in very light environments. The reason why TAA causes these phenomena is because it uses data from previous frames, and the current frame, to reduce jagged edges. When there are objects in motion, the data from previous frames will not be accurate when compared to the current frame being rasterized.
As a result, I now turn off TAA on all games, and resort to FXAA, SMAA, and even MSAA. Those have different pros and cons of making shape edges less jagged, however based on my experience, they do not produce the graphical phenomenon of ghosting.
This YouTube video explains in detail much better than I can - props to the video author:
Now: I have opened an issue on Github for Solas shaders, in an attempt to gain insight in the inner works of how shaders work in Minecraft Java. On this shader pack, I have the shader pack on Ultra graphics preset, and have turned TAA off, FXAA on. On this shader pack, with TAA on, ghosting is very apparent when playing Minecraft in very bright environments, such as in snowy biomes.
However, a new graphical phenomenon appears when disabling TAA. Shadows are noisy, dithered, grainy. It isn't just shadows from sunlight, but also shadows generated near block edges as part of ambient occlusion. I have attempted to alleviate the issue by turning off all known forms of image sharpening in the shader settings, yet the grainy shadows are still present. See screenshots to get a look at what I encounter when playing Minecraft with shaders, TAA off.
The developer of said shader pack has demonstrated themselves to be unhelpful, mocking me for my graphical choices, and has said to not compare other AA options for Minecraft, as per their claims, they are very difficult to implement in the Java environment, and that "there's a reason why all Minecraft shaders rely on both FXAA and TAA". The developer has instead acknowledged their TAA implementation is blurry, and promises to work on it. However, that does not answer my question as to why do shadows in Java seem to universally be grainy in nature, and as a result, these shader packs rely on TAA for good shadow quality.
Is it a Java thing? Could it be my GPU and its drivers? I want to know. This phenomenon happens across all other shader packs, the severity of this phenomenon varies between shader packs.
Relevant specs:
Sapphire Radeon RX 6800 XT
AMD Adrenalin driver version 24.8.1, clean install with Radeon Software. Driver-side sharpening is OFF
People love to blame Java for various issues in the game but I can can confidently say that it has nothing to do with this issue, after all the game doesn't even communicate with the graphics driver directly but via a library called "Lightweight Java Game Library", which acts like a bridge between Java and the system (translating Java to C, communicating with the system via native C libraries, the main issue this causes is slower performance when making native API calls (hence one way I optimized the game was to minimize them), and which uses a native rendering API called "OpenGL", a deprecated API I should add, which ceased development in 2017 and has been threatened to be dropped entirely by various vendors, with the vast majority of modern games using DirectX, Vulkan, Metal. or other APIs.
Most notable may be the inability for OpenGL to support modern rendering methods like ray-tracing (so, back before it became more mainstream, buying a fancy "RTX" GPU was a waste of money as far as Minecraft: Java Edition was concerned, only Bedrock Edition supports it, speaking of which, do shaders on Bedrock also have this issue?):
OpenGL is no longer in active development, whereas between 2001 and 2014, OpenGL specification was updated mostly on a yearly basis, with two releases (3.1 and 3.2) taking place in 2009 and three (3.3, 4.0 and 4.1) in 2010, the latest OpenGL specification 4.6 was released in 2017, after a three-year break, and was limited to inclusion of eleven existing ARB and EXT extensions into the core profile.
Active development of OpenGL was dropped in favor of the Vulkan API, released in 2016, and codenamed glNext during initial development. In 2017, Khronos Group announced that OpenGL ES would not have new versions[9] and has since concentrated on development of Vulkan and other technologies.[10][11] As a result, certain capabilities offered by modern GPUs, e.g. ray tracing, are not supported by the OpenGL standard. However, support for newer features might be provided through the vendor-specific OpenGL extensions
(while they point out that it is possible for vendors to add support themselves this would mean that every GPU vendor would need to implement them, as it was, Minecraft was hindered by the fact that older versions used extensions only available or properly supported by NVIDIA, likely the consequence of being coded by a single person at one time, while not necessary this did degrade the experience on AMD and Intel, in both graphical quality and performance)
To answer your question, I did some quick experimenting in Minecraft: Bedrock Edition Preview, using Prizma Deferred resource pack, everything set to High (all settings to Ultra tanked my FPS down to 30, wasn't an enjoyable experience lol).
Anti-Aliasing set to the minimum of 1 (the settings does not explain what kind of AA it's using, just "Anti-aliasing"):
Anti-aliasing cracked all the way to 16:
With AA set to '1', the shadows have jaggies, but they still do not have noise/grain coming up to the edges. Cranked up to 16, the jagginess of the shadows are mostly gone. Again, I saw no noise on the shadows leading up to the edges, which is in stark contrast to the shaders I was using on Java Edition.
You're basically saying, that the reason why certain AA types cannot be added to the game through shaders, is simply because of OpenGL being an old and deprecated graphics library?
I am aware that a Vulkan mod is in the works right now for Java Edition, however, it breaks all mods that use OpenGL calls (which a surprising amount of mods seem to do).
This might be better off in the hardware/technology/whatever forum as opposed to the Minecraft one? While your example is with Minecraft Java shaders, the underlying subject isn't exclusive to that and goes into broader hardware and technology trends.
The developer of said shader pack has demonstrated themselves to be unhelpful, mocking me for my graphical choices, and has said to not compare other AA options for Minecraft, as per their claims, they are very difficult to implement in the Java environment, and that "there's a reason why all Minecraft shaders rely on both FXAA and TAA". The developer has instead acknowledged their TAA implementation is blurry, and promises to work on it. However, that does not answer my question as to why do shadows in Java seem to universally be grainy in nature, and as a result, these shader packs rely on TAA for good shadow quality.
Is it a Java thing? Could it be my GPU and its drivers? I want to know. This phenomenon happens across all other shader packs, the severity of this phenomenon varies between shader packs.
Well, that's... a disappointing response you got...
That being said, while the rudeness was uncalled for, the reasoning they gave you wasn't entirely wrong, and I'll try and explain it the best I understand it.
No, it's not a Java thing. No, it's not your graphics drivers.
More and more, we've been moving back towards dithering for some shadows/effects. This used to be common years and years ago, and personally, I grew up with it so it doesn't bother me all too much when I see it (well, usually). The reason it's done is because it's much cheaper in performance while not being much worse in visuals... that is, when it's masked with upscaling or anti-aliasing methods. As you found out, many games are being made with these methods in mind, and when you take those methods away, it exposes some of these things. To understand why they are being developed with these methods in mind, here's a few reasons.
Graphics hardware is slowing down in improvements (not unlike CPUs did long before them). This is made worse by the fact that the gains aren't simply slowing, but are also rising in price. Consider that nVidia's mid-range graphics cards now cost $600 to $800 (!) in the RTX 40 series, instead of $250 like they did in GTX 10 series. AMD isn't much cheaper, with its closest equivalent to those aforementioned nVidia reference points costing ~$500 (but since AMD isn't matching nVidia's performance at the top end, this means this closest equivalent isn't AMD's own mid-range performance, so it's a bit of a mismatched comparison).
On top of this, there's this nasty reality called diminishing returns. You might make a processor twice as fast, but it might not bring twice the overall improvement. Nowhere near.
TVs have more and more moving to 4K resolutions, and the majority of gaming is cross platform and developed with the consoles (which predominantly are connected to these 4K TVs) as their target hardware point. Even high end PC hardware can struggle with 4K, let alone the current consoles (which basically have a slightly slower Ryzen 7 3700X and Radeon RX 6700 non-XT as their CPU/GPU hardware in a combined APU). Consoles have increasingly been relying on upscaling, checkerboarding, and so on. Dithered shadows and effects are part of this.
So why is TAA the one forced? There could be a few reasons, but I can only speculate.
MSAA and the like have drawbacks. For one, it fails to cover transparent objects and would need transparency AA paired with it to cover those things. And MSAA simply does not work with most modern deferred rendering techniques today. As far back as the original Halo game on the PC, anti-aliasing was infamous for not working. That was over two decades ago, so MSAA falling out of favor from being the defacto standard isn't really a new thing. MSAA is basically "old fashioned". This was the same change from Minecraft between 1.6 to 1.7 that prevented forced MSAA from working right.
SSAA is very expensive in performance, but the best when you have that performance overhead. Unfortunately, I don't think the Fabric ecosystem gracefully offers this (OptiFine does). You can do it via VSR/DSR, but then your menus and everything are being downscaled as opposed to just the 3D rendering, so you'll need to increase the GUI scale and then double your mouse DPI to compensate, and this quickly gets old. In other words, dear Sodium/Iris/etc., please add this feature already like OptiFine has? It's literally one of two reasons I'm still tied to OptiFine.
FXAA has its own issues. Personally, I'm surprised by people who label TAA as blurry and then turn around and praise FXAA. While TAA definitely has ghosting issues, it is FXAA that is the inherent blurry one. I'm not going to pretend TAA is perfect, but I don't agree with the people who blame it and then champion FXAA. That video above (note that I didn't watch it and am going off the thumbnail, so I might actually miss its point) seems pretty sensational because the Xbox 360/PlayStation 3 era, and even a portion of the following console era, was infamous for looking blurry due to poor texture quality and lack of hardware power to drive the then-new HD resolutions, and things often looked blurry and muddy. Blurry looking games aren't a new trend. I personally outright reject FXAA as acceptable at all, but... it's subjective, I admit. Go try the BSL shaders and switch between TAA and FXAA and tell me that it is TAA that is the blurry one between the two. it's not; it's FXAA that is blurry. TAA simply has issues with ghosting (and typically only in certain conditions), and it reintroduces aliasing in motion, which is... the opposite of blurry.
I do agree that choice is good and TAA shouldn't be forced, and it sometimes is in some games... but I sort of get why it is. This thread is an example of why. Certain things are literally designed with it (or at least scaling methods) in mind, and then when it's disabled, people are wondering why certain other things are as they are.
Now maybe shadows/effects should be done with full resolution... but that can becomes increasingly performance demanding, fast. Not saying the user shouldn't have a choice; just that I sort of get why it's gone this way.
The alternative is a lack of anti-aliasing at all, and that's worse than anything in my mind. I'd probably accept FXAA before that, and that's saying something.
I've been using SEUS shaders for a while now, but I've noticed that it tends to drain performance on larger builds. Has anyone else experienced this? I've heard BSL Shaders might offer a better balance between quality and FPS. What are your thoughts? Any other lightweight shader recommendations for high-performance gameplay?
For the most part, the performance you get with shaders shouldn't be impacted too drastically by a build.
However, I've experienced an exception.
Shaders might be mostly GPU limited, but they also add CPU demands, and there's one area in particular this seems to matter, and that is with entities. I first started using shaders a lot with 1.16 using BSL, and when the 1.18 update happened, I noticed something. I was getting pretty reduced performance in my village.
A later update to BSL added an "entity shadows" option, and disabling that did the trick. Notice the difference in frame rate here, and the only difference is that entity shadows are set to on in the first one, and set to off in the other one (you can notice this because my character and other entities, like the golems, no longer have a shadow being cast from them).
Why did this performance hit only occur after 1.18 (or maybe it was 1.17, which I skipped over)? I have no idea. It didn't happen with vanilla; only with shaders. But I noticed many shaders started adding it as an option, and many even default to it being off. So I would check if the shaders have such an option and if it's enabled, and if it is, disable it.
Do not confuse this with Minecraft's own option of the same name as it has no control over this when shaders are being used.
If it's not that, then I don't know, but some of the heavier shaders might also just be heavier, and they might similarly have drops near areas where a lot is going on (like a village, base, farm, etc.). I mostly stick to BSL and Complimentary and they are pretty good on performance. The one I tried Continuum, even at a render distance of 16, I was getting less performance than I do with the other two at 32 or even 48 chunks.
What shader you use is just the start; some of the settings they have in particular have a major impact on quality and performance.
Yeah,
I would like to discuss the intrinsics behind Minecraft shader packs for Java, because I want to understand the underlying works of how these shader packs work for the game. The phenomenon that has fueled my wishes to gain knowledge in shaders is not because I want to make my own, but rather to get an answer to the graphical phenomenon that seems to happen in all shaders: dithered, noisy, grainy shadows.
Backstory: I have grown dissatisfied with the usage of anti-aliasing in modern games, particularly Temporal Anti-Aliasing (TAA). On my computer, those games end up being noticeably blurry, even at native resolution - I game on 2560x1440 resolution - and the image quality further decreases when there's lots of motion in the current and next frames. Annoying graphical phenomena begins happening depending on the environment - loss of normal maps on textures, and objects in movement leave a 'ghost' behind them on the screen; this particular phenomenon is called "ghosting". Depending on the game and implementation, the ghosting phenomenon is amplified five-fold in very light environments. The reason why TAA causes these phenomena is because it uses data from previous frames, and the current frame, to reduce jagged edges. When there are objects in motion, the data from previous frames will not be accurate when compared to the current frame being rasterized.
As a result, I now turn off TAA on all games, and resort to FXAA, SMAA, and even MSAA. Those have different pros and cons of making shape edges less jagged, however based on my experience, they do not produce the graphical phenomenon of ghosting.
This YouTube video explains in detail much better than I can - props to the video author:
Now: I have opened an issue on Github for Solas shaders, in an attempt to gain insight in the inner works of how shaders work in Minecraft Java. On this shader pack, I have the shader pack on Ultra graphics preset, and have turned TAA off, FXAA on. On this shader pack, with TAA on, ghosting is very apparent when playing Minecraft in very bright environments, such as in snowy biomes.
However, a new graphical phenomenon appears when disabling TAA. Shadows are noisy, dithered, grainy. It isn't just shadows from sunlight, but also shadows generated near block edges as part of ambient occlusion. I have attempted to alleviate the issue by turning off all known forms of image sharpening in the shader settings, yet the grainy shadows are still present. See screenshots to get a look at what I encounter when playing Minecraft with shaders, TAA off.
The developer of said shader pack has demonstrated themselves to be unhelpful, mocking me for my graphical choices, and has said to not compare other AA options for Minecraft, as per their claims, they are very difficult to implement in the Java environment, and that "there's a reason why all Minecraft shaders rely on both FXAA and TAA". The developer has instead acknowledged their TAA implementation is blurry, and promises to work on it. However, that does not answer my question as to why do shadows in Java seem to universally be grainy in nature, and as a result, these shader packs rely on TAA for good shadow quality.
Is it a Java thing? Could it be my GPU and its drivers? I want to know. This phenomenon happens across all other shader packs, the severity of this phenomenon varies between shader packs.
Relevant specs:
Minecraft specs:
People love to blame Java for various issues in the game but I can can confidently say that it has nothing to do with this issue, after all the game doesn't even communicate with the graphics driver directly but via a library called "Lightweight Java Game Library", which acts like a bridge between Java and the system (translating Java to C, communicating with the system via native C libraries, the main issue this causes is slower performance when making native API calls (hence one way I optimized the game was to minimize them), and which uses a native rendering API called "OpenGL", a deprecated API I should add, which ceased development in 2017 and has been threatened to be dropped entirely by various vendors, with the vast majority of modern games using DirectX, Vulkan, Metal. or other APIs.
Most notable may be the inability for OpenGL to support modern rendering methods like ray-tracing (so, back before it became more mainstream, buying a fancy "RTX" GPU was a waste of money as far as Minecraft: Java Edition was concerned, only Bedrock Edition supports it, speaking of which, do shaders on Bedrock also have this issue?):
(while they point out that it is possible for vendors to add support themselves this would mean that every GPU vendor would need to implement them, as it was, Minecraft was hindered by the fact that older versions used extensions only available or properly supported by NVIDIA, likely the consequence of being coded by a single person at one time, while not necessary this did degrade the experience on AMD and Intel, in both graphical quality and performance)
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
To answer your question, I did some quick experimenting in Minecraft: Bedrock Edition Preview, using Prizma Deferred resource pack, everything set to High (all settings to Ultra tanked my FPS down to 30, wasn't an enjoyable experience lol).
Anti-Aliasing set to the minimum of 1 (the settings does not explain what kind of AA it's using, just "Anti-aliasing"):
Anti-aliasing cracked all the way to 16:
With AA set to '1', the shadows have jaggies, but they still do not have noise/grain coming up to the edges. Cranked up to 16, the jagginess of the shadows are mostly gone. Again, I saw no noise on the shadows leading up to the edges, which is in stark contrast to the shaders I was using on Java Edition.
You're basically saying, that the reason why certain AA types cannot be added to the game through shaders, is simply because of OpenGL being an old and deprecated graphics library?
I am aware that a Vulkan mod is in the works right now for Java Edition, however, it breaks all mods that use OpenGL calls (which a surprising amount of mods seem to do).
This might be better off in the hardware/technology/whatever forum as opposed to the Minecraft one? While your example is with Minecraft Java shaders, the underlying subject isn't exclusive to that and goes into broader hardware and technology trends.
Well, that's... a disappointing response you got...
That being said, while the rudeness was uncalled for, the reasoning they gave you wasn't entirely wrong, and I'll try and explain it the best I understand it.
No, it's not a Java thing. No, it's not your graphics drivers.
More and more, we've been moving back towards dithering for some shadows/effects. This used to be common years and years ago, and personally, I grew up with it so it doesn't bother me all too much when I see it (well, usually). The reason it's done is because it's much cheaper in performance while not being much worse in visuals... that is, when it's masked with upscaling or anti-aliasing methods. As you found out, many games are being made with these methods in mind, and when you take those methods away, it exposes some of these things. To understand why they are being developed with these methods in mind, here's a few reasons.
Graphics hardware is slowing down in improvements (not unlike CPUs did long before them). This is made worse by the fact that the gains aren't simply slowing, but are also rising in price. Consider that nVidia's mid-range graphics cards now cost $600 to $800 (!) in the RTX 40 series, instead of $250 like they did in GTX 10 series. AMD isn't much cheaper, with its closest equivalent to those aforementioned nVidia reference points costing ~$500 (but since AMD isn't matching nVidia's performance at the top end, this means this closest equivalent isn't AMD's own mid-range performance, so it's a bit of a mismatched comparison).
On top of this, there's this nasty reality called diminishing returns. You might make a processor twice as fast, but it might not bring twice the overall improvement. Nowhere near.
TVs have more and more moving to 4K resolutions, and the majority of gaming is cross platform and developed with the consoles (which predominantly are connected to these 4K TVs) as their target hardware point. Even high end PC hardware can struggle with 4K, let alone the current consoles (which basically have a slightly slower Ryzen 7 3700X and Radeon RX 6700 non-XT as their CPU/GPU hardware in a combined APU). Consoles have increasingly been relying on upscaling, checkerboarding, and so on. Dithered shadows and effects are part of this.
So why is TAA the one forced? There could be a few reasons, but I can only speculate.
MSAA and the like have drawbacks. For one, it fails to cover transparent objects and would need transparency AA paired with it to cover those things. And MSAA simply does not work with most modern deferred rendering techniques today. As far back as the original Halo game on the PC, anti-aliasing was infamous for not working. That was over two decades ago, so MSAA falling out of favor from being the defacto standard isn't really a new thing. MSAA is basically "old fashioned". This was the same change from Minecraft between 1.6 to 1.7 that prevented forced MSAA from working right.
SSAA is very expensive in performance, but the best when you have that performance overhead. Unfortunately, I don't think the Fabric ecosystem gracefully offers this (OptiFine does). You can do it via VSR/DSR, but then your menus and everything are being downscaled as opposed to just the 3D rendering, so you'll need to increase the GUI scale and then double your mouse DPI to compensate, and this quickly gets old. In other words, dear Sodium/Iris/etc., please add this feature already like OptiFine has? It's literally one of two reasons I'm still tied to OptiFine.
FXAA has its own issues. Personally, I'm surprised by people who label TAA as blurry and then turn around and praise FXAA. While TAA definitely has ghosting issues, it is FXAA that is the inherent blurry one. I'm not going to pretend TAA is perfect, but I don't agree with the people who blame it and then champion FXAA. That video above (note that I didn't watch it and am going off the thumbnail, so I might actually miss its point) seems pretty sensational because the Xbox 360/PlayStation 3 era, and even a portion of the following console era, was infamous for looking blurry due to poor texture quality and lack of hardware power to drive the then-new HD resolutions, and things often looked blurry and muddy. Blurry looking games aren't a new trend. I personally outright reject FXAA as acceptable at all, but... it's subjective, I admit. Go try the BSL shaders and switch between TAA and FXAA and tell me that it is TAA that is the blurry one between the two. it's not; it's FXAA that is blurry. TAA simply has issues with ghosting (and typically only in certain conditions), and it reintroduces aliasing in motion, which is... the opposite of blurry.
I do agree that choice is good and TAA shouldn't be forced, and it sometimes is in some games... but I sort of get why it is. This thread is an example of why. Certain things are literally designed with it (or at least scaling methods) in mind, and then when it's disabled, people are wondering why certain other things are as they are.
Now maybe shadows/effects should be done with full resolution... but that can becomes increasingly performance demanding, fast. Not saying the user shouldn't have a choice; just that I sort of get why it's gone this way.
The alternative is a lack of anti-aliasing at all, and that's worse than anything in my mind. I'd probably accept FXAA before that, and that's saying something.
"'Tis foolishness! If all were so easy, why, none would suffer in this world!"
If you're having performance concerns with Minecraft, I hope this may prove useful.
A retrospective of the most important game to me (or, a try to stay awake while I never stop talking about something challenge).
I've been using SEUS shaders for a while now, but I've noticed that it tends to drain performance on larger builds. Has anyone else experienced this? I've heard BSL Shaders might offer a better balance between quality and FPS. What are your thoughts? Any other lightweight shader recommendations for high-performance gameplay?
Minecraft Server knowledge base
For the most part, the performance you get with shaders shouldn't be impacted too drastically by a build.


However, I've experienced an exception.
Shaders might be mostly GPU limited, but they also add CPU demands, and there's one area in particular this seems to matter, and that is with entities. I first started using shaders a lot with 1.16 using BSL, and when the 1.18 update happened, I noticed something. I was getting pretty reduced performance in my village.
A later update to BSL added an "entity shadows" option, and disabling that did the trick. Notice the difference in frame rate here, and the only difference is that entity shadows are set to on in the first one, and set to off in the other one (you can notice this because my character and other entities, like the golems, no longer have a shadow being cast from them).
Why did this performance hit only occur after 1.18 (or maybe it was 1.17, which I skipped over)? I have no idea. It didn't happen with vanilla; only with shaders. But I noticed many shaders started adding it as an option, and many even default to it being off. So I would check if the shaders have such an option and if it's enabled, and if it is, disable it.
Do not confuse this with Minecraft's own option of the same name as it has no control over this when shaders are being used.
If it's not that, then I don't know, but some of the heavier shaders might also just be heavier, and they might similarly have drops near areas where a lot is going on (like a village, base, farm, etc.). I mostly stick to BSL and Complimentary and they are pretty good on performance. The one I tried Continuum, even at a render distance of 16, I was getting less performance than I do with the other two at 32 or even 48 chunks.
What shader you use is just the start; some of the settings they have in particular have a major impact on quality and performance.
"'Tis foolishness! If all were so easy, why, none would suffer in this world!"
If you're having performance concerns with Minecraft, I hope this may prove useful.
A retrospective of the most important game to me (or, a try to stay awake while I never stop talking about something challenge).
I use my shader Recastional