This is somewhat true; he had okay performance with his old Radeon card. It was an HD 7850. The main issue wasn't his card, it was his processor, which was a first gen i3. Here's one of the old videos:
But yeah... when he started doing his cinematics, he used POM and all that other fancy stuff. This is what cut his Fps down, to the point that he had to use those film maker tricks.
@UnstoppablE_GaMeR, you'd be better off just building/buying yourself a new PC. If you plan on gaming with it, avoid laptops. Purchase a proper PC, if you can. I'd go for at least an i5, 8GB of RAM and a GTX 750 Ti. Your current computer just won't cut it for Shaders, I'm afraid. =/
The main issue was his card.
The whole thing about Minecraft being processor based is true, Minecraft does suffer from CPU bottlenecks rather easily. However, this isn't quite the case when GPU intensive shaders are involved.
Basically, what you need for the maximum FPS in Minecraft is a balanced system. Your CPU and GPU should be around the same level.
To explain a bit further, vanilla Minecraft has simple graphics, which means dedicated GPUs can easily push out hundreds of FPS, however the processor is usually what holds them back from getting that great performance. When you add shaders into the mix of things, the GPU is being stressed way more than the processor is, and in that situation the processor doesn't matter as much as it would otherwise.
The whole thing about Minecraft being processor based is true, Minecraft does suffer from CPU bottlenecks rather easily. However, this isn't quite the case when GPU intensive shaders are involved.
Basically, what you need for the maximum FPS in Minecraft is a balanced system. Your CPU and GPU should be around the same level.
To explain a bit further, vanilla Minecraft has simple graphics, which means dedicated GPUs can easily push out hundreds of FPS, however the processor is usually what holds them back from getting that great performance. When you add shaders into the mix of things, the GPU is being stressed way more than the processor is, and in that situation the processor doesn't matter as much as it would otherwise.
Ehh.... The HD 7850 was, and still is, a powerful card. It has the GTX 750 Ti, beaten by a long shot. This is coming from a Nvidia fangirl, by the way - I adamantly avoid AMD, and attempt to persuade people from ever purchasing anything made by AMD. Here, have a look at this benchmark record, against the 750 Ti: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-750-Ti-vs-AMD-HD-7850/2187vs2182
Anyways, the i3 that he owned at the time, was an incredibly poor CPU. Even my old Core 2 Quad Q6600 was more powerful than it. So, I'd argue that in this case, the problem was certainly the CPU. But yes... Shaders, specifically SEUS and its variants like Continuum, are extremely demanding on GPUs. I will never deny that.
Rollback Post to RevisionRollBack
My current PC:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Ehh.... The HD 7850 was, and still is, a powerful card. It has the GTX 750 Ti, beaten by a long shot. This is coming from a Nvidia fangirl, by the way - I adamantly avoid AMD, and attempt to persuade people from ever purchasing anything made by AMD. Here, have a look at this benchmark record, against the 750 Ti: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-750-Ti-vs-AMD-HD-7850/2187vs2182
Anyways, the i3 that he owned at the time, was an incredibly poor CPU. Even my old Core 2 Quad Q6600 was more powerful than it. So, I'd argue that in this case, the problem was certainly the CPU. But yes... Shaders, specifically SEUS and its variants like Continuum, are extremely demanding on GPUs. I will never deny that.
ehhh...his i3 540 back in the day is better than your q6600...time to research a bit more there. Also, an i3 wont bottleneck with shaders. His 7850 was the bottleneck, the i3 was fine.
Rollback Post to RevisionRollBack
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
ehhh...his i3 540 back in the day is better than your q6600...time to research a bit more there. Also, an i3 wont bottleneck with shaders. His 7850 was the bottleneck, the i3 was fine.
You know, purposely mocking someone isn't okay. Regardless of being right or wrong, you ought to be kind to others. And no, the Q6600 is more powerful than the i3-540.
You know, purposely mocking someone isn't okay. Regardless of being right or wrong, you ought to be kind to others. And no, the Q6600 is more powerful than the i3-540.
Regardless, let's not derail this thread any further. If you honestly feel the need to continue a debate, you can always private message me.
Hmm..guess it depends on where you look. cpu-world and cpu.userbenchmarks both show that the i3 is better than the q6600, while cpuboss shows that the results alternate. Idk what to take from that then sorry if my initial reply seemed to be mocking. That wasnt my intention.
Either way, neither the q6600 or i3 540 will be a bottleneck for shaders, especially not with a 7850. The gpu is most certainly the bottleneck when it comes to shaders.
Rollback Post to RevisionRollBack
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
Ehh.... The HD 7850 was, and still is, a powerful card. It has the GTX 750 Ti, beaten by a long shot. This is coming from a Nvidia fangirl, by the way - I adamantly avoid AMD, and attempt to persuade people from ever purchasing anything made by AMD. Here, have a look at this benchmark record, against the 750 Ti: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-750-Ti-vs-AMD-HD-7850/2187vs2182
Anyways, the i3 that he owned at the time, was an incredibly poor CPU. Even my old Core 2 Quad Q6600 was more powerful than it. So, I'd argue that in this case, the problem was certainly the CPU. But yes... Shaders, specifically SEUS and its variants like Continuum, are extremely demanding on GPUs. I will never deny that.
and attempt to persuade people from ever purchasing anything made by AMD
Aw, don't do that. It's one thing to have preference, but we need more equal competition in the GPU space in order for future products to actually improve. When one company has significantly more market share than the other, it gets to a point where progression comes to a stand still. I know what you mean though, I understand the preference for Nvidia.
It's hard to use general GPU benchmarks to measure Minecraft performance. Newer generation "Polaris" cards might improve or even fix this problem, but AMD's current GCN architecture seems to really hate Minecraft's OpenGL implementation for some reason. That's why in the case of Minecraft, a GTX 970 outperforms an R9 390 by a significant margin.
The whole thing about Minecraft being processor based is true, Minecraft does suffer from CPU bottlenecks rather easily. However, this isn't quite the case when GPU intensive shaders are involved.
Basically, what you need for the maximum FPS in Minecraft is a balanced system. Your CPU and GPU should be around the same level.
To explain a bit further, vanilla Minecraft has simple graphics, which means dedicated GPUs can easily push out hundreds of FPS, however the processor is usually what holds them back from getting that great performance. When you add shaders into the mix of things, the GPU is being stressed way more than the processor is, and in that situation the processor doesn't matter as much as it would otherwise.
Bro! all of you guys you just keep saying I have oldest oooolllldeeest graphics card
But when I updated my graphics drivers to latest
andd.....
magicc!!!!!!!!!!!!
All of them are working!! (low fps but they are working!!!)
I seriously can't believe I said you @Strum355 that if I update my drivers
Honestly, that GPU is more than enough to play with Continuum. I have a GTX 970, and I get 50-60fps in maximum settings. That said, the recent release (Continuum 1.2.1, both classic and the normal one) yield an extremely low fps for me as well. I get 20-30fps with it. Have you tried the previous release? It's the 1.8.9 version; the one I get 50-60fps in.
... and now that I try to look for it, I'm not sure which specific link it is. It used to be found under the 1.2 button, when it was crossed out. Hmm. Anyways, no. You don't need a better PC, nor a better GPU. Everything you have, is perfectly fine. I have an i7-6700, 24GB of DDR4 RAM and my aforementioned GTX 970. Anyways, be sure to keep ''Render Quality'' and ''Shadow Quality'' at 1x, as seen here:
Increasing them, does naught but increase GPU load, with little to no visual quality difference.
Those settings do actually do something, if you know what to look out for of course. Render Quality can be compared to the resolution its being run at. The higher the render quality, the less aliasing etc.
Shadow quality is the Shadow Resolution. Increasing it will increase the shadow resolution while lowering it lowers the shadow resolution.
That being said, still run with both of them at x1. I personally find x4 fxaa to be a much better option. Also...slightly disappointed in your not using normal or specular maps ;3 xD
Rollback Post to RevisionRollBack
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
Bro! all of you guys you just keep saying I have oldest oooolllldeeest graphics card
But when I updated my graphics drivers to latest
andd.....
magicc!!!!!!!!!!!!
All of them are working!! (low fps but they are working!!!)
I seriously can't believe I said you @Strum355 that if I update my drivers
It's good to hear that you found a solution. However, your GPU is horrifically underpowered for any SEUS variant. I'm honestly shocked that you managed to get a Nvidia GT 620 to work with this pack and have it be playable. Even the creator of SEUS; Cody Darr, struggled to have over 30fps on his old GTX 550. That's what he claimed, in an old Facebook post of his anyways.
I'd still suggest that you purchase a better GPU in the future, regardless of getting this pack running. Looking back at my reply to you, I'm not exactly sure why I said that you need to build a new computer lol. You certainly do need more RAM, but your CPU is fine.
I played with SEUS, on an Intel Pentium E5300 for quite a long period of time before I was able to update. That said, I also had 4GB of RAM and a GTX 550 Ti at the time. My general fps was around 20-30 with that old build. Anyways, you could easily upgrade to 8GB of RAM and put a new GPU into your PC for under $200. Look for at least a GTX 560; you can find those used, for under $100 sometimes.
Also...slightly disappointed in your not using normal or specular maps ;3 xD
Well, the texture pack I use doesn't support either of them; I use my own personalised version of Misa. Her pack is the best one in existence - in my opinion. Prior to the project being taken up by Les, after she couldn't work on it any further, I was floating around a bit and mostly just using Faithful, or Chroma Hills when I used SEUS. Now that Les has taken it upon themselves to update it to 1.9 and 1.10, I've started using it again.
As for the other stuff you mentioned... the difference in quality is so minimal (yes, it's noticeable) that it doesn't seem worth it, to turn above 1x. So, I always make certain to make it clear to those who ask, that it's pointless to turn those two quality levels up.
Rollback Post to RevisionRollBack
My current PC:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Well, the texture pack I use doesn't support either of them; I use my own personalised version of Misa. Her pack is the best one in existence - in my opinion. Prior to the project being taken up by Les, after she couldn't work on it any further, I was floating around a bit and mostly just using Faithful, or Chroma Hills when I used SEUS. Now that Les has taken it upon themselves to update it to 1.9 and 1.10, I've started using it again.
As for the other stuff you mentioned... the difference in quality is so minimal (yes, it's noticeable) that it doesn't seem worth it, to turn above 1x. So, I always make certain to make it clear to those who ask, that it's pointless to turn those two quality levels up.
Had a look at Misa. Its nice but a bit plain from what the screenshots showed. However ill give it a go, might be better with shaders for the others, yea, hence i said i keep it at x1 as well myself. Theres always someone out there who will turn it up, but for most, theres minimal gains to be gotten from it
Rollback Post to RevisionRollBack
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
I'm not exactly sure why I said that you need to build a new computer lol. You certainly do need more RAM, but your CPU is fine.
ild. Anyways, you could easily upgrade to 8GB of RAM and put a new GPU into your PC for under $200. Look for at least a GTX 560; you can find those used, for under $100 sometimes.
Lol .. and i can upgrade ram to 4gb only and gtx 560!?! nah its out of my budget man india is different
The Meaning of Life, the Universe, and Everything.
Join Date:
8/4/2012
Posts:
48
Member Details
I have a couple of things to ask/say here. First, I don't understand why people are saying there's not much difference between 1x and 2x Render Quality, as there is a HUGE difference to me. Even running at 1.5x, the amount of aliasing and general blur is unbearable to me, with any shader. Shadow Quality I usually keep at 1x, with a higher res, or 2x with 1024.
On to my questions:
1. I have a Giga R9 380 and an Athlon XII 860K, and I'm getting lower framerates than I feel I should be with this card on this pack, even using the medium profile. Are there a set of settings that should be set a certain way to get better performance? What should I be seeing with my card at 1080 with the extreme profile? Are there still bugs or performance issues with 1.9.4?
2. I am in love with the water from this pack. The refraction (which I think could use a little boost in visual strength btw) makes it absolutely stunning to look at. Is there any way you could create a profile optimized for just the water effects and decent FPS? Maybe add some simple lighting effects like sunglare, godrays, etc, but DEFINITELY give an option to turn everything on/off individually, including the dynamic lighting/cast shadows (which seems to be one of the biggest fps hit, at least for me), and if you could add options for effect strength for everything too... I know it's a lot to ask, but I know there are people out there (like me) who want JUST water shaders, and these are BY FAR the best looking, or others who don't have a strong enough rig to use all the amazing effects offered here. Offering the continuum in separately grouped packs (water, shadows and lighting, weather effects, etc) will increase the user base for these (hell, put them behind adf.ly, we don't mind). Maybe even make this modular some how by letting users choose which effects they want, then have a script compile just those feature. Continuum is wayyyy to good to have such a small group of people using them, so in my opinion everything that can be done to increase how many people use these, should be done. Please at least consider this...
3. I ran cinematic just for the hell of it. I think there's a rendering issue with one of the cloud settings. I haven't narrowed it down yet, but it appears to look like some kind of volumetric cloud settings that are rendering as flat verticle/horizontal sheets instead of a volume. I can try to load the profile back up and get a screensshot if needed.
I have a couple of things to ask/say here. First, I don't understand why people are saying there's not much difference between 1x and 2x Render Quality, as there is a HUGE difference to me. Even running at 1.5x, the amount of aliasing and general blur is unbearable to me, with any shader. Shadow Quality I usually keep at 1x, with a higher res, or 2x with 1024.
On to my questions:
1. I have a Giga R9 380 and an Athlon XII 860K, and I'm getting lower framerates than I feel I should be with this card on this pack, even using the medium profile. Are there a set of settings that should be set a certain way to get better performance? What should I be seeing with my card at 1080 with the extreme profile? Are there still bugs or performance issues with 1.9.4?
I can answer your first question, but the next two aren't something I am capable of helping with. Firstly... I say that there's no difference, meaning -- in a slightly cryptic/convoluted way -- that the performance hit is so insanely high for the visual quality it offers, that it's simply not worth it. The performance to quality ratio is completely out of whack, with those two settings. Even increasing it a fraction higher than the default 1x, will result in staggering fps drops.
And now, regarding your R9 380... that is a nice card, yes. It's better than a GTX 960, but it's slower than a GTX 970. You should be capable of having near-similar performance that I do; just keep what I said above, in mind. The CPU that you named, ''Athlon XII 860K'', I cannot find any record of its existence. The closest model name appears to be an ''Athlon II X4 860K''. Anywho, that specific processor is definitely not on the higher end. It may be a cause of a good 5-10 frame loss, when compared to my i7-6700.
With your hardware, at the default Ultra preset (This means 1x render and shadow quality), I would expect to see an average of... 30-40fps. That should answer your performance problem, lol. Hopefully someone else who knows how, can respond to your next two questions. =)
Rollback Post to RevisionRollBack
My current PC:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
The CPU that you named, ''Athlon XII 860K'', I cannot find any record of its existence. The closest model name appears to be an ''Athlon II X4 860K"
You're 100% correct, it's an X4 860K. I haven't slept in almost 36 hours, that was definitely a typo I passed up lol.
It may be a slightly older architecture, but it's a quad core overclocked to 4.4GHz, and from what I could tell, even at stock clock rate it shouldn't bottleneck an R9 380 by more than a couple fps. Everywhere I searched said they are almost a perfect match as far as system balance goes. Ill see what I get with the default ultra settings, as I haven't actually tested with 1x renderRes.
Firstly... I say that there's no difference, meaning -- in a slightly cryptic/convoluted way -- that the performance hit is so insanely high for the visual quality it offers, that it's simply not worth it. The performance to quality ratio is completely out of whack, with those two settings. Even increasing it a fraction higher than the default 1x, will result in staggering fps drops.
As far as the efficiency of the renderRes and shadowRes, maybe it's something on optifine/shader mods end, possibly some optimization is needed as to when exactly these render changes are actually processed, just speculation though. I'd love to see it working on 2x+ though, as being used to 8xSSAA or 16xAA built in to optifine, it looks a little off to me lol.
You're 100% correct, it's an X4 860K. I haven't slept in almost 36 hours, that was definitely a typo I passed up lol.
It may be a slightly older architecture, but it's a quad core overclocked to 4.4GHz, and from what I could tell, even at stock clock rate it shouldn't bottleneck an R9 380 by more than a couple fps. Everywhere I searched said they are almost a perfect match as far as system balance goes. Ill see what I get with the default ultra settings, as I haven't actually tested with 1x renderRes.
Could you run Cinebench R15 on your CPU and post the single-core score you get? I`d like to compare it to my Xeon.
Running HD7950 on X5670 @4.15Ghz with HT off (so 4 cores effectively). Since my fav server is 1.7, i didn't bother using anything bar Optifine/Shader mod for 1.7.10
Regarding Continuum 1.2.1 (non PBR) I'm getting 18-20 FPS on 1080p render resolution (fullscreen), with 70-80% GPU utilisation and just one CPU core getting job done. Most likely my CPU is the narrow point here.
Minecraft, being a sloppy Java-based game, is really inefficient with CPU, this is a great example of heavy CPU-bound, single-core perf-dependant game.
Physically Based Rendering: open the spoiler. taken from reddit
It's a way to make everything look more photorealistic/natural-looking by changing how light bounces off everything. Since lighting controls how you see everything in the game (because you couldn't see in the pitch-black dark), it affects how every single part of the game looks.
Basically: instead of current lighting techniques like using multiple diffuse textures/specular maps for each part of every object in the game to represent different conditions, they can just create 1 texture for each part then artificially define properties like a refractive index to help parameterize a physics model that controls how light and shadow work when rendering frames of an in-game scene that contains that object.
In most implementations the physics model basically uses a predictive set of converging functions to determine how light from a specific source will reflect/refract off a given surface with different reflectivity and absorption/diffusion characteristics, which then refracts and reflects off other surfaces at a) different angles, with reduced intensity and c) a different wavelength, etc.
In productivity terms: people creating textures for in-game assets now have to spend less time creating multiple maps for each surface because they can just say "this panel is steel" or "this seat cushion is leather" with specific reflectivity/diffusion/texture/etc. rather than having to create multiple different copies of them that behave differently under different lighting conditions (such as in space, in atmosphere, indoors, etc.)
In visual terms: different types of surfaces (such as metal, leather, plastic, glass, etc.) should look more photorealistic and more "natural" because the way that light reflects off them and the way shadows are created will be more accurate.
Rollback Post to RevisionRollBack
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
The Meaning of Life, the Universe, and Everything.
Join Date:
8/4/2012
Posts:
48
Member Details
An update from me, I finally got the framerates I was looking for. Running with nice sharp shadows, 1.5x render res, and most effects enabled at a very solid 30 fps. This shader is so gorgeous.
I have a question for some shader savvy people now. I'm making a resource pack specifically for continuum, and I need to know what maps this pack supports, in what file formats. Currently I have normal and specular maps for my ores, and they look great, but I need to know how to make certain surfaces ONLY use the specular map while it's raining (like how the grass looks with "specular on every texture pack" is on). I ccan make speculars, but they are always on, making the grass sparkle when its dry...
So:
1. What maps does continuum support (normal, spec, height, dudv, ao, etc)
2. With those maps, what file types can be loaded by continuum/optifine? Is it png only? Can I use tga?
3. What are the name extensions for each supported map (specular_s, normal_n, etc.) or does it not matter?
4. How do I make state based mapping (not sure what it's called, but only using certain maps under certain conditions)?
5. Is embedded mapping supported? If so, what maps can be embedded into what maps?
The main issue was his card.
The whole thing about Minecraft being processor based is true, Minecraft does suffer from CPU bottlenecks rather easily. However, this isn't quite the case when GPU intensive shaders are involved.
Basically, what you need for the maximum FPS in Minecraft is a balanced system. Your CPU and GPU should be around the same level.
To explain a bit further, vanilla Minecraft has simple graphics, which means dedicated GPUs can easily push out hundreds of FPS, however the processor is usually what holds them back from getting that great performance. When you add shaders into the mix of things, the GPU is being stressed way more than the processor is, and in that situation the processor doesn't matter as much as it would otherwise.
Ehh.... The HD 7850 was, and still is, a powerful card. It has the GTX 750 Ti, beaten by a long shot. This is coming from a Nvidia fangirl, by the way - I adamantly avoid AMD, and attempt to persuade people from ever purchasing anything made by AMD. Here, have a look at this benchmark record, against the 750 Ti: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-750-Ti-vs-AMD-HD-7850/2187vs2182
Anyways, the i3 that he owned at the time, was an incredibly poor CPU. Even my old Core 2 Quad Q6600 was more powerful than it. So, I'd argue that in this case, the problem was certainly the CPU. But yes... Shaders, specifically SEUS and its variants like Continuum, are extremely demanding on GPUs. I will never deny that.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
ehhh...his i3 540 back in the day is better than your q6600...time to research a bit more there. Also, an i3 wont bottleneck with shaders. His 7850 was the bottleneck, the i3 was fine.
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
You know, purposely mocking someone isn't okay. Regardless of being right or wrong, you ought to be kind to others. And no, the Q6600 is more powerful than the i3-540.
http://www.cpubenchmark.net/compare.php?cmp[]=738&cmp[]=1038
Regardless, let's not derail this thread any further. If you honestly feel the need to continue a debate, you can always private message me.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Hmm..guess it depends on where you look. cpu-world and cpu.userbenchmarks both show that the i3 is better than the q6600, while cpuboss shows that the results alternate. Idk what to take from that then sorry if my initial reply seemed to be mocking. That wasnt my intention.
Either way, neither the q6600 or i3 540 will be a bottleneck for shaders, especially not with a 7850. The gpu is most certainly the bottleneck when it comes to shaders.
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
Aw, don't do that. It's one thing to have preference, but we need more equal competition in the GPU space in order for future products to actually improve. When one company has significantly more market share than the other, it gets to a point where progression comes to a stand still. I know what you mean though, I understand the preference for Nvidia.
It's hard to use general GPU benchmarks to measure Minecraft performance. Newer generation "Polaris" cards might improve or even fix this problem, but AMD's current GCN architecture seems to really hate Minecraft's OpenGL implementation for some reason. That's why in the case of Minecraft, a GTX 970 outperforms an R9 390 by a significant margin.
Bro! all of you guys you just keep saying I have oldest oooolllldeeest graphics card
But when I updated my graphics drivers to latest
andd.....
magicc!!!!!!!!!!!!
All of them are working!! (low fps but they are working!!!)
I seriously can't believe I said you @Strum355 that if I update my drivers
Those settings do actually do something, if you know what to look out for of course. Render Quality can be compared to the resolution its being run at. The higher the render quality, the less aliasing etc.
Shadow quality is the Shadow Resolution. Increasing it will increase the shadow resolution while lowering it lowers the shadow resolution.
That being said, still run with both of them at x1. I personally find x4 fxaa to be a much better option. Also...slightly disappointed in your not using normal or specular maps ;3 xD
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
It's good to hear that you found a solution. However, your GPU is horrifically underpowered for any SEUS variant. I'm honestly shocked that you managed to get a Nvidia GT 620 to work with this pack and have it be playable. Even the creator of SEUS; Cody Darr, struggled to have over 30fps on his old GTX 550. That's what he claimed, in an old Facebook post of his anyways.
I'd still suggest that you purchase a better GPU in the future, regardless of getting this pack running. Looking back at my reply to you, I'm not exactly sure why I said that you need to build a new computer lol. You certainly do need more RAM, but your CPU is fine.
I played with SEUS, on an Intel Pentium E5300 for quite a long period of time before I was able to update. That said, I also had 4GB of RAM and a GTX 550 Ti at the time. My general fps was around 20-30 with that old build. Anyways, you could easily upgrade to 8GB of RAM and put a new GPU into your PC for under $200. Look for at least a GTX 560; you can find those used, for under $100 sometimes.
Well, the texture pack I use doesn't support either of them; I use my own personalised version of Misa. Her pack is the best one in existence - in my opinion. Prior to the project being taken up by Les, after she couldn't work on it any further, I was floating around a bit and mostly just using Faithful, or Chroma Hills when I used SEUS. Now that Les has taken it upon themselves to update it to 1.9 and 1.10, I've started using it again.
As for the other stuff you mentioned... the difference in quality is so minimal (yes, it's noticeable) that it doesn't seem worth it, to turn above 1x. So, I always make certain to make it clear to those who ask, that it's pointless to turn those two quality levels up.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Had a look at Misa. Its nice but a bit plain from what the screenshots showed. However ill give it a go, might be better with shaders for the others, yea, hence i said i keep it at x1 as well myself. Theres always someone out there who will turn it up, but for most, theres minimal gains to be gotten from it
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
Lol .. and i can upgrade ram to 4gb only and gtx 560!?! nah its out of my budget man india is different
I have a couple of things to ask/say here. First, I don't understand why people are saying there's not much difference between 1x and 2x Render Quality, as there is a HUGE difference to me. Even running at 1.5x, the amount of aliasing and general blur is unbearable to me, with any shader. Shadow Quality I usually keep at 1x, with a higher res, or 2x with 1024.
On to my questions:
1. I have a Giga R9 380 and an Athlon XII 860K, and I'm getting lower framerates than I feel I should be with this card on this pack, even using the medium profile. Are there a set of settings that should be set a certain way to get better performance? What should I be seeing with my card at 1080 with the extreme profile? Are there still bugs or performance issues with 1.9.4?
2. I am in love with the water from this pack. The refraction (which I think could use a little boost in visual strength btw) makes it absolutely stunning to look at. Is there any way you could create a profile optimized for just the water effects and decent FPS? Maybe add some simple lighting effects like sunglare, godrays, etc, but DEFINITELY give an option to turn everything on/off individually, including the dynamic lighting/cast shadows (which seems to be one of the biggest fps hit, at least for me), and if you could add options for effect strength for everything too... I know it's a lot to ask, but I know there are people out there (like me) who want JUST water shaders, and these are BY FAR the best looking, or others who don't have a strong enough rig to use all the amazing effects offered here. Offering the continuum in separately grouped packs (water, shadows and lighting, weather effects, etc) will increase the user base for these (hell, put them behind adf.ly, we don't mind). Maybe even make this modular some how by letting users choose which effects they want, then have a script compile just those feature. Continuum is wayyyy to good to have such a small group of people using them, so in my opinion everything that can be done to increase how many people use these, should be done. Please at least consider this...
3. I ran cinematic just for the hell of it. I think there's a rendering issue with one of the cloud settings. I haven't narrowed it down yet, but it appears to look like some kind of volumetric cloud settings that are rendering as flat verticle/horizontal sheets instead of a volume. I can try to load the profile back up and get a screensshot if needed.
I can answer your first question, but the next two aren't something I am capable of helping with. Firstly... I say that there's no difference, meaning -- in a slightly cryptic/convoluted way -- that the performance hit is so insanely high for the visual quality it offers, that it's simply not worth it. The performance to quality ratio is completely out of whack, with those two settings. Even increasing it a fraction higher than the default 1x, will result in staggering fps drops.
And now, regarding your R9 380... that is a nice card, yes. It's better than a GTX 960, but it's slower than a GTX 970. You should be capable of having near-similar performance that I do; just keep what I said above, in mind. The CPU that you named, ''Athlon XII 860K'', I cannot find any record of its existence. The closest model name appears to be an ''Athlon II X4 860K''. Anywho, that specific processor is definitely not on the higher end. It may be a cause of a good 5-10 frame loss, when compared to my i7-6700.
With your hardware, at the default Ultra preset (This means 1x render and shadow quality), I would expect to see an average of... 30-40fps. That should answer your performance problem, lol. Hopefully someone else who knows how, can respond to your next two questions. =)
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
You're 100% correct, it's an X4 860K. I haven't slept in almost 36 hours, that was definitely a typo I passed up lol.
It may be a slightly older architecture, but it's a quad core overclocked to 4.4GHz, and from what I could tell, even at stock clock rate it shouldn't bottleneck an R9 380 by more than a couple fps. Everywhere I searched said they are almost a perfect match as far as system balance goes. Ill see what I get with the default ultra settings, as I haven't actually tested with 1x renderRes.
As far as the efficiency of the renderRes and shadowRes, maybe it's something on optifine/shader mods end, possibly some optimization is needed as to when exactly these render changes are actually processed, just speculation though. I'd love to see it working on 2x+ though, as being used to 8xSSAA or 16xAA built in to optifine, it looks a little off to me lol.
Could you run Cinebench R15 on your CPU and post the single-core score you get? I`d like to compare it to my Xeon.
Running HD7950 on X5670 @4.15Ghz with HT off (so 4 cores effectively). Since my fav server is 1.7, i didn't bother using anything bar Optifine/Shader mod for 1.7.10
Regarding Continuum 1.2.1 (non PBR) I'm getting 18-20 FPS on 1080p render resolution (fullscreen), with 70-80% GPU utilisation and just one CPU core getting job done. Most likely my CPU is the narrow point here.
Minecraft, being a sloppy Java-based game, is really inefficient with CPU, this is a great example of heavy CPU-bound, single-core perf-dependant game.
What is PBR?
Physically Based Rendering: open the spoiler. taken from reddit
It's a way to make everything look more photorealistic/natural-looking by changing how light bounces off everything. Since lighting controls how you see everything in the game (because you couldn't see in the pitch-black dark), it affects how every single part of the game looks.
Basically: instead of current lighting techniques like using multiple diffuse textures/specular maps for each part of every object in the game to represent different conditions, they can just create 1 texture for each part then artificially define properties like a refractive index to help parameterize a physics model that controls how light and shadow work when rendering frames of an in-game scene that contains that object.
In most implementations the physics model basically uses a predictive set of converging functions to determine how light from a specific source will reflect/refract off a given surface with different reflectivity and absorption/diffusion characteristics, which then refracts and reflects off other surfaces at a) different angles, with reduced intensity and c) a different wavelength, etc.
In productivity terms: people creating textures for in-game assets now have to spend less time creating multiple maps for each surface because they can just say "this panel is steel" or "this seat cushion is leather" with specific reflectivity/diffusion/texture/etc. rather than having to create multiple different copies of them that behave differently under different lighting conditions (such as in space, in atmosphere, indoors, etc.)
In visual terms: different types of surfaces (such as metal, leather, plastic, glass, etc.) should look more photorealistic and more "natural" because the way that light reflects off them and the way shadows are created will be more accurate.
Sushi Shaders - a fresh and (objectively/somewhat) realistic shader, approved by Past Life Pro, made by yours truly. Works on Mac OSX
Thanks Strum355
I searched the internet and couldn't find a definition (how did you search for it?)
Last question: what is DOF?
Depth of Field?
Nevermind, I found the explanation on reddit.
I'll recheck what shader version I choose on minecraft, that could explain the visual circle I saw on minecraft last night...
That would be correct. =)
Here's a rather informative video that both explains what it is, and explains how to use it on default SEUS:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
An update from me, I finally got the framerates I was looking for. Running with nice sharp shadows, 1.5x render res, and most effects enabled at a very solid 30 fps. This shader is so gorgeous.
I have a question for some shader savvy people now. I'm making a resource pack specifically for continuum, and I need to know what maps this pack supports, in what file formats. Currently I have normal and specular maps for my ores, and they look great, but I need to know how to make certain surfaces ONLY use the specular map while it's raining (like how the grass looks with "specular on every texture pack" is on). I ccan make speculars, but they are always on, making the grass sparkle when its dry...
So:
1. What maps does continuum support (normal, spec, height, dudv, ao, etc)
2. With those maps, what file types can be loaded by continuum/optifine? Is it png only? Can I use tga?
3. What are the name extensions for each supported map (specular_s, normal_n, etc.) or does it not matter?
4. How do I make state based mapping (not sure what it's called, but only using certain maps under certain conditions)?
5. Is embedded mapping supported? If so, what maps can be embedded into what maps?
TIA
A little preview at my almost completed ores: