Install the mod, test it with all of the different shaders and tell me if anything different happens. It may fix the problem, or it may not. My fingers are crossed.
Lol, I suppose I should release that shader just for you guys to play with. It's just a weird looking effect. I've been working on a shader that actually curves the world around on itself.
This other video is just a teaser of sorts of what else I'm cooking up:
Thanks so much for sharing this. So much fun to run around with, gives a different perspective to old things.
Two questions, if anybody is curious.
First: Would it be possible to add another distance variable to that equation such that the perceived changes from the player's POV are uniform (i.e. the greater the distance from the player, the larger the wave is). As it stands now, mountains in the distance don't simply look "trippy," they look almost unrecognizable (because the wave appears so small at that great a distance). I, unfortunately, know next to nothing about code. This sounded like it would be possible, though, and I think it'd make the distortion shader for the above video into a shader players could actually use for legitimate gameplay relatively easily (it'd just be like getting over the learning curve in Portal).
Second: Mojang is planning on adding worlds with limited sizes in the future (I'm almost positive I read that recently). If the world were limited and created in a shape that would tessellate, I'm pretty sure some of these curvature shaders could pretty easily give the illusion of planets. Am I wrong?
The things possible simply by changing how the world is rendered are amazing. Thanks so much again to everybody who's put time and effort into this!
-Thayere
Just figure out a function you would like in the form of y = f(x) and it would be really easy to do.
If you're thinking something more along the lines of this:
Sent you a log.
While on the subject though, Bump mapping sounds awesome and custom lighting shader looks good.... however would it be possible to implement deferred lighting or a real shadow engine into the game with lighting? I don't know much about GLSL but I assume it wouldn't be impossible
edit: another thing I fantasize about is real time screen space indirect illumination/global illumination, would look fantastic in houses and stuff, though I actually havn't seen any examples of that working in GLSL
I'd help out writing shaders if I could, but I don't know much about the shader language
i dont know why but its just not working for me i have downloaded it, no errors occur i click anykey to close and open minecraft but when i go to mine craft i see no graphics difference.
i run windows xp
my graphics card is a
Geforce 9600 GSO 512 bit
edit: never mind im just thick headed ^^
Physical Global Illumation (that is, not just the per-block blending that's currently happening) is not going to happen anytime soon (or ever). Deferred shading might be possible and i'll give it a shot once a new mcp comes out.
The basic technique is pretty basic, instead of currently:
game engine --> (base.vertexshader + base.fragmentshader does lighting) --> RGBA buffer texture
buffer texture --> (final.vertexshader + final.fragmentshader does post fx like bloom or DoF) to full screen quad --> screen
For deferred shading you'd use more buffer textures to transfer color, normal and material data to the final shader where you then do the lighting calculation.
Of course multiple passes would be fantastic, since you could split the rendering up and do 1 pass sunlight, 1 pass for each visible torch, then 1 for bloom and whatever, then blur the final result with a DoF pass.
****, i'm starting to fantasize again, time to stop.
I think notch still does some horrible performance stuff like render identical triangles multiple times when two cubes attach, but that could be outdated rumor.
I don't do any of the real programming when I work with dax on this, but considering the client might as well be open source to us, what's the likelyhood of making an attempt at rewriting the rendering engine even if it was to just optimize what we already have? I feel like (and of course not really being a programmer, I could very well stand to be corrected) that would be the first step to making all of these other modifications less of a challenge.
Normals is just one of the small things that jumps out to me. According to dax, the way Notch handles alpha testing (am I getting that term wrong? The pig saddle glitch before the fix) is also just weird.
Edit: Same result for lwjgl-2.4.2 in Ubuntu 10.10 and in Windows 7 with Catalyst 10.12, all on an ATI 5770.
Edit2: The code for the Shader Uni test: http://pastebin.com/cJREDbJA
What if we take the Inception effect backwards and make the world bend in the opposite direction?
It'd give an effect of a round world, but it'd have to be much more subtle.
Doesn't look like normals are in yet, gl_Normal is still zero for me? I thought there had been some progress?
Regarding optimization: Optimine has already sort of done that, althought you'd probably rewrite a few functions from scratch to switch the renderer instead of the code injection this mod currently does.
Regarding width/height as uniforms for pixel sampling
int aspectRatioU = ARBShaderObjects.glGetUniformLocationARB(program, "aspectRatio");
ARBShaderObjects.glUniform1fARB(aspectRatioU, (float)mc.displayWidth / (float)mc.displayHeight);
int sdisplayWidth = ARBShaderObjects.glGetUniformLocationARB(program, "displayWidth");
ARBShaderObjects.glUniform1fARB(sdisplayWidth, (float)mc.displayWidth);
int sdisplayHeight = ARBShaderObjects.glGetUniformLocationARB(program, "displayHeight");
ARBShaderObjects.glUniform1fARB(sdisplayHeight, (float)mc.displayHeight);
in Shaders.java (from this mod) should do the trick, right? I tried changing it, but apparently the mod supplies its already compiled Shaders.class, so we'll have to wait for dax to take a peek at this.
Honestly, the stuff we're trying to do here is ridiculously basic, the stuff that comes up the first few chapters in any opengl tutorial. Trying to tie it into the code mess of minecraft is the ugly part. :sad.gif:
Edit: ah what the hell, i'll jump the "hey lets do silly crap with vertex transformation" bandwagon as well.
and the lighting in motion without the distortion nonsense
if you think that makes you sick now, try it with 3D glasses on
This is amazing.
No, really, this is beyond anyone could expect I think, lol.
And one day I asked "can glow me added" ? rofl silly me....
All I wonder now is, where is this all going?
Physical Global Illumation (that is, not just the per-block blending that's currently happening) is not going to happen anytime soon (or ever). Deferred shading might be possible and i'll give it a shot once a new mcp comes out.
.
Ofcourse when I had the idea of screenspace indirect illumination, I wasn't thinking of physical global illumination like you see in actual renders.
Screenspace is faked, it's pretty slow but i've seen it used before pretty well. It's similar to how screenspace ambient occlusion works but a bit more complex from what I can gather. I assume it would also be somewhat similar to screenspace reflections(as opposed to cube maps or raytraced reflections) because of the way it determines which pixels to reflect though in the case of indirect illumination, it wouldn't be reflection so much as colour bleeding and such; if we had deferred lighting, screenspace GI/II and HDRI we would see some really nice screenshots(at the least, assuming it'd have a huge impact on performace) from interiors however
I would try to write it up myself if I wasn't running an ATI card and I knew anything about GLSL shaders
@RoadCrewWorker
I have no idea what's going on in that lighting shader apart from the celshading, but it's pretty damn nice.
Fixed some problems with the ssao shader and tried dialing up the parameters to overly high levels. Produced interesting, if not entirely coherent, results.
Also, who the hell decided that dark grayscales should be transparant? I lost more of my life than I'd like to admit thinking it was some obscure shader error.
Doesn't look like normals are in yet, gl_Normal is still zero for me? I thought there had been some progress?
Regarding optimization: Optimine has already sort of done that, althought you'd probably rewrite a few functions from scratch to switch the renderer instead of the code injection this mod currently does.
Regarding width/height as uniforms for pixel sampling
int aspectRatioU = ARBShaderObjects.glGetUniformLocationARB(program, "aspectRatio");
ARBShaderObjects.glUniform1fARB(aspectRatioU, (float)mc.displayWidth / (float)mc.displayHeight);
int sdisplayWidth = ARBShaderObjects.glGetUniformLocationARB(program, "displayWidth");
ARBShaderObjects.glUniform1fARB(sdisplayWidth, (float)mc.displayWidth);
int sdisplayHeight = ARBShaderObjects.glGetUniformLocationARB(program, "displayHeight");
ARBShaderObjects.glUniform1fARB(sdisplayHeight, (float)mc.displayHeight);
in Shaders.java (from this mod) should do the trick, right? I tried changing it, but apparently the mod supplies its already compiled Shaders.class, so we'll have to wait for dax to take a peek at this.
Honestly, the stuff we're trying to do here is ridiculously basic, the stuff that comes up the first few chapters in any opengl tutorial. Trying to tie it into the code mess of minecraft is the ugly part. :sad.gif:
Edit: ah what the hell, i'll jump the "hey lets do silly crap with vertex transformation" bandwagon as well.
and the lighting in motion without the distortion nonsense
I'm loving those two shaders. And yeah, doesn't it feel fun? Jumping on the vertex changing bandwagon that it.
So i've been poking around at GSLS code a bit, and I'm curious, could you somehow use the "water.png" I'm talking about the "overlay" that is much like the "pumpkinblur.png" how would you /could you reference that specific texture and then apply a shader just when those overlays are on the screen that way we could get underwater distortion? I see all these vertex shaders doing the trippy warping and can't hope but think if it could somehow be used to do underwater distortion, I tried to write/modify shaders to get the result with some interesting results but no luck. Couldn't you just grab the front layer of the z buffer and then apply a sin to it or something? or like a
if texture sample0 is present "warp vertexs"
else stop the loop or w/e maybe reset the float value?
again I'm just starting to dabble in programming and just wondering if something like this has/could be done
I hope someone understands what I mean and I'm not talking out of my ass here.
Doesn't look like normals are in yet, gl_Normal is still zero for me? I thought there had been some progress?
*text*
Sorry but please for gods sake, don't use that texture pack when recording videos, use the default so as many people as possible can see the difference at it's full potential without having their retinas burnt from ugliness.
The shader looks good, looking forwards to better revisions.
No results. :sad.gif: I've tried all the shaders. The only thing I've noticed is a lower FPS and a weird block bug.
http://dl.dropbox.com/u/8664592/2011-01-22_13.41.13.png
Just figure out a function you would like in the form of y = f(x) and it would be really easy to do.
If you're thinking something more along the lines of this:
go ahead and use this shader instead.
THIS! FOR THE LOVE OF GOD THIS!
i run windows xp
my graphics card is a
Geforce 9600 GSO 512 bit
edit: never mind im just thick headed ^^
Awesome, I think that is pretty much what I had in mind. I'll look at the differences between the two and probably be able to figure a little out.
-Thayere
I don't do any of the real programming when I work with dax on this, but considering the client might as well be open source to us, what's the likelyhood of making an attempt at rewriting the rendering engine even if it was to just optimize what we already have? I feel like (and of course not really being a programmer, I could very well stand to be corrected) that would be the first step to making all of these other modifications less of a challenge.
Normals is just one of the small things that jumps out to me. According to dax, the way Notch handles alpha testing (am I getting that term wrong? The pig saddle glitch before the fix) is also just weird.
I might have found a clue..
I downloaded lwjgl-2.6 and ran the included shader tests. They all worked except for the last one:
The test described as
failed with error message "Function is not supported".
Full error report here: http://pastebin.com/u4RM3EvN
Edit: Same result for lwjgl-2.4.2 in Ubuntu 10.10 and in Windows 7 with Catalyst 10.12, all on an ATI 5770.
Edit2: The code for the Shader Uni test: http://pastebin.com/cJREDbJA
It'd give an effect of a round world, but it'd have to be much more subtle.
if you think that makes you sick now, try it with 3D glasses on
No, really, this is beyond anyone could expect I think, lol.
And one day I asked "can glow me added" ? rofl silly me....
All I wonder now is, where is this all going?
Could someone with an NVIDIA card please run the lwjgl Shader uni test (full instructions: http://lwjgl.org/wiki/index.php?title=Downloading_and_Setting_Up_LWJGL) to confirm that NVIDIA cards pass this test?
EDIT: so on Windows,
download lwjgl: https://sourceforge.net/projects/java-g ... JGL%202.6/
unzip, start a cmd window, cd to the unzip location and run
and if you see a colored sphere then it worked.
Ofcourse when I had the idea of screenspace indirect illumination, I wasn't thinking of physical global illumination like you see in actual renders.
Screenspace is faked, it's pretty slow but i've seen it used before pretty well. It's similar to how screenspace ambient occlusion works but a bit more complex from what I can gather. I assume it would also be somewhat similar to screenspace reflections(as opposed to cube maps or raytraced reflections) because of the way it determines which pixels to reflect though in the case of indirect illumination, it wouldn't be reflection so much as colour bleeding and such; if we had deferred lighting, screenspace GI/II and HDRI we would see some really nice screenshots(at the least, assuming it'd have a huge impact on performace) from interiors however
I would try to write it up myself if I wasn't running an ATI card and I knew anything about GLSL shaders
edit: here's an example;
that looks sexy
I'm loving those two shaders. And yeah, doesn't it feel fun? Jumping on the vertex changing bandwagon that it.
if texture sample0 is present "warp vertexs"
else stop the loop or w/e maybe reset the float value?
again I'm just starting to dabble in programming and just wondering if something like this has/could be done
I hope someone understands what I mean and I'm not talking out of my ass here.
Sorry but please for gods sake, don't use that texture pack when recording videos, use the default so as many people as possible can see the difference at it's full potential without having their retinas burnt from ugliness.
The shader looks good, looking forwards to better revisions.
I actually didn't realize the shader was in GLSL... Seems like it could be simple enough to adapt in that case