Shader effects on Linux (and possibly other platforms)

Moderator: Moderator

Post Reply
Message
Author
mikekchar
Halfling
Posts: 95
Joined: Fri Nov 21, 2014 3:55 am

Shader effects on Linux (and possibly other platforms)

#1 Post by mikekchar »

In a thread in General a question was asked about shader effects not showing up: http://forums.te4.org/viewtopic.php?f=38&t=44172

I've had the same problem on all of the Linux platforms that I run the game on. Now, I must admit that I am a complete newbie to OpenGL, so what I'm about to say may be completely wrong. :-) Anyway, in src/shaders.c:664, there is a check to see if GLEW_EXT_gpu_shader4 is supported by the GPU and driver. If not, most of the shaders are disabled. The problem is (as far as I understand) Mesa (the OpenGL implementation on Linux) does not support this extension and I am led to believe they will never support it. Instead, there is a core implementation which is similar, but not exactly the same. At the moment, Mesa supports OpenGL 3.3, which has that core implementation.

So for fun, I modified shaders.c to allow the use of shaders and indeed almost everything works perfectly. I've played a paradox mage up to level 23 with barely a hitch and gone through quite a few of the effects. When I say "barely", there are a couple of problems: When fire is used (bolt_fire, fiery_hands, etc), the CPU goes to 100% for several seconds, after which the effect works perfectly. As long as a single fire effect is loaded in the shaders, everything works fine. The other minor problem is that spikes don't seem to work. There is a background graphic (beige) and nothing seems to happen -- having never seen it before, I'm not sure what to expect, but it doesn't look right.

So I think that this core implementation is very close to working. I've spent some time trying to find out exactly what the problem is with fire, but have not had much success. It looks like an interesting challenge so I'd like to continue playing with it. I have a few questions, though:

1) Am I completely wasting my time? (i.e., this is a known issue which is unfixable / I've got it completely wrong / etc)
2) Is there any place but here where development issues are discussed?
3) Are there any tips you can provide for debugging shader issues. Even simple things would be good. For example, to test things, I am always starting up the game, shutting it down, modifying the code, again and again. It's fairly time consuming. Is there a faster/easier way to test things?
4) Similar to 3, I noticed that the shader code is loaded dynamically. Is there any way to unload the shader easily in the game, modify the source and then have it load again without shutting down the game?

Any other tips would be most welcome. I've got 20+ years experience as a professional programmer -- mostly C++, and lately web development stuff, but no experience with OpenGL or Lua. I mostly understand what's going on (and the code is not hard to understand at all!), but I think it will still take me quite a while to work efficiently with this technology.

mikekchar
Halfling
Posts: 95
Joined: Fri Nov 21, 2014 3:55 am

Re: Shader effects on Linux (and possibly other platforms)

#2 Post by mikekchar »

Semi bump :) I've decided to actually get around to seeing what I can do about this.

Some more research seems to confirm that GL_EXT_gpu_shader4 is not supported at all in Mesa, which means that most Linux users will not be able to get the shaders working to any great extent. I figure I'll try to replace any of the unsupported calls with non-ext calls (my naive understanding is that it shouldn't be too difficult).

I'll post here with any progress.

Edit: Given the lack of responses to my last post, I assume not many people are interested, so I'll just edit this post as I go.

First, looking at the code again with a fresh eye, I realize that there is no actual use of the gpu_shader4 extension, which is why the code works even though mesa doesn't support it :-)

The problem with the game hanging for a few seconds with certain shaders seems to be due to compilation. Looking at the IR (intermediate representation) for one of the offending shader programs reveals that it is something like 11,000 lines longs :-) Shader programs that don't have a problem are more reasonably sized (like 50 lines or so).

I think the problem *may* be because Mesa inlines all subroutines and it gets it horribly wrong sometimes. Looking at the resultant IR seems to support this theory (the same thing over and over and over again). Armed with this knowledge, I'm going to try to rewrite some of the shaders to see if it helps.

Note: if anyone wants to get to the point I did, setting the environment variable MESA_GLSL=dump,nopt dumps the unoptomized shaders to stdout when they are compiled.

Edit 2: The mainline for shockwave2.frag results in the creation of 3000 temporary variables :-P I have yet to figure out exactly what is triggering it.

Edit 3: Found the problem. It's the repeated application of snoise. Not entirely sure what to do about it, but at least I know where to start. I'll have another look tomorrow.

Last edit tonight: (I promise) I'm 99% sure that the issue is when generating code like

Code: Select all

permute(permute(permute blah, blah)))

In Fireball.frag it also has

Code: Select all

Uberblend(blah, Uberblend(blah, ))
So basically, it seems like Mesa freaks out if you call the same function in its parameters (which I can understand). Should be very easy to fix. Like, I said, I will try tomorrow...

mikekchar
Halfling
Posts: 95
Joined: Fri Nov 21, 2014 3:55 am

Re: Shader effects on Linux (and possibly other platforms)

#3 Post by mikekchar »

Well, for those of you following along, it is *not* what I thought it was. This is frustratingly difficult to debug because even if you tell it not to optimize, it clearly does (i.e., returning 42.0 from snoise fixes the problem, but does so by optimising all the other code out of the function). Still, I'm relatively sure the problem is in snoise for shockwave2.frag. I really hope once I figure out what it is, it's easy to fix because this is a painstaking process that I wouldn't like to do on every shader!

Edit: Sigh... Have to go to work now. I'm guessing that the only problem is actually snoise. Mesa just does not like it. Doing some googling around, I realized that this is the Ashima noise generator and is highly recommended by virtually everyone, so I'm surprised that this issue hasn't been raised before. I'm starting to wonder if it might only be on my Intel hardware. When I get a chance, I'll give it a spin on my Radeon box (with mesa drivers). There is nothing I can see here that looks particularly likely to be the problem. Possibly a mesa bug? Possibly people usually precompile their shaders (still have a lot of catching up to do WRT OpenGL, so I really don't know)?

mikekchar
Halfling
Posts: 95
Joined: Fri Nov 21, 2014 3:55 am

Re: Shader effects on Linux (and possibly other platforms)

#4 Post by mikekchar »

If anyone is looking at trying to decipher GLSL and has no idea, I have discovered that the OpenGL website has a wiki that is *much* easier to understand than the spec :lol:

https://www.opengl.org/wiki/Data_Type_(GLSL)

Anyway, I'm going to try to put in another hour or so today to see if I can make any progress. I will post here. It's frustrating because it is so agonizingly close to working ;-)

Edit: Not really any closer other than starting to actually understand how the code works. I noticed that the calls to GetFireRingColor appear to have non-uniform flow control. If I understand correctly, gl_TexCoord is an "in" parameter as opposed to a "uniform" variable. Since radius is calculated from it, the if statement that uses radius creates non-uniform flow control. So I moved the calls to GetFireRingColor out of the if statement, but it didn't seem to help (or hurt) anything. I didn't really have high hopes that it would, but you never know.

I don't really understand how GetFireDelta works. I'm assuming the texture that is being used has hue on the x-axis and intensity on the y-axis and GetFireDelta is choosing the intensity of the colour based on position (with noise). What I don't understand is why there needs to be so many calls to snoise. I guess to add noise to all of the bits from low order to high order, but intuitively I can't quite grasp why that's important (if my previous assumption is correct, then I don't quite see why the low order bits are that important). However, snoise still has problems compiling even if you only call it once, so it's a moot point.

Anyway, I'll poke at it again tomorrow. Armed with a vague understanding of what is going on, I hope I can make progress.

Post Reply