0 Favourites

Need help with a Shader - coordinate weirdness [2nd post]

  • Hi, shaderheads!

    I have been learning my way around shaders, but all this premultiplied alpha business is still going over my head.

    Here's what I want to do - make an "inline" effect - base the brightness of a given pixel on the inverse alpha value of nearby pixels. My code looks like this at the moment:

    /////////////////////////////////////////////////////////
    // Selection inline effect
    //
    varying mediump vec2 vTex;
    uniform lowp sampler2D samplerFront;
    
    precision lowp float;
    uniform lowp float pixelWidth;
    uniform lowp float pixelHeight;
    
    void main(void)
    {
    	lowp vec4 front = texture2D(samplerFront, vTex);
    	
        float A = 1.0 - texture2D(samplerFront, vTex + vec2(0, pixelHeight)).a;
    	float B = 1.0 - texture2D(samplerFront, vTex + vec2(0, -pixelHeight)).a;
    	float C = 1.0 - texture2D(samplerFront, vTex + vec2(pixelWidth, 0)).a;
    	float D = 1.0 - texture2D(samplerFront, vTex + vec2(-pixelWidth, 0)).a;
    	
        float M = clamp(A+B+C+D,0.0,1.0);
    
    	front.rgb = vec3(M,M,M);
        
    	gl_FragColor = front;
    }[/code:a5lwjud4]
    
    But the values don't work out the way one might expect. What needs to be done so we basically keep the same alpha of the original image AND change the pixel colors?
  • All right, after doing some research (basically reading some other posts here) it looks like I managed to make a working version:

    /////////////////////////////////////////////////////////
    // Selection inline effect
    //
    varying lowp vec2 vTex;
    uniform lowp sampler2D samplerFront;
    
    precision lowp float;
    uniform lowp float pixelWidth;
    uniform lowp float pixelHeight;
    uniform lowp float width;
    
    void main(void)
    {
    	lowp float Alpha = texture2D(samplerFront, vTex).a;
    	
        lowp float A = 1.0 - texture2D(samplerFront, vTex + vec2(0.0, pixelHeight*width)).a;
    	lowp float B = 1.0 - texture2D(samplerFront, vTex + vec2(0.0, -pixelHeight*width)).a;
    	lowp float C = 1.0 - texture2D(samplerFront, vTex + vec2(pixelWidth*width, 0.0)).a;
    	lowp float D = 1.0 - texture2D(samplerFront, vTex + vec2(-pixelWidth*width, 0.0)).a;
    	
        lowp float M = clamp(A+B+C+D,0.0,1.0);
        
    	gl_FragColor = vec4(vec3(M,M,M)*Alpha,Alpha);
    }[/code:2iehbb73]
    
    Initially it works just as expected:
    [img="http://i.imgur.com/dd3M4bs.png"]
    
    But should the sprite be moved strange artefacts appear:
    [img="http://i.imgur.com/bicSiUQ.png"]
    
    And if it's rotated forward and back the artefacts get even worse:
    [img="http://i.imgur.com/Rs4NluD.png"]
    
    I tried changing all the precision settings, but it didn't help. Perhaps someone can explain why this is happening? Perhaps that someone could be 

    Ashley (I realise you are very busy, but I really would like to understand the Shader behaviour). I'm attaching the Shader itself for easier testing if necessary.

  • It looks like you need to premultiply by your calculated alpha value M, not the original alpha value from the pixel.

  • But I need to keep the same Alpha as the original image - only change the colours. If I multiply with the calculated alpha then everything gets filled, which isn't desired:

    Besides if the sprite is moved the same artefacts appear:

  • Here's an additional demonstration - could it be that either the pixelWidth and pixelHeight values are acting weird or something is equally weird with alpha when a sprite is moved?

    In theory the sprite gets moved exactly a given amount of pixels - 128 or 64 or so. As we see the effect acts different when a sprite is moved somewhere and then comes back. Any idea why that is?

    I would do further research, but since there's no way to debug a Shader I have to ask here...

    Edit: More info:

    The strangeness happens if there is any other Shader present - it seems as if having a different Shader in the stack also "adds" some extra space around the sprite, even without the expand properties being used - here's the "inline" with a Shader in the stack: - it seems like it is able to "detect" the surrounding pixels and properly mark the inside of the sprite since there's 0 alpha around.

    And without any other Shader it looks like this - - nothing is drawn since, apparently, the Shader cannot read the 0 alpha from outside of the sprite bounds.

  • Looks like subpixel interpolation to me. No way around that.

    You might see what it looks like in other browsers however.

  • Shameless bump in the hopes that Ashley might shed some light on why the results differ seemingly randomly...

  • I don't know, can you reproduce it on other systems or browsers? It looks like the type of glitch which sometimes comes down to a driver quirk or some aspect of ANGLE (Chrome/Firefox's WebGL wrapper - IE11 uses a different engine)

  • I don't know, can you reproduce it on other systems or browsers? It looks like the type of glitch which sometimes comes down to a driver quirk or some aspect of ANGLE (Chrome/Firefox's WebGL wrapper - IE11 uses a different engine)

    Thanks for the reply - so far I have tested on Chrome/Firefox on two different systems and the weird behaviour is present on both. Will try IE11.

    Since it uses the PixelWidht and PixelHeight values - is there a slim chance that those get iffy somewhere along the way?

  • You should test on as many WebGL-supporting browsers as possible as your very first port of call. This is all essential information and without that we can only speculate.

  • Well, here's what I could gather so far:

    Chrome: Glitches

    Firefox: Glitches

    Opera: Glitches

    nw.js: Glitches

    Safari: Shaders didn't work

    IE11: Shaders didn't work

  • The artifacts are caused because your images "touch" the border of the texture, as you can see here:

    So when sampling outside of the texture bounds, it samples the same border pixel (opaque) like if the texture repeated the border to infinity. This causes the glitches.

    You can fix this by manually adding a 1px transparent border to your images or extending the effect bounding box in the shader xml.

  • You can fix this by manually adding a 1px transparent border to your images or extending the effect bounding box in the shader xml.

    Thanks for the input, Animmaniac, but my question is more about inconsistent behaviour - as you can see in the animated gif - one time you get the effect, move and object away, then back the same amount and the effect is different, move it away again, then back and it's suddenly fragmented. And my images are 254x254 so on export they get the obligatory 1px transparent border (I also just made this image have the border in preview and it's the same).

    Extending the bounding box for something like this usually makes things worse as then even more artefacts appear for overlapping objects (or you get unneeded elements on the other side). R0J0 observed it in his outline Shader (so he disabled bounding box extension) and I did in this one as well. Also, as said before for some reason adding any other Shader into the mix already sort-of extends the bounding box, even if it's not mentioned anywhere. If my inline Shader is the only one in the stack on a square that fills the texture it looks as it should - empty (since there's nothing to sample from the outside):

    If I use it as the only Shader it works predictably, every time. So, perhaps something is happening with Shader stacking - hopefully it can be addressed.

  • Construct 3

    Buy Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Buy Now Construct 3 users don't see these ads
  • I did some tests but I can't reproduce this inconsistent behavior that you are describing.

    You can try for yourself here.

    *I did some modifications to the shader based on what I think you are trying to achieve, but it works the same regardless of the modifications.

    Here's what it looks like when previewing in Chrome:

    I left a transparent border in two sides of the images (right and bottom) so it's easy to spot the differences with and without a border. You can see in the image that only the sides with a transparent border are rendered correctly.

    Also added sine behaviors and a rotate to see what effects non round positions could have. It's possible to see some oscillation on the non-transparent borders of the circle, but nothing on the square.

    Everything works consistently here, and adding other effects doesn't seem to extend the bounding box. Adding a transparent border or extending the texture through the xml solves the problem on my PC.

    Are you getting the same results?

  • I did some tests but I can't reproduce this inconsistent behavior that you are describing.

    You can try for yourself here.

    *I did some modifications to the shader based on what I think you are trying to achieve, but it works the same regardless of the modifications.

    I'll need to compare what those modifications do. The way it has an outline, but respects the alpha of the insides is great. Thanks!

    Are you getting the same results?

    I wasn't, because the test wasn't the same as mine - the moment we change the shader stack (shader BEFORE the effect) everything goes plenty bad here. I have changed your demo to show this:

    [attachment=0:7we6oz6l][/attachment:7we6oz6l]

    [attachment=1:7we6oz6l][/attachment:7we6oz6l]

    In my case it needs to be at least beyond one other Shader (one that trims things) so, unfortunately, I cannot change the stack order.

    In general I think it's related to this - the way Shader-using thing sequencing works (or doesn't):

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)