Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert ShaderToy to fragment shader

I came across several shaders on ShaderToy and I have not had success converting them into a format that can be used on a mobile device, such as a .fsh.

I have this Shader, and I want to be able to use it on a mobile device.

I know that I need to make modifications to the iXXXX variables and change mainImage to main().

Does anyone know how I can do this? I am unable to find any resources on how to do this, and have never encountered it myself.

float noise(vec2 p)
{
    float sample = texture2D(iChannel1,vec2(1.,2.*cos(iGlobalTime))*iGlobalTime*8. + p*1.).x;
    sample *= sample;
    return sample;
}

float onOff(float a, float b, float c)
{
    return step(c, sin(iGlobalTime + a*cos(iGlobalTime*b)));
}

float ramp(float y, float start, float end)
{
    float inside = step(start,y) - step(end,y);
    float fact = (y-start)/(end-start)*inside;
    return (1.-fact) * inside;

}

float stripes(vec2 uv)
{

    float noi = noise(uv*vec2(0.5,1.) + vec2(1.,3.));
    return ramp(mod(uv.y*4. + iGlobalTime/2.+sin(iGlobalTime + sin(iGlobalTime*0.63)),1.),0.5,0.6)*noi;
}

vec3 getVideo(vec2 uv)
{
    vec2 look = uv;
    float window = 1./(1.+20.*(look.y-mod(iGlobalTime/4.,1.))*(look.y-mod(iGlobalTime/4.,1.)));
    look.x = look.x + sin(look.y*10. + iGlobalTime)/50.*onOff(4.,4.,.3)*(1.+cos(iGlobalTime*80.))*window;
    float vShift = 0.4*onOff(2.,3.,.9)*(sin(iGlobalTime)*sin(iGlobalTime*20.) + 
                                         (0.5 + 0.1*sin(iGlobalTime*200.)*cos(iGlobalTime)));
    look.y = mod(look.y + vShift, 1.);
    vec3 video = vec3(texture2D(iChannel0,look));
    return video;
}

vec2 screenDistort(vec2 uv)
{
    uv -= vec2(.5,.5);
    uv = uv*1.2*(1./1.2+2.*uv.x*uv.x*uv.y*uv.y);
    uv += vec2(.5,.5);
    return uv;
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
    vec2 uv = fragCoord.xy / iResolution.xy;
    uv = screenDistort(uv);
    vec3 video = getVideo(uv);
    float vigAmt = 3.+.3*sin(iGlobalTime + 5.*cos(iGlobalTime*5.));
    float vignette = (1.-vigAmt*(uv.y-.5)*(uv.y-.5))*(1.-vigAmt*(uv.x-.5)*(uv.x-.5));

    video += stripes(uv);
    video += noise(uv*2.)/2.;
    video *= vignette;
    video *= (12.+mod(uv.y*30.+iGlobalTime,1.))/13.;

    fragColor = vec4(video,1.0);
}
like image 857
Clip Avatar asked Jul 06 '16 04:07

Clip


People also ask

Is fragment shader same as pixel shader?

A fragment shader is the same as pixel shader. One main difference is that a vertex shader can manipulate the attributes of vertices. which are the corner points of your polygons. The fragment shader on the other hand takes care of how the pixels between the vertices look.


1 Answers

I've written main() and included the Sprite equivalent variables of the ShaderToys variables at the bottom of my answer.

Setup

To apply a shader to your node you need to tell SpriteKit to attach a shader to the SKSpriteNode in the .fsh file.

  1. Create empty text file ending in .fsh for the shader code.

The swizzle 🌀

shader1.fsh

void main() {

vec4 val = texture2D(_texture, v_tex_coord);
vec4 grad = texture2D(u_gradient, v_tex_coord);

if (val.a < 0.1 && grad.r < 1.0 && grad.a > 0.8) {
vec2 uv = gl_FragCoord.xy / u_sprite_size.xy;
uv = screenDistort(uv);
vec3 video = getVideo(uv);
float vigAmt = 3.+.3*sin(u_time + 5.*cos(u_time*5.));
float vignette = (1.-vigAmt*(uv.y-5)*(uv.y-5.))*(1.-vigAmt*(uv.x-.5)*(uv.x-.5));

video += stripes(uv);
video += noise(uv*2.)/2.;
video *= vignette;
video *= (12.+mod(uv.y*30.+u_time,1.))/13.;

gl_FragColor = vec4(video,1.0);

} else {
 gl_FragColor = val;
}

} // end of main()
  1. Next, attach the shader in SpriteKit.

shader1.swift

let sprite = self.childNodeWithName("targetSprite") as! SKSpriteNode
let shader = SKShader(fileNamed: "shader1.fsh")
sprite.shader = shader

Explanation

  • The shader turns every pixel the color of the effect (screenDistort(uv)).
  • main() is the entry point.
  • gl_FragColor is the return.
  • For each pixel of the image this code is executed.
  • When the code executes it is telling each pixel that the color should be the color of the effect. The vec4() call has a r,g,b,a values.

ShaderToys variable names -> SpriteKit variable names

iGlobalTime -> u_time

iResolution -> u_sprite_size

fragCoord.xy -> gl_FragCoord.xy

iChannelX -> SKUniform with name of “iChannelX” containing SKTexture

fragColor -> gl_FragColor

Since you have the Sprite equivalent variables you can now easily convert these remaining methods that are above main().

float noise {}

float onOff {}

float ramp {}

float stripes {}

vec3 getVideo {}

vec2 screenDistort {}

Theory

Q. Why does main() contain texture2D and u_gradient, v_tex_coord?

A. SpriteKit uses textures and uv coordinates.

UV mapping

UV mapping is the 3D modeling process of projecting a 2D image to a 3D model's surface for texture mapping.

UV coordinates

When texturing a mesh, you need a way to tell to OpenGL which part of the image has to be used for each triangle. This is done with UV coordinates. Each vertex can have, on top of its position, a couple of floats, U and V. These coordinates are used to access and distort the texture.

SKShader Class Reference

OpenGL ES for iOS

Best Practices for Shaders

WWDC Session 606 - What's New in SpriteKit - Shaders, Lighters, Shadows

like image 178
Edison Avatar answered Oct 13 '22 11:10

Edison