How can I do these image processing tasks using OpenGL ES 2.0 shaders? -
how can perform following image processing tasks using opengl es 2.0 shaders?
- colorspace transform ( rgb/yuv/hsl/lab )
- swirling of image
- converting sketch
- converting oil painting
i added filters open source gpuimage framework perform 3 of 4 processing tasks describe (swirling, sketch filtering, , converting oil painting). while don't yet have colorspace transforms filters, have ability apply matrix transform colors.
as examples of these filters in action, here sepia tone color conversion:
a swirl distortion:
a sketch filter:
and finally, oil painting conversion:
note of these filters done on live video frames, , last filter can run in real time on video ios device cameras. last filter pretty computationally intensive, shader takes ~1 second or render on ipad 2.
the sepia tone filter based on following color matrix fragment shader:
varying highp vec2 texturecoordinate; uniform sampler2d inputimagetexture; uniform lowp mat4 colormatrix; uniform lowp float intensity; void main() { lowp vec4 texturecolor = texture2d(inputimagetexture, texturecoordinate); lowp vec4 outputcolor = texturecolor * colormatrix; gl_fragcolor = (intensity * outputcolor) + ((1.0 - intensity) * texturecolor); }
with matrix of
self.colormatrix = (gpumatrix4x4){ {0.3588, 0.7044, 0.1368, 0}, {0.2990, 0.5870, 0.1140, 0}, {0.2392, 0.4696, 0.0912 ,0}, {0,0,0,0}, };
the swirl fragment shader based on this geeks 3d example , has following code:
varying highp vec2 texturecoordinate; uniform sampler2d inputimagetexture; uniform highp vec2 center; uniform highp float radius; uniform highp float angle; void main() { highp vec2 texturecoordinatetouse = texturecoordinate; highp float dist = distance(center, texturecoordinate); texturecoordinatetouse -= center; if (dist < radius) { highp float percent = (radius - dist) / radius; highp float theta = percent * percent * angle * 8.0; highp float s = sin(theta); highp float c = cos(theta); texturecoordinatetouse = vec2(dot(texturecoordinatetouse, vec2(c, -s)), dot(texturecoordinatetouse, vec2(s, c))); } texturecoordinatetouse += center; gl_fragcolor = texture2d(inputimagetexture, texturecoordinatetouse ); }
the sketch filter generated using sobel edge detection, edges shown in varying grey shades. shader follows:
varying highp vec2 texturecoordinate; uniform sampler2d inputimagetexture; uniform mediump float intensity; uniform mediump float imagewidthfactor; uniform mediump float imageheightfactor; const mediump vec3 w = vec3(0.2125, 0.7154, 0.0721); void main() { mediump vec3 texturecolor = texture2d(inputimagetexture, texturecoordinate).rgb; mediump vec2 stp0 = vec2(1.0 / imagewidthfactor, 0.0); mediump vec2 st0p = vec2(0.0, 1.0 / imageheightfactor); mediump vec2 stpp = vec2(1.0 / imagewidthfactor, 1.0 / imageheightfactor); mediump vec2 stpm = vec2(1.0 / imagewidthfactor, -1.0 / imageheightfactor); mediump float i00 = dot( texturecolor, w); mediump float im1m1 = dot( texture2d(inputimagetexture, texturecoordinate - stpp).rgb, w); mediump float ip1p1 = dot( texture2d(inputimagetexture, texturecoordinate + stpp).rgb, w); mediump float im1p1 = dot( texture2d(inputimagetexture, texturecoordinate - stpm).rgb, w); mediump float ip1m1 = dot( texture2d(inputimagetexture, texturecoordinate + stpm).rgb, w); mediump float im10 = dot( texture2d(inputimagetexture, texturecoordinate - stp0).rgb, w); mediump float ip10 = dot( texture2d(inputimagetexture, texturecoordinate + stp0).rgb, w); mediump float i0m1 = dot( texture2d(inputimagetexture, texturecoordinate - st0p).rgb, w); mediump float i0p1 = dot( texture2d(inputimagetexture, texturecoordinate + st0p).rgb, w); mediump float h = -im1p1 - 2.0 * i0p1 - ip1p1 + im1m1 + 2.0 * i0m1 + ip1m1; mediump float v = -im1m1 - 2.0 * im10 - im1p1 + ip1m1 + 2.0 * ip10 + ip1p1; mediump float mag = 1.0 - length(vec2(h, v)); mediump vec3 target = vec3(mag); gl_fragcolor = vec4(mix(texturecolor, target, intensity), 1.0); }
finally, oil painting generated using kuwahara filter. particular filter outstanding work of jan eric kyprianidis , fellow researchers, described in article "anisotropic kuwahara filtering on gpu" within gpu pro book. shader code follows:
varying highp vec2 texturecoordinate; uniform sampler2d inputimagetexture; uniform int radius; precision highp float; const vec2 src_size = vec2 (768.0, 1024.0); void main (void) { vec2 uv = texturecoordinate; float n = float((radius + 1) * (radius + 1)); vec3 m[4]; vec3 s[4]; (int k = 0; k < 4; ++k) { m[k] = vec3(0.0); s[k] = vec3(0.0); } (int j = -radius; j <= 0; ++j) { (int = -radius; <= 0; ++i) { vec3 c = texture2d(inputimagetexture, uv + vec2(i,j) / src_size).rgb; m[0] += c; s[0] += c * c; } } (int j = -radius; j <= 0; ++j) { (int = 0; <= radius; ++i) { vec3 c = texture2d(inputimagetexture, uv + vec2(i,j) / src_size).rgb; m[1] += c; s[1] += c * c; } } (int j = 0; j <= radius; ++j) { (int = 0; <= radius; ++i) { vec3 c = texture2d(inputimagetexture, uv + vec2(i,j) / src_size).rgb; m[2] += c; s[2] += c * c; } } (int j = 0; j <= radius; ++j) { (int = -radius; <= 0; ++i) { vec3 c = texture2d(inputimagetexture, uv + vec2(i,j) / src_size).rgb; m[3] += c; s[3] += c * c; } } float min_sigma2 = 1e+2; (int k = 0; k < 4; ++k) { m[k] /= n; s[k] = abs(s[k] / n - m[k] * m[k]); float sigma2 = s[k].r + s[k].g + s[k].b; if (sigma2 < min_sigma2) { min_sigma2 = sigma2; gl_fragcolor = vec4(m[k], 1.0); } } }
again, these built-in filters within gpuimage, can drop framework application , start using them on images, video, , movies without having touch opengl es. code framework available under bsd license, if you'd see how works or tweak it.
Comments
Post a Comment