Tue Feb 24 02:09:35 EST 2015


I left OpenGL before the shader business.  A lot has happened in 10
years, so maybe time to rethink / relearn a couple of things.

Bascially, what I want to do is visualize audio data, as spectra,
waveforms or other kinds of control curves.  It seems more appropriate
these days to push the raw data arrays to the GPU and have it do


- Vertex shaders[2]

compute vertices based on vertices (and textures?)

- Pixel (Fragment) shaders[3]

compute pixels based on vertices and textures.

In OpenGL ES, all the core functionality is removed, so nothing can be
rendered without shaders[1].

[1] http://glslstudio.com/primer/#gl2prog
[2] https://www.opengl.org/wiki/Vertex_Shader
[3] https://www.opengl.org/wiki/Fragment_Shader