It looks like you're new here. If you want to get involved, click one of these buttons!
When investigating co-ordinate systems used by the app I noticed variables interpolated between the vertex and fragment shader have an unusual implicit conversion.
Let w and h denote the built-in Codea variables WIDTH
and HEIGHT
, respectively.
gl_Position
is (-w, -h) at the bottom left corner, (w, h) at the top right corner, and (0, 0) in the centre. I expected this to be (-1.0, -1.0), (1.0, 1.0), and (0.0, 0.0) (a.k.a. normalised device co-ordinates) respectively, but it's not a concern - it's quite sensible.
The thing I find troubling is that there is an implicit conversion from these 'semi-window' co-ordinates to normalised device co-ordinates between the vertex and fragment stages on variables that are not gl_Position
. The implication is that, for example, if I want to interpolate a 100% red vertex in the fragment shader, I have to set the color variable to (w, 0, 0) and, if I want to pass a 100% green vertex to the fragment shader, I have to set the color variable to (0, h, 0) so that the conversion maps the color values to unit range. This is totally surprising and unintuitive!
The following program demonstrates:
Main
function setup()
displayMode(STANDARD)
parameter.number("x", 1.0, 1024.0)
m = mesh()
m.shader = shader("Project:Foo")
m:addRect(0.5 * WIDTH, 0.5 * HEIGHT, WIDTH, HEIGHT)
print(WIDTH, HEIGHT)
local vertices = m:buffer("position")
for i = 1, 4 do
print(vertices[i])
end
end
function draw()
background(40, 40, 50)
m.shader.uColorScale = x
m.shader.uDimensions = vec2(WIDTH, HEIGHT)
m:draw()
end
Vertex shader
attribute vec4 position;
uniform vec2 uDimensions;
uniform highp float uColorScale;
varying lowp vec4 vColor;
void main()
{
vColor = uColorScale * vec4(position.x / uDimensions.x, position.y / uDimensions.y, 0.0, 1.0);
gl_Position = position;
}
Fragment shader
precision highp float;
uniform vec2 uDimensions;
varying lowp vec4 vColor;
void main()
{
if (vColor.x > 1.0 || vColor.y > 1.0) {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
} else {
gl_FragColor = vColor;
}
}
See attached screenshots for output.
I'd like to know whether there is a way around this that doesn't involve scaling by the window size.
Comments
Update: It turns out the output of
gl_Position
should in fact be normalised device co-ordinates and not screen co-ordinates. It seems during investigation I have managed to thoroughly confuse myself!One source of confusion comes from the default setting for
projectionMatrix()
which is equivalent toortho(-WIDTH, WIDTH, -HEIGHT, HEIGHT)
with a max z value of 10.0.Another is that the square in the shader lab follows this default projection convention but with different dimensions. The square itself is defined in terms of this co-ordinate system which makes the live output a bit useless if you want to work in a different co-ordinate system. This can be worked around although it would be better if one could configure the shader lab, i.e. set the input vertices and size of live preview area.
In the regular editor, one can switch to NDC with
ortho(-1.0, 1.0, -1.0, 1.0)
, bearing in mind the max z value is still 10.0.