It looks like you're new here. If you want to get involved, click one of these buttons!
I'm having a very strange issue.
I'm working on an explosion shader. It's inspired by @Andrew_Stacey 's 2D explosion shader here,
http://codea.io/talk/discussion/2257/exploding-mesh-with-shaders
but it's for 3D models. It disintegrates each of the faces along a trajectory derived from the face normal (unlike Andrew's, I assume that the model is already multi-faceted, I don't further subdivide the faces).
Now, the weird thing is, the shader was actually coming along nicely within a large project I'm working on, when I decided to export it to a minimal working example (MWE) to work on it further (and to share it). And, ironically enough, I can't get the custom attribute buffer to work at all in the MWE. In the parent code, it works about 70% of the time, but does sometimes produce this error.
It's probably something really simple, and I've just been staring at it too long, but I can't work out what I'm doing wrong at all.
The bit that triggers the error is when I try to index the custom buffers. When this code runs:
local origin = m:buffer("origin")
local trajectory = m:buffer("trajectory")
origin:resize(#verts) --shouldn't be necessary. Triggers error.
trajectory:resize(#verts)
for i=1, #verts do
origin[i] = ori[i] --triggers error
I get:
error: [string "explode = {}..."]:9: attempt to index a nil value (local 'origin')
The mesh has been set up, it has vertices, colours, normals (so the custom buffers shouldn't need resizing), it has a shader that compiled without error, which uses the custom attribute.
What have I forgotten??
The full MWE is here (it borrows code from @Ignatz , @Andrew_Stacey , and whoever made the Icosphere function):
https://gist.github.com/Utsira/463f656b56fd38ad5b68
I'd be very grateful if someone could put me out of my misery.
Comments
I've figured it out. It was an issue in the plumbing between the vertex and the fragment shader. The latter was expecting a variable
vNormal
but the former was supplying one calledvDirectDiffuse
instead. Weird though that it results in an error in a different attribute. I'll post the working version soon, as there's some other elements I'd like help with.Here's a video. Code is on it's way. Does the resolution of the video look really terrible to you? For some reason the stuff I upload to YouTube always ends up looking awful.
I was just about to post the solution ... but you found it yourself.
As the video quality is so crappy, here's an image:
The gist link above has been updated with functioning code. The next challenge... gravity (I'll have to refer back to Andrew's 2d shader I think...)
@yojimbo2000 - looking good!
I've updated the code at the gist again so that it now has gravity. The gravity and friction code is adapted again from Andrew's shader. One tricky thing is working out which way is down as the model is rotating. To work out what the world gravity vector is I load the inverse of the model matrix into the shader. Here's the updated vertex shader. Suggestions and criticisms welcome.
Sorry code above is bit of a mess, transitioning from goraud (per vertex) to phong (per fragment) lighting, that's why there's bits of unused lighting code still clogging it up. That's what got me into trouble in the first place.
I suppose really now that each face is double-sided I should check for non front-facing fragments in the frag shader and invert the normal....
I'm not sure back-face normal inverting makes any difference, as far as I can see. I've read that this is the "correct" way to do double-sided faces. Has anyone tried this?
Change these lines in the fragment shader :
Wrt gravity, it might be a little simpler to pass the shader a uniform of the vector (0,-1,0) multiplied by modelMatrix, to tell it which way is down.
Yes, that's a good idea, thank you. Rather than do multiplying by the inverse modelmatrix on every vertex, I can just do that calculation once, and pass the result to the shader.
I should also point out, if anyone's interested in using the code in their own projects, that it's for a Z-up orientation (I just like working in Z-up). This means that because I'm only bothering to do a 2D rotation (as the angle computation is much easier, and I was adapting a 2D source), each fragment rotates around the model's up axis (so the fragments appear to change size, and catch the light as they rotate). I think it looks good, and I'm not going to bother adding code to compute rotations around the other axes of each fragment.
Ok, trying to implement @Ignatz 's suggestion above, and I've got it working, but something rather expected has happened:
For some reason, when doing this calculation outside of the shader, the down vector is on w, not z. Why?? This is replacing these lines:
How come down shifts from z to w when the calculation is moved to outside the shader?
@yojimbo2000 - I guess one reason is that this doesn't work
I'm no matrix expert, so I have no immediate explanation.
However, I think a better approach is this
1. pass modelMatrix to the vertex shader
2. apply it to the current
position
attribute to get the world position3. compare that with your gravity vector (0,0,-0.05)
This is a little more work for the shader, but it should work, and also avoids inverting the model matrix, something our resident mathematician says is a Very.Bad.Idea (possibly because not all matrices will invert).
2. apply it to the current position attribute to get the world position
3. compare that with your gravity vector (0,0,-0.05)
with that approach you'd then also need to create your own viewProjection matrix to use instead of the built-in modelViewProjection. Something like this perhaps:
Damn. So how do you translate a world coordinate into local space? What does your
getLocalPoint
function look like?@yojimbo2000 - Based on previous Codea discussions, converting from world space to object space is done like this
Maybe OpenGL's matrix multiplication works different than Codea's? In the test below I get much better results with
modelMatrix:inverse()
than you reported above. The result is within a rounding error. But if I change it tomodelMatrix:inverse():transpose()
the result is totally wrong.Yes, when trying to reverse modelMatrix in Codea, I probably should have put the vector on the left, which Codea doesn't support.
I haven't done much conversion from world to object, so I'm not sure.
Interestingly, inverting modelMatrix transposes the rotation part and makes the translation part negative..
Ahem, matrices, ahem.
(Might need a little updating - I need to look up if newer versions of Codea introduced new methods for combining vectors and matrices.)
@Ignatz OK, I didn't realise that you could multiply a vector on the left of a matrix in openGL. The code you posted above
m.shader.invModel=modelMatrix():inverse():transpose() ... v = v * mInvModel;
works. I was just hoping that I could convert the world vector into a model vector outside of the shader and then pass it in to the shader, on the assumption that doing this once in Codea would be faster than doing it, say, several thousand times in the vertex shader (although, the GPU does multiplication so blindingly fast, maybe it makes no difference?). Generally (as the code above shows)localCoord = modelMatrix():inverse * worldCoord
, seems to work acceptably well in Codea (out by 0.00001). But for some reason I haven't been able to fathom yet, something peculiar happens when I try to pass that converted world coord into the shader, and I've found that the value for z needs to be placed in w, which just seems wrong.@LoopSpace thanks for that link, wonderfully informative as ever. One thing that I think has changed is that you say that Codea doesn't support multiplying a matrix by a vector, which it does do now (though only with the vector on the right). Apologies if I've misunderstood.
Codea always multiplied on the right, but it never has done on the left
Codea does now do matrix times vector, but it's weird. You can do
m * v
wherem
is a 4x4 matrix andv
is avec3
and you get the result of the transformation described at the bottom of this section of my matrices description, specifically the transformation[x y z] -> [x'', y'', z'']
. That isn't actually all that useful as it is useless for, say, doing anything with normals. It also means that while you write it as multiplication on the right, it is actually doing multiplication on the left. I haven't investigated what this means for associativity, but it might be weird.You can also multiply a matrix by a
vec4
, but that's just odd. Try it; if you can discern any logic to it then you're better than me!Ok, I think I've worked out how to pass the world vector for gravity into the shader. In the shader, if you set w to zero, ie
vec4(gravity, 0.)
then it works. I'll post the working code tomorrow.@LoopSpace - can you explain this strange result?
@yojimbo- I think this is the strange thing you were talking about, how z and w swapped around - although (even worse) z is zero above!
@Ignatz that is weird. I get the same result. You get the correct results if you use a custom mat * vec function, eg:
I assumed actually that the built-in codea mat*vec function resembled the above, but it seems it's doing something a bit different. Is this a bug? Or just some alternative way of doing matrix multiplications that I don't know about?
I often use a 3x3 version of the above function (or 2x2 in 2D) if I just want to get the rotation of the matrix, not the translation or scale. I can't remember where I got this from. Probably from one of you two.
The gist linked to above now has the updated version. World gravity is transformed to local gravity in Codea and passed to the shader:
In the shader, the w of gravity is set to zero:
I guess this is necessary because, like a normal, gravity is a relational vector, rather than one describing a fixed point.
That's a bug, I think
And yes, directions have a w=0, while points have w=1
That's the behaviour I was referring to. There should be a mat4 x vec4 method and it should be right. I don't think I ever got round to filing a bug report, though. Any volunteers?
It's also a bit weird that Codea writes matrix multiplication on the left (
m * v
) but the effect is multiplication on the right (v * m
). It would break associately, except that I don't know if I would expect associativity with mat4 x vec3 as it's not a natural operation.I think that in my own code then i overwrite the matrix x vector multiplication with the correct one.
I get very confused by this on the left/ on the right. So is this multiplication on the right?
@yojimbo2000 The crucial difference is whether you regard vectors as rows or columns. Then when you write a matrix, depending on your convention either the rows or the columns are where the standard vectors end up. If rows then you need to do
v * m
and if columns you need to dom * v
. It also has an effect on composition. If you want to doA
and thenB
, is the resultA * B
orB * A
? If rows, then it isA * B
and if columns then it isB * A
.Codea / OpenGL is always column matrices isn't it?
So, how should we word the bug report? Is there something more specific we can say than "Mat * vec4 is sometimes weird"?
No, OpenGL is row matrices, contrary to just about everyone else in the world.
That seems a reasonable bug report! Could say that
mat4 * vec4
should be the correct matrix multiplication. For bonus marks, it should be possible to dov * m
andm * v
and have them come out correctly (ie differently).I already reported the bug to Simeon. I think my code example above is clear enough.
Thank you both.
I've added it to the issue tracker. Feel free to vote for it!
https://bitbucket.org/TwoLivesLeft/core/issue/362/mat4-vec4-produces-odd-results
Simeon said he'll look at it
@yojimbo2000 i tried to add a texture to the disintegration shader with
M.texture=image but did not succeed. Any suggestions?
@piinthesky yeah, the shader in the code is for objects without textures. I'm away from my iPad so can't test this, but the below should work for textured objects. Let me know if it throws errors:
@yojimbo2000 i did not manage to make it work. No error message, but does not do what i hoped. I set the m.texCoords=verts is that right?
no, m.vertices is for verts, m.texCoords is for texture positions
@yojimbo2000 If i use verts = m:buffer("position") to get the vertex positions. I don't seem to be able to use #verts to count the number of vertices. Any ideas why that does not work?
No, don't use a buffer for position. Buffers are only for custom attributes, not the built-in ones that the Codea api sets up for you (vertices --> position, texCoords, normals, colors). Use
myMesh.vertices
or thevertex
method to access the position attribute.To get the size, you can use
myMesh.size
or#myMesh.vertices
Ahhh, thanks
@yojimbo2000 - that's not right - buffers are available for all the built in attributes, see the examples in the reference
https://codea.io/reference/Shaders.html#buffer
Oh ok. Not sure why you'd want to for the built-in attributes.
That's so you can change them dynamically, eg if you have a set of separate objects in a single mesh and need to change their positions each frame
Sure, but you can do that by referring directly to the built-in
myMesh.vertices
table, or using the vertex command. I don't see any need to set up another buffer variable.you might want to check comparative speed. My understanding is that buffers allow you to change values "in place" which should be faster