Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Reproducing Blender animations in Codea (Now with code link, new video)

edited May 2015 in Code Sharing Posts: 2,020

Check this out:

I've animated a simple walk cycle from Blender in Codea! This dude has 32,000 vertices, so not exactly low-poly (I think you could get a fully articulated figure with 1 to 2,000 verts and it still look really good). It's a model I found online, I wanted to check I could get this working in Codea before spending time creating a low-poly model.

I animated the walk cycle myself in Blender. I only spent a few minutes on it, and this is my first shot at animating a figure, so it could probably use some work, in terms of getting a fluid and believable effect.

I'm pretty happy with the performance in Codea though. I can animate 3 copies of him at once and get 60 fps on the iPad Air (so that's 100K vertices being animated). If I add a fourth copy, the fps drops to 45 or so. This bodes well, I think. If your characters were around 1500 verts each, you could have around 60 dudes running round at 60 fps I reckon.

I'll post code and explanations at some point, if people are interested. Most of the work was the importer, which is a modified version of @Ignatz 's obj importer.

Whilst I'm cleaning up the code, anyone want to guess how I did it? :-\"

Comments

  • Well, after researching and furiously scrawling cryptic notes all over hundreds of papers and making one of those pinboards with the strings connecting important points Ive come to the conclusion that you did this with Wizardry. It is the only reasonable explanation.

  • IgnatzIgnatz Mod
    edited May 2015 Posts: 5,396

    What I did to animate my Blender model was to create several separate frames and use Codea to interpolate between them

    Here: http://codea.io/talk/discussion/5946/rigging-3d-models-in-codea-first-animation

  • edited May 2015 Posts: 2,020

    @Ignatz haha, same here! I didn't realise you had done that, did you post that? I saw the post you did where you essentially built an entire posing and rigging system, with inverse kinematics etc (which is immensely impressive of course), this one:

    http://codea.io/talk/discussion/5946/rigging-3d-models-in-codea-first-animation

    There's also @spacemonkey 's skeletal shader here:

    http://codea.io/talk/discussion/2901/skeletal-animation-shader

    And @matkatmusic , in your 3d rigging thread, discusses the md5 format as another possibility.

    I figured I would try frame interpolation first, as that seemed easiest. I didn't realise you had done frame interpolation as well. Would you be interested in comparing code?

    I've not looked into md5 format yet. If it was just something relatively "simple", like a list of cascading bone transforms, then it could be possible to write an importer and a runtime for it, something similar to @spacemonkey 's shader, where you have an array of transforms in the vertex shader. But if it was something more complex, involving bone weights, real-time kinematics in the runtime etc, that would be immensely complex to code I think, and I doubt it would be practical to use in a game from a performance point of view.

  • IgnatzIgnatz Mod
    Posts: 5,396

    @yojimbo2000 - I doubt my code would make much sense, there is a lot of it

    As I recall, I used joints as the key elements, like Blender, and simply interpolated between their positions in successive frames, nothing fancy

  • Posts: 2,020

    Yeah, my code is also a complete mess just now, it'll take a day or two to write something up.

  • IgnatzIgnatz Mod
    Posts: 5,396

    I wrote mine up here.

  • Posts: 2,020

    One thing I need to investigate: at the moment each point is interpolated between 2 key frames in a linear mix (just using the GL SL mix command actually). Because the above animation only uses 4 key frames, it does look a little, well, robotic (not so much from the front, but from the side it's more obvious). So in this case as the model is in fact a robot it's quite appropriate, but I was thinking I could interpolate using some kind of curve (cubic maybe?), either between 2, or even 3 key frames. There are hundreds of curve algorithms out there, can anyone recommend one that's good for key framing and is cheap to implement (as it would be calculated on the vertex shader 100,000 times per frame)?

  • Posts: 2,020

    I've just read that GL SL smoothStep does hermite cubic interpolation, I'll try that first....

  • Posts: 2,020

    OK, I've added a Catmull Rom spline that interpolates between all 4 keyframes in the vertex shader, and now the walking motion is silky smooth! B-) I'll try to do a side-by-side video, and maybe throw in how the animation looks in Blender for comparison purposes.

  • IgnatzIgnatz Mod
    Posts: 5,396

    Now you're just showing off ;)

  • Posts: 2,020

    @Ignatz who, me? :-\"

    Here's a video comparing linear interpolation to Catmull-Rom. In the side-view Catmull-Rom is noticeably smoother:

  • IgnatzIgnatz Mod
    Posts: 5,396

    Awesome!

  • Posts: 2,020

    OK, the explanation for this is quite involved, so.... I've decided I'm going to blog it. The introductory post is here, comments and criticism is welcome:

    https://puffinturtle.wordpress.com/2015/05/14/animating-a-3d-blender-model-in-codea-part-1/

    Part 2 will follow shortly, with the code.

  • IgnatzIgnatz Mod
    Posts: 5,396

    @yojimbo2000 - very interesting, keep going!

    My approach is much closer to the way Blender does it.

    1. I broke the Blender model into pieces (separate upper arms lower arms, upper legs, lower legs, feet, head, body) in Blender, which was quite messy.

    2. In Codea, I "fitted" a joint to each mesh, essentially a start and end position (and the joints linked together would form a stick figure if you drew them).

    3. I could then move and rotate the joints (within constraints), cascading down the limb, so a rotation of the hip affected the knee and ankle.

    Then I set the joint positions for the animation keyframes, and interpolated between them.

    This runs pretty fast, and is flexible, in that you can create dynamic new positions, hold objects, etc, but there is a fair bit of setup work in breaking up the model and fitting the joints, and creating positions is not easy either.

    I see you have large bulging joints like I did, so that you don't notice the mesh getting distorted or tearing as the joints rotate. Animating a superhero with a skin tight costume must be pretty difficult!

  • Posts: 2,020

    Actually, I think keyframe interpolation would be able to handle a skin-tight costume very well. The reason why the joints appear to tear is because you're not taking into account weighting. Each vertex is affected by the bone to varying degrees, and can also be influenced by neighbouring bones. ie a vertex isn't necessarily just skinned to one bone. Have a look at this image of the weighting. You can see that the neighbouring bones also influence the vertices around the joints, which is what keeps the joints round throughout the animation:

    bone weighting

    So the advantage of keyframe interpolation is that you don't have to try to recreate the weighting/ skinning system in a Codea importer and runtime. I can see that you're sceptical (scepticism is good!), I'll try to see if I can work up an example with a much more organic model. Maybe a T-Rex or something.

  • Posts: 455

    The model in my bones example was originally intended to do this, but the mesh was pretty rubbish with few vertices. But conceptually you have you bones to vertex buffer where you can store the weights to a couple of bones. Then it's a single mesh for the whole model, and pass in the set of "bones" matrices to the shader at each call.

    My code is very messy, mainly because at the time I couldn't pass a single array of bones to the shader, I had to pass a uniform for each individual bone. I'm not sure if Codea has moved forward on that element or not...

  • IgnatzIgnatz Mod
    Posts: 5,396

    I separated the meshes so I could rotate them individually with no need to manipulate individual vertices. This is fast and doesn't require any weight calcs, but creates gaps between meshes as they rotate, but as long as you have bulky joints, it's not obvious.

  • Posts: 2,020

    @spacemonkey no, shader language 1.0 doesn't allow vertex attributes to be arrays. So rather than attribute vec3 position[4] you have to write

    attribute vec3 position
    attribute vec3 position2
    attribute vec3 position3
    attribute vec3 position4
    

    I'll post my code soon.

  • edited May 2015 Posts: 2,020

    As promised, something a little more organic. I give you:

    CAPTAIN CODEA!

    This guy is 5900 vertices, so around a sixth the number of verts as the robot in the first video.

    One thing I've discovered: animating a run cycle is way harder than a walk cycle. I think you'd need more than 4 keyframes, because the extremes of the motion, head at highest point, arms and legs at full extent etc, don't all coincide like they do with a steady walk.

  • IgnatzIgnatz Mod
    Posts: 5,396

    B-)

  • Posts: 2,020

    Part 3 is up, with discussion of how spline interpolation methods can be used on the vertex shader:

    https://puffinturtle.wordpress.com/2015/05/16/animating-a-3d-blender-model-in-codea-part-3/

    comments, criticism, questions welcome.

  • Posts: 257

    wonderfull, awesome :)>-
    it makes me think the marches Cannes
    it's very very fast
    60 FPS with 1 character
    18 FPS with 60 characters, we are in the matrix
    ^:)^
    i like panz but why there is no pany ?
    i find hands are little big and i don't see eyes
    i imagine climbing stairs
    very very good work for FPS Games
    Thanks for sharing

  • Posts: 257

    especially fighting games

  • Posts: 257

    with ipad air and ios 8.1 last codea
    limit 16 characters with 60 FPS ( wich is more than enough )
    limit 30 characters with 30 FPS
    and 44 characters with 24 FPS

  • IgnatzIgnatz Mod
    edited May 2015 Posts: 5,396

    @yojimbo2000 - I thought you might be able to do something like this in the vertex shader

    //all the uniform, varying declarations as before
    
    //before main, do this
    //it will be executed before OpenGL loops through the vertices, ie just once
    //create a function for each position buffer
    vec4 P1() {return position;}
    vec4 P2() {return position2;}
    //etc
    
    //assign the function to a pointer
    vec4 Point;
    if (no==1) Point = P1;
    else if (no==2) Point = P2;
    //etc
    
    //then in the main function, for each vertex
    //just use Point() to get the position value
    

    The idea is that if you can set a pointer to one of the 4 position uniforms, on entering the vertex shader and before looping though the vertices, you only do the if tests once.

    The problem is that it doesn't work (NB not just because my example above may have typos). It doesn't seem possible to assign a function pointer inside a shader.

    EDIT - pointers don't work in OpenGL ES, unfortunately.

    I got excited when I read about OpenGL "subroutine" functions that specifically do the above, but they are for OpenGL 4.0+.

  • IgnatzIgnatz Mod
    Posts: 5,396

    @yojimbo2000 - there's another kludge you can try, which will work, which is to have a different version of your vertex shader for each animation position, and assign the correct one at each frame. That should be very fast.

  • IgnatzIgnatz Mod
    Posts: 5,396

    With respect to multiple textures, the problem with having to combine them into a single spritesheet is, of course, that all the texture coordinates are messed up.

    There is a way round this, which I wrote about here. It allows you to define your texture coordinates as though they were separate images, and all you need to do is pass through a vec4 to the shader to give it the offsets on the spritesheet.

  • Posts: 2,020

    @Ignatz thanks for those suggestions. Yeah it's a shame about pointers not being possible in SL1.0.

    I don't think the shader switching suggestion would work though, because the data that is in the custom attribute buffers for the extra frames seems to get erased when you switch shader (even if the shader you're switching to has the same custom buffers defined). I tried switching between the linear and spline shader on the fly, and it produced very strange results.

    Also, I think it would be useful being able to interpolate between two arbitrary frames. ie, say if the player presses the kick button mid-way through the walk-cycle, and in the interest of quick response, you don't want to wait until the end of the walk cycle but launch the kick straightaway: so you blend straight from walk-frame 2 into kick-frame 1 or whatever. So I think it's good not to have the actual transitions hard-coded into the shaders, because you would end up with too many permutations.

    btw, is there any point using elif if you have a series of if (x==1) return x1;? My guess is no, because if the condition is true, you return without executing the rest of the conditions, right?

  • Posts: 2,020

    @hpsoft thanks, I'll share the blender model soon. It's not mine, but it's public domain so should be OK. If I were modelling it myself, for most games I wouldn't bother with individual fingers on the hand (just go with "mittens"), and would put more vertices on the head. It's still 5904 verts (mid-poly?) but the head looks quite lo-poly. I don't think it distributes the verts very well.

  • IgnatzIgnatz Mod
    Posts: 5,396

    @yojimbo2000 - I think the way you've done it is fine. However, if you're going to include multiple animations in there, not only will you have bucket loads of vertex positions, but more if statements to decide which animation to use. I don't think it's going to scale well.

  • Posts: 2,020

    @Ignatz yes the lookup functions are going to get long. And I need to check if there's a limit on the number of attribute buffers you're allowed. I was thinking about writing a kind of shader printer, similar to the one you made for different types of lighting. It would create shader strings with enough attribute variables for the number of key frames that the model requires.

  • Posts: 2,020

    Ok, regarding scaling this technique: apparently GLES 2.0 on iOS only allows 16 vertex attributes. I've tested this, and as soon as you declare more than 16 there's no shader compile error, but setting the attribute stops working. I tried setting gl_MaxVertexAttribs to a higher number, but you get his error: ERROR: 0:2: Regular non-array variable 'gl_MaxVertexAttribs' may not be redeclared

    So, that means a maximum of 7 key frames with this method (7 positions + 7 normals + 1 colour + 1 texCoord = 16 attributes).

    You could try packing two vectors into one variable (placing them either side of the decimal point, unpacking them with modf). First I'm going to experiment with setting the entire vertex table every few frames

    It's one more factor in favour of GLES 3.0 as a Codea feature request.

  • IgnatzIgnatz Mod
    Posts: 5,396

    @yojimbo2000 - or you could try the superior (#ahem#) method of using bones 8-X :))

  • Posts: 2,020

    I don't fancy trying to write an importer for one of the formats that export the bone data and animation though. I did find an importer for the .fbx ascii format written for LÖVE, so that could be a candidate. By the way, now that we have UTF-8 support in Lua 5.3, does that mean that we could write parsers for files in xml format?

  • edited May 2015 Posts: 502

    you could parse xml before

  • Posts: 2,020

    It turns out that uploading new frames to the mesh buffers isn't as expensive as I thought it would be, and can be done on the fly. I reckon the spline interpolation is still best done in the shader, on the gpu. Here's a rig with a couple of different actions, the walk cycle, and a high kick. The action's frames are loaded into the shader when the action starts.

    I'm still cutting my teeth when it comes to animation, and my research into martial arts didn't stretch beyond googling an image of a karate kick.

    I'll blog this at some point.

    Captain Codea discovers his violent side:

  • Posts: 2,020

    OK, the code for the above video is now up, along with an explanation in the blog:

    https://puffinturtle.wordpress.com/2015/05/26/animating-a-3d-blender-model-in-codea-part-4/

    Up next: a toon shader!

    toon

  • IgnatzIgnatz Mod
    Posts: 5,396

    It's hard work, isn't it? I can't imagine creating a full sequence of actions.

    And that is just one character!

    It struck me that when creating enemy characters, most of the time they are running straight at you, so you don't really need 3D for that. A spritesheet would do.

  • Posts: 688

    @Ignatz - a lot depends on the game your trying to create, but the bill board approach (like you suggested) worked brilliantly on Doom (and all the other games of the same era). I bet Codea could handle a Doom clone without too much trouble.

  • Posts: 2,020

    it depends also on the level of realism of whatever art style/ game genre that you're going for. ie the original Prince of Persia had all sorts of ways that you could interact with the environment, and went for full rotoscoped realism. That would be an insane amount of work, no matter whether you were implementing it in 2D or 3D. But more arcade-y games have way less animation. Something like Oceanhorn for instance, the characters just have walk/ attack/ stand animations (ie the same three actions as above), and that's it. And it's one of the most gorgeous, graphically-acclaimed games on iOS.

    Billboarding would work fine for a retro FPS, but it's not any substitute for this kind of fluid animation.

    I agree that getting assets out of one package into another is always a somewhat laborious round trip (this is equally true with 2D assets though, like a spritesheet).

    On the plus side though, the animation editor is actually one of the nicest and more intuitive parts of the Blender interface (not that that's saying much). A walk cycle takes only a few minutes to create. Blender, when you take into account that it's free, multi-platform, and has enormous support in terms of tutorials, models online, import/export scripts etc, is the best tool for the job as far as I can see (even if the UI is irritating at times). I know of no comparable system for 2D animation (the $60 version of Spine is probably closest).

    The alternative to the Blender/Spine skeleton-rig approach is old-school, hand drawn animation with onion-skinning etc. That's hard work, and requires actual animation skills (which I don't have). If I were creating a strictly two-D game with fluidly animating characters, with the tools and skills available to me, the easiest way would be to project the kinds of 3D animations shown above orthogonally, rather than trying to create a spritesheet.

  • Is there a definitive link to the most current project code? There are lots of links in the post and I'm not sure which of them is the latest, if any.

  • @yojimbo2000 Ok I'm dumb, it's totally easy to figure out how to get the latest code.

    Dayummmmmm this is sweet. Has anyone used it in a game or project?

Sign In or Register to comment.