Thursday, May 07, 2020

Rise of the TMNT clips

I was a Senior Rough Animator on "Rise of the Teenage Mutant Ninja Turtles". Here are a few clips from some episodes that I worked on:



Monday, March 02, 2020

Some Thoughts on Hand Drawn Animation and A.I. Generated In-Betweens.


I've been giving a lot of thought about AI driven in-betweens (IBs) that have been making the rounds lately on the web. A few of them show some animated loops taken from the Street Fighter video game series. The AI has generated IBs for these cycles which creates a far more smoother effect than was originally intended by the designers and animators. Of course, the source material is well known and beloved so it captures people's attention and consequently garners a negative outlook. Personally, I feel these AI enhanced loops are float-y and lack timing. It calls too much attention to itself and knocks down careful planing the animator has put into the work.


That said, I feel that this process can aid in the future of hand-drawn animation, and help keep it relevant in the face of ever-changing technology. This is a topic I've been giving a lot of thought about for years now, and these new AI generated GIFs are giving me more fuel for thought. To understand my reasoning, let's first take a look at our current standard of frame rates.


Currently in the USA, the standard for film projection is 24 frames per second. This is the baseline of how we perceive hand drawn animation. Of course, most of the time we work on 2s or 3s, but 1s is still not out of the question for when more fluid movement is required by an action.


If we take a look at broadcast in the USA, the frame rate is 30 frames per second (actually 29.97 but lets not get too wrapped up details here). We can still animate at 24 fps and get an acceptable result when interpreting the footage from 24 fps to 30 fps. However, there was once a time when video interlacing was an issue. Modern display technology has solved that with progressive scan displays. I can remember a time when progressive scan TVs were not common and in fact progressive scan was an expensive feature. Now, progressive scan is common place. Display tech marched on, and we all have better images for our eyeballs to feast on because of it.


I've seen comparisons to AI generated IB's to the interpolated frame rates on modern TV sets which sets a higher refresh rate for the image. This creates a "float-y" or "odd" look to not just animation but live action as well. I feel this is because the AI is inserting frames into footage where none existed before. Thus, we lack a certain level of timing that we are accustomed to. However, what if the footage is shot at 60 frames per second, and then played back at its appropriate speed? Do we feel it is "floaty" or "odd" then? I feel the answer is no. The footage is being played back at the speed it was intended and we perceive a more solid and defined frame rate.


It is not inconceivable that displays and projection technology will continue to advance to 60 fps and beyond. Heck, I just saw in my Google feed this morning that iD software boasts up to 1000 fps (yes one thousand) for DOOM Eternal arguing that future rigs will be able to get more bang for your gaming buck out of it. Like progressive scan, the move from SD to HD, and now to super affordable 4K and 8K television sets, the technological march to better and better display and projection tech will inevitably move forward.


Where does this leave hand drawn animation? How can we expect to fill that many frames? Of course, one answer is just to hold the frames for as long as need be, and that can certainly be a useful means of filling the gap so to speak. However, in more fully animated productions, that has it's limits. What if a 2D character needs to match a moving 3D element that runs along at 60 fps?

 I feel this is where AI driven IBs can help. To help illustrate this point, I'd like to bring up Netflix's "Klaus" and its revolutionary volumetric lighting on its characters. But first, let's look back to the late 80's an a similar production that required soft lighting on its characters.


In order to create the lighting needed to make the 2D animated characters match the live action footage in "Who Framed Roger Rabbit", animators had to painstakingly paint a series of mattes on cel by hand to create the desired effect. Needless to say, it was an amazingly time consuming and expensive process.


Today, an intelligent matte tool set that can snap to key frames based on nearby color values and can then interpolate the in-between mattes allowed the Klaus production to not only do it at a fraction of the time, but also allowed them to tweak the lighting color as needed. By no means am I downplaying the amount of skill, time, and effort that went into the Klaus production. The reverse! The tech allowed the artists to create an amazing and complex effect at scale that was not possible before in my lifetime. I think that's incredible!


Moving forward, how can AI generated IBs allow hand drawn animation to continue to thrive in a world where display and projector technology is rapidly changing? How can these controls be put directly in the animator's hands in a meaningful way so that we can get a desirable result? I don't know the answer to any of these questions. I'm not that smart! Additionally, I don't know if I'll ever see any of this in my life time, but I love thinking about the future of hand drawn animation. I think AI driven IB's has a real place in the industry as a tool to assist the hand drawn animator. We're just not there yet. It is in it's infancy, but it is fascinating to think about!