![]() The camera controller interprets the different “shot” types into spatial coordinates and animates the camera based on the minimal info provided.įor example, here’s how a camera ‘orbit’ movement is described. There are three main controllers responsible for a performance - Camera, Animation and Lighting. I saved these animations to a database and generated a JSON file that could be referenced by the front-end code. Here’s a video of me quickly demonstrating my animation tool. I thought I’d need a multi-touch tool to capture animations but once I started defining them I realized I could pose each part individually to bring the animations together step-by-step. I wanted minimal data but natural movement. This is essentially all the data required to render a performance of one song. There’s not much to it considering how much work it’s doing. Here’s the final data file for the arrangement. I wrote scripts to convert the data from this spreadsheet into something I could work with in code. I needed to map out a song structure to define all the camera movements, lighting and animations I wanted. Wasn’t too much work when I based it on a lathe primitive. I modelled the lighting enclosures and rigging. Lighting research and chosen colour scheme I did some research and picked a style and some colors. To make a compelling stage show I first needed to learn how to light a stage. Texture atlas for the band members Lights The four band member textures are in one atlas and the only thing different between each band member is its vertical offset in the atlas. I attached face layers afterwards - note the transparent areas above and below to allow protruding beards and suchlike. I constructed these characters with a minimum of code specifying their layout - they’re just box primitives really. I think Barry’s feet gave me the most trouble Barry’s feet I decided to make my band up of three other characters: a frog on synths, a mandrill called Barry on bass and a Minecraft cow on drums. My sister made me a little cardboard cutout a few years ago so I started with her artwork. Modelling the band Original Michael Forrest cubee by Ann Forrest I described the coordinate offset for each element in the atlas and used the same data to map the textures onto the 3d models. Texture atlas for all the instruments I created - laid-out by hand dae files or binary file formats (well - apart from the images…).Īll these textures went into a big hand-made texture atlas. Photograph vs my drawing on pixel gridĪs you can see everything is hand-drawn and hand-coded. Here’s a close up of the modular synth artwork. I wanted these things to be printable so I created detailed vector artwork for every instrument. This meant I could describe an instrument with a small amount of code. ![]() I defined most of my models as a single folded sheet with two transparent “cheeks” on each side. Half size stage Modelling the instruments This stage started feeling a bit too big so I decided to make a smaller one for my first videos. I listened to all the songs on my (not yet released) album to work out what instrumentation I’d need if a band had played them, and laid out a stage with four areas - a singer (me), drummer, bass player and synth/keys. Research + laying out the stageįirst looked at a few live performances on YouTube to see how they were lit and shot. ![]() The video above is the first render of the results, but I have designed the system so that different songs and animations can be used without too much trouble. I hand-coded and hand-drew just about every element during this 12 month project. I wrote this song and then made this realtime animation engine for “virtual live performances” so my song could be played by some funny stylized characters. By Michael Forrest Shoebox - my virtual hand-drawn, hand-coded live band
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |