Skywatcher Music [screencast/video]

Some of you may remember that some time ago, I made a vow to produce a music video for one of Sheffield composer, Skywatcher Music‘s tracks.
Here’s a couple of little progress videos I’ve made centered around the orange Broadstadium3D models from 3D Warehouse.

Today, I finally completed the first, and actually had enough screencast footage for a second video, so why not?

Skywatcher Music – Chrysippus Allegedly Died Of Laughter

The second video is all screencast, and starts with me completely breaking Adsetts Building (though I didn’t realise that’s what I was doing at the time) – in the preview image above, you can see that it’s too big and floating off the ground.

Skywatcher – Platysma

Update: Walkway [freestyle]

This turned out to be a happy accident.
By the time I had realised I had left Freestyle strokes enabled for the Terrain, it had already been sent to the Sheepit render farm – when I first saw it, I thought it was ruined but let the render complete, to see how it looked when animated, and I was nicely surprised:

What this has done is has taken where the terrain ends at a 90 degree angle, this will be where a building is, and intersects with the ground: these are the white lines.

Freestyle rendering is often used for toon/comic style artwork, but it’s useful for seeing perspective, and the general flow of the mesh – here I’d experimented with setting each object as a different colour: see if you can tell which colour is which object by looking at the shape of the mesh, and the colour of the Freestyle strokes.

As you pass through the walkway, the perspective shifts – this could be a useful method of displaying an ingame map, or at least stylised imagery to look like a game.
As every frame is different, it really does give the illusion that the lines are shifting, and it almost looks Outrun retro, if it were an 80s colour scheme.

To the right you can see the checkered floor I built in yesterdays screencast, and the ribs of the glass Winter Gardens. We’re now in the middle of the Freestyled terrain, and can even see the elevation below the top arm of the walkway.

Only another 3,297 frames to go before the Skywatcher Music composition is complete!

Something new!

I’m well aware that I’ve been spending the past few days on the same area, so wanted to start on something a bit newer.
I’ve added a grass texture to all the vegetation areas:

It doesn’t look perfect, but it’s an improvement on plain green.

Update: Sheepit Renderfarm results

I’ve been quiet lately, but that’s because I’ve been working on a video in the background and not taken much time for modelling any new elements.
I’m quite excited to have been given permission to use a track by a very talented local musician, Skywatcher Music, who composes some beautiful music: I won’t say which one I’m using; will leave some aspect of surprise, but the one I have in mind captures the vibe and emotion perfectly.

I wanted to share some imagery of the frames I’ve had rendered with the the open source render-farm, Sheepit.

These may look familiar if you’ve been here before, and it may look as if I’m reusing images: I promise you I’m not, for the sake of the website – these are a quarter size of the actual frames, when we watch these frames joined together – they’ll be at 24 frames per second, to give the illusion of movement to our eyes.
I’ve also displayed them out of order so there are no spoilers for the actual video when it’s finished.

The unusual shape is of the images is because I’m experimenting with a screen resolution of 2580×1080, which is a 21:9 ultra-widescreen format, whereas most of the time when we see widescreen in HD, it’s 1920×1080, which is 16:9.

Rendering for the solitary [sheepit / cloud computing]

A half completed render

Rendering, it’s a waiting game isn’t it?
I’d wanted to produce a video of what the city looks like in its actual state – show you it how it’s meant to look, instead of dots and lines.
I’m still not sure that this will even be finished by tonight, it’s a long process, especially on a single machine: the caveat of that is that I also cannot work on the machine because its system resources are being drained by the above.

Cloud computing

So I got thinking, are there any free cloud computing options, so I can just remotely connect to, and use a big companies computing power instead of my own? And it turns out there are, from the tech giants you might expect.

First, though I want to explain what I wanted to do with one once I got my hands on one. My first thought was run Sheepit, ‘a free distributed renderfarm for Blender.

What is a render farm?
A render farm is a group of computers connected together to complete a large task. In the case of 3D rendering, most of the time a render farm will distribute frames of an animation to multiple computers. Instead of having a single computer work for 100 days, you can have 100 computers work for 1 day.’

Sheepit running in a shell (front) and my user profile behind it – showing my nodes
participations in the queue (you can see why this particular users scene will take hours)

With Sheepit, new users must render at least 10 frames before being able to submit your own project into the queue, and other people rendering your scene uses your points: of course, you can set your own node(s) to render your project before everybody elses in the queue.

Having this process running on your machine uses up as much system resources as actually rendering a 3D scene, so it was a choice of one or the other. If I run them together, they crash: so rendering those 10 random users frames so I can upload my own is time consuming.
In the screenshot above, the process is actually running on an Amazon computer, rather than my own, so this doesn’t drain my own machine in any way.

Meanwhile, on my own computer:

18 more frames to go from this sequence, the last took 6:32 minutes, that’s about average across this scene, so to work out when it will complete:
6.5 * 18 = 117 minutes, almost 2 hours.
18 frames is around half a second of animation, and why this update has a lot of writing!

Without giving too much away, here’s a single raw frame.

I’ve not done any colour adjustment to this, this’ll all be done later, when the individual frames have been composited together into video.

Cloud Services

Cloud computers are in a nutshell, online computers you can just switch on and borrow for a bit (or a cost). These can be standard setups, like your average gaming PC, or whole entire systems with more or less unimaginable amounts of resources for incredibly complex computations.
There’s so much you can do with them, I’ll not even begin to go into anything beyond the scope of this post.
For now, I think I’ll just keep it simple.

I’m using Google Cloud and Amazon Web Services free accounts for this, I’d expand more upon the two services, but think they have quite enough reach on the internet without giving them more.

Using Google Cloud, a frame rendered in 2 seconds.

Amazon Web Services took 34 seconds, but lets bear in mind these are free offerings, and shouldn’t draw comparison from differing setups, and availability of paid services.

Scrap that, they’re blank

It half worked, they came out blank.
No wonder they were so fast!

I believe this to be because I used Filmic colour profile for its realism, Sheepit tried to render with Blender 2.79b, which is vastly different to version 2.80 onwards, so wouldn’t have been compatible.

I’ve re-added it to render with 2.81a, and lets see what happens:

This looks more positive, like it actually has something to do now.

My Google Cloud instance seems pretty insistent on rendering this project even though it’s supposed to work on my own projects first – but eh, this looks fun – I’ll let it.


This became a problem.
It will only work on this users project, and not my own.

I later found that the reason for this is that the free cloud instances weren’t GPU-enabled, so couldn’t take on the task, so was only working on projects it could handle. I will look further into this another time.

But for now, I’m going to let it run overnight and hope some kind, anonymous internet stranger renders my scene for me, and render the next segment the traditional way:

… and soon as I updated: