A quick video where I take a portrait of myself that my girlfriend drew and painted, and animate it with EbSynth.
Hello, this is the first video for a while, and I wanted to share a portrait of me that my girlfriend had painted, which I really like – it’s very Van Gogh. I also wanted to show you how to quickly animate some video footage in the same style using a package called ebsynth.
The first thing you’ll need is some video footage. I’ve tried to replicate the angle and basic shapes of the picture for the best effect when I come to animate it. I import the footage into Blender, crop out any jumpy movement at the beginning and move my head around a little.
Once I have a short little clip, I’ll render them out as PNG. For the sake of consistency, I’m going to render it at the same resolution as the original image, 1600×1474. That might sound like a weird resolution, and it is – but I’ll scale it accordingly to fit in the video you’re watching now.
Once I have my frames all rendered in PNG, I’ll import them all into eBsynth in the Video section, and the single image into the Keyframes section. Choose a directory, and Synth. It might take a while, so I’ll pause the video here.
It has been way too long since I’ve worked on this, and a trip into the city centre had inspired me to pick up where I’d left off with the project: which insofar has been work on a few buildings, no actual gameplay implementation…yet.
Naturally, I will have to carefully consider the plot-line – masked protesters amidst a war within the city doesn’t feel dystopian anymore, it’s practically a reality.
This is a short video demonstrating the process used to import the base character from the Armory first-person template project into the 3D representation of the map. Really, it’s a compilation of the past week or so’s screen recordings that I hadn’t done anything with.
In between these two videos, I’d also done a calibration live-stream, in preparation for future development streams, in which I had one mystery viewer observing the whole show, and I have no idea who it was!
I haven’t embedded it here because it isn’t so integral to the story that it needs embedding, and if I were to episodify them all, this it would be a pilot episode, or a dress rehersal: feel free to have a watch, though.
Colour analysis of some terrible low resolution video footage of lastnights Uber-Storm over Sheffield. Media had predicted a 2 month downfall of rain within a few hours. it was a very intense thunderstorm.
I’ve done some colour analysis so you can see a breakdown of the chaos within the clouds. Music: Actual thunderstorm ( and some weird unexplainable blips)
There’s a few seconds of some footage of an Assetto Corsa replay where I’d tried to replicate my friend Jme’s Volkswagen Polo – and _then_ the lightning.
I’d tried to grab some georeferenced data of The Barge in Grimsby, but it kept crashing. I sped it up and made it fit a dance track.
If you want to do that yourself, add your video footage and then your audio: divide the length (in frames) of your video file by the length (in frames) of your audio file, add a Speed Transform to your video, and set the multiply speed to what your calculator results in.
More on the Whatsapp e-collaboration project with Sayanti, in this video – she explains the premise behind the project in both English and Bengali, we try our hand at stitching a story together, and I finish with a walkthrough around the 3D set I’ve been working on.
Why are there no Bengali subtitles?
Sometimes I need to take screen breaks, so I sit with a pen and paper and plot my next moves when I get to my desk. With the same ethos of a swordsman who does not unsheathe his sword unless he is prepared to use it, I must take the same approach with screen recording software: so I don’t record unless I’m showing something. I digress… this was an idea I had.
Keentools is a small tool I came across a little while ago, but haven’t had much of a chance to play with – so I thought I’d do a post about it, because the Keentools Blender addon is currently in beta and is free – but once out of beta, this could become a paid product once it’s released – so I would grab a copy while you can.
How it works, is that you first Create a head, this is a blank, generic head object, and you insert reference images into the tool – and select points on the face and head – and line them up with the actual photograph.
I can see this tool being my go-to for character development: at least for the head and face anyway.
I’ve learnt through experimentation that too many points like this does make things very confusing, especially with multiple images from different angles – a word of warning to try to keep things as simple as possible.
The trade off is that the more information that you give the plugin through these pins, is that it can guess the position of the camera more accurately – and will place a 3D camera in its representative position.
For an example, I’m going to leave it at this for now – this is a poor quality, low lit photo so I am not expecting amazing results. These will vastly improve once I have a studio lighting and chroma key background setup.
Once you’re happy with your model, and are confident that you have captured as much source imagery as possible, you can create a UV map of your subjects face from the images you’ve provided – and wrap it around the head model.
For the curious, here’s how the UV map looks once it has finished processing- if you look to the ears and the left side of my face, it’s completely blank – because images I took and used as reference for the pins did not include this portion of my face, so it has not mapped it, and these blank areas will show on the model as plain black mesh.
Under a production environment, while I will use this process: I will spend a lot more time on it, this has been a quick demonstration of an experimentation for the purpose of this post. The game assets will also be taken under studio settings with a 1000 watt halogen lamp which will be pure white light, rather than a 60 watt incandescent light (that has a more yellowy hue) that is commonly found in homes.
This image is the right shape, but it isn’t the right thickness – currently, it’s only one pixel thick. If you think of this in real world terms, it’s akin to the thickness of the outer skin of an onion. It needs to be thicker.
For games, this will be acceptable because we want to keep the amount of polygons down, but what about for cutscenes, video where we want the characters to look believable to tell the story?
I’ve applied a Solidify modifer to the mesh, which has thickened every part of it – and this does look more realistic, especially if you look around the nostril and the top of the ear: light is shining through it as it would in the real world, it’s not paper thin like the images above. Only, I’ve solidified it so much, it doesn’t even look like the same person any more – I will have to find a parameter that works for each individual model.
There are a lot of trees in Sheffield (we have had some controversy around this of late) – and thankfully we have groups like STAG (Sheffield Tree Action Group) who are doing wonderful things to help keep those numbers up, and prevent them from being unnecessarily felled by our council.
There’s a new addition to the top menu, Wiki. This’ll be an explorable Wikipedia-a-like of the story-line of the game, and any future projects connected to this universe. It’s based on a series of stories I’ve written before, so the information available will fill up quickly, and will not detract from progress on the 3D modelling front.
I digress. The most obvious place to house afformentioned trees would be a building where a range of interesting plant life can be found, the Winter Gardens (you’ll be glad to know that in 2030, it still serves the same purpose) But first, let’s have a look at this tree:
Winter Gardens foliage area
I’ve naturally tried to find a Blender plugin that would help with creating trees, and there are some available, but they are paid plugins. If possible, I want to avoid this – keeping with the ethos of this site: a shoestring budget studio.
If there’s a resource out there for free that does some of the work for me, saving time and money – I’ll take that, thanks! My search for a free Blender plugin turned out to be fruitless (pun intended!) and I look towards free, specialised software instead – and came across Abraro; and was suitably impressed.
Almost every aspect of the tree is customisable, so you can let the Charles Darwin within you loose, and create any kind of tree you can imagine like a mad scientist genetic botanist. Once you have created your tree – click File -> Export and save it as an .obj file on your computer: you can then import it into Blender either as a new object, or straight into your scene.
When first importing the tree, it did put a strain on my (limited) graphics card, and I would get areas like this across the Blender application, which as you can imagine made it quite awkward to use:
I used the Decimate modifier on it to reduce its poly count from around 29,000 to 1,500. Not only does this lessen system resources, it also makes it more appropriate for a game asset.
I’ve only included this image because the crash makes it looks Vaporwave.
Below, I’ve modified the trunk and shape of a Black Tupelo so that it is shorter and more bush-like, and experimented with the leaves.
It turned out massive and soon after loading, Blender gave up and shut itself down: this’ll have to be something I branch into once I’ve upgraded to a higher spec PC.
Some slight colour modification in GIMP, no changes to the actual content of the model.
These have been scaled down by your web browser to make it fit the screen on your device, but you can right click the image, and open it in a new tab to see it full size: mobile users can usually press and hold their finger down on the picture to open in a new tab, which will allow you to pinch and zoom around it.
To call this an Episode would be a tad unfair – there’s no dialogue per se, it’s a compilation of screencast segments where I’m making an attempt to texture one of the most icon buildings in the city center – with the tallest building there is.
I wanted to use the music from the end of my last video, in which we walked around the city a little – so in this video, I’ve used the whole song to show some screencast footage of building it, the clips aren’t really in any chronological order, more a stylised arrangement to the music.
Spoiler alert: The video dramatically ends with an Armory error message fade to black.
Uniform receiveShadow not found. Uniform lightProj not found. Uniform envmapNumMipmaps not found. Uniform receiveShadow not found. Uniform lightProj not found. Uniform lightProj not found.
As a chaos magickian who loosely holds a belief system which involves being your own God (or at least convincing myself I am during gnostic states): switching on the lights should be some day one shit.
In the last video (Episode 2), I was able to export the city to the Armory engine, but pardon the pun – it was very foggy.
Now, we have a Skylight, we need to tell the game engine which objects give out light, which receive light, and what objects cast and receive shadows – so it knows how to dynamically interact with it in our game environment.
Thanks for reading!
Music: David Guetta Ft Kelly Rowland – When Love Takes Over (Buzzby & Dane Robson Remix) https://soundcloud.com/danerobson Big thanks to Dane for letting me use this track.
In this video, if you can hear it – we have a look at the Peace Enforcer model I teased about in the last videos title, and optimising your renders for sales on Redbubble.
My microphone volume is incredibly low (and I can’t be bothered to rerecord it), so I’d turn your volume up _after_ the first musical segment to actually hear what I’m saying. To save you damaging your eardrums, the first sentence is: ‘Hello, I’m David – and this is the second instalment: aaand I’ve rendered this way too fast, so straight on with the mu–‘
I will be looking at using the blender-osm plugin by Prokitetkura which can import OpenStreetMap data from within Blender, and I’ll be attempting to compile it with the Armory game engine so I have the basis of a first-person-shooter level layout.
We explore Laycock House and Premier House on the map, and compare the building shapes to how they are in the real world, and I don’t want to spoil it for you – but we have some success with exporting to the Armory engine!