It’s Good To Be Back

I found I really missed my blog, particularly when it came to revisiting stuff I did ages ago and finding out how I did things. So, to save me sorting through a stack of Pukka Pads full of scribble, I plan to start updating it again.

Lionel reading at 5:45
Let’s see if we can really get it on

I was going to start from scratch and bin the old stuff, but my friend John Hoare at wouldn’t forgive me if I did that, so I’ve added all the old posts here as well as new posts. I wonder how long I’ be able to keep it updated this time!

Making Music using Rosegarden on Fedora 17

I’ve always loved music – as do both of my parents. They have excellent, but divergent tastes in music. With my Mum I share a love of Sandy Denny, Jeff Lynne and George Harrison, with my father there was a shared affection for Eric Coates, Henry Hall and G. F. Handel. And when you mix the two together you get my love of Maestoso, Mike Oldfield, Kevin Ayers and Barclay James Harvest.

As well as listening to music, I also enjoy making it. But I always thought making a music on a computer seemed so difficult to do I never bothered really trying.

However, recently I got a bit of inspiration from my friend TA Walker (Tim). Earlier this year Tim signed up for something called the 5090 Challenge – writing 50 songs in 90 days. Given Tim has a full-time job, a wife and a young daughter that was insanely ambitious but astoundingly he managed 36 excellent songs which I have been known to raid for my YouTube videos. In order to reach his goal Tim was making music anywhere using anything – he was even overdubbing vocals and recording guitalele in his car during his lunch-breaks using an iPod Touch. Here is Tim playing one of his 5090 songs:

So, if Tim could make music in a car (or on a very nice looking white leather sofa) I had no excuse sitting in front of a computer that had access to a repository of free software for making noises.

I’m using Fedora 17 and I wanted to try and record music entirely using free software. This is because a) I’m on a budget of £0 and b) I think it’s the right thing to do.

Rosegarden running on Fedora 17

The first program I tried to install was something called Rosegarden. It seemed a pretty welcoming program for beginners as music programs go and therefore a good place to start. It used staves and notes – things that a dinosaur like me can (almost!) understand. However before I could get Rosegarden to make any noise I needed a synthesiser. I don’t have a real synthesiser, so instead I needed a soft synthesiser – a program that runs on the computer and pretends it’s a real synthesiser sitting on your table.

The synthesiser that everyone seemed to recommend was something called FluidSynth, so I thought I’d install that. FluidSynth is a free software synthesiser that can take MIDI data from a program like Rosegarden and turn it into audio.

It normally comes with a “SoundFont” bank containing a nice range of sounds for a beginner, so it seemed a good start. However to use FluidSynth it’s best to have a nice graphical interface so you can fiddle with it using knobs and buttons on your desktop. The most common one is called QSynth. It looks very impressive!

A very impressive addition to any desktop!

Only, before I could use the virtual synthesiser I needed something to plug it into the computer’s sound hardware. In other words, FluidSynth needs somewhere to send all this audio it’s creating. That somewhere is a piece of software called the JACK Audio Connection Kit (JACK). But before I could use JACK I thought I’d find it easier if I something graphical to could control JACK with. So I needed something called QJackCtl – a graphical JACK Control panel.

QJackCtl with JACK not started

So I downloaded all the bits I needed. I had Rosegarden (a music studio), Fluid Synth (a synthesiser), JACK (a sound server), QJackCtl (a graphical interface for JACK) and QSynth (a graphical interface for FluidSynth). It was, literally, like a house that JACK built.

Now I tried to make a noise. I worked out after a couple of minutes that it’s not enough to simply load QJackCtl – JACK has to be started and stopped by pressing the Start and Stop buttons. So I tried to start JACK and it did nothing but spit error messages at me and I certainly couldn’t get anything to make any sound.

Now, this is where the cutting-edgeness of Fedora had just bitten me on the bum. Normally you should be able to start JACK and it will work without error. And indeed, since this morning’s software repository updates that’s exactly what it does do. However at that time there was a permissions problem within Fedora so I needed to type:

su -c "modprobe snd-seq-midi"

It took me an hour or so to find that out, and before I did so I couldn’t start JACK or make any noise at all. Normally I would have given up long before this point, but with M4 and Mr Cable The Sysadmin ringing in my ears I was determined and pressed on.

There were a couple of other things I had to do in JACK to get it to work. After pressing the Setup… button I had to uncheck Realtime, check Force 16bit and change the interface to hw:0.

QJackCtl with JACK started

With JACK running happily, I started QSynth to get FluidSynth running. Everything seemed OK, so the next step was to run Rosegarden. No problems. I opened one of the examples in the examples folder, pressed the play button and success! Music!

However, music on my headphones only – there was nothing coming out of my speakers. I went to QJackCtl and pressed the Connect button to see what was going on.

QSynth in headphone-only mode

As you can see, the left output of QSynth (l_00) was going to my system’s playback_1 and the right output of QSynth (r_00) was going to my system’s playback_2. This was giving me music in my headphones. However, what were the other playbacks?

QSynth will now use my speakers too

I tried to connect the left output of QSynth (l_00) to playback_3 and the right ouput (r_00) to playback_4, and it worked. Music through my speakers!

So every time I want to make music I…

  1. load QJackCtl, 
  2. start JACK by pressing the Start button, 
  3. load QSynth 
  4. then load Rosegarden

…always in that order.

Provided I just wanted to enter musical notation into Rosegarden I was now fine, but that’s not much fun. The frustrated Woolly Wolstenholme in me wanted to have a keyboard to play!

The trouble was as well as not having a synthesiser, I don’t have a keyboard either. Fortunately there are “virtual keyboards” available that allow you to play music using your computer’s keyboard. The one I chose out of a field of three was called Virtual MIDI Piano Keyboard (VMPK). I chose this one because it was the only one that seemed able to play nicely with the Hungarian keyboard on my computer.

Be Woolly in the comfort of your own home…

However, in order to record MIDI data created with a virtual keyboard meant I had to plug it into something that records MIDI data – Rosegarden. It was back to the QJackCtl Connect dialog:

VMPK running, but QJackCtl shows nothing to plug it into

VMPK had appeared in the MIDI tab of the QJackCtl Connect dialog. The trouble was, nothing else did – the only thing I could plug my virtual keyboard into was itself.

This proved to be a very tricky problem to sort out. It took me a long time to find an answer but the answer was a program called a2jmidid. Apparently there are two kinds of MIDI on a GNU/Linux machine – ALSA MIDI and JACK MIDI. They can’t talk to each other without a “bridge” program. The bridge is called a2jmidid and it’s available in the Fedora repository. To use it I had to start a terminal window and type:


Then, provided I kept the terminal window open, when I go back to my QJackCtl Connect dialog I get some extra things in the MIDI tab:

VMPK connected to Rosegarden in QJackCtl

As you can see, I can now connect the VMPK Output to the Rosegarden record in and, hey bingo, I’ve got a MIDI keyboard connected to Rosegarden.

VMPK configured for a Magyar keyboard

The only thing left to do with VMPK is create a Hungarian key mapping – this was very easy to do using a dialog provided by the program.

The first thing I wanted to try and record was a tune I remembered from my childhood. It was an early music or baroque piece for recorder and a small ensemble used by the Open University before their morning broadcasts. I have never heard since early mornings in the 1980s when I used to get up early to watch a maths foundation course on calculus or the foundation course on modern art.

A lost childhood memory

I did a rather twee arrangement using a harpsichord, viola, cello and a recorder. I think the real thing was less Bach and more Henry VIII.

However when I came to play it the recorder just didn’t sound right. It sounded very good, but it didn’t sound like the recorder I had in my head. So I looked on-line to see if there were any other SoundFont banks I could use with QSynth.

I was in luck because the pianist and composer S. Christian Collins has put together an excellent SoundFont for bank for QSynth and put it on his website here. It’s called the GeneralUser GS Soundfont bank.

GeneralUser GS Soundfont bank loaded into QSynth

To load it I had to get QSynth running and press Setup…. Next, I had to go to the Soundfonts tab and replace the default soundfont bank (an sf2 file) with the GeneralUser GS SoundFont bank I had downloaded.

To my delight the recorder sounded much more how I wanted it to sound.

So now I  had finished and was happy with my sounds I realised I needed some way of recording what I’d just done as an audio file instead of a Rosegarden file.

When I ran QJackCtl with nothing else running the Connect dialog looked like this:

By default I can only get sound from the microphone

If you look at the Readable Clients box on the left you’ll see the only place I could get any audio is from  capture_1 and capture_2. They represent the microphone socket on my computer. capture_1 is the left channel and capture_2 is the right channel of the stereo microphone input.

If I ran Rosegarden I found they were connected automatically to Rosegarden’s record in 1 L and record in 1 R Input Ports:

Rosegarden connects to the microphone automatically

I looked in the QJackCtl Setup dialog and saw a Monitor check box which was unchecked. It sounded like what I needed so I checked it.

However you can enable monitoring

When I restarted JACK I saw this:

And route the monitor output where you want it

So now I have monitor output in addition to microphone input as a potential source of audio. What monitor output means is that I can record whatever I can hear through the speakers. This is just what I needed.

Such as here, where the monitor output is routed to Rosegarden

I started Rosegarden up again and connected up monitor_1 and monitor_2 to record_in_2 L and record_in_2_R.

This meant that now Rosegarden had the system’s sound output available as a source of audio. Now I could use Rosegarden to record whatever Rosegarden was playing as an audio file!

You can easily turn the metronome off in Rosegarden

Setting this up in Rosegarden is quite easy and pretty logical once you work it out (which took me a long time!). The first thing you need to do is go to Studio-> Manage Metronome to turn off the click track. You usually don’t want that on your master recordings!

The next thing you need is an audio track that can accept the monitor output as its audio input:

Rosegarden all set up to record Rosegarden

You can see in the picture above I’ve set track 17 as my current track. It’s an audio track and I’ve called it RECORD.

On the left hand side you’ll notice that I’ve set the In: to In2. This is very important – In2 is the Rosegarden input we connected up to the monitor output in QJackCtl earlier. Never use In1 – it’s quiet and full of  interference noise!

Finally you’ll notice I’ve armed track 17 to record – shown by the red light to the left of the track name. Now when I press the record button the my Rosegarden file will play and be recorded as an audio file on track 17 at the same time.

My recorded Rosegarden output in Rosegarden

When the track has finished you will see the waveform displayed in your recording track as it is above.

Double-clicking on an audio track segment in Rosegarden opens Audacity

Now you can double click on the recorded segment and it will open in Audacity. Don’t forget to set Audacity to JACK Audio Output as I have in the picture above, or it will freeze and not play.  From Audacity you can edit or export the audio in the usual way.

For OGG Vorbis files or MP3 files I normalize to -2.0dB

I always save a lossless FLAC file from Audacity first. If I want to save to a lossy format such as OGG Vorbis or MP3 I always Normalize to -2.0 dB first before I export.

Being able to set Audacity to use JACK audio output is very handy – particularly if you find you want to listen to audio files while you are working.

So now I had a FLAC file, an Ogg Vorbis file and an mp3 file. The FLAC file was fine, but what I really wanted to do was get a picture in my mp3 and ogg files so they would be just like the ones I downloaded from TA Walker’s Bandcamp page.

To do this I found an excellent program called EasyTAG which does exactly what it’s name suggests. This program allows you to add a picture to your audio files and is very easy to use. Although I tend to use Ex Falso for most of my tagging (it’s better for classical music) I’ll use EasyTAG for tagging my own files in future.

The next thing I decided to do was re-record the OU Tune in Mike Oldfield style. When I was a child I remember watching Simon Groom visit Mike Oldfield to see him re-record the Blue Peter signature tune. That video had a enormous effect on me as a child and recording something like that was something I always wanted to try.

I had a lot of fun in Rosegarden pretending to be Mike – particularly tapping away on my computer’s keyboard pretending to play the bodhrán.

When I finished Tim very kindly recorded a real electric guitar solo for me to add to my track. He supplied it to me as a FLAC file, but the funny thing was I could not find any way of importing a FLAC file into Rosegarden – only .WAV files.

TA Walker’s solo shown on the red track

However, by accident, I discovered you could import FLAC files directly into Rosegarden if you dragged and dropped them onto the time track.

I’d enjoyed myself so much with the Open Universtiy tune I decided to record another tune Mike Oldfield-stylee, so I dusted off my recording of Border Television’s Keltic Prelude March by L. E. DeFrancesco and did that as well!

The reason I did the Keltic Prelude March was so that I could upload my video of a Border Television start-up that I had pulled down earlier this year. The reason I had pulled it down was because of a copyright claim over the track Holiday People by James Clarke that I had used over the menu rundown. Therefore I decided I would also create a pastiche of Holiday People to use in my Border start-up. I came up with a tune I called Lionel’s Talking!

Lionel’s Talking in Hydrogen

I needed a “real” drum kit for Lionel’s Talking so I decided to use a special drum synthesiser called Hydrogen which does the job flawlessly. Hydrogen also works beautifully in tandem with Rosegarden. The Rosegarden website has a wonderful tutorial to explain how to do this here.

So put it all together and what do you have? Well something like this…

Producing music on GNU/Linux can be a bewildering and frustrating experience at first. There are so many things you need – each one has to be set up correctly in order to work properly. There is a huge amount to learn and sometimes you feel the easiest thing is to just give up. I spent a lot of time trying, and failing, to do things which I thought should have been easy.

In addition, differences in hardware mean what you have to do get everything working is slightly different for everyone.

But with a little perseverance you find that things rapidly begin to make sense, there is a common logic that underlies all the things you have to do and you begin working out answers to your problems yourself.

I hope you try making some music too.

Synfig Studio Gate Weave

If I wasn’t so stupid, I’d have realised you could add gate weave to my animations in Synfig Studio without the need for any coding whatsoever.

Here’s an example:

You simply need to add a translate layer to your animation. The translate layer is used to move things around the canvas. Here’s one in my layers panel:

Translate layer in layers panel

The translate layer should be at the top, so everything in your animation will weave (Z Depth 0.000000).

Next, you need to convert the Translate layer’s Origin into a Composite. That means the X-axis and Y-axis values are separated instead of being a vector.

You then convert the X-axis and Y-axis values to Random. Put in some suitable values, such as these:

Example X and Y axis values

Export the resulting animation as video and that’s all there is to it.

Weave All Wobbles

Back in July, I wrote about some of the techniques I used to simulate old 16mm film entirely using free software.

One technique I mentioned was using Kdenlive to simulate gate weave – that strangely pleasing effect whereby picture moves almost imperceptibly around the screen as you watch. If you’re not familiar with what gate weave looks like, here’s an example:

I mentioned in my previous article that I discovered I could simulate gate weave manually using the Kdenlive “Pan and Zoom” filter.

I did this by zooming in on my video slightly…

Video resized by 108% to zoom in on it slightly

…and then moving the picture around randomly at 5 frame intervals.

Keyframes added a five frame intervals; X and Y changed randomly

Once this is done, I could save this portion gate weave as a custom effect so I could re-use it:

Save effect button in Kdenlive

When you do this, your zoom, keyframes and random movements are stored in the ~/.kde/share/apps/kdenlive/effects folder as an XML file. The XML file created by Kdenlive for some manually created gate weave looks like this:

<effect tag="affine" type="custom" id="test">
<description>Adjust size and position of clip</description>
<author>Charles Yates</author>
<parameter opacity="false"

Obviously, creating gate weave by hand for a long piece of video using the Kdenlive interface would take hours. Luckily, because the resulting gate weave custom effect is stored as a simple XML file, you can write a quick script to create the gate weave instead.

So I thought I’d use this post show you the script I use to create “automatic” gate weave.

When I do scripting jobs I prefer to use Python if possible. For this task I needed to get it to write out an XML file. Python comes with a selection of  complex but hugely flexible ways to do this. However, to make this as quick and easy as possible, I used a lovely Python module called pyfo which was developed by Luke Arno and did everything I needed.

Although you can install pyfo if you want to, there’s no need; you can just extract the file and put it in the same folder as your Python scripts that use it.

As you can see below, my Python script is pretty self-explanatory. The script below is suitable for adding gate weave to PAL 4:3 video.

The variable value_string determines how much the video is zoomed in by initially. For wide-screen PAL 16:9 video I adjust this value to “-29,-23:1034×622:100;”. The value of the step_value variable determines how often the video frame moves. I think every 5 frames often works best.

You can run the script multiple times to create different xml files, but if you do remember that you will need to change the value of the custom_effect_name variable each time to something different so you’ll be able to tell your gate weave custom effects apart.

#!/usr/bin/env python2.4

""" This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see"""

import random
from pyfo import pyfo

custom_effect_name = "weave"

#Number of seconds of gate weave required
frames_required = 750

#Initial value of frame
value_string = "-29,-23:778x622:100;"

origin = {'x':-29, 'y':-23}

#Step value (in frames)
step_value = 5

#Maximum weave distances
max_distance = {'x': 1, 'y': 1}

for i in range(0, frames_required, step_value):
x = random.randrange(origin['x'] - max_distance['x'],
origin['x'] + max_distance['x'] + 1)
y = random.randrange(origin['y'] - max_distance['y'],
origin['y'] + max_distance['y'] + 1)
value_string += str(i) + "=" + str(x) + "," + str(y) + ":778x622:100;"

xml_output =
('effect', [
('name', custom_effect_name),
('description', 'Adjust size and position of clip'),
('author', 'Charles Yates'),
('parameter', ('name', 'Rectangle'),
{'opacity':'false', 'default':'0%,0%:100%x100%',
'type':'geometry', 'value':value_string, 'name':'transition.geometry'}),
], {'id':custom_effect_name, 'type':'custom', 'tag':'affine'})

result = pyfo(xml_output, pretty=True, prolog=False, encoding='ascii')
print result.encode('ascii', 'xmlcharrefreplace')

As it is, my Python script writes the XML file it produces to the console window. You can copy and paste the resulting XML into a blank text file and save it to ~/.kde/share/apps/kdenlive/effects folder,

However, you can pipe the output into straight into an XML file instead. For instance:

$ python > gateweave.xml

Obviously, my Gate Weave solution isn’t very elegant, but who cares – it works, and it’s all free software!

ATV Yesterday and Today

If you’ve read my blog before, you may have come across some posts about my friend Roddy Buxton. Roddy is an incredibly inventive chap – he’s rather like Wallace and Grommit rolled into one! He has his own blog these days and I find everything on it fascinating.

One of Roddy’s cracking contraptions

One of the subjects recently covered on Roddy’s blog is the home-made telecine machine he built. The telecine was a device invented by John Logie-Baird at the very dawn of broadcasting (he began work on telecine back in the 1920s) for transferring pictures from film to television.

Roddy also shares my love of everything ATV, so naturally one of the first films Roddy used to demonstrate his telecine was a 16mm film copy of the ATV Today title sequence from 1976.

This title sequence was used from 1976-1979 and proved so iconic (no doubt helped immeasurably by the rather forgetful young lady who forgot to put her dress on) it is often used to herald items about ATV on ITV Central News. Sadly, as you can see below, the sequence was not created in widescreen so it usually looks pretty odd when it’s shown these days.

How the sequence looks when broadcast these days.

The quality of Roddy’s transfer was so good I thought it really lent itself to creating a genuine widescreen version. In addition, this would provide me with a perfect opportunity to learn some more about animating using the free software animation tool Synfig Studio.

The first thing to do when attempting an animation like this is to watch the source video frame by frame and jot down a list of key-frames – the frames where something starts or stops happening. I use a piece of free software called Avidemux to play video frame by frame. Avidemux is like a Swiss Army knife for video and I find it handy for all sorts of things.

Video in Avidemux

I write key-frame lists in text file that I keep with all the other files for a project. I used to jot the key frames down on a pad, but I’ve found using a text file has two important advantages: it’s neater and I can always find it! Here is my key frame list in Gedit, which is my favourite text editor:

Key-frame list in Gedit

After I have my key-frame list I then do any experimenting I need to do if there are any parts of the sequence I’m not sure how to achieve. It’s always good to do this before you start a lot of work on graphics or animation so that you don’t waste a lot of time creating things you can’t eventually use.

The ATV Today title sequence is mostly straightforward, as it uses techniques I’ve already used in the Spotlight South-West titles I created last year. However one thing I was not too sure about was how to key video onto the finished sequence.

Usually, when I have to create video keyed onto animation I cheat. Instead of keying, I make “cut-outs” (transparent areas) in my animation. I then export my animation as a PNG32 image sequence and play any video I need underneath it. This gives a perfect, fringeless key and was the technique I used for my News At One title sequence.

However, with this title sequence things were a bit trickier – I needed two key colours, as the titles often contained two completely different video sequences keyed onto it at the same time.

Two sequences keyed at once

Therefore I had to use chromakeying in Kdenlive using the “Blue Screen” filter, something I had never had a lot of success with before.

The first part was simple – I couldn’t key two different video sequences onto two different coloured keys at once in Kdenlive. Therefore I had to key the first colour, export the video losslessly (so I would get no compression artefacts), then key the second colour.

The harder part was making the key look smooth. Digital keying is an all or nothing affair, so what you key tends to have horrible pixellated edges.

Very nasty pixel stepping on the keyed video

The solution to this problem was obvious, so it took me quite a while to hit upon it! The ATV Today title sequence is standard definition PAL Widescreen. However, if I export my animation at 1080p HD and do my keys at HD they will have much nicer rounded edges as the pixels are “smaller”. I can then downscale my video to standard definition when I’ve done my keying and get the rounded effect I was after.

Smooth keying, without pixel stepping

The other thing I found is that keying in Kdenlive is very, very sensitive. I had to do lots of test renders on short sections as there was only one “Variance” setting (on a scale between 1 and 100) that was exactly right for each colour.

So now I was convinced I could actually produce the sequence, it was time to start drawing. I created all of my images for the sequence in Inkscape, which is a free software vector graphic tool based around the SVG standard.

However, in order to produce images in Inkscape I needed to take source images from the original video to trace over. I used Avidemux to do this. The slit masks that the film sequences are keyed on to are about four screens wide, so once I had exported all the images I was interested in I needed to stitch them together in the free software image editor The GIMP. Here is an example, picked totally at random:

She’ll catch her death of cold…

Back in Inkscape I realised that the sequence was based around twenty stripes, so the first thing I did before I created all the slit mask images was created guides for each stripe:

These guides saved me a lot of time

The stripes were simply rounded rectangles that I drew in Inkscape. It didn’t take long to trace all of the slit masks for the title sequence. Two of the masks were repeated, which meant that I didn’t have as many graphics to create as I was fearing.

Once the slit masks were out of the way I could create the smaller items such as the logo:

ATV Today logo created in Inkscape

And, with that, all the Inkscape drawing was done. It was time to animate my drawings now, so I needed to export my Inkscape drawings into Synfig Studio. To do this I was able to use nikitakit’s fantastic new Synfig Studio SIF file Exporter plug-in for Inkscape. This does a fabulous job of enabling Inkscape artwork to be used in Synfig Studio, and it will soon be included as standard in Inkscape releases.

When I did my Spotlight title sequence I exported (saved) all of my encapsulated canvases (akin to Symbols in Flash) that I needed to reuse within my main Synfig file. This was probably because I came to Synfig from Macromedia Flash and was used to the idea of having a large file containing all the library symbols it used internally.

I have been playing with Synfig Studio a lot more since then, and I realised a far more sensible way to work was to have each of what would have been my library symbols in Flash saved as separate Synfig files. Therefore I created eight separate Synfig Studio files for each part of the sequence and created a master file that imports them all and is used to render out the finished sequence.

The project structure

This meant that my finished sequence was made up of nine very simple Synfig animation files instead of one large and complicated one.

The animation itself mainly consisted of simply animating my Inkscape slit masks across the stage using linear interpolation (i.e. a regular speed of movement).

I could type my key-frames from my key-frame text file directly into the Synfig Studio key-frame list:

Key-frames for one part of the animation

The glow was added to the ATV Today logo using a “Fast Gaussian Blur”, and the colour was changed using the “Colour Correct” layer effect – exactly the same techniques I used in the Spotlight South-West titles.

ATV Today logo in Synfig

In order to improve the rendering speed I made sure I changed the “Amount” (visibility) of anything that was not on the stage at the present time to 0 so the renderer wouldn’t bother trying to render. You do this using Constant interpolation so that the value is either 0 or 1.

I had a couple of very minor problems with Synfig when I was working on this animation. One thing that confused me sometimes was the misalignment of key-frame symbol between the Properties panel and the Timeline.

This misalignment can be very confusing

As you can see above, the misalignment gets greater the further down the “Properties Panel” something appears. This makes it quite hard at times to work out what is being animated.

Some very odd Length values indeed!

Another problem I had was that the key-frame panel shows strange values in the time of length columns – particularly if you forget to set your project to 25 frames per second at the outset.

However, overall I think Synfig Studio did brilliantly, and I would chose it over Flash if I had to create this sequence again and could choose any program to create it in.

The most important technical benefit of Synfig Studio for this job was the fact that it uses floating point precision for colour, so the glows on the ATV Today logo look far better than they would have done in Flash as the colour values would not be prematurely rounded before the final render.

I rendered out my Synfig Studio animation as video via ffmpeg using the HuffyUV lossless codec, and then I was ready to move onto Kdenlive and do the keying.

Obviously I needed some “film sequences” to key into the titles, but I only have a small selection of videos as I don’t have a video camera. To capture video I use my Canon Ixus 65, which records MJPEG video at 640 x 480 resolution at 30fps.

My 16mm film camera

Bizarrely, when the progressive nature of its output is coupled with the fact it produces quite noisy pictures, I’ve found this makes it a perfect digital substitute for 16mm film camera!

I “filmised” all the keyed inserts, so that when they appear in the sequence they will have been filmised twice. Hopefully, this means I’ll get something like the degradation in quality you get when a film is then transferred to another film using an optical printer.

Once the keying was done the finished sequence was filmised entirely using Kdenlive using techniques I’ve already discussed here.

And so, here’s the finished sequence:

Although I’m not happy about the selection of clips I’ve used, I’m delighted with the actual animation itself. I’m also very pleased that I’ve completed another project entirely using free software. However, I think the final word should go to Roddy:

Thanks for the link. I had a bit of a lump in my throat, seeing those titles scrolling across, hearing the music, while munching on my Chicken and Chips Tea… blimey, I was expecting Crossroads to come on just after!

If you are interested in ATV, then why not buy yourself a copy of the documentary From ATV Land in Colour? Three years in the making, over four hours in duration, its contains extensive footage (some not seen for nearly fifty years) and over eleven hours of specially shot interviews edited into two DVDs.

Sunday’s Newcomers

Click to enlarge

Going through my old Flash files, I stumbled across an early version of this image, which I first produced in 2005. I didn’t know how to make it look realistic then, but I’ve since been given lots of good advice from Rory Clark. This new version was produced in Inkscape and aged in The GIMP.

In case you’re wondering, these were all real IBA Transmitters.

Doing my pennants…

I often spend idle half hours looking around flickr for anything of interest. The other day I found a very nice Anglia logo from 1959. Obviously, I couldn’t resist recreating it in Inkscape while I was listening to a pod-cast:

Click to enlarge

This stylised Anglia pennant logo formed the basis of Anglia Television’s original end-caps, including the one seen on their opening program.

Cheap Dirty Film

Three years ago I talked about the programs I used to simulate old 16mm film. Back in 2008 I was using Windows XP, Adobe Premiere Elements 4.0 and a VirtualDub filter called MSU Old Cinema. I found I could use them to create some half-decent 16mm film:

These days I’m using Fedora 15 as my operating system and Kdenlive as my off-line video editor. That meant I’ve had to change the way I simulate old film quite a bit. I have been continuing to use VirtualDub and the MSU Old Cinema plug-in via WINE. Although VirtualDub is free software, the MSU Old Cinema plug-in is not, and this bothered me. I wondered what I could achieve in Kdenlive alone and I started experimenting.

In the course of this blog post I’m going to use the same image – an ITV Schools light-spots caption from the 70s that I recreated in Inkscape. Here’s the original image exported directly from Inkscape as PNG:

Created in Inkscape

The most obvious sign that you are watching something on a bit of old film are the little flecks of dirt that momentarily appear. If the dirt is on the film itself it will appear black. If it was on the negative when the film is printed it will appear white.

Kdenlive comes with a Dust filter that tries to simulate this effect. However, it has a very small database of relatively large pieces of dirt. In total there were just six pieces of dirt, drawn as SVG files, and that limited number led to an unconvincing effect. If I used the filter on a long piece of video I found I began to recognise each piece! There were also no small bits of dirt.

I drew 44 extra pieces of dirt in Inkscape and added them to the Dust filter. I also redrew dust2.svg from the default set. I call this particular piece of dirt “the space invader” as I found it was too large and too distracting!

The video below compares the Dust filter (with identical settings) before and after I added my extra files:

You may find you prefer the Kdenlive dust filter with just the default six SVG files. However, if you prefer what I have done you can download my extra SVG files from here.

With the modifications I’ve made, I actually prefer the dirt created by the Dust filter in Kdenlive to the dirt you get in the MSU Old Cinema plug in. The dirt from Kdenlive’s filter is less regular in shape and simply by changing the SVG files in the /usr/share/mlt/oldfilm folder I can tailor the dust to any specific application I have in mind.

After flecks of dirt, the second most obvious effect that you are watching old film is a non-uniform shutter causing the picture to appear to flicker very slightly. The MSU Old Cinema plug-in can simulate this effect, but wildly over does it. It is not suitable for anything other than simulating silent movies, so I never used it.

Luckily the Kdenlive Old Film plug-in does a much more convincing job. The settings that I found worked for me are shown below:

KdenLive Old Film settings for uneven shutter

And they create the results shown below:

It looks a bit odd on it’s own, but when added to all the other effects I’m describing here it will look fine.

I’ve noticed that when I am creating these effects it’s best if I move away from the monitor to a normal TV viewing distance to see how they look – otherwise I tend to make the effects too subtle to be noticed when I come to watch the results on my television!

The next thing that will help to sell the output as film is having some film grain. Film grain is irregular in shape and coloured. In fact, I used the Colour Spots setting of the MSU Noise filter to create film grain in VirtualDub.

Kdenlive has a Grain filter, which simply creates random noise of 1 pixel by 1 pixel in size. Although technically this is not at all accurate, it can look pretty good if you are careful.  The settings for film grain will vary from job to job, so some trial and error is involved.

As a starting point, these settings are good:

Kdenlive Grain settings

And will look like this:

Again, it looks odd by itself (and you can’t really see it at all on lossy YouTube videos!) but it will look fine when added to the other effects. You’ll start to notice the rendering begin to slow down a bit when you have added Grain! Incidentally, Grain is still worth adding even if YouTube is your target medium because it helps break up any vignette effect you add later.

The next thing you need to do is to add some blur – edges on 16mm film in particular tend to be quite soft. Kdenlive has a Box Blur filter which works just fine for blurring. How much blur you add depends on your source material, but a 1 pixel blur is fine as a starting point.

Colour film is printed with coloured dyes, so it has a different colour gamut to the RGB images you create with The GIMP, Inkscape or a digital video camera. In addition, it also fades over time. Therefore to make computer-originated images look like film-originated images some colour adjustment is normally required.

Luckily, Kdenlive has a Technicolor filter that allows you to adjust the colours to better resemble film.

Kdenlive Technicolor settings

The way colour film fades depends on whether it has been kept in a dark or light place. If I’m recreating a colour 16mm film that has been stored safely in a dark tin for many years, I make it look yellowish. If I’m recreating a colour 16mm film that’s been left out in the light a bit too much I make it look blueish. Both these looks rely adjusting the Red/Green axis slider – not the Blue/Yellow axis slider as you might think!

Source image faded with Technicolor

You soon begin to notice that the telecine machines used by broadcasters could adjust the colours they output to make colours that were impossible to resolve from the film. For instance, some of the blue backgrounds on ATV colour zooms were too rich to have been achieved without some help from the settings on the telecine machine. So the precise colour effect you want to achieve varies from project to project, and sometimes you will be actually increasing colour saturation rather than decreasing.

The Technicolor filter is, ironically, the filter you use to make colour source material monochrome too!

The biggest problem when trying to recreate old film is recreating gate weave – that strangely pleasing effect whereby picture moves almost imperceptibly around the screen as you watch.

MSU Old Cinema created an accurate but very strong gate weave which was too severe for recreating 16mm film. The Kdenlive Old Film filter has what it calls a Y-Delta setting, that makes the picture jump up and down by a set number of pixels on a set number of frames. It’s easy and quick (a Y-Delta of 1 pixel on 40% of frames is good) but introduces black lines at the top of the frame and is so obviously fake it won’t really fool anyone!

So there is, sadly, no quick way to create gate weave in Kdenlive. However, the good news is there is a way, provided you’re prepared to do a bit of work. You need to use the Pan and Zoom filter. The Pan and Zoom filter is intended to do Ken Morse rostrum camera type effects – it’s particularly good if you have a large image and want to create a video to pan around it.

However, what we can do is use the Pan and Zoom filter to move the frame around once per second. Firstly you zoom the image in by 108%. This means you won’t see any black areas around the edge of the frame as the picture moves around.

First of all, zoom the image very slightly

Next, you create key frames on each second:

Then add one key frame per second

Then you move the image around slightly on each keyframe – plus or minus two or three pixels from the starting position is often plenty.

Obviously, for a 30 second caption that’s 30 keyframes and 30 movements – a lot of work if done “by hand”. However it won’t go to waste, as you can save your Pan and Zoom settings as a Custom effect and resuse it again and again on different clips.

And, luckily, doing all this by hand isn’t even necessary. Custom effects are stored as simple XML files in the /kde/shared apps/kdenlive effects folder so it is possible to write a small Python script to automatically create as much gate weave as you want – something I’ll come back to.

As well as gate weave, you can also use the Pan and Zoom filter to stretch the frame, which is perfect for simulating stretched film. Again, that’s hopefully something I’ll return to another time.

Here’s an example of video moving with the Pan and Zoom filter:

The Pan and Zoom filter also adds hugely to your rendering time, so it’s best to switch it off until you do your final render.

Glow is a very important effect to add when simulating film, particularly monochrome film. Kdenlive does not have a glow filter, so if I need to add glow to a video file I have to improvise. I export the video as a PNG sequence, add glow to the PNG files using a GIMP batch script (written in Scheme), and then reimport the video file. It’s worth the effort, as it’s amazing how much glow helps to sell something as being originated on film.

Glow added using The GIMP

The GIMP glow filter tends to be rather harsh, and tends to wash out images if you use too much glow. Therefore you have to experiment a lot.

Finally, there is often uneven brightness or contrast visible across a film frame. In VirtualDub I used a filter called the Hotspot filter. The hotspot filter is actually designed to remove this effect from old film, but turned out to be just as good at putting the effect in!

However, with Kdenlive, this effect is best achieved in the GIMP when required as Kdenlive’s Vignette effect is too unsubtle to be of any real use.

So, put it all together, and you get something like this:

All in all, Kdenlive does a pretty good job of making digitally originated images look like 16mm film but although there is room for improvement. The film scratches filter needs work, there is no glow and the film grain is really just noise rather than grain. However you can still get some excellent results and I’m really pleased with it.

Walk Cycle Train

A lot of the Flash animation on the internet consists of characters blinking whilst the camera pans or zooms Ken Morse style. I can sympathise – the mere thought thought of, for instance, producing a walk cycle for an animated character can be really terrifying.

Indeed, I have a couple of projects on the back burner that I’ve put off for just that reason – they would require me producing an animated walk cycle and I really didn’t know where to start.

Sammy The Chamois

I remember producing a walk cycle for a Flash game called Sammy The Chamois from Alan Scragg‘s drawings. The walk cycle I produced was ridiculous and broke every known rule of anatomy and physics. Scraggie said he loved the way I’d broken all the rules in Sammy’s walk – I was too ashamed to admit that was because I didn’t know what the rules were!

However playing with Synfig Studio gave me a new impetus to think about animation again, and I started searching for walk cycles on the internet.

Dermot O’Connor’s website

I was lucky enough to come across a brilliant web tutorial in Flash by the Irish animator Dermot O’Connor. Dermot explains, over four videos, how to produce the classic Preston Blair animated walk cycle in Flash in the most clear and concise way imaginable. If you have ever been interested in animation I recommend that you look at them.

Having looked through these tutorials, I thought it would be a good exercise to produce Dermot’s rig in Synfig Studio and try animating the character in Synfig Studio instead of Flash. The term “rig” is a rather pretentious term for what is basically the digital version of a paper puppet jointed with brass paper fasteners.

Dermot’s Rig In Synfig Studio

Producing the rig in Synfig Studio was very straightforward. I simply traced Dermot’s drawing using Bline layers (a Bline layer is a layer made up of Bézier curves). The only tricky thing was getting the centres of rotation in the correct position. In Synfig Studio each Bline’s origin (and hence its centre of rotation) is the centre of the screen. That means you need to trace the shape, move it to somewhere near the centre of the screen to get the centre of rotation correct, and then move it back into position.

This became even more fiddly when I created Paste Canvas layers (what would be nested symbols in Flash) as you had to do a lot of mucking about to get the origin points correct. However, the great thing about Paste Canvas layers is that is completely explicit if a layer has other layers nested inside of it. That meant I didn’t need to use Dermot’s asterisk convention to denote nesting.

Rotation Layers

The other main difference between a Synfig Studio rig and a Flash rig is that the rotation is provided by a rotation layer, so they had to be added to the rig amongst the other layers.

The bright orange points are “linked”

For the arms and legs, I linked the two common nodes together so I could change the shape of the arms and legs in exactly the same way as Dermot could on his Flash rig.

Once the rig was set up, I could start animating. This was much easier in Synfig Studio than it would have been in Flash. The great thing about Synfig Studio was that I didn’t have to worry about shape hints for shape tweening. I didn’t have to decide whether I wanted a shape tween or a motion tween. I didn’t have to worry about creating new time-lines for nested layers and I could name my key-frames with meaningful labels rather than abbreviations such as “c” for “contact”, and then jump to them by clicking on the JMP in the layers panel.

My key-frames panel

The main disadvantage of Synfig Studio over Flash for animating is the lack of an outline mode. This meant that you have to do more layer hiding to animate the left hand leg and arm than you would in Flash.

There were a couple of other niggles in Synfig Studio – firstly when moving multiple layers you had to make sure the canvas window has the focus before using the arrow keys. This became very annoying until I learnt to do it instinctively. Secondly, it would be great if there was a visual indication of whether a node has merged or split tangents as there is in Inkscape.

This is what I did in ten minutes in Synfig Studio – it would have taken me a lot longer to achieve in Flash:

The walk cycle so far…

It’s not finished, as the arms are still very mechanical and I haven’t put the bend in on the feet. However thanks to Dermot I now have the both the confidence and the knowledge to try and working with my own projects either in Flash or Synfig Studio.

You can download the Synfig Studio rig I made from the Synfig Studio forum here.