And with that, I’m pretty much coming to the end of my recent buying spree of Transformers. Trypticon was one of the toys that never made it to UK shores, so I was always going to be a sucker for Hasbro’s current strategy of ‘let’s do all of G1, but this time with knees’. So I now have a two-foot dinosaur to go with my two-foot robot…but with no idea where I’m going to put them (although I guess I can delay a permanent decision there for a while).
Next week, more adult pursuits: getting my first US passport. Exciting times!
The talk suffered a bit from my usual deer-in-headlights presenting style and that I lost a week of preparation due to wisdom teeth shenanigans, but hey, I implemented a 3x speedup of Ruby’s Pathname.absolute? in less than ten minutes, which isn’t too bad…
Going to a busy six weeks coming up - as of yesterday, I have trips to DC, Cincinnati, and of course Bicester and London before the end of August. And then…and then, things get even more fraught as I begin my 2018 masterplan. Which is much less impressive than it sounds, but will involve quite a bit of upheaval. But all for the better in the end!
Hopefully, I will sleep a little more this upcoming week, too. And fewer exam stress dreams set back in Manchester. It’s been 17 years…
I can drink hot and carbonated drinks again. Which is good, because I seem to have bought all of the Diet Coke in Durham during the past two weeks of Target sales. I may have miscalculated how much even I can drink, but you can’t beat 4 cases of 12 cans for $8.88, can you?
Other than that…it has been a quiet week. I mean, really quiet. The most exciting thing of the has likely been the discovery of 1980’s Fox. Come back next week for more exciting updates!
tl;dr - a fully convolutional neural network trained to colourize surviving black-and-white recordings of colour Steptoe and Son episodes.
One of the most infamous parts of the BBC’s history was its practice of wiping old episodes of TV programmes throughout the years to re-use videotape. The most famous of these is, of course, Doctor Who, but it was far from the only programme to suffer in this way. Many episodes of The Likely Lads, Dad’s Army, and Steptoe And Son have been lost.
Every so often, an off-air recording is found. Recorded at the time of broadcast, these can be a wonderful way of plugging the archive gaps (who said that piracy is always a crime, eh?). If the BBC is lucky, then a full colour episode is recovered (if it was broadcast in colour). More often for the older shows, however, it’s likely that the off-air recording is in black and white. Even here, though, the BBC and the Restoration Team has been able to work magic. If a BW recording has colour artifacts and noise in the signal (known as chromadots), then the colour can be restored to the image (this was performed on an episode of Dad’s Army, ‘Room At The Bottom’).
If we don’t have the chromadots, we’re out of luck. Or are we? A couple of months ago, I saw Jeremy Howard showing off a super-resolution neural net. To his surprise, one of his test pictures didn’t just come out larger; the network had corrected the colour balance in the image as well. A week later, I was reading a comedy forum which offhandedly joked about the irony of ‘The Colour Problem’ being an episode of Steptoe and Son that only existed in a b/w recording…and I had an idea.
Bring out your data!
Most image-related neural networks are trained on large datasets, e.g. ImageNet, or CoCo. I could have chosen to take a pre-trained network like VGG or Inception and adapted it to my own needs. But…after all, the show was a classic 60s/70s BBC sitcom production - repeated uses of sets, outside 16mm film, inside video, etc. So I wondered: would it make sense to train a neural network on existing colour episodes and then get it to colourize based on what it had learnt from them?1
All I needed was access to the colour episodes and ‘The Colour Problem’ itself. In an ideal world, at this point I would have pulled out DVDs or Blu-Rays and taken high-quality images from those. As I’m not exactly a fan of the show…I don’t have any of those. But what I did have was a bunch of YouTube links, a downloader app, and ffmpeg. It wasn’t going to be perfect, but it’d do for now.
In order to train the network, what I did was to produce a series of stills from each colour episode in two formats - colour and b/w. The networks I created would train by altering the b/w images and using the colour stills as ‘labels’ to compare against and update the network as required.
For those of you interested, here’s the two ffmpeg commands that did this:
I was now armed with 650,000 still images of the show. I never thought my work would end up with me having over half-a-million JPGs of Steptoe and Son on my computer, but here we are.
But First on BBC 1 Tonight
Having got hold of all that data, I then took a random sample of 5000 images from the 600,000. Why? Because it can be useful to work on a sample of the dataset before launching into the whole lot.
As training takes a lot less time than on the entire dataset, not only does this allow you to spot mistakes much quicker than if you were training on everything, but it can be great for getting a ‘feel’ for the data and what architectures might or might not be useful.
FSRCNN
Normally, if I was playing with a dataset for the first time, I’d likely start with a couple of fully-connected layers - but in this case, I knew I wanted to start with something like the super-resolution architecture I had seen a few weeks ago. I had a quick Google and found [FSRCNN]( Accelerating the Super-Resolution Convolutional Neural Network), a fully-convolutional network architecture designed for scaling up images.
What happens in FSRCNN is that the image is exposed to a series of convolutional layers which reduces the image’s complexity down to a much smaller size, whereupon another set of convolutional layers operate on that smaller data. Finally, everything goes through de-convolutional layers to scale the image back up and then to the required (larger!) size.
Here’s a look at the architecture, visualized with Keras’s SVG model rendering:
(The idea of shrinking your data, operating on that instead and then scaling it back up is a common one in neural networks)
I made some modifications to the FSRCNN architecture. Firstly, I wanted the output image to have the same scale as the input rather than making it bigger. Plus, I altered things to take the input with only one channel (greyscale), but to produce an RGB 3-channel picture.
Armed with this model, I ran a training set…and…got a complete mess.
Well, that worked great, didn’t it? sigh
Colour Spaces - From RGB to CIELAB
As I returned to the drawing board, I wondered about my decision to convert from greyscale to RGB. It felt wrong. I had the greyscale data, but I was essentially throwing that away and making the network generate 3 channels of data from scratch. Was there a way I could instead recreate the effect of the chromadots and add it to the original greyscale information? That way, I’d only be generating two channels of new synthetic data and combining it with reality. It seemed worth exploring.
The answer seemed to be found in the CIELAB colour space. In this space, the L co-ordinate represents lightness, _a*_ is a point between red/magenta and green, and _b*_ is a point between yellow and blue. I had the _L_ co-ordinates in my greyscale image - I just had to generate the _a*_ and _b*_ co-ordinates for each image and then combine them with the original _L_. Simple!
Other Colorization Models Are Available
While I was doing that research, though, I also stumbled on a paper called Colorful Image Colorization. This paper seemed to confirm my choice of moving to the CIELAB colour space and also provided a similar architecture to FSRCNN, but with more filters running on the scaled-down image. I blended the two architectures together with Keras, as I wasn’t entirely convinced by the paper’s method of choosing colours via quantized bins and a generated probability distribution.
Here’s what my architecture looked like at this point:
Sepia. Not wonderful. But better than the previous hideous mess!
Let’s Make It U-Net
Okay, so maybe Zhang et. al had a point, and I needed to include that probability distribution and the bins. But…looking at my architecture again, I had another idea: U-Net.
U-Net is an architecture that was designed for segmenting medical images, but has proved to be incredibly strong in Kaggle competitions on all sorts of other problems.
The innovation of the U-Net architecture is that it passes information from the higher-level parts of the network at the start across to the scaling-up side on the right, so structure and other information found in the initial higher levels can be used alongside information that’s passed up through the scaling-up blocks. Yay more information!
My existing architecture was basically a U-Net without the left-to-right arrows…so I thought ‘why not add them in and see what breaks?'.
I added a simple line from the first block of scaling-down filters to the last block of the scaling-up filters just to see if I’d get any benefit. And…finally, I was getting somewhere. Here’s the current architecture - the final Lambda layer is just a multiplication to bring the values of the two new channels into the CIELAB colour space for a and b:
(it turns out that Zhang and his team released a new paper in May that also includes U-Net-like additions, so we’re thinking on the same lines at least!)
You Have Been Watching
Here’s a clip of the original b/w of ‘The Colour Problem’ side-by-side with my colourized version:
Jun 30, 2017 · 2 minute
read
ow the delicious gossip that I can't talk about ow road trip! ow ow ow ow
I no longer have a green card! Or anything that identifies me as a valid citizen rather than a large, watermarked piece of paper. Which is likely worse…but better, once I get the new passport sorted out. Anyhow, the citizenship ceremony was fairly straightforward, except that our current President still hasn’t got around to recording a welcome message for new citizens, and a complete absence of him from all printed material. Not that most of us minded when we realized.
After a weekend of boardgames, Doctor Who, buying 8 cases of Diet Coke, and a longer-than-expected-dinner-because-reasons, it was Monday and time to celebrate being an American in style: getting my wisdom teeth removed. I have always been hesitant to have them out, due to one of them being apparently close to one of my jaw nerves. But the dentist here seemed very keen on yanking them out, and well, I was getting tired of the infections they kept bringing. I was informed that I have the smallest mouth she’s ever seen…but I was also one of the nicest patients she’s had. So swings and roundabouts there (everybody seems to agree on the small mouth thing…BUT NOBODY EVER SAID THAT TO ME BEFORE! But fine. Sure.).
Thankfully, it was fairly quick and easy to yank them out, but as everybody thought it would be a bad idea for me to be alone afterwards, Tammy drove me all the way back to Kentucky for the week. I spent the 8-hour car ride apologizing every five minutes, so I’m grateful that she didn’t throw me out in West Virginia.
Aside from running a fever on Wednesday and making the well-intentioned mistake of eating pizza yesterday, it hasn’t been too bad. Some pain, but not unbearable, not too much swelling, and the nerve ended up not being a problem. Hurrah! Though I am looking forward to being able to drink Diet Coke and tea again next week.
In the meantime, I am hopped up on ibuprofen, antibiotics, and percocet, whilst doing work and dying over and over on Zelda1. Tomorrow, mostly recovered, I fly home to Durham. Home for now, anyhow…
Buried lede - I have a Switch! Not with the colour of joycons that I wanted, but after looking for…two months for the thing, I decided to get what was available. I am so bad at Zelda, but slightly better at Mario Kart 8. ↩︎
Jun 19, 2017 · 2 minute
read
brian can! through the round window yes, it's jeremy irons before he turned evil so that's where my love of puns comes from
As a point of comparison to American readers, I’d say that Cant was something akin to Mr. Rogers in terms of importance to younger viewers. But whereas, to this foreigner, Rogers comes across as the classic American staid gentleman next door, Cant and the Playaway/Play School set gave the impression that they spent their evenings debating Marxist tracts in their omnisexual polyamorous commune.1 They were firmly pitched in the ‘now’…even if that ‘now’ seems a lost time and place for us, particularly in the light of recent events.
The other thing about Cant is that he was always there. We grew up with him on Playaway and Play School, but even after going to school, you’d see him somewhere when you were off sick and watching the schools programmes. Or when he was doing a rotation on Jackanory. Or during the Summer when repeats of Camberwick Green or Trumpton would fill in time on But First This… And then even later at university in the late 90s, where he did The Organ Gang shorts for This Morning With Richard Not Judy. A reassuring twinkling smile when you’re drinking a lemsip…with a hangover on a Sunday morning at the age of 28.2
You can’t claim that he was one of the Pythons, or up there with Spike Milligan, but he had that sparkle of safe anarchy that us British and children in general love:
You could never describe Cant as cool. And yet, a generation of us watched him whilst we were small and knew that’s what we wanted to be when we grew up. Even if we didn’t realize it until much later.
Obviously, no harm to Fred Rogers, who was a legend on his own terms. ↩︎
Jun 18, 2017 · 1 minute
read
the story of aa174 have you tried turning it off and on again
My family got an extra night in the US after their plane failed to reboot properly. As of right now, they’re delayed another two hours on their second attempt. At this rate, they’ll be here for my citizenship ceremony on Friday. But hopefully they’ll get underway this evening. After all, the cats back home are getting hungry without their supply of treats…
Jun 12, 2017 · 2 minute
read
admit it, you haven't stopped laughing either
22:00 BST. When the Exit Poll fell.
Although I didn’t make it explicit last week for fear of jinxing it, the YouGov/Survation polls of last week didn’t just make me hope. Given the completely inept way the Tories ran their campaign, the other polls just seemed wrong - surely we wouldn’t give somebody a 100+ seat majority when they spent six weeks seemingly hiding from the press?
We did not.
And we laughed and laughed and laughed. The Tories achieving an amazing Pyrrhic victory, managing to lose a 25-point lead to a man that just two months ago looked like he was taking Labour to the point of destruction.
But they were wrong.
Along with Macron’s En Marche giving people a trouncing in the French elections, things might be looking up1? Just maybe?
In less globally-important news, my citizenship interview went well, and I will become a US citizen on June 23rd. I will celebrate by having my wisdom teeth taken out on the following Monday. I know how to have a good time, y’know.
And, a good time was had this weekend - a full house with many friends, Tammy and I spending Sunday making cakes, ice creams, other pastry items, and then me abandoning her to cook all the chicken. But: so many people that even the extended table wasn’t enough for everybody. Pools and slip’n’slides as well!
I can, at request, go into lengthy detail why the “Bernie-would-have-won” brigade shouldn’t take this as vindication, but I’ll just leave you here with my précis: Jeremy Corbyn won the Labour leadership twice, the second time with more PLP shenanigans than the DNC committed even in your wildest fever dreams. Come back when you don’t lose by 3 million votes. ↩︎
Jun 5, 2017 · 1 minute
read
I want to believe yougov/survation
I have started to get invested in the polls (at least YouGov/Survation). This will inevitably lead to a crushing sensation around 17:00 EDT on Thursday when the party that has run the worst GE campaign in living memory gets elected in a landslide.
It’s not the despair, Laura. I can stand the despair. It’s the hope!
To cheer us all up, here’s Gyles Brandreth on his short career as an anti-porn crusader in the 1970s:
May 29, 2017 · 3 minute
read
st mary's rcp boy is ill world is ending
I don’t have too many vivid memories of childhood1, but I do remember one day in primary school. We had been given a project to make a small wheeled vehicle, and we were out testing them on the netball court. It was a sunny day, warm and hot.
My car was a mash of Construx and ice-cream cartons2, while Scott’s was a constructed kit affair with a proper motor and gearing. Mine had a huge power block with D batteries and not a single gear in sight. It had worked fine in testing. But testing was my table back home, not the tarmac of the court.
It started, it spluttered, it threw off the belt I had liberated from one of the many video recorders my Dad had let me disassemble. It sat on the court while Scott’s fancy car zipped away in the distance.
The lesson here is not, surprisingly: learn about gearing, or that the rich kid will almost always win, but that if you’re wearing a jumper on a hot summer’s day and you’re still freezing, you might be ill. And so, defeated, I laid back on the grass and shivered until my Mum came to pick me up.
I did build a computer this weekend. It works. My first build since…2006 or so (and even then, that was just replacing a blown motherboard. This is likely my first totally new tower since before UNC…and let’s not count up those years!). I can’t really do anything fancy with it until I buy the graphics card in a month or two, but it’s coming along.
Success, then…but I am still under a blanket, shivering, having difficulty standing up or sitting down without pain, and oh, yes, almost managing to give myself third-degree burns whilst attempting to carry a Lemsip. Perhaps I shouldn’t be left alone. Maybe I’ve become allergic to Durham! Maybe I’m just sick.
Still, a week tomorrow, my family arrives and I have my citizenship interview. So probably need to get better.
Ask me about Covent Garden, and I’ll do my party piece about how I was abandoned in the middle of London at three years old and left to fend for myself amongst a carny of street performers, armed only with an inflatable hammer but also afflicted with an early adherence to pacifism. My parents may chime in with ‘we were getting ice-cream! You said you didn’t want any! What child doesn’t want ice-cream?', but I don’t think that alleviates them of guilt, do you, dear reader? ↩︎
Look, eventually I started liking ice-cream, okay? NOT THAT I DON’T RELIVE THE TRAUMA WITH EVERY SPOON. ↩︎