Escape Room Weekend 2

Second week of the Escape Room is over. One more week! I still can’t told about it much because spoilers, but hopefully a bit more of a discussion of it next weekend.

Some reshuffling of projects underway — Camouflage is being vastly scaled back as I’m not going to be able to meet the end-of-April deadline to get it started, and I don’t think I am able to really offer the support for running a year-long competition. Instead, it’ll just be a very silly paper that comes out at the end of the year. My comic anthology project is about to run into the brick wall of ‘Nicolás runs out of story to draw’, which is obviously nobody’s fault but my own…I have two more stories for the project, but have I written anything this month? Of course not.

But I have taken on another project, which we’ll call Cat Stevens for now. It’s an AI collaboration project! And should hopefully be wrapped up (mostly) by next weekend, provided I get HTML columns working and some little details ironed out. More on that soon!

I think April might be a month of cleaning up and rearranging things! I’m hoping to sort out a few things in the bar (more shelves, setting up the MiSTer station, etc.), and I do need to clear at least one wall for hanging up a poster. And maybe we can make the main bedroom look like somebody lives there by putting things on the walls. It’s only been four years, after all…

Escape Room Weekend 1

View this post on Instagram

A post shared by Ian Pointer (@carsondial)

(more next week)

CLIPPed Faces

When I was building the Adam Curtis search engine, I noticed that the combination of CLIP and FAISS was pretty good at coming up with good suggestions for “Tony Blair” and any other figure that you care to name that pops up throughout the documentaries. And I wondered - I’m sure most of these people turn up in the CLIP training set, being historical figures, but how well would it do on somebody that almost certainly wasn’t (e.g. me)?

import torch
import faiss
import clip
import numpy as np
import faiss.contrib.torch_utils
import glob
from PIL import Image

model, preprocess = clip.load("ViT-B/32", device='cpu', jit=False)
def encode_image(filename, id):
    image_tensor =  model.encode_image(preprocess(
    image_tensor /= image_tensor.norm(dim=-1, keepdim=True)
    return image_tensor, torch.tensor([id],  device="cpu", dtype=torch.long)

Photos of me!


Okay, so we’ll grab two images of me, encode them into vectors with CLIP, but obviously if those are the only two vectors I have to choose from, it’s not much of a test. But! I have the Adam Curtis FAISS index just lying around, and it has a great selection of talking heads and other things that makes for a relatively decent test (as Peter Snow would say, this is “just a bit of fun”). I know that the IDs in the Curtis dataset are 64-bit ids, so I’m going to fudge it with the new entries by using low-digit IDs that are not going to collide (and I checked beforehand just to make sure that I wasn’t getting collisions to be on the safe side).

faiss_index = faiss.read_index("curtis.idx")
ref_tensor, ids = encode_image("IMG_0331.jpg", 111)
faiss_index.add_with_ids(ref_tensor, ids)
check_tensor, _ = encode_image("IMG_4069.JPG", 0) 
distances, indices =, 5)
tensor([[                111, 1914273147895908598, 1911355963158792438,
         1914859604205340918, 1910780652289493238]])

The big surprise is that it does seem to work with no additional training being necessary! With just one photo added, it is already matching me amongst a 150k set of different images. Not too bad! Let’s just make sure that it’s working okay by doing a test against a picture of Mr. Tony Blair.

blair_tensor, _ = encode_image("blair.jpg", 0)

distances, indices =, 5)
tensor([[7189953302034286838, 6916916776419100918, 7336140665801869558,
         7180745063950354678, 8920012398545733878]])

And here’s the first result, which is definitely Blair."/home/ian/notebooks/curtis/web/app/static/images/the_trap03_10415556398636929516_003333_7189953302034286838.jpg")


Anchoring With Text

The other thing I noticed when working on the search engine is that CLIP is pretty good at reading text. So another approach we could try is adding my name to one of my photos, encoding that and a bunch of other photos of me, then doing a search for “Ian Pointer” and seeing what comes back - whether it can anchor on the text on that image and whether the vector representation of that image is close enough to pull in the other versions of me in the index.

faiss_index = faiss.read_index("curtis.idx")

ref_tensor, ref_ids = encode_image("ian_text.jpg", 111)
check_tensor, check_ids = encode_image("IMG_4069.JPG", 222)

faiss_index.add_with_ids(ref_tensor, ref_ids)
faiss_index.add_with_ids(check_tensor, check_ids)
text_features = model.encode_text(clip.tokenize("Ian Pointer").to("cpu"))

text_features /= text_features.norm(dim=-1, keepdim=True).float()
r ='cpu').float()
distances, indices =, 5)


tensor([[ 368881852814592245, 2327282344735017205, 9001564851015125238,
         6284035229181315318, 6287482820904650998]])

Okay, that didn’t work so well. But! What if we do something really stupid and just add more instances of my name on the photo so CLIP takes the hint?

faiss_index = faiss.read_index("curtis.idx")

ref_tensor, ref_ids = encode_image("ian_text_lots.jpg", 111)
check_tensor, check_ids = encode_image("IMG_4069.JPG", 222)

faiss_index.add_with_ids(ref_tensor, ref_ids)
faiss_index.add_with_ids(check_tensor, check_ids)

text_features = model.encode_text(clip.tokenize("Ian Pointer").to("cpu"))
text_features /= text_features.norm(dim=-1, keepdim=True).float()
r ='cpu').float()
distances, indices =, 5)

tensor([[                111,  368881852814592245, 2327282344735017205,
         9001564851015125238, 6284035229181315318]])

So if you add a bunch of text to the image, CLIP will “read” the text, but it feels like the information encoded in the vector space is in a separate cluster to the rest of the image details, as we’re not seeing the other picture of me coming back on following searches. This surprises me a little, as I was seeing the opposite in test queries in the Curtis database, but it’s likely they were matching a lot more in the image clusters of the vector space regardless of the text it was finding.


But what if I want to fool the system? Here, I’m taking the picture of Tony Blair, adding my name over the top (which makes me feel a little dirty, but whatever), and then searching for “Ian Pointer” again. Will CLIP be confused?

faiss_index = faiss.read_index("curtis.idx")

ref_tensor, ref_ids = encode_image("blair_text.jpg", 111)
check_tensor, check_ids = encode_image("IMG_4069.JPG", 222)

faiss_index.add_with_ids(ref_tensor, ref_ids)
faiss_index.add_with_ids(check_tensor, check_ids)

text_features = model.encode_text(clip.tokenize("Ian Pointer").to("cpu"))
text_features /= text_features.norm(dim=-1, keepdim=True).float()
r ='cpu').float()
distances, indices =, 5)

tensor([[                111,  368881852814592245, 2327282344735017205,
         9001564851015125238, 6284035229181315318]])"/home/ian/notebooks/curtis/web/app/static/images/cant4_15115732058354422251_003609_368881852814592245.jpg")


Again, CLIP zeroes in on the text, but the rest of the returned search items are exactly the same as before, so I don’t think it has been fooled all that much.

Wrapping Up

In this strenuous and stringent bit of testing, it seems like CLIP actually does have some value as a zero-shot facial identification system. Which is vaguely terrifying. It might be interesting to expand this idea out further, maybe with larger and more appropriate datasets like Celeb.A, or if you just happen to have a few million photos hanging around.

And remember, don’t have nightmares.

Bryan Adams Just Won't Leave

It’s been a while, but The Story of… is finally back as an accompaniment to the Friday night repeats of Top of The Pops. And isn’t it just a fun experience when they get to an act that currently looks like you’d bump into them on the high street as they were coming out of New Look, and they’re just full of joy about their time in the chart and their appearances on the show. Something I’ve missed since it seemed to be put on hiatus due to the pandemic and the semi-closedown of BBC4.

Also, watching it with an American is recommended. It’s more fun when you have to pause to explain Betty Boo, Chesney Hawkes, and The Wonder Stuff are, or expound on your declaration of “of course Bob is there! It’s Vic and Bob!", when they mainly know Bob Mortimer from Would I Lie To You?. And “yes, the ice creams are playing guitar. It’s the KLF, you just have to go with the flow”.

Next week? PEAK MR. C!

Actually, it’s been a banner week for iPlayer, as somebody deep in the bowels of the BBC has decided to put up The Young Ones and all of 2point4children. Sadly, the former are the cut versions, but I have the DVDs for them. 2point4children, though, is completely uncut, including those pesky music cues, and it’s probably the great lost big BBC sitcom of the 90s…ånd now, thanks to get_iplayer, I have all of it.

In local news, our escape room is really about to happen! We’re going to be putting the finishing touches on it this week, and our first run-throughs take place next weekend! For those of you reading that are in the area and would like to take part, why not head over to the website and see what dates are still available? Come see what we’ve done to the basement!!

Finally, keep your eyes peeled for, yes, some technical content this week! I know all of you reading this blog have been itching for some more neural network action, so that’s what you’re going to get. I remain somewhat obsessed with CLIP (though, in fairness, this upcoming entry is an idea I had about a year ago and just never actually sat down for a couple of hours to try it out until this week).

Neunundneunzig Luftballons

I will say that the weekend got off to a rather mixed start. On the bright side, I ended up not having to go to NC for 24 hours after all. Which was nice! On the other, I got about three hours sleep on Friday night and had to deal with a cat going to the toilet on the carpet at 1am. Less fun.




I may have found a source for a copy of Emergency Ward 9, the last Dennis Potter play I don’t have (okay, the last one I don’t have that wasn’t wiped. I’m not magic). Plus a few other bits and pieces…basically, the Spring will be filled with the joys of BBC plays from the 60s and 70s if I get my way!

Meanwhile, in other BBC Archive news…it was finally the week that we got to see this majestic bit of Top of The Pops footage:

And is if the heavens have opened, the BBC are finally going to be showing The Story of 1991/2 in the next fortnight. It’s like Jimmy broke the seal.

Somehow, it’s March?

(Annihilation Mix)

All in all, I’m a touched more concerned that my flights next week go through DC’s National Airport than I was, ooooh, a week or so ago. And I have also been told that my childhood rationale of “we live near so many military bases that we’d all be wiped out in a first strike” is…not exactly comforting.

Anyway, aside from impending world-wide doom, the art for my first comic is done! Now all I need is somebody to letter it…and write the rest of the stories. Easy! 🥺

And on to March. Which, fingers crossed, is going to be Escape Room Month. We’re going to get it done!

IPC Subeditors Dictate Our Youth

First up - this is an amazing Humble Bundle package of British comics ranging from the 60s right up to the present day, including some of the most celebrated Dredd epics of the last few years. You might want to check it out!

I have a four day weekend! Which means I will likely feel guilty by the start of Monday for not doing half of my original (wildly over-optimistic) plans. I’ll be beating myself up all day and going into the start of work feeling like I’ve already failed. I’m not saying it’s a good system…

But! We have done some things! Tammy has made great progress on the Escape Room downstairs - we now have a new fireplace and an almost complete fake wall (with a window!). It looks like we’ll finally be running it throughout March. I also made carnitas and we played our first game of Mind MGMT, which is a great hidden movement game that doesn’t take four hours to play (looks over at Fury of Dracula). So maybe it wasn’t too bad…

Announcing: Lewknor Turn

Hello. Comics!

Lewknor Turn

The Next Stop Is Lewknor Turn is a story in an upcoming collection1 I’m putting together. Pencils and inks are by the great Nicolás Nieto, letterer Still To Be Decided. There will be a website forthcoming when there’s a bit more to show off, but it’s looking good so far.

(This the Project Formerly Known as Morning After Pill, named after a Meanwhile Back In Communist Russia song that references Oxford’s train station. Obviously. Just try to guess what Project Camouflage is…)

In other news, I have splurged a little on the basis of my imminent tax return (I always over pay taxes, so I get a significant refund this time every year. Is it incredibly inefficient? Sure, but but I’d rather have that than owe something), and I now have a proper chocolate-grade airbrush and a fancy compressor coming my way. There will be coloured chocolates to come later this year…perhaps starting with some quite swish Easter eggs. It’ll have to wait until after our Escape Room Shenanigans are finished, though. And that’s coming together too, with furniture placed and the fake walls all up. At the end of the month, we will be taking the basement to the 1920s!

And that’s about it for this week, although I did also give beeswax-lined canelés a try out this weekend. Sadly, not quite as good as my butter-lined molds, but I think that was mainly down to having a little too much beeswax/ghee mixture in the bottom of the mold, resulting in less browning on top than I’d normally expect. Very crunchy though!

  1. It’s likely to be digital-only, but we’ll see what paper prices are like by the end of the year, shall we? ↩︎

Paul Robinson Turns Erinsborough Into A Chemical Site

I really have to start keeping notes during the week again; I find myself with little to say this weekend, except that the news that Neighbours may be ending after almost 40 years makes me a little sad, and I hope it ends with Karl Kennedy doing a ten-minute speech to camera in the same manner as Jimmy Corkhill in the last episode of Brookside. It never was what you’d call quality television, but a comforting presence whenever I go home. And one of those weird shows where a foreign programme is kept alive by another country — the only other example I can really think of is how the BBC stepped in to help fund the final series of Due South.

Otherwise, I guess I’ll see you all back here in another seven days; hopefully I’ll have something more interesting to talk about…

Briefly, Raleigh

I was hoping to have a long entry on my Raleigh visit…but it was somewhat restricted to about three blocks of downtown and the airport, so not a lot of interesting things happened. I will say that the Kouign-Amann at Lucettegrace is really good, and unfortunately my memories of Beasley’s are always better than the reality. I even came home a day early, not just because of the disappointments, but not not either.

Sometimes I like to watch 90s documentaries on the Internet just to remember optimism. You may need to click through to fully experience the 90s Channel 4 goodness.

By the end, though, I’m always left with this feeling.


Anyway, I guess a slightly melancholy end to the month, but that possibly has a lot to do with the -17ºC temperature at night…with -21ºC coming this following weekend! 😮🥶

And I just broke a Kitchen-Aid mixing paddle I’ve had for ten years, so it’s going great…