A message from back home

The Secret Service And Other Stories

Well, I was going to write something about the Californian Ideology and how it got us into our current nightmare, but Dan Hon has said it better than I would. I’d only add that we’re likely downplaying just how much tech people bought into the things that brought us here, just like how we downplay how the IEA went hand-in-hand with pirate radio, swinging London, and Keith Joseph. There’s a direct line from the hippies to Thatcher/Reagan, and there’s another between the cipherpunks and today.

Now, in happier news, I need to tell you about a book. I haven’t finished it yet, but you need to run, not walk, and get hold of The Secret Service. You should probably get the eBook rather than hunting down an expensive physical copy (h/t Jennifer Hodgson).

It’s a book about spies in the early 19th century that can turn into objects. It opens in medias res with one of the main characters as a goblet. It is absolutely glorious. Run, don’t walk.

Finally this week, I recommend that you all, like me, watch Taggart from the beginning. The Glasgow high-rises! The early episodes switching from 16mm to videotape at a drop of a hat! The Biscuit! A very naked 21-year-old Alan Cumming! Taggart taking the piss out of Jardine every five minutes!

What's A Computer?

I am struggling to think of anything interesting that happened this week here in Durham. I sent off tax documents! I hoovered up! I discovered that I had missed the last two re-releases of Blue Monday, so I ordered them! I…went to Kroger! And…and…no, I’m afraid I have nothing.

A post shared by Ian Pointer (@carsondial) on

I did make some chocolates with a fancy new mold…but really that’s it. I have let you all down and I can only apologize.

Panic In Granadaland

Unlike many (at least 50,000, right?), I wouldn’t class myself as a huge fan of The Fall. I bought Hex Enduction Hour shortly after graduating from university, liked The Classical but bounced off everything else. And I never really went back to the well, aside from rediscovering Hit The North a while back. Through somebody else who isn’t with us anymore.

But it feels like another big chunk of the North was lost to us this week (it’s no mistake that I picked a clip with Tony Wilson, obviously).

Two things, though. Firstly, a lot of the eulogies skipped over or made light of just how bad Mark E. Smith could be. Secondly, I had no knowledge of him actually assaulting people…except when a friend on Twitter called out a few people for erasing this part of his life, I found my way to the NME story. Which is datelined March 16th 1998. During the time where I bought and read the NME every week. There’s no way I didn’t come across this news story when it actually happened. And yet, it made no impact on me.

This is not to Milkshake Duck the man. But it’s probably better not to sweep the uncomfortable parts under the tables. He’d probably not want it any other way, either.

Back in Durham and I’ve stopped looking at this house as a home and more of a collection of ‘well, I’m going to have to pack those up and do I really want to take that?’ Which is perhaps not the healthiest attitude to have, as I’m still here for quite a while yet, but I can’t really help it. A few months left…

The Ant-Let And Other Stories

A post shared by Ian Pointer (@carsondial) on

Robert was sad that I shot down all the antler-themed suggestions for the bar downstairs, so I gave in to the ‘ant-let’. That’s it, though! Anything larger and it starts to resemble something out of Hannibal.

I have met neighbours. They seem nice! And happy that the house is going to be lived in (even if that’s going to be sporadic until May). I have learnt gossip about the former owners and I have shovelled my driveway clear of snow. Am I a Midwesterner now?

One more week here in Cincinnati, and then back to Durham again! I feel like I have made a start on moving in (even played the first board game in the house and put up a bookcase!), but still early days yet…

View From The New Office

Picture the scene: on the left, a terrified British person clawing at the seat, trying not to be obvious about it, but as usual hating every moment of the flight to Chicago. On the right, a slightly older man, holding a double vodka (on the rocks, of course), who we’ll call Blake.

Blake: I’m a VP of a packaging company, and I’m a great negotiator. You should do this, this, and this at your job.

Terrified British Person: nods politely, trying not to wince out loud when the inevitable conversation about Churchill comes up.

Blake: And the thing about Churchill is that he was a hero, standing alone!

Terrified British Person: nnnnnnng

Perhaps unsurprisingly, I enjoyed the flight back to Durham a lot more, as my neighbour was asleep. Still, at the very least, I did make sure to point out who won the 1945 General Election.

Anyhow, after a short trip to Chicago for something I can’t fully reveal yet, I’m in Cincinnati for two weeks. Yes, after buying a house and running all the way back to Durham, I’m finally here again.

And of course, my trip coincides with a winter storm. Inches of snow on the ground. I haven’t really had to deal with snow before! After an embarrassing event where Tammy and I had to improvise to dig her car out of my driveway, I have ordered a shovel, and I must forever hide from the neighbour across the road who was clearly judging us as he used his fancy snow plough to clear his driveway.

Still, the view is pretty, right?

A post shared by Ian Pointer (@carsondial) on

(my next door neighbour, who I met today after deciding that dumping the post and running wasn’t polite, assures me that I won’t see this much snow every Winter)

Thanks to Tammy, I have a house full of furniture, the beginnings of a full-on library (insert evil cackle as I imagine ALL THE BOOKCASES), and an actual bed to sleep in. Hurrah! Oh, and enough Diet Coke to last 14 days. Maybe. If I ration it. It’s becoming more of a home!

Painting All The Things

A post shared by Ian Pointer (@carsondial) on

A new year, and the beginning of a new era where I spend half my time in Durham, and the other half in Cincinnati. Hopefully, this should only last as long as it takes to get Driver ready for sale and all my things moved up north, but at least this week it means I only faced temperatures of -12ºC instead of -20ºC.

As part of fixing the house up, Tammy came down for the New Year and painted the bathroom and the utility room. While I did roll some paint across walls, I just took orders from the person that knew what they were doing. And you can’t argue with results - the main bathroom actually looks like a finished room…which is something it has lacked since 2013. So, hurrah!

Next week Chicago for a day(!), and the first real trip to my new home. Expect fun photos of a new washing machine in your exciting future!

Class Activation Mapping In PyTorch

Have you ever wondered just how a neural network model like ResNet decides on its decision to determine that an image is a cat or a flower in the field? Class Activation Mappings (CAM) can provide some insight into this process by overlaying a heatmap over the original image to show us where our model thought most strongly that this cat was indeed a cat.

Firstly, we’re going to need a picture of a cat. And thankfully, here’s one I took earlier of a rather suspicious cat that is wondering why the strange man is back in his house again.

%matplotlib inline

from PIL import Image
from matplotlib.pyplot import imshow
from torchvision import models, transforms
from torch.autograd import Variable
from torch.nn import functional as F
from torch import topk
import numpy as np
import skimage.transform
image = Image.open("casper2.jpg")
imshow(image)

png

Doesn’t he look worried? Next, we’re going to set up some torchvision transforms to scale the image to the 224x224 required for ResNet and also to normalize it to the ImageNet mean/std.

# Imagenet mean/std

normalize = transforms.Normalize(
   mean=[0.485, 0.456, 0.406],
   std=[0.229, 0.224, 0.225]
)

# Preprocessing - scale to 224x224 for model, convert to tensor, 
# and normalize to -1..1 with mean/std for ImageNet

preprocess = transforms.Compose([
   transforms.Resize((224,224)),
   transforms.ToTensor(),
   normalize
])

display_transform = transforms.Compose([
   transforms.Resize((224,224))])
tensor = preprocess(image)
prediction_var = Variable((tensor.unsqueeze(0)).cuda(), requires_grad=True)

Having converted our image into a PyTorch variable, we need a model to generate a prediction. Let’s use ResNet18, put it in evaluation mode, and stick it on the GPU using the CUDA libraries.

model = models.resnet18(pretrained=True)
model.cuda()
model.eval()

This next bit of code is swiped from Jeremy Howard’s fast.ai course. It basically allows you to easily attach a hook to any model (or any part of a model - here we’re going to grab the final convnet layer in ResNet18) which will save the activation features as an instance variable.

class SaveFeatures():
    features=None
    def __init__(self, m): self.hook = m.register_forward_hook(self.hook_fn)
    def hook_fn(self, module, input, output): self.features = ((output.cpu()).data).numpy()
    def remove(self): self.hook.remove()
final_layer = model._modules.get('layer4')

activated_features = SaveFeatures(final_layer)

Having set that up, we run the image through our model and get the prediction. We then run that through a softmax layer to turn that prediction into a series of probabilities for each of the 1000 classes in ImageNet.

prediction = model(prediction_var)
pred_probabilities = F.softmax(prediction).data.squeeze()
activated_features.remove()

Using topk(), we can see that our model is 78% confident that this picture is class 283. Looking that up in the ImageNet classes, that gives us…‘persian cat’. I would say that’s not a bad guess!

topk(pred_probabilities,1)
(
  0.7832
 [torch.cuda.FloatTensor of size 1 (GPU 0)], 
  283
 [torch.cuda.LongTensor of size 1 (GPU 0)])

Having made the guess, let’s see where the neural network was focussing its attention. The getCAM() method here takes the activated features of the convnet, the weights of the fully-connected layer (on the side of the average pooling), and the class index we want to investigate (283/‘persian cat’ in our case). We index into the fully-connected layer to get the weights for that class and calculate the dot product with our features from the image.

(this code is based on the paper that introduced CAM)

def getCAM(feature_conv, weight_fc, class_idx):
    _, nc, h, w = feature_conv.shape
    cam = weight_fc[class_idx].dot(feature_conv.reshape((nc, h*w)))
    cam = cam.reshape(h, w)
    cam = cam - np.min(cam)
    cam_img = cam / np.max(cam)
    return [cam_img]

weight_softmax_params = list(model._modules.get('fc').parameters())
weight_softmax = np.squeeze(weight_softmax_params[0].cpu().data.numpy())
weight_softmax_params
class_idx = topk(pred_probabilities,1)[1].int()
overlay = getCAM(activated_features.features, weight_softmax, class_idx )

Now we can see our heatmap and overlay it onto Casper. It doesn’t make him look any happier, but we can see exactly where the model made its mind up about him.

imshow(overlay[0], alpha=0.5, cmap='jet')

png

imshow(display_transform(image))
imshow(skimage.transform.resize(overlay[0], tensor.shape[1:3]), alpha=0.5, cmap='jet');

png

But wait, there’s a bit more - we can also look at the model’s second choice for Casper.

class_idx = topk(pred_probabilities,2)[1].int()
class_idx
 283
 332
[torch.cuda.IntTensor of size 2 (GPU 0)]
overlay = getCAM(activated_features.features, weight_softmax, 332 )

imshow(display_transform(image))
imshow(skimage.transform.resize(overlay[0], tensor.shape[1:3]), alpha=0.5, cmap='jet');

png

Although the heatmap is similar, the network is focussing a touch more on his fluffy coat to suggest he might be class 332 - an Angora rabbit. And well, he is a Turkish Angora cat after all…

England Made Me — Christmas Eve Edition

The Snowman

The View From The Kitchen Floor

There’s always a brief moment when the shots ring out when you wonder ‘is that really gunfire, or is it just firecrackers?’ This time, that moment of hesitation was shattered by the second round of fire. So I spent a good five minutes on the kitchen floor, on the basis that it’s the one room in the house that’s equally distant from both roads, meaning that any bullets would have to travel through several walls before they got to me. The joys of living in a country with insane gun laws.

Anyway, back home in the UK for ten days! Tomorrow there will be an exciting adventure in Tesco (where I will, at 38 years old, race a proper, free-axis trolley around the shop like a crazy person), Star Wars later in the week, all building up to FESTIVE FESTIVE FESTIVE at the weekend! Providing my sister doesn’t kill me beforehand…