Giving Up On Google

I don’t hate Google, that would be a silly thing to do in public. After all, in the last twelve months they purchased an organisation that developed some of the world’s most impressive (military) robots 1 and have always developed impressive AI systems of their very own 2 - what could possibly go wrong 3.

But that hasn’t stopped me from giving up on them.

The main reason I’ve given up on Google’s services are the ever increasing feelings of becoming locked-in. More and more services coming from Google seem to be Chrome-only, or at least work far better in Chrome than anywhere else and I think this speaks to the future of Google.

There’s no denying that Apple products are a kind of lock-in, but I feel that Apple don’t rely on my data as a source of revenue so I’ll always be able to get it out if I need to - I don’t feel this way about Google.

Whilst this feeling has been building up over time due to factors like requiring a Google+ profile to make full use of YouTube, or Gmail never quite playing nicely with other mail clients; it came into focus more recently with the introduction of Inbox. There’s no denying that Inbox is a great service, it’s innovative and genuinely useful, but I can only access it though Google applications.

This particular lock-in isn’t fundamentally a problem in itself, I can still get at my data, but it leaves me fearful for the future of my email. In the future I imagine Google turning off all POP & IMAP support - access to my email will be via Google (or maybe an approved API) only and my email, my data, will be more ‘stuck’ where it is than I would like - and I imagine the same will be true of all of Google’s other services.

None of Google’s recent behaviour strikes me as ‘open’, Apple is hardly an open company either, but I at least feel like Apple is open with my data on a closed platform rather than, like Google, closed with my data on an open platform.

I’d quite happily pay Google to have better access to my data, but they don’t seem to have much interest in that - so I’m moving all my data elsewhere.

Push Email With FastMail (Sieve) & IFTTT

As I was writing this post FastMail released a new app with push notification support, it wasn’t really for me as I prefer a unified inbox but if you think it might suit you - check it out.

I recently went through the process of prising my email from the jaws of Gmail and getting it into FastMail, the switch went pretty smoothly but I missed Mailbox’s push notifications (even if it gets IMAP support I probably won’t use it any more - having my email flowing through another cloud server never felt quite right).

The simplest thing to do would be to set up the email channel in IFTTT, forward all incoming mail to then use the iOS notifications channel to push the alerts. The main problem with this is related to privacy; the email body will also, at some point, end up on IFTTT’s servers.

To solve this problem I created a custom sieve rule inside FastMail’s advanced rules section that strips everything other than the subject. It’s pretty simple and looks like this:

if true {
  notify :method "mailto" :options ["[email protected]"] :message "/ $subject$ /";

This rule should probably go after your junk mail filters (which probably look similar to the below), unless you want to get notified about your junk mail too of course.

if not header :contains ["X-Spam-known-sender"] "yes" {
  if allof(
    header :contains ["X-Backscatter"] "yes",
    not header :matches ["X-LinkName"] "*" 
  ) {
    setflag "\\Seen";
    fileinto "INBOX.Junk Mail";
  if  header :value "ge" :comparator "i;ascii-numeric" ["X-Spam-score"] ["5"]  {
    setflag "\\Seen";
    fileinto "INBOX.Junk Mail";

I also have emails coming through to a work Google account that I like to get push notifications for (without using the Gmail app). IFTTT is a little limited as it only supports one email address in one email channel so I set up Gmail to forward all my emails to FastMail. I didn’t really want all this unrelated email cluttering up my inbox so I added another rule, just after my notification rule, to discard them:

if header :matches ["x-forwarded-for"] "[email protected]" {

And that’s all there is to it really - I get a (nearly instant) push notification that I have a new email and I can open the mail app of my choice when it’s convenient. One added benefit seems to be a little extra battery life after turning off fetch & push in the native iOS Mail app.

Cheltenham Literature Festival 2014

Radio Times Tent At Cheltenham Festival

The Cheltenham Literature Festival is, as the name suggests, a festival in the lovely Spa town of Cheltenham (seriously, go there, it’s a really nice place) that celebrates literature. It was formed in 1949 and I’ve attended for the past couple of years - thanks entirely to my wonderful girlfriend.

The festival takes places over ten days (starting on 3rd October this year) but it’s only really practical to stay in Cheltenham for a week due to travelling and work commitments. Unfortunately this meant that we had to miss one of the best days of the festival this year, the last day.

We still got to attend some pretty interesting talks though.

Welcome To Just A Minute!

One of my highlights of the week was a great performance of Just A Minute presented by Nicholas Parsons himself and with a panel consisting of Pam Ayres, Jenny Eclair, Shappi Khorsandi - the shows first all female panel since the show first aired in 1967.

They were there, of course, to talk about Nicholas’ latest book, but the discussion quickly become a series of anecdotes from all present about the show and its past participants - this was great and gave an excellent insight into the past 900 episodes of this excellent radio panel show.

Whilst it wasn’t entirely without hesitation, repetition or deviation it was very entertaining - including the brief intermission to deal with a slightly lost wasp.

What’s Next For Google

This talk was probably my biggest disappointment of the week. It was presented by Peter Fitzgerald, the UK Sales Director for Google. I was hoping for a discussion of the near-term future for Google, preferably including key issues such as privacy and government spying. It ended up however, as a big Google X advertisement for our “possible future”. Granted, the talk was well rehearsed and Peter, as I’d expect, was a competent presenter but he didn’t seem to want to be there at all.

Interestingly, during the Q&A session, a question was asked about wearables and Google’s future focus. Peter seemed pretty adamant that their focus was purely on the software. This seemed odd to me given Google’s ownership of Motorola and the recent launch of the Moto 360 - Glass didn’t even get a passing mention.

Keep Britain Tidy

Hester Vaizey presented some of her favourite posters from her new book along with some interesting facts. One of the most interesting was the fact that the “Keep Calm And Carry On” poster we’ve all come to love (and probably hate) was never publicly displayed but rediscovered in 2000 and rose to ubiquity from there.

Golden Days Of The Railway

This was a panel discussion between three authors and a poet to determine if there was ever any such thing as the “Golden Days” and if golden days in the future are a possibility. The panel consisted of:

Whilst the panel didn’t really come to a conclusion about the “Golden Days” there were some interesting discussions including their opinion that the on-going restorations of the Flying Scotsman are a waste of money - even if it is good to have a living connection to the past. Also, their opinion that state-controlled railways (such as those on the continent) function far better and are more efficient than the privately controlled railways here in the UK.

One interesting fact too, it apparently cost around £80 to change a fluorescent light bulb in a station here in the UK - ridiculous.

Agatha Christie And The Monogram Murders

Another panel discussion, this time about the new Agatha Christie continuation book - The Monogram Murders with the author Sophie Hannah, Christie’s grandson, Mathew Prichard and expert John Curran.

I’m not the biggest of Christie fans but it was interesting to hear how Sophie created a continuation story by maintaining the well-established Poirot character but changing the stories narrator to suit her style of writing rather than trying to write in the style of Agatha Christie herself. This is in contrast to many continuation books that have appeared recently that choose to alter the main character in some significant way as well as copy the style of the original author.

Victoria: A Life

This talk by A.N. Wilson was more interesting than I expected it to be. He discussed his new book in which he explores, with new research, Queen Victoria as a successful diplomat, writer and anything other than a recluse after Prince Albert’s death.

He also talked about his disappointment at the number of letters in various archives that had been redacted / destroyed after her death leaving large holes in her personal history.

Overall the festival was pretty good, but I still think last year’s was better. One thing that was consistent was the overpriced and not brilliant food & drink - my advice would be to buy food from somewhere else and not from one of the festival tents.

Even if you don’t get the chance to attend a future festival be it literature you should definitely spend some time in Cheltenham. The festivals are always interesting but Cheltenham itself improves them greatly.

My Field Notes

Refining My Podcast Recommendations

After much listening over the past couple of months I’ve added to and removed from my recommend podcasts list to create a playlist that doesn’t feel like a chore to listen to. My refined recommendations are:

Somehow I’ve ended up with one more that my previous selections, but I find this selection far less of a burden to listen to on a weekly basis.

Recommended Podcasts

Since I started working remotely I’ve been listening to a lot of podcasts, pretty much all of them tech related (and many of them Apple related). Here are the few that have kept me interested enough to stay in rotation:

Some of these have been discontinued but they’re still worth subscribing to; for the past episodes and because they’ll soon be resurrected on Relay FM.

My favourite podcast client (podcatcher?) at the moment is Overcast, both Smart Speed and Voice Boost are definitely worth the IAP. There’s no Mac app for now but the web player works in a pinch (albeit without the best features of the app).

Digitising My Negatives

As I mentioned before, I recently began shooting on film again (a little bit anyway). Of course, this meant I wanted someway of getting my images from the negatives and into my digital library for a bit of light editing and sharing.

I found myself on Amazon looking at the plethora of cheap negative scanners. Most of these consist of a 5mp CCD and a backlight; photos are scanned quickly, to JPEG, and mostly without the need to involve a computer. From what I could find though, this type of scanner has three problems: highly compressed JPEG output, relatively low resolution ‘scans’ and extremely mixed quality output.

Maybe I was worrying too much about image quality - but they weren’t good enough for me. I wanted RAW images and higher-resolution output.

The most obvious choice would’ve been something like the Plustek OpticFilm 8100 or a negative-compatible flatbed scanner. I could’ve scanned as TIFF at high resolution and have been done. The main problem with this solution was the price, I couldn’t justify the high cost for something I probably wouldn’t be doing often.

To this end, I decided to make the use of what I already owned (or could make pretty easily).

The Setup

The camera setup itself wasn’t too complicated, I used my Nikon D700 with and old 105mm Micro-Nikkor. The equipment isn’t massively important though, as long as you have a decent DSLR, mirrorless or high-end compact / bridge with a lens that can get close enough to fill the frame with a negative then you’re probably going to get higher quality shots than a cheap dedicated negative scanner. RAW shooting is a massive plus though; the results will need some white-balance correction.

All of this needs to be mounted on a fairly sturdy tripod that can take the weight of your setup pointing straight down.

A couple of things you will need to be able to do though: manual focus, or at least have the ability fix the focus, and a self-timer / cable release function. Shooting so close, things can get blurry quickly.

One useful little accessory is a macro focusing rail, it allows you to finely tune the focus without having to mess around with the camera’s settings too much. It can be especially helpful with older, heavier lenses that tend to fall out of focus when the camera is pointing towards the ground and nudged slightly.

Probably the most difficult bit of the whole setup was coming up with some way to backlight the negative a suitable amount and evenly. Fortunately an Amazon shipping box, printer paper, a torch and a lot of packing tape came to the rescue.

As I didn’t have anything suitable to diffuse the light directly under the negative I made a relatively long tube (appox. 30cm) and lined it with printer paper that curved up towards the negative-sized aperture I cut at one end of the box. This produced diffuse enough light that evenly lit the negative.

A special shout out should probably go to the torch I used, it was the extremely bright LED Lenser P7. This is probably the best torch I’ve ever bought, super-bright for normal torching with a neutral enough light temperature for small photography-related projects like this.

Now for the stuff that really matters…

The Settings

For my negatives I shot in manual mode: 1/50s, f/7.1, ISO 200. I left automatic white-balance enabled as I was shooting in RAW and the white balance would definitely have to be corrected in post-processing anyway.

I chose not to quite fill the frame with the negative to ensure I made the most of the lens’ sharpness in the centre. After cropping, most of my shots worked out at around 8mp, which was pretty good going and definitely better than the cheap negative scanners.

The Results

Straight out of the camera this is how the negatives looked:

Inverting the photo quickly got me to something that looked more sensible. The blue cast to the image is the nature of the colour film and this is what needs to be white-balanced away. This can take a lot of playing with to get right but once you’ve done it for a single image, it should be the same for the whole roll.

After a little pushing & prodding with your image editor of choice (mine is Aperture but I guess that won’t be true for much longer). You can get something that looks perfectly acceptable.

To be honest, this photo probably isn’t the best example, but you can find some of the better ones (B&W and colour) in my Flickr album.

Something that did surprise me during this process was the amount of dynamic range I got from the negatives by digitising them in this way, I could see details from the negatives that the original prints didn’t even give the smallest clue to. The large RAW photos also gave me a lot of latitude when I was editing, it was nice to maintain the atmosphere of film with the advantages of a digital editing workflow.

Did it take a while to do all this: yes, would I have been better off getting a scanner: possibly, would it have been anywhere near as satisfying or fun: definitely not!

Shooting Film Again

I’ve recently started shooting a bit of film again so I ‘scanned’ and uploaded some of the results to Flickr. They’re all in my film album (along with some old ones I uploaded a good while ago). My scanning process isn’t exactly typical - but that’ll all be explained in a post that’s coming soon…

Using GitLab Omnibus With Passenger

GitLab is a great self-hosted alternative to GitHub. I’d set it up for other people before but it always seemed to be more hassle than should be to update and maintain (especially with it’s monthly update cycle) so I’d never set it up for myself.

Thankfully GitLab now has omnibus packages available to make installation and maintenance much easier. Unfortunately these packages contain all of GitLab’s dependencies including PostgreSQL, Nginx, Unicorn etc.. This is great for running on a server dedicated to GitLab but not terribly useful for my setup.

I already had a Postgres database I wanted to make use of along with an Nginx + Passenger setup for running Ruby application. The following describes the configuration changes I needed to make to fit GitLab omnibus into my setup.

The first steps are to create a PostgreSQL user and database for your GitLab instance and install your chosen omnibus package from installation instructions, we need to add some config first.

The first bit of config goes in /etc/gitlab/gitlab.rb, this sets up the external URL for your GitLab instance, configures the database and disables the built-in Postgres, Nginx and Unicorn servers:

external_url ""

# Disable the built-in Postgres
postgresql['enable'] = false

# Fill in the values for database.yml
gitlab_rails['db_adapter'] = 'postgresql'
gitlab_rails['db_encoding'] = 'utf8'
gitlab_rails['db_host'] = ''
gitlab_rails['db_port'] = '5432'
gitlab_rails['db_username'] = 'username'
gitlab_rails['db_password'] = 'password'

# Configure other GitLab settings
gitlab_rails['internal_api_url'] = ''

# Disable the built-in nginx
nginx['enable'] = false

# Disable the built-in unicorn
unicorn['enable'] = false

Now you can run sudo gitlab-ctl reconfigure, this should set up all of GitLab’s configuration files correctly, with your settings, and migrate the database. You’ll also need to run sudo gitlab-rake gitlab:setup to seed the database (this is a destructive task, do not run it on an existing database).

The final bit of configuration goes in /etc/nginx/sites-enabled/gitlab.conf (this assumes you have Nginx + Passenger installed from their instructions):

server {
  listen *:80;
  server_tokens off;
  root /opt/gitlab/embedded/service/gitlab-rails/public;

  client_max_body_size 250m;

  access_log  /var/log/gitlab/nginx/gitlab_access.log;
  error_log   /var/log/gitlab/nginx/gitlab_error.log;

  passenger_ruby /opt/gitlab/embedded/bin/ruby;
  passenger_set_cgi_param PATH "/opt/gitlab/bin:/opt/gitlab/embedded/bin:/usr/local/bin:/usr/bin:/bin";
  passenger_user git;
  passenger_group git;

  passenger_enabled on;
  passenger_min_instances 1;

  error_page 502 /502.html;

Most of the above configuration comes directly from GitLab’s omnibus configuration with a few customisations for Passenger. The main configuration options are correctly setting the user and group so there are no permission issues and ensuring that the correct directories exist in the $PATH variable to prevent errors in GitLab’s admin interface.

Currently, files uploaded into GitLab may not appear correctly using these instructions due to a permission issue, this should be corrected with a future omnibus release, more discussion can be found in this merge request.

And we’re done.

That’s about everything, hope it works for you too.