Every Christmas since a rather young age, I recall watching The Royal Society's Christmas Lectures on the BBC. Having grown up with a curiosity for anything involving the sciences (especially when presented in exciting and fun ways), and the annual lectures delivered on the BBC at Christmas usually delivered wonderfully interesting content in a very entertaining manner.

Still curious, I fired up iPlayer (rather than YouTube, the usual goto for interesting clips) and gave the first of this year's a watch, knowing the focus this year was on hacking. In my childhood I remember the common subjects being those within physics and chemistry predominantly (although I may just have not paid attention to any focussing on biology), so this shift into the realms of computer science piqued my interests (for something at a secondary school level in terms of difficulty of content).

The one lecture I saw, The Lightbulb Moment, used Tetris, an unassuming London high-rise office block, Makey Makey boards, and Philips Hue bulbs as a medium of introducing demonstrations of increasingly common technologies (alongside less established but very engaging ones) for whistle-stop introductions to electronics/hardware, networking, and software. All the while showcasing some affordable/accessible things that could be tried (albeit if some of the demonstrations these items featured in deployed them in quite an excessive volume), in the spirit of the traditional kitchen cupboard non-Newtonian fluid demonstration.

Watching it made me think; here was a space typically showcasing more traditional scientific fields targetting a young audience with accessible, fun computer science. The lecture theatre was filled with a brilliantly diverse assortment of schoolchildren, a promising sign for a subject long marred by an imbalance in students studying it at university, and the obvious explanation for its exhibition in this way.


Or at least it would be the obvious answer if it weren't for everything else going on in the past twelve months and going forward. I think it was one of my A-Level physics teachers that reflected on how it's always the scariest, quickest evolving science that gets the most prominence in media, attention by the public. His examples were primarily which weaponry was considered most deadly (biological warfare, chemical warfare, and nuclear warfare (a product of physics, and undoubtedly the victor still to today)).

When I was in the later years of my primary school education, it could be argued that biology was the science of that time, with public fear centering on the likes of SARS and H5N1 bird flu. I can't vividly remember if chemistry ever truly had the limelight, in terms of uninformed fear, at least, however physics certainly did, certainly with the construction of CERN's Large Hadron Collider and the panic that arose from someone confessing it was theoretically possible for turning the very same thing on to produce a black hole. Still to this day Prof. Brian Cox has a very large and established plinth from which to yell about space and such things.

This year, however, things seem to have shifted quite considerably, and perhaps it was from this the Royal Institute's Lecture adapted. The fifth most googled news story this year was the notorious "celebrity photo hacking", outranked by sport, fatal tragedy, politics, and (other) sex crimes [Source - Google/The Telegraph]. Bristol's inhabitants queried the search engine on "how to hack" in between squats [Source - Google/The Telegraph]. Desperate reports of what "hackers" were doing trended on social media as drones (quadrotor helicopters) became the must have Christmas gift. North Korea even launched an unprecedented cyberattack on Sony in what is at the very least the most publically visible exhibition of successful cyber warfare in recent memory.

All this in a year that was eventful across other fields too. Ebola broke out. The European Space Agency successfully landed a probe on a comet, three times (for a singe probe). Someone invented the hoverboard (just in time). Soylent gets a new formula. So computer science truly emerged for perhaps the first time, it was far from a free ride.

void hack("Hollywood") { foo() };

The emergence stretches beyond the real-world too, with Hollywood shifting towards plots surrounding computer science seeming more ubiqutous than ever. In the 1st half of 2014 Wally Pfister's Transcendence presented a science fiction film in which a dying scientist has his consciousness transcended into a computer (although I didn't get around to seeing, I've heard it's a bit far fetched, even for luddites, leaving it ill-received overall (6.3 on IMDb)). Later in the year, The Imitation Game provided a glimpse into the extraordinary life of Alan Turing, in a brilliant showcase, tipped for a plethora of award nominations (already receiving 9 Bafta nominations at the time of writing), although I personally thought that for a film named after section 1 of Turing's pivotal Computing Machinery and Intelligencce paper, there was a laughable amount of content focusing on his life away from Bletchley Park. In 2015 the trend looks set to continue with Blackhat, although I'm sure watching it will allow me a greater empathy with astrophysicists and quantum physicists who groaned at the superb Interstellar. Clearly the studios see the relevance of the field, and the viewers have at least a vague interest in engaging stories based around it.

Ultimately though, I'm undecided as to how much the pursuit will benefit from all this attention. It will, hopefully, inspire and engage more teenagers to pursue computer science, however beyond that it is hard to say. It's unlikely that those who don't find themselves engaging at it at an earlier point in time will start to gain a better appreciation of the subject (see below's xkcd comic).

xkcd summarises the difficulty in getting Joe Bloggs to better understand the difficulties of the subject

If the interest can stretch beyond fiction (although in my opinion the fiction barely stays ahead of the reality nowadays), then perhaps we'll see more people eager to see what they can do with a Raspberry Pi, for example. However there exists no single hobby that unites everyone, so why would Computing suddenly emerge as the thing everyone can do and enjoy? Education is already promoting programming more broadly; it's rare for a scientific or mathematical degree to not expose students to something such as MatLab or Python, and even fields relating to English are teaching data mining, singing its praises as a tool for journalism, so what more remains to be done? Computer Science may never be universal, but it's certainly emerging at front and centre of the mainstream.