While down in London a couple of weeks ago I very suddenly convinced myself the Apple Watch was a brilliant idea and something I should own. The subsequent two weeks have resulted in me mostly smitten with my new purchase (barring a defective band and HSBC preventing me from trying Apple Pay) as well as surprised with what my favourite feature of the watch is.

As a little preface, I own an iPhone 6 Plus. I've never owned an Android phone, and the more I upgrade from one iPhone to the next, the less I think I'll ever make the big leap across platform and ecosystem (even if 90% of my personal email goes through Google rather than iCloud). Apple's walled garden is lush and warm, no matter how high they stack the walls (although they seem to be chipping away at them in parts of recent, e.g. Swift and Apple Music on Android).

As such, asides from the Pebble (which I have lying somewhere, unused and unloved for a long while) or less smart fitness trackers (again I have a Fitbit Charge HR, which is as compelling as it is flawed), the smartwatch options are confined solely to the Apple Watch (barring the release of Android Wear's iOS support, but even then I wouldn't expect too much in the way of iOS supporting Android Wear extensively). As such, in between seeing some of the watches in the wild and finding WatchOS 2 enticing enough as a platform to develop bits for (hello complications!), I took the plunge, carefully investing in the Space Grey Sport watch (the stainless steel ones are very nice, but too risky an investment given the tendency for smart-thing obsolescence without a guaranteed upgrade programme).

So having had two wrist-wearables before, what drew me to Apple's offering? Simply, the Fitbit didn't do very much and what it did, it did rather poorly (inaccurate heart rate readings, and impossible to wear over a thermal top), and the Pebble before it, while neat in terms of showing notifications, had got to the point where I didn't really feel compelled to wear it. It was a cheap looking thing, with few compelling reasons to grin and bear the fashion side of things.

So then, after realising the cheapest (non-steel) Apple Watch was actually quite nice to look at, I made the leap. 3 weeks in, and I realise the best features aren't the fitness tracking, nor the apps (native or 3rd-party), but the Siri integration. Now there's very little (if anything) the Watch can do that the iPhone can't, but then other than fitness, there's very little in the way of unique watch functionality, tethering it to a phone means it's primary purpose is as another outlet for existing information. However, and particularly now with Apple Music integration, having access to a plethora of tasks, media, and information just by lifting your wrist and greeting Siri is a fantastic experience. No longer am I fumbling around in my pocket to change music, instead just telling Siri "Play Run The Jewels 2" in order to listen to RTJ2 (although I do concede my first attempt at just saying the letters left a weird 'are tejay' translation).

This evolution of user input is not all that surprising though. As I write this on a 102-key keyboard, occasionally reaching for my 9-button mouse, the majority of electronic devices sold today are unlikely to have 5 physical buttons, instead relying on some sort of touch-sensitive display. This first shift was part of a way of making manipulation of digital items less abstract. Rather than rolling some mechanical wheel on a mouse to scroll a page, you'd simply drag it directly with your finger (although as direct and intuitive as this is, I don't think people generally read gigantic scrolls and even then they probably use more than just a finger to move them). So it could seem logical that some next step in the quest to make these actions more direct and intuitive is to use other means of interaction, and voice is probably the single most expressive tool at the average person's disposal (although I for one would welcome dance recognition). Conversation has existed for millennia, so why would it not be synthesised for the purposes of instructing a machine (and in the near future, instructing robots - literally workers).

image of course courtesy of xkcd.com

And yet, no matter how much you converse in daily life, and no matter how common computers are, it's a bit strange talking to your wrist. I find it incredibly exciting whenever I use Hey Siri on my Watch, but it's hard not to be massively conscious of what you're doing. But it's not quite apparent where that awareness stems from.

People have had apparently one-sided conversations over the phone in public for a couple of decades now, it's commonly accepted as normal behaviour. But when the conversation with your phone stops at your phone, there's something that doesn't quite seem right about it. And yet when you just ignore the feelings of something not quite being correct, it's brilliantly convenient, effective, and impressive.

Using Siri I can't help but think to my mum shouting at her phone during phone calls, as if somehow trying to compensate for the fact the person in the call is some distance away. And I'm an early adopter, in the generation that has perhaps adapted to the greatest rate behavioural and technological change of all time perhaps (from playing out in person, to MSN, MySpace, Facebook, and so on to simply focus on socializing). If I feel weird, how is someone later in life to feel (I haven't seen it all yet, but Channel 4's humans does a good job of exploring this adjustment in an uncanny suburbs environment).

And then what happens next? There are already interfaces that don't require any input other than from the brain, and that can output feelings directly to the brain. What happens when computers augment our consciousness? Reality will lie somewhere between the sadistic realisations of Black Mirror (think S1E3) and the class divides of Deus Ex, and will be far from plain sailing.

Obligatory nonsensical image. And you can't knock Deus Ex