If you’ve not seen Google Glass yet (seriously, where have you been hiding), it’s basically a wearable computer that uses an augmented reality (AR) overlay to show various tasks and options and things.
Why is this exciting? You may recall a while ago I wrote a couple of posts about future software, designed to flow into the setting I was then playing around with in Greg Christopher’s Synapse rules. Why was this exciting? Spex.
Spex were the worn computers, offering the same overlay. Instead of the tiny screen just above an eye though, they had a flicker filter so that they’d opaque when on, or repeatedly (similar to how some 3D glasses work these days). They’d work with voice controls, eye gestures, could be hooked up to old-school systems with real keyboards and extra screens, could be used to immerse in virtual environments, and as I understood them also used an AR keyboard system (remember Johnny Mnemonic?), similar to the gesture controls.
But perhaps most importantly, the programs were less like Microsoft Word and Excel and more like Siri – a user was a manager of their bunch of programs, and the programs do all the heavy lifting and working.
I’ve still not covered the setting in more detail (expect that later in the week, I no longer really mind if players skip ahead), but it was basically the near future of Earth, with some fun tools that made it slightly different. Biodegradable ‘wet-tech’ – phones with throat mikes that slapped on like plasters on the neck, e-ink newspapers that would be used up and recyclable after a week of streaming updates.
I’ll tease that it’s the sort of stuff you’d expect from Ken Macleod’s writing…