When I first heard the term “ubiquitous computing” almost 25 years ago, it sounded magical. Computers then, I admit, didn’t make it easy to imagine such a world. I frequently found myself daydreaming of the films ET and Escape To Witch Mountain (I did say it sounded magical, didn’t I?)
It evoked a vision of being able to do anything, absolutely anything, anywhere using a device connected to all humanity, well, of sorts.
We are close to that utopian vision.
Except one thing.
That vision didn’t warn us of the whining, attention-seeking brat that ubiquitous computing will become.
Just to balance the argument out, I spent a few days noting how much time and attention I divert to other ubiquitous technologies in my life.
Such as the trusty old wall clock in the kitchen which is how I know what time it is when I arrive downstairs bleary-eyed early in the morning, or the radio I listen to while I work, or the land-line telephone that rings, or the 10-12 year old blazer I might pick out for the day.
You are wondering, aren’t you, why I am referring to these things in the context of ubiquitous technologies.
Think about it.
My wall clock needs one battery change a year. The numbers are large and can be easily read by a just-awake person to assess if a leisurely cup of coffee is possible or if one must rush on with the day.
I need a land-line telephone because where I live the line-of-sight technology called mobile or cellular access does not work.
As for clothes, let’s try and count the ubiquitous but invisible technologies therein including cutting, stitching, buttons to name a few. Without even going into the material, the weaving, the suitability to the day’s weather etc.
I contrast this now with the devices that do mean ubiquitous computing to everyone.
My laptop, my tablet and my mobile phone.
They need charging twice a day at least. I have turned off most notifications but it takes a while to work out why Viber notifications need to buzz, even when the phone is silent or indeed why the phone, left face down, needs to vibrate each time a new email pops in.
Yes, I know everything can be personalised and fixed just as I like it.
I am asking a different question.
Why does one have to spend all this effort on ubiquitous technologies/ computing tailoring them, charging them, tweaking & twisting them, blah blah?
Was this Weiser and Seely Brown’s vision when they coined the term ubiquitous computing?
Or did we get here all on our own — in our rush to ship beta versions, MVP and pick-your-term-of-choice — without adequately thinking just how much energy and time we will expend just to make these things work seamlessly, easily?
When did ubiquitous computing become ubiquitously painful, annoying and draining?
But more importantly, why did it become so?
And what does it say about our attention to design?