Tag Archives: computing

Computer heal thyself

I’ve just read a long, passionate post on Marc Scott’s Coding2Learn blog lamenting the fact that kids can’t use computers. The hypothesis, simply put, is that far from being “digital natives” most kids simply don’t know their way round a computer or, indeed, a smart phone. This is not to say they don’t use them – but that they don’t know what to do when anything – even something quite basic – goes wrong.

Instinctively I had quite a bit of sympathy with the argument; knowing how to troubleshoot a wireless connection or an external monitor, for example, seems to me pretty useful basic stuff.

However, as I thought about it some more I became convinced that this kind of lament is really a symptom of a technology in transition. I can imagine a similar post being written (if the medium had existed) in the 60s or 70s bemoaning the fact that car drivers simply don’t understand the mechanics of what they are driving anymore. Motoring was a do-it-yourself activity for a long time – I remember as late as the 80s doing quite a bit of tinkering with spark plugs and the like to keep my cheap, old and unreliable cars on the road. It is now decades since I’ve known what to do looking into the bonnet of a modern car.

I suspect computer technology is going through just such a transition. Marc Scott’s suggestion for fixing the dearth of computer knowledge is, among other things, to get kids to use Linux computers which need a lot of configuration (which means learning a fair bit about the operating system). But I think he hints at the change that’s coming when he talks about mobile:

This ones tricky. iOS is a lost cause, unless you jail-break, and Android isn’t much better. I use Ubuntu-Touch, and it has possibilities. At least you feel like the mobile phone is yours. Okay, so I can’t use 3G, it crashes when I try to make phone calls and the device runs so hot that when in my jacket pocket it seconds as an excellent nipple-warmer, but I can see the potential.

That, surely, is the point. Computers should fade into the background and “just work”. As he says:

Technology affects our lives more than ever before. Our computers give us access to the food we eat and the clothes we wear. Our computers enable us to work, socialise and entertain ourselves. Our computers give us access to our utilities, our banks and our politics. Our computers allow criminals to interact with us, stealing our data, our money, our identities. Our computers are now used by our governments, monitoring our communications, our behaviours, our secrets.

That being so, we need technology that works when you switch it on, that monitors its own health and fixed itself when anything is awry, that protects us from crime and from being spied upon. We shouldn’t be expected to be able to dismantle computers or smart phones in order to make sure they are working properly.

It is faintly ridiculous that computers can develop glitches and then expect us to search the company’s knowledge bases for the solutions which we then need to manually implement. Why aren’t they self-diagnosing and self-healing using all that superfluous computing power? Partly, I guess, because there is still a lot of tinkerer’s pride and self-satisfaction finally solving these techno-riddles and hence not much consumer outrage at this situation. But this won’t wash for very much longer.

In the end, though it is fun (for some) to be able to tinker with their technology, much like old-car enthusiasts tinkering in their garages, these days are drawing to a close and ubiquitous computing that “just works”, monitors itself and corrects problems as they occur will become the standard, for better or worse.

 

 

Will Apple miss the next big thing?

Will Apple be smart enough to capitalise on the next big opportunity in personal computing – turning the smart phone into the CPU for computing anywhere?

I remember back to the time when there was a huge debate about “convergence” – the big question about whether consumers would accept one multi-functional mobile device (the Swiss Army Knife approach) or would want a series of specialized devices such as a phone, camera, GPS. MP3 player and so on. The iPhone settled that debate completely with hardware and software (apps) which cater for just about every need. It now seems incredible that anyone even argued the point.

Well, we are fast approaching a re-run of that debate. Why have a computer and a smartphone when you could use a phone as your CPU, operating system and file store and simply link via Bluetooth to a screen, keyboard and mouse? Any why not make that screen your TV?

Apple is actually very well placed to make this move. It is already converging its operating systems – OS X looks increasingly like IOS especially after Mountain Lion. And they produce a superb range of Bluetooth-enabled peripherals and brilliant screens.

But this is a big leap for a company which makes so much money from computer hardware – $6.3bn in the last quarter of 2011. Risking that is a big bet for any company, let alone one that is riding the wave with its iconic highly designed and desirable computers.

If not Apple, then maybe Android? Already there have been Android phones launched with full versions of Ubuntu Linux loaded on them.  And Android’s makers Google doesn’t have a hardware business to cannibalise. In fact, it would make massive sense for Google to back a move like this – it is trying to push an alternative to Microsoft’s Office Suite (Google Apps) and what better Trojan Horse than consumers determined to carry their computing device with them wherever they go?

Even Microsoft may be better placed to capitalise on this trend than Apple. Microsoft doesn’t actually make computers (although their OEM partners clearly do) so although there would be much painful disruption if Windows 8 became the operating system on choice on the mobile portable computing device of the future, the company could but only profit in the long run.

I may be wrong, but I bet we will see this trend play out; it remains to be seen who will ride the wave.