It seems ironic that AR was going to be the one that propels wearable computing into the stardom, but that’s exactly what’s happening. Wearable computing (in case you’re wondering) is -as it names implies-, computers that we wear. Or put it in another sense, clothing that has some computing capabilities. We have heard about shoes and/or t-shirts that communicate with -in the beginning- our computers, now with our phone devices.
Well, since the advent of Google Project Glass, pioneers has been trying to figure it out how to make this a reality. Once you know some things, it seems straightforward: the phones, the glasses and the software its already available, what’s needed is to marry them. Well … that’s exactly what this dude (@Hugobiwan) has done:
In case you don’t quite understand what’s happening, let me try to explain it. On his hat, you’ll see a hidden iPhone, then he took the video output of the phone and put it into (input, pun intended) the video glasses he’s wearing. So, the glasses show the iPhone video out. Then, he executed the Layar app, resulting in the end that his glasses show the outside world enhanced with the Layar information, much like Google Project Glass will do. So, for example, has he been in Paris, he could walk by around seeing something like the image below, but on his glasses. Now that’s wearable computing!
I’ll try to replicate that experiment. As soon as I do, I’ll share the experience.