Having been playing around with ‘connecting’ things a fair bit these days, you find yourself asking the question ‘what could we do if this thing could talk’ for a lot of objects you interact with. Here is one simple example.
Even though our little man is 3 in November, I still find myself verging on paranoid when it comes to administrating medicine – the amounts and frequency.
So ‘what could we do if the cap could talk’; one idea I have is broadcasting the cap has been opened with an receiving app logging the time; so you know exactly when you can need give your little one more meds. Of course you could have the time logged on a LCD display but having it as simple as broadcasting via GATT (Bluetooth LE) it could be at an acceptable price point and mean you’re not having to swap batteries out every other day.
I’ve committed myself to writing and talking around the concepts of Micro-Interactions and Anticipatory computing and using a SmartWatches app as a vehicle to deliver the content such that it has particular relevance. The challenge is actually coming up with a compelling use-case (apart from the obvious), initially it was a Water Consumption Monitor, then a Pace Setter (i.e. allow the user to set a pace and nudge them when the user is slowing down), but then settled for a way for the user to navigate to a destination with less reliance on their SmartPhone (more on this on a later post).
This idea sparked a thought of how Smart devices (Phone/Watch/Earpiece/…) could help navigate the visually impaired around urban areas using similar techniques used by semi-autonomous cars.
Despite the average Smartphone life-cycle getting longer, it’s still approx. 22 months (inline with operator monthly plans) and with the over 52% of the worlds cell phones now Smartphones means there is probably a lot of useful devices gathering dust in peoples draws.