I’ve committed myself to writing and talking around the concepts of Micro-Interactions and Anticipatory computing and using a SmartWatches app as a vehicle to deliver the content such that it has particular relevance. The challenge is actually coming up with a compelling use-case (apart from the obvious), initially it was a Water Consumption Monitor, then a Pace Setter (i.e. allow the user to set a pace and nudge them when the user is slowing down), but then settled for a way for the user to navigate to a destination with less reliance on their SmartPhone (more on this on a later post).
This idea sparked a thought of how Smart devices (Phone/Watch/Earpiece/…) could help navigate the visually impaired around urban areas using similar techniques used by semi-autonomous cars.
The concept would be based on SARTRE (Safe Road Trains for the Environment), SARTRE was an initiative that started in 2009 and trailed in 2010. The concept is to electronically link cars together to form a ‘road train’ with only the ‘lead’ being in active control (i.e. person driving).
So replace the cars with people and you have a autonomous crowd-sourced system that can assist the visually impaired (obviously needing a lot more thought). The general idea would be that ‘leads’ would run the ‘lead’ service on their phone – someone who requires assistance would run the ‘client’ service who is responsible for finding suitable ‘leads’; suitable would mean:
– In proximity
– Travelling at a desired pace
– Generally travelling in the same direction (either their calendar records or walking direction)
It’s unlikely that you would have a single ‘lead’ going to the same destination so the ‘client’ service would need to continuously scan be more appropriate ‘leads’. Would obviously also have to detect changes in gradients etc.