Some ramblings about Conversational UI’s, Bots, and ChatBots.
At present there is a lot of attention on Conversational User Interfaces, Bots, and ChatBots – especially interesting/exciting for those who are interested (design and build) in how people interact with computers.
To reduce ambiguity it’s worth distinguishing between ‘Chat’ and ‘Bot’. Here I consider Chat as the interaction model whose interface is predominantly through natural conversation, the medium is the Conversational User Interface (Conversational UI or CUI for short).
A Bot is a agent (software application/service) that can carry out a task (semi-)autonomously on behalf of the user. Therefore the ChatBot is an Bot who interfaces with the user via conversation but achieves some task autonomously.
This problem came apparent when trying to design a “Gift Recommendation service”. The general idea is to use a retailers API and your contacts interests (interest graphs built using LIKES, Follows, Favourites using data available via the users social networks – FB, Twitter, LinkedIn) to recommend relevant gifts (based on their interests, occasion, and strength of the relationship). This was the easy part – the difficulty came when trying to associate the contacts ‘interests’ with ‘gifts (products). One way would base it on keywords e.g. filter products using features from their description and the contacts interests (‘sport’, ‘music’, …) – I imagine this approach would have failed miserably.
One of my first projects I endeavoured out of Uni was building a proximity marketing service using some discounted Motorola phones with JSR-82. Imagine being able to push messages to customers in proximity with vouchers and special offers – reality of this was spam, unsolicited messages interrupting the customer, normally inconveniently. A lot of the ‘services’ currently being implemented using iBeacon remind me of these days, the good news that this time round the user has to ‘opt-in’ i.e. install an app.
Since then my interest has shifted from being intrusive to invisible and you can see this with our current iBeacon prototype at Razorfish UK.
While flicking through the news this morning I came across an article on Venture Beat talking about online advertising trends for 2014. One of those trend was the implications the absence of the cookie for targeted advertising (achieved by tracking the users past browsing), but also highlights the limitations of this approach for the new digital landscape (i.e. unable to track between devices). The article wraps-up by highlighting how internet titians are jumping in to fill this void (e.g. Clearinghouse from Mozilla, Google AdId, and others) and how this will lead to a central repository (for the user) to control privacy.
Back in 2011 Scott Jenson, at the time a Creative Director at Frog, wrote an article titled Mobile Apps Must Die. In this post he claims, essentially, that the current model of searching, finding, and obtaining mobile applications is archaic and one that has been thoughtlessly taken from the desktop world. The main frustration is around discoverability, distribution, and fragmentation. Scott proposes a model where applications are made available to the user based on their current context, or rather, just-in-time, and delivered using ubiquitous web technologies.
Whilst working on a suite of digital products for the well known Carte Blanche Group (brands include Tatty Teddy and My Blue Nose Friends) with Masters Of Pie, we were asked how we might maximise the stand they had at this year’s London Toy Fair.
Our challenge was to build something specifically for the fair that would attract bystanders, provide a memorable and enjoyable experience with the brand, as well as introduce one of the characters from the Blue Nose Friends collection (an easy choice – Coco the excitable monkey!).
Our next question was what would attract and engage the bystanders -it needed to be something casual, fun, and non-intrusive in order to draw people in willingly and give busy bystanders something they wanted to play with.
We looked at how Coco might have some fun and how he could interact with visitors and decided upon an rhythm based game where the user would take control of Coco using the Kinect to hit his Bongos in beat to the music.
In late 2010 we spotted a potential trend for Augmented Reality as a marketing tool – not a new idea as it had been done before but wasn’t a mainstream technique for marketing purposes yet. At this stage I spent a fair amount of time researching the topic but being a bootstrapped service business it wasn’t long this project was parked to gather dust. Around mid-2011 it was discovered again and the initiative was relaunched, Augmented Reality solutions existed now but given our unique relationship as a production partner for a few agencies we saw value of us being able to offer a solution that wouldn’t have the additional cost. So as a company we decided I would be given a couple of months dedicated to build a prototype to be used to promote the service to agencies. Thankfully OpenCV made it possible to build a fully functional prototype (with rendering engine) within this time. The result was a lot of long nights and a fairly responsive marker (template) detection engine.