Recommendation Engines has become the ‘hello world’ of Data Products (or more generally the data era). Popularised by Amazon that not only found them to be more attractive (user engagement) that curated reviews and also figured out how to make it work at scale and in (near) real-time (using a technique known as item based collaborative filtering).
Once then, every digital commerce site has leveraged the idea and every Machine Learning/Data Mining book reviews the idea and implementation.
At a high, and very simplistic, level, it works by finding the distance between two entities (either people or items e.g. restaurants/food) based on a set of features (e.g. cuisine, food, song/movie genre, song artist/movie director/etc) and using your (or similar person’s) history of entities you’ve previously engagement with (bought, visited, etc) predicts what other entities you would like e.g. if 90% of your iTunes library is Jazz then other Jazz songs will have a higher weight than Rock, thus you will be recommended Jazz songs.
There are times when recommendations need something more than history of engagement, something more timely. I’m sure we have all experienced this, it’s your wives birthday and you shop on Amazon to find your recommendations have been embarrassingly polluted with items you would rather your workmates not see. One suggested improvements for recommendation engines is to use context (if possible). Google does this well (advantage of having established a strong presence of lifestyle and productivity products) e.g. if you’re looking for flights then Google Now will use this derived intent to keep you up-to-date with the latest flight deals.
But this can be achieved by other means, and the example I have in mind, we can leverage the mobiles attributes of being connected, aware, and present to determine if the recommendation is for a individual or group of friends e.g. your out with friends, using your phone to look for somewhere to eat – the phone has your contacts, location, awareness of who you’re with (neglecting privacy in this instance) – instead of using just your recommendations, it should extend the preference out to those in close proximity. It’s not hard to see how this extends to going to the movies, something to do, or music to play.
It’s a little ironic that I wrote this post while procrastinating doing the assignment for my data analytics course.
… we’re at the break of the most exciting era of computing, the convergence of pervasive computing, our increased ‘dependency’, and data (capturing and handling) will diminish the concept of a computer as a box (keyboard, mouse, and monitor) and transfer it into more of a living ‘thing’ (or service).
This problem came apparent when trying to design a “Gift Recommendation service”. The general idea is to use a retailers API and your contacts interests (interest graphs built using LIKES, Follows, Favourites using data available via the users social networks – FB, Twitter, LinkedIn) to recommend relevant gifts (based on their interests, occasion, and strength of the relationship). This was the easy part – the difficulty came when trying to associate the contacts ‘interests’ with ‘gifts (products). One way would base it on keywords e.g. filter products using features from their description and the contacts interests (‘sport’, ‘music’, …) – I imagine this approach would have failed miserably.
One of my first projects I endeavoured out of Uni was building a proximity marketing service using some discounted Motorola phones with JSR-82. Imagine being able to push messages to customers in proximity with vouchers and special offers – reality of this was spam, unsolicited messages interrupting the customer, normally inconveniently. A lot of the ‘services’ currently being implemented using iBeacon remind me of these days, the good news that this time round the user has to ‘opt-in’ i.e. install an app.
Since then my interest has shifted from being intrusive to invisible and you can see this with our current iBeacon prototype at Razorfish UK.
The app ecosystem has exploded over the last few years; creating inefficiencies for app discovery. Great for the platform vendors but not for the developers (or users), there have been numerous attempts to improve this, some of the obvious ones listed below (ignoring marketing tactics such as free-app-a-day):
- Vertical and specialised app stores (Amazon, Samsung, Sony, …)
- Cross promotional networks
- Review sites
- Increased categories
- Improved recommendations
- Third party apps (e.g. AppFlow, Appreciate, …)
This inefficiency is one major reason why Just-in-Time Interactions makes sense, especially as the app model extends to other platforms (desktop, TV, SmartWatchers, …).
Haven been faced this with question many times (how to market a mobile app) I thought I would have an attempt in making app discovery more relevant.
In this document we briefly explore the what, why, and how of sentiment analysis – let’s jump straight into it.
So what is Sentiment Analysis?
I would be surprised if you haven’t already come across the term and possibly know/use it already. It has become increasingly popular with the advert of big data in context of social networks and is a tool frequently used by brands to monitor and measure ‘chatter’ about themselves and/or market.
Essentially it offers a way to extract and measure the opinion of this chatter (normally filtered by a specific topic or on a channel) e.g. if you wanted to know the general opinion of your brand you could analyse all the tweets that link to your brand somehow, tallying the general opinion for each and reporting the result.
For a more detailed explanation, check out Wikipedia
No doubt you have noticed a common theme this year with how people can interact with technology, in that the technology has finally come to a stage whereby our interaction with the digital world can be less artificial. Examples include: Gesture controlled interfacing such as with the Kinect, LeapMotion, and Intels Perceptual Computing kit, Touch, Eye Tracking, Voice, All are encompassed under a category of Human Computer Interaction (HCI) called Natural User Interfaces (NUI) with the goal of making interfacing with devices, as the name suggests, natural.