Context-aware applications in 2010
Interestingly, location-based/Context-aware services are more and more present in the press. After the frenziness of 2004-2005 (and less interest afterwards), I see more and more article about the potential role of location and context as the starting point for complex scenarios. See for example the ideas described in this article:
- "My context device "knows" it's noon. It also knows (via accelerometer data) that I haven't moved from my desk for the last couple of hours. Because it "knows" I have a TBD lunch scheduled for 12:30 (it reads my tagged calendar entries), it will remind me I should leave. As soon as I move the device, it displays the list of places where I had lunch the last couple of weeks. Since most were Italian restaurants, it suggests Chinese or falafel and generates the latest consumer rating of the restaurants offered. At the same time, it also highlights restaurants located within walking distance that will allow me to be back in time for my scheduled 2 p.m. meeting.
- I am on a business trip to Madrid, have just finished my meetings and have three hours until my flight back to New York. My device "senses" I started moving and "knows" my schedule, therefore it asks me if I prefer to get a taxi to the airport, or if I prefer to stay in the city since the drive to the airport takes about 15 minutes. I choose the second option, slide the "ambient media streams" all the way from "privacy please" to "hit me with everything you've got," and the device offers me all the tourist attractions around me, even a nearby coffee shop that has received exceptionally high ratings (I love coffee). I choose the coffee shop, and as I am drinking my second cup, the device alerts me that my flight has been delayed by an hour and will board through gate E32. I drink another cup of coffee and read from my device the history of Madrid until the next alert updates me that I should call a taxi -- immediately providing me with an application that directly books one.
- I leave my office to interview someone at a nearby bar. My device "knows" it is a job interview (tagged in my calendar), therefore it automatically Googles the applicant, uploads his resume and image, and then provides me with a summary of the available information found about him from HR, the web and other social sources. As I approach the bar, my device turns itself into "meeting" mode, in which I can view a map that displays two dots approaching each other. As we meet, the device asks me if I would like to record the conversation and send it to HR."
Why do I blog this? I am not sure I am convinced by these scenarios but it's interesting to contrast them with the one we saw in 2004-2005. The move from location to context is interesting because it shows that the former is only a component of the latter. It also acknowledges the importance of taking into account the complexity of contextual information which cannot be limited to mere locational data.
Unlike the 3 stereotypical scenarios we had 5 years ago (friend-finding, location-based ads and geotagged post-its), the ones described here are a bit more complex and rely on the connection between "personalized social/behavioral data" and contextual information (location, time, etc.). Using algorithms, services would then be able to infer different things that can supposedly interest people, especially in urban environments.