Welcome nabaztag

I recently bought a Nabaztag, I find it quite nice with its glowing lights and very simple design. What I appreciated:

  • extra easy set-up (no pb with the wifi)
  • a very calm ambient display
  • the package is quite empy but the website is full of information with informative pdf files (like color meaning, usage situations...)
  • the API is available so that people can create their own services
  • there is already a lively community of users, tinkerers
  • at first I was disappointed that the nabaztag was more a recipient of messages (showed through light, sounds and ear movements) but it seems that it can perceive certain inputs (like if you move its ear, it can send a message to the server).

What I found less good:

  • even though it's their business model, I am reluctant to pay for messages services and subscriptions
  • to me, there should be more emphasis on the openness of the device (more than the API) and I miss a social software dimension on the Nabaztag website. Chris has already used Ning to create a Nabaztag social platform.
  • the pictures on the box and on the website are quite weird, a large majority of people do not have a so cold home with empty tables... (of course the targeted group may have this but...). For me, Nabaztag is in a more messy environment: my office at home:

mynabaztagWhy do I blog this? The object is interesting to me because it's not smart, it's a wireless-linked device that allows basic communication and interaction through light, sounds and ear movements. Currently, this guy can only interact with: my computer (through the company's server) and cell phones. That's a cool feature: you can send SMS to your and your friends' rabbits.What is good is that it's a first step in the world of communicating artifacts. I feel like being more interested by this sort of device than by the locomotion of an AIBO (even though I am very curious of the AIBO communication and interaction practices, especially the blog thing).

Ok, now let's take some time to understand the API.

As a user experience researcher, I am very intrigued by possible user interactions with nabaztag; currently there are more outputs than inputs but using the ear could be a good way to interact with it (and consequently with other rabbits). Of course I would have be happy of having proximity-detections of objects and people in the vicinity but I guess it's a matter of time (next version of the rabbit).

More about it later.