Filtering by Category: blogject

Social Objects

Ulla-Maaria Mutanen's new project is called "Social Objecs" and aims at building and testing "simple service concepts for labeling, bookmarking and communicating around design, art and craft objects:

The purpose is to bring together four kinds of groups: 1) technology developers, who are interested in testing their products and applications in concrete settings like museums and design exhibitions 2) designers, manufacturers, artists, and crafters who want to generate online conversations around their work 3) museums and exhibition organizers who are interested in finding new ways to engage with their audience 4) university researchers who are interested in the social practices that connect the online and the physical

Why do I blog this? given her current work with ThingLink, this new project seems to be quite compelling. I don't know much about it but the idea of extending the social layer about artifacts is of particular interest IMO. Something that would help to track the history of interaction an object has (with its owner, other persons or the environment) is valuable and the narrative that could be generated out of it can be curious.

Interest-based life logging

Blum, M. Pentland, A. Troster, G. (2006), InSense: Interest-Based Life Logging, IEEE Multimedia, 13 (4), pp. 40- 48. The paper describes a wearable data collection device called InSense based on Vannevar Bush's Memex principles. allows users to continually collect their interactions as store them as a multimedia diary. It basically take into account the sensor readings from a camera, microphone, and accelerometers. The point is to "classify the users activities and "automatically collect multimedia clips when the user is in an “interesting” situation".

What is interesting is the types of categories they picked-up to develop their context-aware framework: they chose location, speech, posture, and activities—to represent many diverse aspects of a user’s context. They also have subcategories (for instance for location: office, home, outdoors, indoors, restaurant, car, street, shop)

The experience sampling approach works like that:

Subjects wear the system for several hours without interacting with it. Audio and acceleration signals are recorded continuously. The camera takes pictures once a minute and WiFi access points are logged to establish location. After the recording session, the user employs an offline annotation tool, which presents an image at a time, the corresponding sound clip, and a list of labels from which to chooseshowing sensor placement.

What is also curious is their description of their algorithm that calculates the current level of interest of an event based on the context classification. Why do I blog this? I am less interested in the purpose of the system itself (sharing material) but rather by the data extracted from context readings and how this could be used to tell a story (or to build up a narrative). Of course, given my interest in games, I see this device as intriguing and potentially relevant to map the first life experience with virtual worlds counterparts; it could go beyond current pedometer that control dogs.

Onlife, Nintendo Wii and traces of interaction

It's been several weeks that I am hooked on using Onlife, a very simple application that tracks and help you to visualizes traces of your interaction with Mac applications.

Onlife is an application f or the Mac OS X that observes your every interaction with apps such as Safari, Mail and iChat and then creates a personal shoebox of all the web pages you visit, emails you read, documents you write and much more. Onlife then indexes the contents of your shoebox, makes it searchable and displays all the interactions between you and your favorite apps over time.

For instance, yesterday's patterns are quite clear:

Why do I blog this? the notion of "traces of interaction" is very trendy lately, I see it popping up everywhere: about blogjects, in educational technologies (how to use past interactions to fed back users and make them learn? why not using AI techniques such as cased-based reasoning to meet this end?)... This is also an approach favored by Nintendo with the "Wii play history": the Wii indeed automatically records details of what game was played. Users are then able to see the record of how long they played which games.

Now, some might be wondering, what would the potential usage of such applications? To me onlife is interesting to see my work patterns (my web browser is a very important tool that I used in conjunction with my text editor) and eventually adjust my behavior (time to shut down my IM client?). But what else? A problem here might be that those applications are too limited to make sense, a lot of stuff that we do are not logged... and eventually a tremendous problem here is... privacy...

"Bruno Latour forecasts the future"

A quite compelling title for a blogpost right?(Via Daniel Kaplan), last month I haven't parsed the whole New Scientist's special issue about 50 years foresight. There is this intriguing short one by Bruno Latour that makes sense:

In 50 years, social scientists will be able to visualise the connections between human organisations and technological objects. Today we know how to visualise technological systems using scientific images and technical drawings, but we have no idea of how to hook those designs up with the arrays of emails, spreadsheets, blogs and pieces of paper that organise the people who operate those systems. Why should that matter? Think of the Columbia disaster: there were thousands of drawings of the space shuttle and its parts, but none to represent the organisation of NASA. Once Columbia had exploded, everyone realised that faulty procedures were just as much to blame as faulty parts.

Devising connecting tools is a major but feasible undertaking. Fifty years is a safe bet: it is about the time it took in the Renaissance to invent perspective in the first place. Why do I blog this? well, I fully concur with Latour's description and I sorta like that this forecast is not about crappy superintelligent computers or quantum leaps into parallel universe. What is interesting here is the description the authors makes of potential connection between human and non-human actors. It might seem a bit cryptic but there are good implications to draw here concerning blogjects and the participation of objects in social webs.

Canary in coal mine

Digging some stuff out of the web about how the role of animals in some specific situations, I came across this interesting "usage" as described on the BBC website:

(Picture from the BBC website)
The canary is particularly sensitive to toxic gases such as carbon monoxide which is colourless, odourless and tasteless. This gas could easily form underground during a mine fire or after an explosion. Following a mine fire or explosion, mine rescuers would descend into the mine, carrying a canary in a small wooden or metal cage. Any sign of distress from the canary was a clear signal the conditions underground were unsafe and miners should be evacuated from the pit and the mineshafts made safer. (...) Coal miners now rely on carbon monoxide detectors and monitors.

Why do I blog this? this is an example of how miners found a trick for "measuring" some level of gases that might be dangerous. This concept is not so different from Beatriz Da Costa's blogging pigeons, which measure urban pollution.

A wireless smart ball that senses position, direction, speed and acceleration

Via, this intriguing new device described in the press release as a wireless smart ball that senses position, direction, speed and acceleration :

STMicroelectronics (NYSE: STM),one of the world’s leading semiconductor manufacturers, and Ball-IT Oy, a leading provider of advanced real-time wireless sensor solutions, today announced a novel MEMS-based wireless motion-control device. Making its debut at ST’s stand at Electronica 2006, the smart golfball-sized object can operate as a free-hand personal computer mouse, compass, measuring tape, pedometer, or a 3D-object controller

Why do I blog this? As Gene Becker says "Put that in your blogject and smoke it ;-)" (a private joke related to some other quotes from ubiquitous computing discussion). Anyway, what is interesting here is that it can be seen as a standardized wireless and sensing controller. Though the company says "I believe we’ve sensed the market’s direction and are on the ball", the observer is still left out with no precise ideas about what they want people to do with it (no worries some people have some ideas). Here is the only mention:

ST’s acceleration sensors are used to provide a motion-activated user interface in Nintendo’s new home console, Wii.(...) ST’s unique portfolio of two- and three-axis MEMS accelerometers targets a wide range of low-g applications from motion-based user interfaces to hard-disk drive and automobile-passenger protection. Market analysts predict that by 2010 there will be one accelerometer in each mobile phone and every portable hard-disk-based device (laptops, audio/video players), representing a total market of more than 1.2 billion units.

Chumby

The Chumby seems to be an intriguing artifact expected to be released in 2007:

a compact device that can act like a clock radio, but is way more flexible and fun. It uses the wireless internet connection you already have to fetch cool stuff from the web: music, the latest news, box scores, animations, celebrity gossip...whatever you choose. And a chumby can exchange photos and messages with your friends.

Looking at the product history is quite interesting, I highlighted the aspects I found relevant:

Chumby is different. The chumby was not created in the design department of some big consumer electronics company. (...) We made it all up. Chumby Industries was formed by hackers who wanted to create something interesting, useful and different. (...) What we decided to build was a really low-cost, wireless (WiFi), Internet-connected device that will sit on your bedside table (or in your bathroom, or kitchen, or living room, or maybe even plug into your car somehow...) that could do a lot more than this old clock radio. (...) We also decided that the chumby would be different because it will be “open and hackable.” If you happen to be another card-carrying hacker, you can blow off the warranty, pull out its electronic guts and reprogram it. If you're more of a "crafter," we're providing patterns so you can give your chumby a new skin. You can sew on patches, attach enameled pins, bury it in glue and glitter; whatever you want to do to personalize it. If you're a Flash artist, we hope you'll use chumby as a sort of always-open art gallery for your coolest stuff. (...) The chumby is designed to let you stay connected to your Internet life in locations where it might be fun and convenient. (...) We've now built a few hundred and and are in the process of getting feedback from early users, mostly hackers and artists. We want to learn what people think of chumby, and how they'd like to use it or collaborate to make the world a chumbier place.

Why do I blog this? I like the way this device goes beyond the "communicating object" paradigme (exemplified by the Nabaztag) by expanding the use of information flows (pictures for instance) and taking into account crafting/hacking issue.

Digital patina: Lucent's Live Web Stationery

Lucent's Live Web Stationery is an old project (SIGGRAPH '97) that shows the concept of "virtual aging": a web page ages as if it were a physical piece of paper. It's a project by Dorée Duncan Seligmann and Stephan Vladimir Bugaj. As described in the press release:

Live Web Stationery is a demonstration of Web pages that "age" based on the amount of traffic that they endure. Peloton is a computer-based simulator that creates virtual environments for bicycle rides. (...) "The Web is a public virtual space that requires signs of life and interaction in order to become more engaging," said Seligmann. "Web pages are touched by thousands of people each day, and there must be a way to convey the age of the 'page' itself, how its texture changes, how its shape is altered. Live Web Stationery conveys a sense of community and interaction that doesn't exist on Web sites today."

Why do I blog this? because I like this idea of digital patina: it's a way to enhance objects (virtual or not) with an history of their interactions (positive history?) by the user or by a group of users. The next step is to find or create affordances based on this. Besides, as Laurie Anderson expressed it, at some point it's good to put more dirt in virtual reality.

A blogging purse

Cyril pointed me on this quite unusual blogject (which is wrong since there is no prototypical blogject representation) is this 'blogging purse': " It looks like it just uploads images. The details are a bit on the weak side, but some of the stuff looks neat. The purse contains a camera, basic stamp, pedometer and Nokia phone".

Here is the blog it creates, a contextual uploader actually.

remote control gardening

Via, look at this Aiterrarium: Remote-control gardening:

On October 11, Matsushita Electric Works, Ltd. announced plans to begin selling an indoor gardening system whose lighting, temperature and water supply can be remotely monitored and controlled via the Internet. The system, called Aiterrarium, is slated for release on December 20 and will initially target research facilities for universities and businesses.

Why do I blog this? I am wondering why could not it be the other way around: sensors on a cell phone (or whatever object that can be mobile, "visiting" diverse environments) that would remotely control elements of the plans (for instance water distribution with different levels of Sodium, different light exposition, noises... or even radiowaves and touch sensors) so that the plant development itself is a by-product of your own movement in space... Matching your own experience (the light you have access, the radiowave you encountered, the food you eat) or not... I mean it's a matter of turning your cell phone in a blogject input and your plant as a blogject output.

CARPE: Capture, Archival, and Retrieval of Personal Experience

The last issue of IEEE Multimedia is about "Capture, Archival, and Retrieval of Personal Experience", which the authors refer to as "CARPE" (sounds fishy in french). What stroke me as the most interesting part is the introduction:

The human preoccupation with capturing and archiving memorable experiences witnessed astonishing technological advancement in the 20th century, progressing from diaries and paintings to the dawn of the digital camera and camcorder era—and ushering in our multimedia community. Today, we must expand our notion of media, because audio and video recording can also be supplemented in many ways, including with temperature, heart rate, location, acceleration, humidity, Web pages visited, and logging how we use many devices.

Why do I blog this? The special issue describes some solution regarding this problem (through automatic labelling or passive capture). Here they describe only the information capture by human (or by devices that belong to human), what about objects that would act on their own. If we think at what I blog the other day, there might be surrogates to be put in the loop.

Environment XML

Environment XML, a project by Usman Haque:

Many projects have been constructed in which objects (or webpages) respond to environmental conditions. Here we provide, instead, the environment that gets responded to: realtime environmental data from our office is released in XML format, with the hope that anyone elsewhere in the world can create objects (or webpages) that respond to the environment of the office.

Built with Processing and using the Arduino physical computing platform, Environment XML will be released as an open source project so that others may easily release the environmental conditions of their own spaces as realtime XML data. This is part of a continuing exploration into ways that open source strategies might be applied to the design and construction of space and architecture.

Why do I blog this? Close to Wil's Psychogeographical markup language, this is interesting as a variation on the blogject theme. It's interesting that artist are more and more focusing on how to structure environmental data that would eventually reflect the state of places.

Improving the reliability of virtual organisations

According to this news, researchers at the University of Southampton are working on models that will supposedly improve the reliability and trustworthiness of virtual organizations.

A virtual organisation is one whose members are geographically apart (usually working via networked computer applications) while appearing to others to be a single, unified organisation with a real physical location. According to Professor Michael Luck of the School of Electronics and Computer Science (ECS), as the market for virtual organisations grows and an increasing number of companies are represented by computerised agents acting on their behalf, there is a greater need to ensure that these agents behave responsibly. (...) In seeking to address some of these challenges, the researchers have developed a system for the dynamic formation and operation of virtual organizations, drawing on scenarios such as that of an individual visiting London for the 2012 Olympic Games who requires a PDA to access various multimedia services.

They are currently in the process of implementing a prototype system which looks at issues such as trust and reputation, standardising communication between agents, and policing within a virtual organization, so that the impact of behaviour such as non-delivery of services by an agent is minimised.

Why do I blog this? what is the next step? having blogjects as company agents?

Palimpsest blogject

It's been a while that Wil is playing with the blogject idea and I haven't found time to explain what he is doing. He recently worked out this interesting prototype:

This blog/blogject is a Janus head, 2 faced web0.0 monster sharing a memory-system styled on a palimpsest. When the limited memory it has is filled to maximum capacity it needs to reorganise it to make space otherwise it can't store any more new blog entries. In doing this it has to try not to forget the old ones, but this is not always done with much success as memories confabulated over time become increasingly unrecognisable. This making space is done by the blogject and its functioning is modelled on how our brains interleave our memories: by dreaming. The resulting dreams are what the blogject publishes online. (...) The purpose or meaning of this writing is not in the writing itself but in the interpretation of it by the ones submitting writing to its memory. This property too it shares with dreams.

Why do I blog this? pushing further the notion of blogjects, focusing on input/outputs, this project is quite compelling and I like the adjacency to automatic writing and the cut-up.

More about it here: BLOG/BLOGJECT; A Blog that Dreams

SAP Labs' Ike Nassi about wireless networking

Computerworld features an interview of SAP Labs' Ike Nassi about how he foresees the future of wireless networking. Some excerpts I found interestng:

The integration of the real world and the IT world is going to happen, and it's going to accelerate. It's going to be driven by the increase in RFID in sensor networks and the rise of embedded microprocessors. We are doing things here that couldn't have been done three to five years ago.

He then gives some examples to support this:

For example, we are working with the city of Palo Alto to outfit fire trucks with a variety of wireless communications gear so we can track fire engines back to SAP's back-end systems. One thing the fire department was interested in, for example, was ... understanding why a fire truck would take what appeared to be a nonoptimal route to a fire. (...) The automobile has a tremendous number of microprocessors but has been slow to adopt networking. We are exploring back-end Web services [for] network-enabled cars. For example, my car told me I needed an oil change. But in the mail, I got a notice saying my car needed a software change. If the whole thing were network-enabled, I could have gotten an e-mail saying, "Your car needs to be serviced. (...)" [There is] a potentially very large number of back-end services that can be delivered to the car or driver.

Projects discussed at the 2nd blogject workshop

My (raw) notes about the 3 projects discussed during the second blogject workshop held in Lausanne: 1) Ubicamera (Julian Bleecker, Sascha Pofhlepp, Mark Meagher, Frédéric Kaplan): Starting point: how blogjects would be used to circulate culture. I my camera knows that I am in Amsterdam and also that I was there 6 months ago, it would link up on Flickr and establish a network of different sources. Or, if I go to an event and takes picture, the camera will check other Flickr pictures automatically

"The Flickr camera" is driven by a fascination towards media sharing as a cultural practice. This group thought about how to turn this into the next level of interaction: the social practice of Flickr into a blogject

There are 3 primary Flickr characteristics embedded in a camera: - the interface (fluidity) - association: sharing pictures amongst friends and strangers - browsing practices: go there and look at the 10 different photo of your contacts and then check pictures of a group whom your friends belongs to ("big brown things").

Scenario: Sascha walk down the road, he sees a Totem, take a picture; a public/private indicator shows up as well as the opportunity to select certain tags: certain are preloaded and other tags are there just becasue the camera has foudn them.

Social camera situation: meet other blogject camera. As you maneuver through your day, you come across similar camera and communicate: embodiment of the Flickr association to find similar pictures and people. You can also specify the tags you're interested in and the camera would download the pictures of others you meet serendipituously (these people won't even know).

Also, the camera is GPS-enabled: location can be another key to find information /people/pictures: it's then an exchange of tags based on location. The camera can also tell the others which tag is proper.

As for the browsing practice, the question is "how do we use the camera as a display device?" Not just for browsing but also navigating in a way that is as compelling as Flickr. There could be different interfaces: photo can be shifted accordingly with time/space/personality/trajectory. The photostream is then the stream of life of the owner; when 2 cameras take the same picture, there would be an intersection of 2 lives: this should also be displayed.

The camera has also a life of its own: no buttons, the photo can't be deleted. What happen if you buy such camera on a flea market? You buy a camera but also the pictures taken by the previous owner.

Timo was interested in this sort of interface as well as the tagging practices it would generate

2) Nabaztags and blogjects (Fabien Girardin, Alain Bellet, Regine Debatty, Cyril Rebetez) Starting point: Using Violet's Nabaztag (the wifi rabbit) as an output device or a blogject aggregator. Using it as a way to make sense of history of interactions. It could also be a nice channel. A part of the "blogjectsphere"

"Not a rabbit that helps loosing weight" but a spokespet. The main features would be: - reminder/teaser/awareness - trigger for actions or to drop an action you are doing based on the data collected by the blogject - a spokesperson for voiceless objects in the form of a rabbit: it would recycle and analyze data from the environment (other objects), managing an ecosystem of data. The ecosystem of objects is made of objects which have simple sensors: - letters with your bills you have to pay (RFID tags): a reminder that you have 5 bills to pay - trash: the rabbit will know the status of your trash as can act as a reminder or warner ("don't put that can in the bin, it's only for paper") - water the plant: the rabbit recives the answer of the plant sensors and reminds you that you should water it - playful or unpleasant reminder - the action generated by one sensore can trigger another blogject - the rabbit can talk to your friends' rabbits and your pedometer in your shoe can activate your friends' rabbit: "hey your friend went running": awareness of others so that you might eventually choose to join. - the collar of your dog can communicate with the rabbit: a translator of the barking dog or as a warner that you have to go out with him.

Fabio Sergio was struck by the desire of human beings to have objects that would be like us and he pointed us on the fact that it is antithetic with the fact that animal can act badly. The question is then "do we really want things to have a personality because we would need to manage them?" Do we want to have this sort of relationship with objects?

Sascha raised the question of the underlying cultural aspects: from "mute servants" to agents. But Cyril reminded us that psychology showed that we project meaning and intents anyway.

3) Mobile phones and blogjects (Timo Arnall, Fabio Cesa, Fabio Sergio, Marc Hottinger, Nicolas Nova) Starting point: The mobile phone is a generic device (phone, take pictures, post it on Flickr...): can it be turned into a blogject? or a blogject controller? If we become surrounded by blogjects, how do we manage that situation? The mobile phone as a tool/wand/interface What are the potential social issues

meanwhile... 2nd blogject workshop

These last 2 days, I am busy managing the blogject workshop 2 with Julian Bleecker at EPFL. There's a small group of very relevant people there (Julian Bleecker, Fabien Girardin, Mauro Cherubini, Mark Meagher, Frédéric Kaplan, Laurent Sciboz, Timo Arnall, Sascha Pohflepp, Regine Debatty, Fabio Sergio, Fabio Cesa, Marc Hottinger, Cyril Rebetez, Alain Bellet, Manu Bansal, Cyril Rebetez). This one is a bit different from the first one we had during LIFT06, with different people, from different background and more anchored into concrete projects and scenario developments.

DSCN2251 stuff

Still have to write report from the tons of notes, drawings and audio files we have!

There will be a 3rd workshop on its way... (source), stay tuned

A place like a Muscle

I am really enjoying this Muscle NSA project carried out at the Hyperbody Research Group at Delft University. This is a programmable building that can reconfigure itself.

For the exhibition Non-Standard Architecture ONL and HRG realized a working prototype of the Trans-ports project, called the MUSCLE. (...) Programmable buildings change shape by contracting and relaxing industrial muscles. The MUSCLE programmable building is a pressurized soft volume wrapped in a mesh of tensile muscles, which change length, height and width by varying the pressure pumped into the muscle.

What is interesting is the interaction they designed engaging people in a playful activity:

Visitors of the Architectures Non Standard exhibition play a collective game to explore the different states of the MUSCLE.

The public interacts with the MUSCLE by entering the interactivated sensorial space surrounding the prototype. This invisible component of the installation is implemented as a sensor field created by a collection of sensors. The sensors create a set of distinct shapes in space that, although invisible to the human eye, can be monitored and can yield information to the building body. The body senses the activities of the people and interacts with the players in a multimodal way. The public discovers within minutes how the MUSCLE behaves on their actions, and soon after they start finding a goal in the play. The outcome of this interaction however is unpredictable, since the MUSCLE is programmed to have a will of its own. It is pro-active rather then responsive and obedient. The programmable body is played by its users.

There is also a slight connection with the blogject concept:

For the behavioral system this means that the produced sensorial data is analyzed in real-time and acts as the parameters for pre-programmed algorithms and user-driven interferences in the defined scripts. These author-defined behavioral operations are instantly computed, resulting in a diversity of e-motive behaviors that are experienced as changes in the physical shape of the active structure and the generation of an active immersive soundscape. The MUSCLE really is an interactive input-output device, a playstation augmenting itself through time.

Why do I blog this? what I like in this project is that it mixes different aspects of the HCI world: games, games software, architecture, usage of sensors. In the end, the outcome is pretty original and the visitors' experience seem to be intriguing. I also like how it modifies the relationship of the visitors to a dynamic place.

Meeting at the IFTF

I had lunch today with my friend Alex Pang at the Institute For the Future in Palo Alto. The discussion was around the Internet of Things, spimes and blogjects. Starting by discussing Bruce Sterling's Shaping Things, we were thinking about the fact that as Sterling says there is no smartness in the objects; the smartness better resides in the was those objects and networks help us to make better choices; especially with regards to specific actions or meeting people. Wired and connected objects may indeed help choosing what tools can be used to consume less energy, sharing certain types of objects with others that would be trackable is also of interest (and is actually a topic discusses in one of the story Bruce Sterling wrote in "Visionary in Residence"): a kind of community hammer or driller for instance. IFTF

Alex and I also discussed some potential ideas about the blogjects serie of workshop I am organizing along with Julian. Additionaly, Jason tester updated me on their pervasive gaming projects that is a very relevant synthesis about context-aware games. This project interestingly started first by looking at the history of video games from the POV of users and then continued as an overview of the pergames directions.

Alex finally encouraged me to go deeper in the Science Technology and Society world, which is quite a good idea.

Blogject front-end using the Xbox360 XML data feeds.

Trapper Markelz designed a nice exemplification of the blogject concept: he built a Blogject front-end using the Xbox360 XML data feeds. Here is how it looks like "I am a XBOX 360 and I can talk".

So what is all this talk about Blogjects? While at eTech I had an idea to build a Blogject front-end using the Xbox360 XML data feeds. Steve and I have been working on it a few weeks on and off and here is what we have so far. The next step is putting it into a linear blog format so that you can have an RSS feed for your Xbox and it will tell you each day what happened to it.

 Why do I blog this? it seems to expand on the concept of datablogging by leaving the game console uploading informations to the web in the form of a blog.