➣ Giulio Jacucci
Behavioral technology should ideally be always with users, should not burden them with additional devices or force them to learn new interfaces, and should be flexibly applicable to different tasks and activities. In short, the technology should be easy to weave into the fabric of our everyday life. Recent research in Human- Computer Interaction shows that this integration is made easier if the technology supports situated use, exploiting user actions and the environment as a resource rather than requiring explicit commands or data entry by users. There is a candidate platform that can accommodate these requirements, and that is “what used to be called the mobile phone.”
Mobile phones have evolved to the extent that manufacturers do not call them phones anymore. Features include cameras with ever increasing resolution for picture and video capture, internet connectivity (based on operators network or WLAN), near field communication as RFID (Radio Frequency Identification), a variety of sensors like accelerators, and GPS (Global Positioning System). Through the bluetooth interface virtually any sensor can be used in applications and the bluetooth itself can be a sensor that is able to scan and sense the presence of other devices. More importantly, the mobile phone contains rich and diverse information about the user and his or her actions that can be used by applications: calendar, communication logs, profile of the phone in silent, usage data of the phone or its applications. All these features are even more powerful when used in concert rather than separately and initial naturalistic studies show that such novel mobile applications can have a great impact on activities and experiences of users in particular leveraging social interaction. Let’s review some examples starting with studies of collective story creation using mobiles, then studies of mobile awareness cues and their integration, and finally sensing and augmented reality and their potential for novel applications.
Collective Creation and Sense Making of Mobile Media
With Comeks on the phone, users create comic strips by augmenting and annotating their own mobile pictures with speech bubbles and other comic-related accessories. Messages are put in a sequence creating stories that can be sent to friends. In a longitudinal field study messages featured performative acts where members play-acted together or turned objects into characters. Collage was frequent, reusing, for example, pictures from the members of the group or other printed material (Salovaara 2007). The study shows how a mobile phone that supports producing and sending comic strips out of mobile pictures (figure 1, left) can impact group communication and experience by creating novel expression practices in a group of subjects.
In a series of studies on spectators at large-scale events we have shown how mobile pictures can affect group experience and action. In a first study by just utilizing simple camera phones with just the built in applications, we witnessed rich forms of multimedia-mediated expression such as staging, competition, storytelling, joking, communicating presence, and portraying others. These expressions were always connected to group practices and were motivated by how they featured in socially engaging, processual, and shared nature of experiences. The analysis pointed to how mobile media was not only important to mediate or document experiences but in how it participated in the very construction of experience by providing opportunities for action.
In a subsequent study, we equipped groups of spectators with an application that implemented mobile media messaging in collective stories (group messaging spaces) that became archived group media albums. We have used the term “collective” to refer to the experiential component in joint media creation and sense-making. Collective use appears to be rewarding because it not only provides new forms of interpersonal and inter-group communication, but also ways to re-enact and re-use group’s conventions and shared memories in novel inspiring ways. In these collective albums awareness and social presence are actively constructed achievements. Our work points to how users can actively engage in constructing the cues by contributing to build up the medium for presence.
Awareness Cues and Their Integration
Another example is how mobile awareness cues on other’s context impact coordination and communication. ContextCues augments the contact list of the mobile phone by juxtaposing to each name and number context information on that person. These cues can include: the location, nearby members, time elapsed since phone used, and other activity cues. Longitudinal intervention studies showed impact to coordination whereby cues were used to infer information, align, and optimise mobility and communication. More importantly, cues were used in the pursuit of connectedness, companionship, and a sense of belonging with peers and significant others.
In further experiments we have also combined awareness cues and collective media stories as shown in Figure 2. Our system combined a group media space with event information and integrated awareness cues throughout. In two field trials we found that the system supported what we call active spectatorship facilitating onsite reporting to offsite members, coordination of group action, keeping up to date with others, spectating remotely, and joking. In these activities, media, awareness cues, and event information were often used in concert, albeit assuming differing roles.
Sensing and Augmented Reality
Also sensor information, for example a heart rate sensor, can be used in mobile applications as the one in Kurvinen et al. 2007 that was experimented with sharing real-time performance information of football teenage players to their parents and coaches during training and competition. This exemplifies how easy it is to integrate sensorial information even at a physiological level and also how the sensed information can be used not just for the individual user but in a social application.
Technologically there are some solutions on mobile phones that start to be interesting as ways to tag the environment like RFID tags, and optical markers. The mobile phone increa – singly can interface with a variety of sensors. We have developed at our institute a platform for this called ContextPhone. Also 3D graphics are developing quickly, enabling complex visualization that could be useful in rehabilitation scenarios.
Recent developments make Augmented Reality visualizations possible to be played on mobile phones which means enabling the ability to register 3D virtual objects in a real scene using computer vision tracking optical markers (Figure 1, right). This creates engaging and rich mediation to the physical environment and has been shown to activate users in navigating and using space in a different way for example in educational games in museums.
A Call for Balance and Artful Integration
The mobile phone is therefore a surprisingly powerful tool compared to desktop applications for its ability to provide continuous and ubiquitous use, meaning also interaction with the environment, which can be useful in physically activating subjects. More importantly, it can function as a powerful tool for expression and social interaction, deeply affecting action and experience in a variety of activities. Our examples show that to make best use of the rich features provided by mobile phones, an artful integration is required to create applications that can truly be woven in users’ everyday practices.
There are several issues and open questions connected to the mobile phone as a behavioral technology. On a technical level, developing applications for phones is difficult given the portability problems, as there are several programming languages and operating systems on phones. Finally, although phone screens are getting bigger and their interface easier (see iPhone), a phone application can be perceived to be difficult and complex to use. More importantly, in some of the studies mentioned before, the use was so intensive, it raised concerns of negative effects in the longer term. Some users even reported having felt social uneasiness in public spaces when other people noticed them staring so long at their mobile phone. Therefore as with any technology, the design should aim for achieving a balanced impact.
Giulio Jacucci, Ph.D.
Helsinki Institute for Information
President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.