For our first Interaction Design Studio project, we were asked to design a digital sleep solution for family health. Through our research, we came to understand that the health of many families' sleep relies on good communication about time, routines, information, and habits: so we reframed the design challenge as an opportunity to design a system for family communication called ARrange.
The power of ARrange lies in its ability to embed information directly at our fingertips in the contexts that we want them. For families, this means parents can tag relevant objects with information to bring their children’s data out of their phones and into the environment where they can keep an eye on it. It also means that parents can leave reminders and warnings for their children when they need it and where they need it: don’t forget to brush your teeth for two minutes, lock the door behind you when you come in, don’t leave a spoon in your soup when you microwave it. In addition, it gives families the ability to tag personal and sentimental objects with nostalgic memories or little notes for each other, bringing in an opportunity for deeper meaning. The ARrange system consists of two components: a mobile application to tag objects with new information and access previous tags, and the augmented reality (AR) tags on physical objects that reveal information upon physical interaction.
In a time where the sheer amount of information available to us has become overwhelming, ARrange seeks to provide only the information that we want, when and where we want it, triggered by a simple touch to relevant objects.
The app is a mobile library that controls the AR tags. It enables users to photograph, tag, and organize objects with personalized AR information.
ARrange allows objects that have been tagged with AR holograms to be discreet and meaningful additions to the family ecosystem. Objects with active tags emit an aura in AR, signaling to the user that information is present. The viewer can use the intuitive affordances of touch and natural interactions to bring the AR to life–when and where they want it.
For our working prototype, we relied on Unity and a platform for creating AR markers called Vuforia to paste two unique AR markers (in the form of image targets) onto two different blocks of wood. We were able to get a sense of the field of view of the AR content while twisting or turning around the blocks of wood, as well as the kinds of things one might expect when bringing the blocks together (should the AR content disappear when stacking one block on top of another?).
To get a better understanding of our problem space and where we could situate our solution, we scheduled interviews with five parents with children in different life stages. We wanted to understand each family’s everyday, ideal, and disrupted sleep routines and how those affected and were affected by waking life. While each interview provided unique insights (see more in-depth summaries here), each of our interviewees said that communication around time and health issues deeply affected their family’s sleep.
We came up with four design principles based on our research and early explorations:
- Holistic approach: Keep in mind the 24/7 nature of how sleep affects our waking life and vice versa
- Design for failure: Technology should fail gracefully, especially when the stakes are high like with health.
- Calm technology: It should be intuitive and non-disruptive. Our system should be useful and usable even when users are sleepy.
- Hierarchy of information: We want to design for different levels of attention and hierarchies of information.
Making to Learn
For our team, a big emphasis throughout the course of this project was on designing intuitive and appropriate interactions with information. We began the design process by sketching and playing with all sorts of materials, including things as simple as wooden blocks and printer paper.
Because there are no best practices for interactions in AR yet, we explored various technologies and interaction types in addition to looking at what's already being done (Hiroshi Ishii's work at the MIT Media Lab was particularly inspiring).
However, we quickly found that we needed to bring our brainstorming to a more contextualized and embodied level, so we moved to one of our homes to bodystorm. To capture and test our best ideas from bodystorming, we created five video prototypes (edited in AfterEffects and Premiere) to explore how physical interactions might reveal different pieces of data in augmented reality.
We let our ideas about the technology and our scoped-down problem space shape each other. These how-might-we questions, along with the ability to tag discrete and personal objects already in the environment with unique AR, helped us scope the design of our solution: how might we…
- Ease parental anxiety, especially when it comes to children’s sleep?
- Facilitate contextual communication between family members?
- Bring data and messages to the environments and objects we interact with already?
How might we...
Ease parental anxiety, especially when it comes to children’s sleep?
Facilitate contextual communication between family members?
Bring data and messages to the environments and objects we interact with already?
Designing for AR
The final hologram we created focused on a scene where a parent tags a sick child's body temperature to a memento on her nightstand and goes to bed. The child’s status appears in the night as a hologram when the object is flipped over. When we got together to film the scene and edit it, we were inspired by wooden blocks and the innate interactions we’ve been using since childhood. It was at this point where we decided to use a simple touch to activate AR holograms and use the affordance of sliding the two blocks together to create a unique, combined hologram that reveals a new layer of information.
Designing the App
We went through a few iterations of the UX flow in wireframe flow before nailing down exactly how we wanted the app to work. Then, we began to work on developing the visual design for the UI before finally creating the InVision prototype.
Early Wireframes Iteration
Later Wireframes Iteration
One area of future explorations would focus on experimenting with the visual presentation of a hologram. Would an AR oven or microwave be more appropriately displayed in 3D or 2D form? Should it rotate or be placed at an angle to suggest its 3D shape? User testing would be crucial to successfully implementing this type of augmented reality. We would also hope to address interaction possibilities around gaze control, specifically the layer of data it could present as a user quickly glances over tagged objects.