This post was written by PhD student Andy Quitmeyer about his experience at the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2013) which took place in early September in Zurich, Switzerland. UbiComp is the leading international conference on Ubiquitous Computing, an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable, and mobile technologies to bridge the gap between the digital and the physical world.
Over the summer, doing my research in the Smithsonian’s Tropical Research Institute in Panama, I applied to the Ubicomp’s Doctoral School. This was only their second time holding a doctoral student colloqium at the conference, and the first time they expanded it to include both Junior and Senior level PhD students. My research studying performative digital media and animal behavior in the wild (digitalnaturalism.org), does not fit easily in to many conferences, and I was excited they had accepted me as a Senior Student Participant in the doctoral school.
Ubicomp was one of the most welcoming conferences that I have been able to attend. While I learned that there are a couple subfields that tend to dominate Ubiquitious computing, such as smart-phones and health issues, the overall conference is very open to all kinds of research dealing with getting computers out into the world. Since the field is so broad, professors like Gregory Abowd and Aaron Quigley, gave excellent introductory talks delving into Ubicomp’s history and the major issues affecting it.
I received terrific, insightful feedback from people in many different disciplines directly about my PhD research. My presentation even sparked a small debate between two major researchers. We had a good discussion about whether gathering more positivistic statistics about the behavioral scientists that I work with would help reinforce my work, or detract from the deeper experiences these participants are going through with their animals and technology.
My presentation concerned the current state of my research in my homebrewed topic of “Digital Naturalism.” I investigate methods for digital media’s use in the study of Animal behavior to support exploration and outreach. Over two field seasons thus far, I have tested out several different ways of holding workshops, putting on interactive performances, and documenting scientist’s work as we collaboratively design animal-computer interaction devices. My presentation for Ubicomp served as an starting point for synthesizing my multifarious research into a concrete set of arguments.
My project Stereo Olfacticon is inspired by creatures like the male luna moth, who have spatially separated sense organs that give them the ability to smell directionally. It is a relatively simple device that pumps air from different spouts into mask in an alternating fashion in order to give you a spatialized olfaction. Soon, I will do my first trial where I have a friend draw an invisible maze in scented oil in a large area like a parking lot, and I will see if I can navigate it! The goal is as a device for open-ended exploration for scientists to engage with the world with senses similar to those of their subjects in order to promote inspiration for developing novel research questions.
I also was so impressed with some of the projects that I immediately wrote an elaborate breakdown of my favorite new concepts.
In particular I was really excited by:
Conductive Rubber Electrodes for Earphone-Based Eye Gesture Input Interface
Did you know that the front of your eyeballs are positively charged, and your retinas are negatively. Then when they rotate in their sockets, you can measure these movements just like a simple rotating potentiality with cheap, headphone-like devices.
Fragwrap
This was an awesome project with a fantastic documentation video for a thoroughly flushed out project using scents and bubbles as a novel interface. Features projection mapping on smoke bubbles, and airborne bubble control using polarization
Detecting cooking state with gas sensors during dry cooking
I did not know this before but, arrays of gas sensors can exploit the cross-sensitivity of a single sensor to react to chemical changes in cooking that are outside what the explicit sensors are listed for (eg “methane”). In this way, they were able to robustly detect changes in cooking states to help novice chefs.
Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets
This group created an impressive interface using simple, non-invasive or calibrated eye tracking paired with the knowledge that our eyeballs match the speed and direction of moving objects.
The timestreams platform
Artist mediated participatory sensing for environmental discourse – This was a WordPress plugin for creating and using environmental sensor data in many ways.
They even had a couple of projects that were directly up my alley in terms of research with computer-animal interaction. For instance, in the FIDO – Facilitating Interactions for Dogs with Occupations: Wearable Dog-Activated Interfaces (which is actually from Georgia Tech), they investigate arming smart dogs with a backpack interface for digital sending signals to their owners or emergency helpers.
One of my favorite parts of the entire conference was their gadget show. This was an hour long session where anyone at the conference could show off any gizmo they created for 1 minute. It did not matter if the artifact had been used in any publications at all. Luckily I had packed my new device for Directionalized Smelling, the Stereo-Olfacticon, and I got to show it off to a very enthusiastic audience. From there, and despite some broken parts from inspections, my project was even invited to be featured on Swiss TV.
On the last evening, they held a lovely dinner celebration on top of a mountain overlooking the city. It was an excellent way to cap a ubiquitous tech conference by bringing us all back in touch with the real world.
This was an incredible experience, that has let me build many strong new connections that will be invaluable to my future research. I am thankful to Georgia Tech’s DM program, the GVU, and Ubicomp 2013 for supporting me through this.