The Interface Ecology Lab fosters integrative research projects spanning hardware, software, and theory, producing natural user interfaces, creativity support environments, games,
interaction techniques, visualization algorithms, semantics, programming languages, interactive installations, and evaluation methodologies.
research integrates embedded systems with HCI and the interface ecosystems approach, to imagine and develop new embodied experiences through computing and signal processing.
We create new devices and experiences,
employing low power sensors, computers, networking, software, and interaction design.
Embodied IdeaMÂCHÉ brings pen+touch interaction to information composition to create an embodied curation environment to support information-based ideation.
Cross-surface IdeaMÂCHÉ integrates tablets as private workspaces with embodied IdeaMÂCHÉ for collaboration.
ZeroTouch is a high resolution sensor for detecting fingers, body parts, and object, which we use for free-air and multi-touch interaction.
is a holistic, visual, and semantic medium for curation of digital objects, supporting information-based ideation.
Composition is non-linear and freeform, breaking out of lists and grids to support the synthesis of diverse ideas and emergence of new ones.
Composition integrates rich media curation with semantic metadata and expressive annotation through text and sketching.
The Information-based ideation
investigates human activities in which generating new ideas is essential, while using information as support.
IBI spans imagining and planning an evening, vacation, makeover, summer internship, career, thesis, invention, service, company, crisis response, and approaches to interdisciplinary topics such as sustainability.
We show how media of curation support IBI. Evaluation integrates qualitative methods, such as grounded theory, with derivation and application of scalable quantitative ideation metrics of curation.
is an open
source software architecture for developing powerful applications that present
interactive metadata semantics from diverse information sources. BigSemantics
provides a meta-metadata language for defining metadata types, a type system for
reusing and extending new types, a repository of types that involves many useful
sources and use cases, runtime libraries for conveniently accessing and using
metadata semantics in applications, and a web service that facilitates thin
Based on BigSemantics, we developed Metadata In-Context
(MICE), an example web application that enables exploring linked semantics
in the current context.
installation provides a movement-based, spatial and visual interface for navigating a multi-scale information composition, which holistically represents an online art gallery.
is a chrome extension
that helps twitter users follow associational chains of
tweets through #hashtags and @users. People can experience and relate a variety
of content in-context
helping them develop multiple perspectives on a topic.
TweetBubble makes browsing a more fun and fluid experience.
TweetBubble uses BigSemantics and Metadata In-Context Expander.
addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air.
It is characterized by user experiences of continuously manipulating information across interactive surfaces, providing a sense of connection.
is a high performance multi-point sensor that detects visual hulls, within a plane. Applications include free-air interaction, person tracking in rooms, automobile windshields, hover sensing (e.g., over capacitive multi-touch), and multi-touch.
We developed a series of digital games for Teaching Team Coordination (TeC)
We began with ethnography of fire emergency response work practice, developing understanding of situated contexts,
and design principles for simulations of team coordination in crisis response.
We mapped real-world actions map to game mechanics: players work together to achieve goals and avoid hazards.
We invented Zero-Fidelity Simulation
a method that focuses on reproducing the communication and information distribution components of target environments,
to produce engaging, low-cost, and effective educational experiences.
our open source initiative for augmenting popular programming languages to
facilitate inter-operation, with an emphasis on simplifying development of
distributed applications that represent the world. The foundation layer, S.IM.PL Serialization
, is based
in a language-independent type system encapsulated by
, and enables cross-language code generation, as well as
de/serialization to XML, JSON, and TLV.
Object-Oriented Distributed Semantic
simplifies building distributed applications and services,
which decode message passing using S.IM.PL Translation Scopes, to connect
iPhone, Android, Java, and .Net clients to Java and .Net servers.
BigSemantics also uses S.IM.PL.
Hurricane Recovery: Collecting Locative Media to Rebuild Local Knowledge
Engages in an iterative participatory process of reaching out to evacuee communities subsequent to Hurricane Katrina, gathering information about needs and desires, building situated semantics and a locative media collection sensemaking system, and using the system to collect, organize, and re-present images, interviews, and metadata. Digital photographs are connected with GPS sensor data, semantics, a zoomable map interface, and an image clustering algorithm.
consists of a set of documents, a clearly formed problem that an algorithm is supposed to provide solutions to, and the answers that the algorithm should produce when executed on the documents.
The present research develops an open source Test Collection Digital Library System. The system enables collecting and labeling documents, and publishing the resulting test collections.