The time when punch cards, mice, and keyboards were sufficient as input modalities for human computer interaction has passed. Now we require rich sensory modalities, enabling interactive computing systems to respond to people and the world with the complexity of living beings.
The focus of this research-oriented course is to build engaging human experiences based on sensing and recognizing embodied forms of expression. We will begin by studying how to build sensory computing systems using the Cypress Programmable System on a Chip (PSoC) to manage the signal acquisition chain for hardware sensors, and capture a multiplicity of forms of real world activity. The sensors will include IR LEDs and diodes, accelerometers, gyros, compasses, Near Field Communication (NFC), and Galvanic Skin Response (for "excitation").
The PSoC is notable because it can be configured to provide multiple aspects of signal chain support, including pre-amplication and filtering, in the analog domain, as well as digital signal processing (DSP) and other computing before sending data to a host computer using a medium such as USB. Support for sensor buses, such as SPI (Serial Perheral Interface), is facilitated.
Microsoft Kinect's depth map, and the recently ubiquitous sensor kits of smart phones, which include cameras, accelerometers, gyroscopes, and touch screens, are likewise in play for sensory interfaces.
So is the Interface Ecology Lab's multi-finger ZeroTouch sensor.
Our work with sensing technology will be contextualized by research in sociology and HCI. In this skills and project oriented course, advanced students will develop innovative research projects.