My interest in Human Computer Interaction was ignited by Dr. Michael Pounds at Ball State University, and led me to teach the subject at Georgia Southern University and then work at the Institute for Digital Intermedia Art as HCI Electronics Designer.
Embedded Surfaces
Participants wear a subpac (A subwoofer backpack) and a 5 inch speaker on each hand. Spatial trackers are attached to each of these speakers. As participants move throughout a predefined space, a portion of an invisible four-dimensional waveform is revealed. Each point in real 3D space has a unique timbre and movement through this space creates repeatable sonic gestures.
The four-dimensional waveform is generated mathematically. Every audio sample is calculated in real-time using the real 3D position and a low-frequency signal generator (controlling position in a fourth dimension) as input. This signal is broadcasted through an FM transmitter and picked up wirelessly on the participant using an FM receiver. Due to the lightweight and relatively small preamplifier and speakers on participant’s hands, they are not capable of producing low frequency content. Low frequency content is therefore sent to the subpac, allowing these missing low frequencies to be felt instead of heard. |
Convergence
Convergence uses a Raspberry Pi and Processing to calculate pixel information and a series of FadeCandy LED drivers to deliver this information to the appropriate LEDs. Emergent algorithms by Wolfram, Langton, and Conway are implemented to demonstrate how a simple set of instructions can give way to complex results.
|
With the IDIA Lab
|
Nodal Media Cluster
|
A series of Raspberry Pis were networked together in order to create a distributively-computed multimedia system. In this video, a virtual 3D world is shared between 4 Raspberry Pis (nodes); one monitor per node. Each node is responsible for rendering only its view of the virtual world; world data is passed through a networked connection. This approach allowed this shared 3D context to be rendered at greater than 4K resolution using only 4 $30 computers.
Tools used: openFrameworks, Raspberry Pi, OSC With IDIA Lab |
VR - Haptic Floor
Gesture-Piloted Drone
|
A depth sensor is used to capture a user's gestures. Predetermined gestures were made to launch and land; hand position is used for steering and speed.
Tools used: openFrameworks, C++, openNI, With the IDIA Lab |
Studio Study No. 2
The interfaces comprise of four capacitance sensing planes, placed in a 2x2 grid. Various resistance values were tested in search of a useful reactance distance. The resistance chosen offers approximately three feet of useable range.
An early issue with the interface was accidental touching of exposed aluminum foil. This caused a sharp decrease in capacitance. To fix this issue, I glued the aluminum foil to rectangular pieces of cardboard, blocking a user from direct contact. However, I thought that direct contact to the foil may be a useful variable to poll. For this reason, I exposed a small surface area of foil to allow intentional contact. The state of contact allowed for nearly instantaneous triggering and toggling of variables within the custom software synth. These interfaces were made for this work: https://www.youtube.com/watch?v=vGMbcBIw3jA |
|