This paper introduces a novel interactive, human-computer interface and remote social communication system based on an augmented, hi-fidelity audio headphone platform. Specifically, this system- named Pokehead, currently utilizes the DUL embedded open-source accelerometer platform to gather 3-axis position data in order to trigger real-time sonic events via specified head gestures. These gestures are mapped to sound models that convey particular messages in order to control software on the mobile device or to another, simultaneous POKEHEAD user. The context for the project was motivated by the desire to take advantage of the ubiquitous nature of headphone users in a social and private setting along with networked mobile devices such as smart phones and/or portable media player devices. Our goals were to design an intuitive autonomous, versatile, and practical interface context using a simple, open source implementation. Our rapid prototype proved to be robust enough to work in performance for demonstration purposes and serves as a working proof of concept. In this paper we provide a technical description of our prototype, illustrate the context and motivation behind the project, and offer insight on further uses and potential developments for the viability of such technology for users and developers alike.