Cluster 5: Sound for Virtual Reality: An Exploration of 3D Sound and Movement Using Mobile Devices and Wearable Sensors

    

Instructors: Jim Simmons, Professor, Computer-assisted Recording/Director Electronic Music Ensemble, Fullerton College; Martin Jaroszewicz, Ph.D., UC Riverside

Prerequisites: Geometry, Algebra; basic knowledge of programming language is recommended but not required

Course Description:
The goal of this cluster is to introduce students to computer science principles and applied psychoacoustics with applications in virtual reality and performer-computer interaction in electroacoustic music. The course focuses on sound programming and introduces students to wearable Inertial Motion Units (IMU). Students will study digital sound, field recording techniques, analysis and editing. Students will learn Apple’s new computer language Swift, the AVFoundation, and the Core Motion framework. Working in groups within a unique lab setting, students will collaborate to design and build a sound oriented iOS app while learning to create environmental field recordings of various sound sources. Using open source software tools to perform waveform and spectral analysis to “see” the components of sound, students will learn the basics of digital audio manipulation and sound processing.

  • Part 1: Field recording techniques and audio editing. Students will learn how to record outdoor/indoor sounds using field recorders, and how to edit and process the recorded sounds to create sources for 3D sound spatialization.
  • Part 2: iOS Application Development: Working with Swift 3.0, the AVFoundation and Core Motion, students will learn how to spatialize sound in a 3D space and how to use IMU data to create applications ranging from real time instrumental effects to sound for video games and virtual reality. Students will learn how to create simple interfaces and how to access the gyroscope, compass and accelerometer data on iPads, iPhones, the Apple Watch and wearable IMUs.