SOUND FOR MOBILE iOS DEVICES WITH AN EXPLORATION OF IMMERSIVE 3D SOUND SCENES
PREREQUISITES: Geometry, Algebra; basic knowledge of a programming language is recommended but not required
INSTRUCTORS: Jim Simmons, Professor, Computer-assisted Recording/Director, Electronic Music Ensemble, Fullerton College; Dr. Martin Jaroszewicz, Ph.D. UC Riverside
The goal of this cluster is to introduce students to computer science principles and applied psychoacoustics with an introduction to sound design for mobile devices and an emphasis on creating 3D sound environments for video games and other applications. Students will study the physics of sound, field recording techniques, analysis and editing. Students will learn Apple's new computer language Swift, Xcode and the AVFoundation framework. The course introduces students to sound programming and the latest iOS technology. Working in groups within a unique lab setting, students will collaborate to design, build and complete a sound oriented iOS app while learning to create environmental field recordings of various sound sources. Using open source software tools to perform waveform and spectral analysis to “see" the components of sound, students will learn the basics of digital audio manipulation and sound processing using Apple's AU (Audio Units) and the AVFoundation framework.
- Part 1 Field recording techniques and audio editing: Students will learn how to record outdoor/indoor sounds using field recorders, how to edit and process the recorded sounds to create sound sources for mobile devices.
- Part 2 iOS Application Development: Working with Swift 2.0 and the AVFoundation, students will learn how to playback sound sources at specifics points in a 3D space, and to apply signal processors to create applications ranging from instrumental effects to sound for video games and virtual reality. Students will learn how to create simple interfaces and how to access the gyroscope, compass and accelerometer data on iPads, iPhones and the Apple Watch.