John Woodruff has twenty years of experience developing innovative sound processing. He has taken dozens of audio software features from concept to launch for leading companies like Apple, Samsung, Facebook, LG, Lenovo, Knowles and others, and has deep experience in feasibility research, algorithm design, machine learning, real-time audio software, and project management.
Read on to learn more about the audio innovations John has helped to develop. If you think we could help with your project, contact us and let’s discuss it!
Apple’s Hearing Aid Feature
Apple’s AirPods Pro headphones have both a noise cancelling and a transparency mode in order to control how much of your surroundings you want to hear. Transparency mode lets outside sound in, so you can hear what's going on around you. Recently Apple received FDA approval for the Hearing Aid feature, which can amplify soft sounds and adjust the relative balance of low-, mid-, and high-frequencies. This enables users to customize their experience in order to make sounds more audible, with adjustments easily accessible in the Hearing Control Center module. When Hearing Aid is enabled, the Conversation Boost sub-feature focuses your AirPods Pro on the person who's talking in front of you and allows the user to control the background noise level.
Apple’s Headphone Accommodations
Headphone Accommodations is an accessibility feature released in iOS 14 and is intended to improve sound clarity over Apple and Beats headphones for users with hearing difficulty. Offering nine preset settings and the option to import the result of a hearing test for setup, the feature amplifies soft sounds and adjusts certain frequencies for a user’s unique hearing needs. Headphone Accommodations can be applied to media playback, phone calls, FaceTime calls, Live Listen mode, and to Transparency Mode on AirPods Pro.
Natural Sound in Augmented Hearing
Have you ever been in a noisy restaurant and struggled to hear the person right across the table? If so, you are not alone! It is a really common problem even for people without clinical hearing loss. Today’s noise cancelling headphones, like Apple’s AirPods Pro, offer a unique opportunity to turn down the background noise and let through only the sound of your conversation partner across the table.
In this system we use the latest “tiny ML” to integrate information from multiple microphones in order to not only isolate the voice of interest (your conversation partner across the table) from the background noise, but also to isolate your own voice so that it is possible to selectively minimize the filtering delay to create a natural listening experience.
Apple’s Reduce Loud Sounds in iOS and iPadOS
The World Health Organization estimates that over 1 billion young people are at risk of hearing loss due to their listening habits. To enable awareness and increased listening safety, Apple introduced the Reduce Loud Sounds feature in iOS 14. This feature allows users to set an upper bound in dBA SPL (e.g. 85 dBA SPL) such that no audio content will violate this limit. This enables users, and also parents of kids, to ensure that headphones will not produce an unsafe level.
Tracking and Separating Sound Sources
For a long time I have been fascinated by the fact we can pay attention to an individual sound or sound source within a mix of sounds, and how we can modulate our attention across time to change what we are listening to. I started to appreciate this as I was actively producing and recording music, and would focus my attention on individual parts in turn and often marvel at how much detail I could hear in spite of