The dark magic that crops up everywhere, is the root of all evil, and no one seems to understand (including me…) - phase finds its way into everything I do.
What is phase? In a nutshell, it represents delay. If you delay a signal, technically that’s phase shift. Things get interesting when different frequencies have differing amount of phase shift. Most people working with audio know it as the thing which makes your mics sound funny. If you sum mics with differing amounts of delay (such as when they’re a different distances to a source) then that causes certain frequencies to get boosted and others attenuated. Unfortunately people have the habit of referring to polarity as phase. Flipping polarity is different from a 180 degree phase shift (which implies a shift in time).
Where else does phase manifest? It’s hard to find anywhere it doesn’t. Most roads in the projects I work on lead to phase. When you look at a typical speaker (ignoring coax designs), you have multiple sources of sound (multiple drivers) which overlap in their frequency ranges. What happens when you move around a speaker? The relationship of distances between each driver and you changes. This results in the signal from one driver arriving at your ear at a slightly different time than the other (speed of sound is roughly 344m/s, you can do the math…). What’s the result? Phase shift! This change in timing is a major factor determining the directivity (off axis response) of a speaker which ultimately determines how it’ll interact with a room (on axis might be fine but the reflections you’re hearing off of that bare wall in your room could be riddled with problems cause by phase…). Speaking of reflections in your room - phase!
Another major factor relating to speakers and phase is the crossover filters which determines which frequencies go to which drivers. These filters introduce their own phase shift. Not only does it affect the off axis response, but this time it also affects the on axis response. It baffles me how so many people claim that phase shift is inaudible. Unfortunately this conclusion was arrived at through flawed testing. In a steady state signal (like a square wave), the relative timing of harmonics doesn’t matter much. What does matter is the timing of transients. If half of a snare’s sound is delay from the other half, it’s not difficult to see how this would lead to very audible effects. Much of my work with speaker design has revolved around exploring these effects of phase shift. With nothing but a simple filter the soundstage of a system can be changed quite drastically.
Phase once again pops up when it comes to my work exploring room acoustics and horns. In one way or another, it’s responsible for a significant part of what we hear. Room modes, reflections, and the flare rate of a horn are all largely driven by phase. You can’t escape it.
Lastly, and most importantly, it allows me to do what I do with mastering. Playing with phase is how I’m able to craft a song and bring out a certain element or potentially help fight a problem. Tools like EQ, M/S processing, and width tools allow me to sculpt the soundstage and depth with nothing but phase. Remarkable yet an endless source of curiosity.