I'm evaluating AudioKit for the first time after not being able to solve this problem with AVFoundation. Basically, I just need to know exactly when an output device actually plays a sound buffer. The problem becomes more important with BT headphones (like AirPods) where the latency is >100ms.
AVFoundation and AudioUnit at a lower level have a variety of latency and io buffer properties to inspect but sadly these just aren't reliably accurate in all cases. (E.g. on a Mac running through Catalyst, the AudioEngine input latency is always 0!)
With AudioKit, if I use a class with a start(at:)
function, does the "at" time account for output latency so that the sound actually plays at this time?
Or are there other facilities within AudioKit to clock sync with output events?
question from:
https://stackoverflow.com/questions/65864015/syncing-with-high-latency-output-devices-like-bluetooth-headphones 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…