I'm wondering what the relation is between operating an audio device at a lower latency and it using more CPU power.
Can you explain it to me?
I'm wondering what the relation is between operating an audio device at a lower latency and it using more CPU power.
Can you explain it to me?
To achieve lower latency the CPU has to service interrupts from the device more often and so move data in smaller chunks - so more chunks will have to be moved. It turns out that the setup and teardown (the overhead) to move a chunk of data to or from a device is significant. So the fewer chunks/second you're handling, the less the load on the CPU.
However, modern CPUs are so powerful that the difference shouldn't be noticeable.