2

I'm wondering what the relation is between operating an audio device at a lower latency and it using more CPU power.

Can you explain it to me?

Hedge
  • 1,154
  • 6
  • 20
  • 34

1 Answers1

5

To achieve lower latency the CPU has to service interrupts from the device more often and so move data in smaller chunks - so more chunks will have to be moved. It turns out that the setup and teardown (the overhead) to move a chunk of data to or from a device is significant. So the fewer chunks/second you're handling, the less the load on the CPU.

However, modern CPUs are so powerful that the difference shouldn't be noticeable.

Jamie Hanrahan
  • 23,140
  • 6
  • 63
  • 92
  • Spot on. A reference would be nice but, heck, +1 anyway because this is the correct answer. – misha256 Sep 03 '15 at 02:40
  • 2
    My analogy for this is working in an office where people keep interrupting you. Each interruption involves a lot of your time (who are you, why are you here, what do you need from me, let me see what I can do, etc). If these interruptions happen frequently enough, you spend all your time context switching and getting no real work done. – misha256 Sep 03 '15 at 02:45