-1

The theoretical maximum transfer speed of Wi-Fi is about 7 Gbps or 900 MB/s.

Apparently NVMe for example can do up to 16 GB/s. That's orders of magnitude faster than Wi-Fi.

Wi-Fi literally uses light as its medium though, which is pretty fast...

Why is Wi-Fi so slow?

theonlygusti
  • 888
  • 4
  • 14
  • 30
  • 6
    Wifi does not use light, it uses radio waves. Furthermore the maximum distance is much higher. The maximum length for PCIEx 3 is 8 inch (~20,3 cm) and it uses copper lines instead of air. If you would limit Wifi to short ranges without walls or people in between you could use TeraHertz radio waves. Using this technology scientists already transferred 100Gbps for a few meters. – Robert Nov 01 '19 at 17:07
  • 3
    The practicalities of radio communications are far too wide a set of topics to explain in the context of this website. Mainly things like communications are limited by [signalling rate](https://en.wikipedia.org/wiki/Data_signaling_rate), synchronisation and [bandwidth](https://en.wikipedia.org/wiki/Bandwidth_(signal_processing)). Just saying "it's basically the same as light" is a gross oversimplification and assumes that "light" has infinitely fast signalling and bandwidth. – Mokubai Nov 01 '19 at 17:20
  • 1
    @Ramhound I think you did the maths there wrong. The question isn't about a speed I'm experiencing anyway, I just looked up the theoretical maximum Wi-Fi transfer speeds. – theonlygusti Nov 01 '19 at 18:22
  • @Robert why does having a larger maximum distance slow down the transfer speed so much? – theonlygusti Nov 01 '19 at 18:24
  • @Mokubai [this "Related" question](https://superuser.com/q/1217280/325613) is basically the same question. Why is it so well-received but mine is too broad? – theonlygusti Nov 01 '19 at 18:29
  • 1
    @theonlygusti Probably a range of things. For one thing the question shows at least some level of awareness of the problem at hand. They are also focused on solving an actual problem *within* the larger protocol rather than "Describe the entire protocol". Their problem requires a general knowledge of the protocol to answer, but does not require *describing* a large number of topics from basic principles and in great detail in order to "just" answer. – Mokubai Nov 01 '19 at 19:28
  • 1
    In that related question they talk about how they believe it works and go on to ask very specific questions `what does, at the concept/protocol level cause this?` while your question could, possibly harshly, be summed up as "why are two very different things so different?" – Mokubai Nov 01 '19 at 19:42

1 Answers1

1

WiFi works in a randomizing environment. Signals from other computers, from other devices which use the same frequencies, and radio frequency noise all interfere with the signals. Therefore, data often has to be repeated because of collisions of signals.

Also, radio waves bounce off objects in the environment, and when some signals bounce and others don't, the received signal has to be cleaned up in the PC or router; sorting out the signal from the noise, and sorting out the signal from the echoes, takes time, which also reduces throughput.

By comparison, NVMe is on a copper bus designed to reject interference, uses dedicated lanes so there's no need to slow down to detect collisions, much less repeat data when there's a collision, and it has much greater bandwidth on that signal bus, so it can move data much faster.

K7AAY
  • 9,512
  • 4
  • 33
  • 62