Difference between revisions of "Network Latencies"
Jump to navigation
Jump to search
Line 11: | Line 11: | ||
*Demand for latency sensitive applications: | *Demand for latency sensitive applications: | ||
**Online computer games | **Online computer games | ||
**high performance computing | **high performance computing (Universities, Pharmaceutical, other R&D departments) | ||
**video conferencing | **video conferencing | ||
**voice over IP | **voice over IP |
Revision as of 02:00, 13 March 2005
Description:
What are the driving forces behind network latency? What is latency?
Latency is the amount of time needed for a packet to go from place A to B, and to receive a response packet from B back to A.
[t=0, place=A] send packet to B ==> [t=25, place=B] Sent response to A ===> [t=50, place=A]
Network latency together with bandwidth are the two major harware inhibitors for parallel computing. This holds for clusters and grids, but also for system-on-a-chip solutions like the Cell architecture, which in fact is a cluster/grid on a chip.
Forces Stimulating the Development of Low Latency Infrastructure:
- Demand for latency sensitive applications:
- Online computer games
- high performance computing (Universities, Pharmaceutical, other R&D departments)
- video conferencing
- voice over IP
- Lots of new market opportunities (High Performance Computing for everybody):
- Complex Stock Market Analysis on Demand
- Remote control of equipement X
- Complex whatever, for consumers etc.
- Optical Fibres
Forces Inhibiting Low Latency Infrastructure Development:
- Speed of light
- Network Switching Techology
- Cost of (Re)placing new Switches and Cables