A Loss-event Driven Scalable Fluid Simulation Method For High-speed Networks
Abstract
Increase of size and bandwidth of computer network posed a research challenge to evaluate proposed TCP/IP protocol and corresponding queuing policies in this scenario. Simulation provides an easier and cheaper method to evaluate TCP proposals and queuing disciplines as compared to experiment with real hardware. In this paper, problem associated with scalability of current simulation method for high-speed network case is discussed. Hence, we present a scalable time-adaptive numerical simulation driven by loss events to represent dynamics of high-speed networks using fluid-based models. The new method uses a loss event to dynamically adjust the size of a time step for a numerical solver which solves a system of differential equations representing dynamics of protocols and nodes' behaviors. A numerical analysis of the proposed protocol is discussed. A simple simulation of high-speed TCP variants is presented using our method. The simulation results and analysis show that the time-adaptive method reduces computational time while achieving the same accuracy compared to that of a fixed step-size method. © 2009 Elsevier B.V. All rights reserved.
Recommended Citation
S. Kumar et al., "A Loss-event Driven Scalable Fluid Simulation Method For High-speed Networks," Computer Networks, vol. 54, no. 1, pp. 112 - 132, Elsevier, Jan 2010.
The definitive version is available at https://doi.org/10.1016/j.comnet.2009.08.018
Department(s)
Computer Science
Keywords and Phrases
Congestion control; High-speed networks; Network modeling; Network simulation; Transport protocol
International Standard Serial Number (ISSN)
1389-1286
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Elsevier, All rights reserved.
Publication Date
15 Jan 2010