Bibliography Details
Measuring the Internet | |
Authors: | k. claffy |
Published: | CAIDA, 2000 |
URL: | https://catalog.caida.org/paper/2000_ieee0001/ |
Entry Date: | 2004-02-02 |
Abstract: |
Internet traffic behavior has been resistant to modeling. The reasons derive from the Internet's evolution as a composition of independently developed and deployed (and by no means synergistic) protocols, technologies, and core applications. Moreover, this evolution, though "punctuated" by new technologies, has experienced no equilibrium thus far. The state of the art, or lack thereof, in high-speed measurement is neither surprising nor profound. It is a natural consequence of the economic imperatives in the current industry, where empirically grounded research in wide-area Internet modeling has been an obvious casualty. Specifically, the engineering know-how required to develop advanced measurement technologies, whether in software or hardware, is essentially the same skill set required to develop advanced routing and switching capabilities. Since the latter draw far greater interest, and profit, from the marketplace, it is where the industry allocates engineering talent. A common complaint about traffic measurement studies is that they do not sustain relevance in this environment where traffic, technology, and topology change faster than we can measure them. Moreover, the proliferation of media and protocols make the acquisition of traffic data almost prohibitively complicated and costly. And finally, the time required to analyze and validate data means that most research efforts are obsolete by the time findings are published. Thus, far from having an analytic handle on the Internet, we lack in most cases the ability even to measure traffic at a granularity that would enable infrastructure-level research. As a result, while the core of the Internet continues its rapid evolution, measurement and modeling of it progress at a leisurely pace. |
Datasets: | |
Experiments: | |
Results: | |
References: |