My main interest is in measuring, analyzing (in the widest possible sense), and describing realistic network traffic, i.e., traffic from "live" networks, with the purpose of applying our (hopefully) improved understanding of actual traffic flows to the design, management and engineering of modern high-speed networks.
For obvious reasons, the Internet is one of the most exciting working packet network for (i) learning about the true nature of actual network traffic, (ii) investigating historical traffic trends, and (iii) inferring the traffic behavior in tomorrow's networks. However, for some combination of reasons (legal, political, security), high-resolution Internet traffic measurements are either non-existent, in very short supply, or kept under a tight seal.
Question: How to make as many high-quality and high-resolution (and suitably "cleaned") traffic measurements as possible available to as many (qualified) researchers as possible?
Making the most effective and efficient use of actual traffic measurements requires a close collaboration between networking experts, data analysts, statisticians, mathematicians, etc. This is even more so the case as the size of modern network traffic measurements is reaching the Terabyte range and the data keep revealing features that have been unheard of in the past and require novel and unconventional concepts and techniques. Yet, in the Internet community as well as in the academic community, there exists a widespread "can-do-it-myself" mentality when it comes to analyzing any sort of traffic data.
Question: How to change the prevailing "can-do-it-myself" mentality into a "let's-do-it-together" attitude when it comes to rigorous analyses of future Internet traffic data sets?