QUINCE (Quality of User Internet Customer Experience): a Reactive Crowdsourcing-based QoE Monitoring Platform
We are developing a framework which integrates existing network measurement infrastructures and crowdsourcing platforms to measure the Quality-of-Experience on network paths.
Principal Investigators: Amogh Dhamdherekc claffy
Funding source: Joint Experiment Period of performance: January 1, 2018 - November 30, 2018.
Project Summary
Measuring the Quality of Experience (QoE) in a real-world environment is challenging. Even though a number of platforms have been deployed to gauge network path performance from the edge of the Internet, one cannot easily infer QoE from that data because of the subjective nature of QoE. On the other hand, crowdsourcing-based QoE assessment, namely QoE crowdtesting, is increasingly popular in conducting subjective assessments for various services, including video streaming, VoIP, and IPTV. Workers on crowdsouring platforms can access and participate in assessment tasks remotely through the Internet. The experimenter can also select a pool of potential workers according to their geolocation or their historical accuracy. Existing QoE crowdtesting mainly evaluates emulated scenarios, instead of studying the impact of Internet events. This is because the launching of QoE crowdtesting is usually not based on network measurement results. Although we can measure network path quality from the workers, it will be difficult to correlate the assessment results with Internet events because of differences in the assessment time and network path being measured.
In this project, we propose a novel framework to launch QoE crowdtesting in a timely manner when adverse network events are detected. We will use existing network measurement infrastructures to detect network events, such as link congestion. Based on information of the events, the framework would initiate QoE crowdtesting to recruit workers who are potentially affected, who then will provide feedback on their perceived QoE. The main advantage of this reactive approach is that it will improve the effectiveness of launching QoE crowdtesting tasks thus helping to evaluate the impact of network events as they occur.
Research Plan
Task | Description | Projected Timeline | Status |
---|---|---|---|
1 | Find resources | Jan - Feb 2018 | done |
2 | Finalize the schedules | Jan - Feb 2018 | done |
3 | Set up the necessary software and hardware for the platform | Jan - Feb 2018 | done |
4 | Start the experiment | Mar 2018 | done |
5 | Check the preliminary results. Are we collecting 150 records/month? Is the QoE score within the expected range? | Apr 2018 | done |
6 | Finish the initial result analysis | May 2018 | done |
7 | Modify parameters as necessary, repeat the experiment | Jun - Aug 2018 | done |
8 | Collect the targeted number of records | Oct 2018 | done |
9 | Report the results | Nov 2018 | done |
Expected Outcomes
- A platform which can automatically launch QoE crowdtesting according to network events
- A mechanism for creating suitable QoE crowdtesting and recruiting appropriate set of workers from the crowd
- A set of data obtained from the platform and the models derived from them
Additional Content
QUINCE (Quality of User Internet Customer Experience): a Reactive Crowdsourcing-based QoE Monitoring Platform
The proposal “A Reactive Crowdsourcing-based QoE Monitoring Platform” is also available in PDF.