Assessment of Quality of Experience (QoE) of Web and Video Services over a Mobile Network Using a Network Emulator

Abstract

This paper investigates the subjective assessment of QoE of web and video services over a mobile network. To achieve this, the authors used the network emulator (NetEm) traffic control functionality to simulate the dynamic behaviour of a mobile network. Experiments were conducted in a laboratory setting and test conditions were varied to ascertain the QoE, with a focus on QoE metrics such as delay and packet loss ratio. From the experiments conducted, it was observed that there was a negative correlation between delay and average mean opinion score (MOS), and between packet loss ratio and average MOS. The results obtained can be adopted by network operators to provide better services which would lead to improved subscriber base and profitability for the operators and better QoE for the end users.

Share and Cite:

Ominike, A. , Joshua, J. , Awodele, O. and Ogbonna, A. (2020) Assessment of Quality of Experience (QoE) of Web and Video Services over a Mobile Network Using a Network Emulator. Journal of Computer and Communications, 8, 89-95. doi: 10.4236/jcc.2020.85005.

1. Introduction

With the present advancement in mobile networks, there is a growing need for mobile internet services by end users. Examples of such services are web, video and over-the-top (OTT) services, and these are different from the traditional voice services mobile network operators provide. Qualinet, a body responsible for multidisciplinary QoE research in Europe, defines QoE as “the degree of delight or annoyance of the user of an application or service. It results from the fulfillment of his or her expectations with respect to the utility or enjoyment of the application or service in the light of the user’s personality and current state” [1].

The Quality of Experience (QoE) of end users is a key issue to be considered by mobile network operators, in order to ensure subscriber growth [2]. The way QoE is perceived, differs from one application or service to another. For example, QoE is positive in a voice application if end users can communicate effortlessly and the quality of the voice transmission is excellent. Similarly, a positive QoE for a web browsing service translates to the fact that end users can quickly download videos, graphics and high quality images [3].

Due to the growing number of operators in Telco and network services industry, it is imperative that telecommunications service providers strive to keep their customers happy, while also maintaining and improving their subscriber base. Telcos need to be proactive as there are low exit barriers for customers who wish to change network providers as a result of low service quality. Thus, the onus lies on the service providers to regularly monitor and improve QoE as the need arises [4]. In [5] [6], research showed that the majority of the end users are willing to spend more to have a superior quality of experience. Similarly, they are also willing to change service providers if their needs are not adequately fulfilled.

A network emulator was used because of the ease it provides in constructing an experimental environment for subjects to assess the QoE of the web and video services to be investigated. In evaluating the QoE, emphasis was on two network level QoE metrics, Delay and Packet Loss Ratio (PLR). The network emulator tool was used to create specific sequences of delay and artificial packet loss [7]. Delay, also called network latency, is the time it takes to send a packet from a source to destination or vice versa, while Packet loss can be defined as the failure or inability of IP packets sent over a network to reach their destination. Transmission issues, network congestion and limited memory are some of the reasons networks experience packet losses. The organization of the rest of this paper is as follows. In Section II, the network emulator tool is discussed. The experiments and results are shown in Sections III and IV respectively and Section V concludes our study.

2. Network Emulator Tool

NetEm is a Linux traffic control functionality which can be used in emulating bandwidth, delay, packet loss, packet reordering, and jitter amongst other metrics [8]. It functions as a traffic shaper which controls the transfer of data over a network to achieve desired quality of service (QoS) requirements. The traffic shaper introduces certain conditions in the network based on the input parameters specified [9]. The emulator was placed in-between the end user devices and the internet cloud. Other available traffic shapers are Dummynet and NIST Net.

In this paper, the WAN emulator was set up as a network bridge. When you “bridge” two Ethernet adapters together (for example, eth0 and eth1), the two networks become one single (larger) Ethernet network. The reason for bridging Ethernet connections is to monitor traffic flowing across an Ethernet cable. An Ethernet bridge is used to forward packets between two or more networks so that they behave in most cases as if they were a single network. A bridge could be configured in a physical hardware device or implemented as a software application.

3. Experiments

3.1. Testbed

A testbed was setup in the laboratory as illustrated in Figure 1, to emulate the behavior of a mobile network using an Ubuntu Linux (Bionic Beaver) server and the NetEm tool. The testbed included a Cisco Catalyst 3750 Series Network Switch, a Wide area Network (WAN) emulator (traffic shaper), Internet router and end user devices such as desktop PCs and Laptops. In Table 1, the configuration of the emulator is given.

3.2. Procedure and Participants

To evaluate the QoE, experiments were carried out with users for various test scenarios and all the users were tested/presented with the same scenarios. For the web browsing experiments, contents presented to the users included online photo albums, e-commerce sites, news pages and map services. Popular services such as Footytube, Netflix and YouTube were used for the video streaming sessions. Free browsing tasks were also given to the subjects i.e. It does not explicitly require users to achieve specific goals during the web browsing session (WBS).

Figure 1. Network Testbed.

Table 1. Configuration for emulator system.

Here, subjects explored the defined test contents by calling up several web pages. The users accessed these web pages and rated how they perceived the service, using the MOS (Mean Opinion Score) five scale parameters, with each grade reflecting users’ judgment of the experiment under test. The choice of the MOS scale is hinged on the fact that with complex reporting systems, there is normally low participation rate. Questionnaires were administered at the end of the experiments to obtain the MOS ratings of the subjects. In Table 2, the MOS quality ratings used by the subjects during the experiments is shown.

Different parameters that could influence QoE (such as delay and other packet loss ratio) were introduced (without the knowledge of the users) and the then the test was repeated successively. After the experiments were carried out, questionnaires were used to elicit information from the experiments conducted.

In study I (web and video), delay value was varied from 100 ms to 1000 ms at intervals of 100 ms. The participants were unaware of the delay values at any instant. A total of 48 participants, 29 Males and 19 Females were involved in the study. The average age of the subjects was 27. There was a pre-experiment briefing of 10 minutes at the start of each experiment and a 10-minute period after each experiment to debrief the users. The duration of each experiment was for 20 minutes.

In study II (web and video), the packet loss ratio was varied from 5% to 35% at intervals of 5%. Again, the participants were unaware of the packet loss ratio at each level. A total of 40 participants, 22 Males and 18 Females were involved in the study. The average age of the subjects was 23. There was a pre-experiment briefing of 10 minutes at the start of each experiment and a 10-minute period after each experiment to debrief the users. The duration of each experiment was for 20 minutes.

4. Results and Discussions

Table 3 shows a summary of the results obtained from study I and Figure 2 shows the graph of web and video MOS plotted against overall delay.

Figure 2. Web and video MOS Vs delay.

Table 2. MOS quality rating.

Table 3. Delay comparison.

Discussion I

In study I, the delay values specified ensured that the NetEm delayed all packets by the time specified in milliseconds. The MOS of all subjects in the study for each delay level, for the web and video services was averaged and the results obtained as shown in Table 3. This study revealed that the MOS for web and video services was 4.85 and 5.00 respectively at delay value of 100 ms and these values dropped significantly to 1.67 and 1.20 respectively, at 1000 ms. As the delay value increased, the MOS of users decreased correspondingly. The study revealed that users began to feel dissatisfied with the services when delay values rose to 600 ms. The threshold for acceptability of the services were found to be 900 ms, as users were not willing to reuse the service at delay values exceeding 900 ms. It was observed that a negative correlation existed between the delay and the average MOS.

Table 4 shows a summary of the results obtained from study II and Figure 3 shows the graph of web and video MOS plotted against packet loss ratio.

Discussion II

In this study II, netem emulated packet loss by randomly dropping the specified percentage of packets before queuing. The results obtained are shown in Table 4. This study revealed that users began to feel dissatisfied with the service when the packet loss ratio (PLR) was 25% and above. This study revealed that

Figure 3. Web and video MOS Vs. packet loss ratio.

Table 4. Packet loss ratio comparison.

the MOS for web and video services was 3.75 and 3.98 respectively at PLR value of 5% and these values dropped significantly to 1.68 and 1.61 respectively at PLR value 35%. 1000 ms. Again, a negative correlation existed between the PLR value and the average MOS.

5. Conclusion

In this study, we performed QoE assessment by conducting several experiments on web and video streaming services by introducing artificial delays and randomly dropping packets using a network emulator tool, NetEm. End users performed a subjective assessment of their experiences and rated it using the MOS. From the results obtained, we found that the threshold for acceptability of the web and video service was found to be at delay values of 1000 ms at which point the web and video MOS were 1.67 and 1.20 respectively. For PLR, the threshold was at 30% with web and video MOS of 1.93 and 2.18 respectively. Mobile network operators can adopt the results obtained to provide better services which would lead to better QoE for the end users. In future, more metrics could be investigated and field trials or crowdsourcing methods employed to improve subject diversity.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Le Callet, P., Möller, S. and Perkis, A. (2012) Qualinet White Paper on Definitions of Quality of Experience (QoE) and Related Concepts. Proceedings of the 5th Qualinet Meeting, Novi Sad, Serbia.
[2] Wang, Y., Zhou, W. and Zhang, P. (2017) QoE Management in Wireless Networks. Springer, Berlin.
https://doi.org/10.1007/978-3-319-42454-5
[3] Ominike, A., Joshua, J., Awodele, O. and Ogbonna, A. (2019) A Quality of Experience Hexagram Model for Mobile Network Operators Multimedia Services and Applications. International Journal of Computer (IJC), 4523, 95-105.
[4] Spetebroot, T., Afra, S., Aguilera, N., Saucez, D., Barakat, C. and Antipolis-Mediterranee, I.S. (2014) From Network-Level Measurements to Expected Quality of Experience: The Skype Use Case. 2015 IEEE International Workshop on Measurements & Networking (M&N), Coimbra, 12-13 October 2015, 1-6.
https://doi.org/10.1109/IWMN.2015.7322989
[5] Venturini, F., Marshall, C. and Alberto Di, E. (2012) Hearts, Minds and Wallets Winning the Battle for Consumer Trust. 1-16.
[6] Accenture (2014) Digital Video and the Connected Consumer.
[7] Shibata, M., Ito, Y. and Koshimura, R. (2014) Evaluation of QoE of Web Services on a Mobile Host over a Wireless LAN by Simulator/Emulator System. 2014 International Conference on Information and Communication Technology Convergence (ICTC), Busan, 22-24 October 2014, 77-81.
https://doi.org/10.1109/ICTC.2014.6983087
[8] Hemminger, S. (2005) Network Emulation with NetEm. Proc. 6th Aust. Natl. Linux Conf. (LCA 2005), April, 1-9.
[9] Shinwary, A. (2010) Mapping of User Quality-of-Experience to Application Perceived Performance for Web Application. PhD Thesis, School of Computing, Blekinge Institute of Technology, Sweden.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.