A Review of Artificial Intelligence Applications in Contemporary Computer Network Technologies

Abstract

Rapid advancement in science and technology has seen computer network technology being upgraded constantly, and computer technology, in particular, has been applied more and more extensively, which has brought convenience to people’s lives. The number of people using the internet around the globe has also increased significantly, exerting a profound influence on artificial intelligence. Further, the constant upgrading and development of artificial intelligence has led to the continuous innovation and improvement of computer technology. Countries around the world have also registered an increase in investment, paying more attention to artificial intelligence. Through an analysis of the current development situation and the existing applications of artificial intelligence, this paper explicates the role of artificial intelligence in the face of the unceasing expansion of computer network technology.

Share and Cite:

Lutepo, A. and Zhang, K. (2024) A Review of Artificial Intelligence Applications in Contemporary Computer Network Technologies. Communications and Network, 16, 90-107. doi: 10.4236/cn.2024.163005.

1. Introduction

A field of study that describes the capability of machine learning just like humans and the ability to respond to certain behaviors is termed artificial intelligence, abbreviated as AI [1]. It is not a secret that, at the forefront of science and technology, artificial intelligence (AI) technology has realized high efficiency and safety of data processing under the control of modern programming. The growth of the economy and society has promoted the expansion of computer network technology and necessitated the need for support for computer network technology, respectively. Artificial intelligence, as a new field in computer network technology, is solving complex problems [2]. AI technology has gradually become widely used due to its competitive edge and ability to meet the needs of social development. AI network technology and big data technology are the products of the rapid development of computer network technology. Under this background, the application level of AI technology and big data technology has been greatly improved [3]. At this point, it is inevitable for industries to integrate these technologies. Therefore, the article expounds on the related content of AI technology, introduces the application and optimization measures of AI in computer network technology, and clarifies the development trend of computer network technology [4].

Many researchers have discussed AI and its applications in a number of areas of computer technology [5] and [6]. Others have focused on the application of AI to one aspect of communication technology, such as wireless communications in the Internet of Things (IoT) [7], network management [8], wireless security [9], emerging robotics communication [10], antenna design [11], and UAV networks [12]-[14] briefly discussed some promising use cases of AI for communication technologies, whereas [15] discussed the use of AI for space-air-integrated networks. The use of DL in space applications has also been addressed [16].

Further, a significant number of researchers have discussed network communication systems and the use of AI in one or a few ways; however, a broader survey of AI applications in distinct aspects of network technologies is yet to be performed. Consequently, this literary piece aims to provide an introduction to AI, details of various challenges being faced by network technologies in the absence of AI, and an extensive survey of potential AI-based applications to overcome these challenges. Many challenges are being faced by different areas of computer network technologies, algorithms, models, potential AI-based solutions, and the positive news the application of artificial intelligence has brought to the world of computer network technology.

2. Methodology

To improve search results and accuracy, the author optimized many terms in search engines to get the most help. Conduct research through a rigorous peer review process using multiple databases, including searching SCI, Scopus, Web of Science, ACM Digital Library, IEEE Xplore, Google Scholar, and ScienceDirect for important articles relevant to the current topic. Strategies followed include measurement, planning, analysis and research, product analysis, evaluation of conformity, and presentation and interpretation [17].

The results of the research are limited to relevant information, because the aim of this article is to present the application of Artificial Intelligence in network technology. In addition to selecting documents with more than 5 references, researchers also selected new research documents with less than 5 references and/or references but with new layers of standards and/or methods [18].

The data analysis for this study was done using Python, a popular open-source programming language. Python was chosen due to its strong data manipulation and analysis capabilities, as well as the availability of a wide range of libraries and tools that facilitated the processing and visualization of the data.

The primary Python libraries utilized in this analysis include: Pandas, data manipulation and analysis library that was used to load, clean, and transform the data into a structured format suitable for analysis. Matplotlib, a comprehensive library for creating high-quality static, animated, and interactive visualizations, which was used to generate the figures and plots presented in this study, and NumPy, a fundamental library for scientific computing in Python.

2.1. Benefits of Using Artificial Intelligence in Computer Network Technology

With the development of today’s information and communication technology (ICT), network technology is used to store, transmit and manage information within a company, a country or even within itself. This information supports economic development and is an important resource for people. Internet technology provides a knowledge sharing process and development platform. People can use this platform to get the information they need from the internet. However, internet technology brings its own problems. Violators often gain access to valuable information, including patents, copyrights, and academic texts. This situation hurts the victim and causes financial losses. For this reason, many countries around the world have prioritized cyber security issues and it has become an important part of strategic planning [19]. Although this is true, it does not meet the relevant network performance management information security requirements. Therefore, the threat must be eliminated. The system cannot verify the process, security and accuracy of the data. It is designed to collect information and cannot be verified. The combination of artificial intelligence and computer network technology allows the software to quickly and easily identify risks, making it possible to detect, protect and prevent the risk of personal surveillance. It also simulates the processing of human data and filters out inaccurate data. It provides data efficiency that modern network technology cannot provide. It is also worth noting that a working intelligence model should be established in the first stage. Artificial intelligence has been proven to save time and resources. Many universities offer professional courses in network technology to meet the needs of future talents. It can be safely said that the application of computer network technology is now very popular. Therefore, strengthening the use of special devices in computer networks is very important in terms of protecting national security and finance as well as personal interests.

2.2. Application of Artificial Intelligence Technology in E-Commerce

Business activities involve the exchange of information; the development of e-commerce is founded on computer network technology. In today’s world, e-commerce activities are growing at an alarming rate. For the convenience of work, companies’ cooperation is on the internet, which makes many business secrets stored on computer networks. The openness and sharing of computer networks make these critical pieces of information face greater threats. The ineffectiveness of the security of business data in e-commerce would bring losses to the company, affect business activities, and deter the development of e-commerce to a certain extent. In the process of e-commerce activities, artificial intelligence technology is employed for information encryption to ensure the security of information storage and transmission.

Further, the application of artificial intelligence technology in business activities not only warrants the company’s data security but also increases the user’s sense of security, which becomes a guarantee for the interests of both parties [20].

2.3. Application of Artificial Intelligence Technology in Information Signature Verification

Non-repudiation is achieved through cryptography, like digital signatures, and includes other services for authentication, auditing, and logging. In online transactions, a digital signature is where a party cannot later deny sending information or deny the authenticity of its signature. This is achieved through the use of AI. Generally, this is applied in financial cooperation, such as commercial cooperation, banking business, and financial-related activities. Publicly used information encryption and privately used information encryption are two known types of information signatures, with the former being the most widely used signature verification technology. The high security requirements in this area in public utilities have seen the rise of online payments, and more and more people have begun to use Alipay, WeChat, and other online payment methods to handle various businesses, attaining the goal while in their homes. The internet has made it possible and convenient [21]. Information signature verification technology effectively protects the security of information and related resources when people conduct business online.

2.4. Application of Artificial Intelligence in Mobile Communication

AI is playing an essential role in the mobile communication system, promising ways to optimize its performance. In general, AI techniques meaningfully contribute to the dynamic adaptation of mobile communication in an environment. The complex network infrastructure needs to migrate from traditional operation and management methods to an intelligence approach. This will reduce inefficiency and enhance expansiveness [22].

The world should expect a more complex generation of wireless networks and the need for more resources because of the need to improve service requirements with various devices, various applications, and, of course, complex networks [23]. In addition, network programmers should familiarize themselves with their system to give the best service and use available resources to increase service quality. Furthermore, in 2021, the industry forecasts that IP traffic consumption will reach 3.3 zettabytes and smartphone traffic will outpace personal computer (PC) traffic [24]. AI presents the opportunity to build an adaptive system that will provide a system and an environment with better performance. This big data era causes the increasingly massive datasets available from mobile or wireless systems. This means that applying AI to mobile communication will ensure communication systems have better performance, are more efficient, and increase key performance indicators (KPIs) [25] [26].

The increasing network infrastructure and hardware acceptance of mobile devices and their applications will increase the need for mobile stations, exploding mobile traffic volumes, and demand for huge data processing. According to [27], one of the most important solutions is the implementation of advanced artificial intelligence techniques, such as machine learning and deep learning, in mobile communication, which will assist in managing base stations’ huge data and performance to provide full mobile stations. [28] draws a review of deep learning approaches in mobile and wireless networking research, and [29] explores the AI approaches for 5G mobile and wireless communications technologies. On the other hand, AI has been widely applied to applications in mobile communication. There are a number of notable classic artificial intelligence approaches, such as fuzzy logic and neural networks. The neural network would be extended to better performance techniques, and these are machine learning and deep learning approaches. Fuzzy logic is the basic approach that processes values and results in true and false. Reinforcement learning is another term in AI, a technique for designing a computer or machine to learn by itself instead of being precisely programmed [30]. One of the techniques is neural networks (NN). A technique that can be made by a machine or computer that is able to do self-learning and solve a problem. The NN process adopts the human brain system and behavior. In contemporary issues, deep learning (DL) has also been popular as an improved machine learning (ML). ML and DL are fascinating approaches for advanced network traffic and the management of future mobile communication. Supervised learning and unsupervised learning are the two types of AI learning that have been employed in mobile communication.

2.5. Artificial Intelligence for Satellite Communication

Advances in wireless communications, increasing demand for new services and the rapid development of smart devices to perform important tasks have led to an increased demand for satellite communications technology, which will increase ground services to serve many cities, towns and regions. These three types of satellites include geostationary orbits, also known as geostationary equatorial orbit (GEO), medium earth orbit (MEO), and low earth orbit (LEO) satellites [31]. Dispersion depends on three main factors: altitude, beam footprint, and trajectory. GEO, MEO and LEO satellites orbit around the earth at an altitude of 35,786 km, 7000 - 25,000 km and 300 - 1500 km, respectively. The range of a GEO satellite is between 200 and 3500 miles. Although LEO and MEO satellites have shorter durations, many LEO and MEO satellites provide continuous global coverage. For example, Iridium Next has 66 LEO satellites and 6 backup satellites; the existing network in case the underground network is temporarily paralyzed or damaged due to a disaster; In addition, satellite communication can cover various areas such as electricity, transportation, commerce, agriculture and public security. Although satellite communication has improved the global economy and communication quality, it also faces some problems. Some of these include: Satellites, especially those in low earth orbit, have limited onboard service and move at high speeds, putting pressure on network access. The movement of space can affect safety networks and connections between satellite systems (GEO, MEO, and LEO), aerial systems (unmanned aerial vehicles (UAVs), balloons, and airships), and ground systems. Path control and spectrum control.

In addition, it is more difficult to achieve high energy efficiency in satellite communications compared to terrestrial networks. Many studies discuss various aspects of satellite communications, including transmission schemes, satellite MIMO, mobile satellite systems [32], inter-satellite communication [33], Quality of Service (QoS) [34], satellite remote IoT [35], CubeSat communications [36], optical space communications [37], air-ground hybrid network [38], small satellite communications [39], physical security [40], and non-terrestrial networks. Meanwhile, according to [41], interest in artificial intelligence (AI) has increased rapidly in recent years. Artificial intelligence, including machine learning (ML), deep learning (DL), and reinforcement learning (RL); It has yielded great results in many applications in the field of science and engineering, such as software engineering, electrical engineering, financial engineering, and bioengineering. Some researchers are turning to AI technology to solve many challenges in their work and creating AI-based applications as different ways to overcome many problems in wireless communications.

2.6. The Role of AI in Cyber Security

The first question to ask is whether artificial intelligence is the future of cybersecurity. When we look at commercial and private companies, they have already adopted AI programs. Many government departments also use this tool. But why is this so? Since AI can easily save money by going through data patterns and reading and learning unnecessary data, patterns, numbers, sounds and sentences, it can save taxes and state secrets. Hackers are working day and night trying to find ways to get into systems by bypassing cracks that people don’t even know exist. Time passes before the company discovers the data breach [42]. At this point the hacker is long gone and all sensitive information is in the wrong hands. But the AI needs to sit back, collect information and wait for the hackers to get away with it. AI detects unusual behavior that a hacker would expect to see when entering a password or when a user logs in. Detects junk signals that will stop hackers from detecting and interfering. As suggested by [43], any node can be exploited. In the ongoing game of cybersecurity, human hackers will always question the weaknesses of every system, including intelligence. AI is controlled by humans and therefore can still be defeated. Although artificial intelligence is very good in terms of its ability to connect and process information, it can only work as designed [44]. Programmers are called on to provide greater protection as hackers work around the clock to adapt to the intelligence process. The hide and seek game continue; but intelligence is a powerful force in the fight for data protection. Google offers Tensor Flow, a graph data learning model for machine learning. Research on Neural Structured Learning (NSL), an open-source application that uses neural graph learning techniques to train data and data models in Neural Networks. NSL runs on Tensor Flow, a machine learning framework, and is designed to work for expert (rather than inexperienced) machine learners. NSL can be used for machine vision modeling, NLP, and projecting interactive data such as medical records or infographics [45].

Security management at the specific level of the network domain is an important task that needs to be developed and researched. Network security management provides a simple guarantee for the development of network technology applications. By securing network technology, they can be efficient and provide support. In this context, experts can use big data and artificial intelligence to prepare the main points of network security management and meet various needs used in network security management [46]. Big data and artificial intelligence are big players in the game. The protection capabilities of network security management provide assurance for network security management. For example, in the era of big data, an artificial intelligence blocking system should be established to protect network security to meet the needs of computer network technology and artificial intelligence technology. Intelligent intervention and protection technology and technology can be integrated to form the core of management intelligence [47]. Communications protection is a type of protection of information transmitted by computer network technology.

Data Collection and Analysis

A stage where network technicians need to strengthen the AI technology application. In the process of applying computer technology, staff will generate a huge amount of data and need to dig more data. Information is gradually presented [48]. With the characteristics of diversity and huge data in mind, relying solely on traditional technology may not only be costly, but could also increase the pressure of data collection. It is evident that use of AI technology can effectively solve the data collection problem, analysis of more data scientifically and reasonably, and effectively improve data analysis efficiency.

2.7. Artificial Intelligence and Machine Learning in 5G and Beyond

Long-Term Evolution (LTE) or large-scale deployment of 4G mobile networks solves one of the biggest problems in wireless communications, the need for large capacity to create a true broadband mobile Internet. This is due to the very powerful layer and flexible network architecture based on orthogonal frequency division multiplexing (OFDM) and multiple input multiple output (MIMO). Services that require high bandwidth are evolving like never before, with capacities up to 1 Gbps. Some of these include virtual reality (VR) and augmented reality (AR). Moreover, new services such as in-vehicle communications or the Internet of Vehicles (IoV) require mobile networks to be highly reliable and with almost zero latency [49]. They are forced to think about making the network smarter to gain a deeper and more accurate understanding of the workplace. Adoption of artificial intelligence (AI) and machine learning (ML) is essential to predict changes and create effective, efficient and adaptable networks. This is true for every layer of the system and every layer of the network.

Main products and goals of 5G Just as the transition to 4G/LTE is based on more mobile data, [50] believes that 5G systems compete with more stringent and diverse requirements, thus improving the quality of user experience (QoE) and high competition and deploying 5G systems with ultra-low latency communications. To meet these requirements, 5G aims at three changes to protect new applications: business automation/intelligent design (Industry 4.0), automatic driving/driving, virtual reality, electrical health, etc. On the other hand, the International Telecommunication Union, a United Nations working group of the (ITU) came up with quality standards and specifications developed by another international working group, the Third Generation Partnership Project (3GPP).

2.7.1. ITU Sets Standards and Regulations for Information and Communications Technology Worldwide. Below Are Three Key 5G Developments Rolled out by ITU

1) Enhanced Mobile Broadband (eMBB): Faster on the Go

eMBB is a strategy that focuses on speed, capacity, and mobility to support new mobile applications such as augmented reality (AR), virtual video streaming, and virtual reality (VR). All users expect 5G networks to be fast. However, aptitude and mobility may or may not be necessary. For example, there may be many people connected to the 5G network in a particular location. In this case, a higher capacity but a lower mobility requirement is needed to maintain high performance. On the other hand, traveling in a moving vehicle, especially a high-speed train or plane, requires high performance to maintain high speed. 5G eMBB allows you to stream high-definition videos on your device [51].

2) Ultra-Reliable and Low-Latency Communications (URLLC)

According to [52], this is an innovation that allows you to connect with low latency. It supports mission-critical communications for services such as connected healthcare, remote surgery, mission-critical applications, vehicular applications and autonomous vehicles, vehicle-to-vehicle (V2V) communications, high-speed rail connectivity, smart business, and many more. In this case, data should be sent with as little delay or latency as possible and the connection should be as reliable as possible. As announced by 3GPP, this is a latency increase of up to 75% over 4G LTE. Considering innovation that has never been achieved, our technology comes into play: Advertising More Advertising More Advertising (MIMO) enables a new level of bitrate and millimeter wave Enabling access and usage prospects to overcome spectrum scarcity and network concentration. The introduction of MIMO in 4G/LTE will determine the achievement of true broadband speed for mobile internet.

3) Massive Machine-Type Communications (mMTC): Connecting Everything

Think of 5G mMTC as connecting many things in the Internet of Things. The mMTC concept aims to connect a large number of devices (up to 1 million devices per square kilometer) in an area with limited data and low energy consumption. By enabling a wide range of sensors, mMTC can be used in applications such as smart agriculture, where sensors can help monitor and trigger small changes in the development of a large area, bringing the target closer to the best of development. It could also enable similar operations in urban areas, near-instantaneous data collection and response to help manage traffic, or smarter plans to help manage traffic generation, safety, or transportation in factories, monitoring operations, and maintenance in transportation.

It is the same idea that massive MIMO will push the next mobile phone’s transmission to 2 Gbps. Large antennas on the base station require multiple power amplifiers. The main problem with power amplifiers is the trade-off between linearity and efficiency: amplifiers can be designed to achieve good linearity at the expense of performance [53]. Highly linear power amplifiers are expensive and consume more power. To control overall network costs, capital expenses, and operating expenses, inexpensive products focus on the overuse of base station antennas.

It is also worth noting that new energy and spectrally efficient broadband wireless communications are subject to nonlinear distortion from the radio frequency front end. For example, high-power amplifiers can affect the performance of the receiver and therefore the entire network. In addition, since 5G systems operate in different power locations and at different frequencies, various requirements are expected to be even more challenging. Therefore, power amplifiers must meet stricter linearity specifications while maintaining the efficiency of the entire system. limited space. However, this does not apply to 5G base stations because it is not a good solution in terms of power/medium ratio. However, a functional and effective 5G network cannot be achieved without expertise. 5G enables simultaneous connection of many IoT devices, resulting in massive amounts of data that must be processed using machine learning and artificial intelligence. When ML and AI are combined, wireless service providers can seek to achieve the following [54].

  • Dynamic network slicing to address wide-ranging use cases with diverse QoS.

  • Remove coverage holes by measuring the interference and using the inter-site distance information.

  • Detect dynamic variation and forecast user distribution by analyzing historical data.

  • Forecast peak traffic, resource utilization, and application types, and optimize and fine-tune network parameters for capacity growth.

  • High level of automation from the distributed ML and AI architecture at the network edge.

  • Application-based traffic navigation and aggregation across heterogeneous access networks.

  • ML/AI as a service offering for end users.

2.7.2. The Role and Integration Method of AI and Different ML Approaches as Core Parts of AI in the Next Generation of Mobile Networks

Artificial neural networks (ANN), also known as neural networks (NN), are popular machine learning models inspired by biological processes in the brain. The first neural network algorithm was the perceptron developed by Rosenblatt in 1958 [55]. The discovery was inspired by McCulloch’s mathematical model of neurons in the human brain. Over the next few years, many types and architectures of neural networks were proposed, as well as algorithms for effective training. This is why the term deep learning has entered the machine learning community. Deep learning is a subfield of machine learning that focuses on parameterizing multilayer (deep) neural networks that can learn data representations. Its success is due to its excellence in image classification, speech recognition and language processing. It has been documented that deep learning techniques outperform the results of these tasks. However, deep learning and most machine learning techniques have some limitations [56]. The first is the knowledge needed during training to successfully perform human-like tasks. The computing power required to train deep learning models on big data is another limitation.

The categories of machine learning

ML is a complex landscape. It can be divided into three classical categories based on the training strategy. The illustration below shows different learning approaches.

  • Supervised Learning: Learning with a labeled training set. It includes classification and regression tasks.

  • Unsupervised learning is the process of using unlabeled data to train a model. The model discovers patterns in unlabeled data. Clustering is the most widely used task in unsupervised learning.

  • Reinforcement learning is defined as the process of training a model through a series of actions that lead to a particular outcome, where the system is rewarded for good performance and punished for poorly performed performance directly from its environment. This kind of learning is used in robotics and games [57].

Globally, countries that would be termed early adopters in regard to innovation are leading the AI are devising different strategies in their investment and deployment of AI.

The world has witnessed Artificial Intelligence (AI) emerging as a transformative force reshaping industries, economies and societies. Various countries around the world have allocated significant resources towards AI research, development and implementation, through diverse strategies, priorities and visions for the future. Figure 1 shows how much venture capital countries have continued to inject in AI.

Results calculated based on data collected from Global Artificial Intelligence Index ranking, Mirae Assets’ Global Research, OECD and the world bank showing an increase in investment in AI by country.

Figure 2 shows venture capital investment in AI for seven countries, taking into account their GDP, which substantiates how much countries have continued to commit to expending towards AI.

Figure 1. AI venture capital investment by country.

Figure 2. Venture capital investments in AI against country’s GDP from 2012 to 2022.

Intelligence research is a global effort. As presented in Figure 2, the United States and China leads contribution to artificial intelligence. This has attracted great attention, countries around the world are experimenting with the technology, exploring new trends and attracting the attention of private investors. Stanford University’s “2023 Artificial Intelligence Report” estimates that the global private sector in artificial intelligence will reach US$ 91.9 billion in 2025, yet this is just the tip of the iceberg. Goldman Sachs predicts that global investment in intellectual property will reach US$ 110.2 billion in 2025 and US$ 158.4 billion in 2026. 48% of all investments. Companies like OpenAI, Anthropic, and Inflection AI are among the most sought-after vendors.

The Dataset “AI global index” includes The Global AI Index and seven indicators affecting the Index on 62 countries, as well as general information about the countries (region, cluster, income group and political regime).

The index benchmarks nations on their level of investment, innovation and implementation of artificial intelligence.

Figure 3 demonstrates scores for various indicators which were taken into

Figure 3. AI global indicators index.

account to come up with the graphical representation. That is, it shows the total score is highly correlated with Research (0.95), Development (0.87), Talent (0.86), and Commercial (0.86) variables. It displays medium correlations with Government Strategy and Operating Environment variables (that is, the contribution of such variables to the total score is less than the highly correlated variables).

The results indicate research, development, talent, and commercial contribute the most to the overall national AI index score, while government strategies and processes have little impact on the overall AI index. Two powerful countries (the United States and China) are better than other countries in terms of intellectual intelligence measured simultaneously; these countries have seen many ways to create and produce intellectual property, technology, or ecosystems, and to integrate (even soft “war”) AI into the world arena.

Talent, Infrastructure and Operating Environment are the factors of AI Implementation group of indicators, which represents the application of artificial intelligence by professionals in various sectors, such as businesses, governments, and communities. As presented in Figure 3, talent indicator focuses on availability of technical experts to deliver AI solutions, whilst Infrastructure focuses on the reliability and scale of access to systems, from energy and the internet to supercomputing power, operating environment, on the other hand focuses on delivering AI solutions. Research and Development are the factors of innovation group of indicators, which indicates technological and process progress and signals the potential for development and improvement of skills whereas research indicator focuses on the extent of specialist research and researchers; investigating the number of publications and citations in credible academic journals. Further, development indicator focuses on the development of fundamental platforms and algorithms upon which innovative artificial intelligence projects rely. Government Strategy and commercial are the factors of investment group of indicators, which mirrors financial and procedural commitments to artificial intelligence. Government Strategy indicator concentrates on the depth of commitment from national government to artificial intelligence; investigating spending commitments and national strategies. Lastly, commercial dwells on the level of startup activity, investment and business initiatives based on artificial intelligence.

3. Challenges of Artificial Intelligence

Although the adoption of artificial intelligence is increasing, it is not yet integrated into the business value chain at the scale it should be. In addition, the fact that decision-making companies are still in their infancy causes a decrease in the evaluation of intellectual property and the loss of cost effectiveness of the evaluation. After years of speculation and legitimate concerns about human progress, the potential social disruption of technological intelligence, and black box problems, smart investors worry of putting money into new investing [58].

Automation makes it difficult to identify the causes of errors and malfunctions. Also, since humans cannot learn and understand how these devices work, they have little or no control over the system and accordingly automated systems become increasingly complex and machines do more. On the other hand, AI is about science, technology and algorithms, most people don’t know about it, which makes it difficult for them to trust it.

Not Invincible, Scope and Limits are other challenges, just like other technologies, AI can only replace certain functions. It has its limitations and cannot replace all functions. However, this leads to the emergence of a new job with different job quality. With the rise of artificial intelligence, there is a lack of experts in the sector who can meet the requirements and technology. Business owners need to train their professionals to benefit from technology. Besides, data scarcity, the power and efficiency of AI and AI applications directly depend on the accuracy and accuracy of data tracking and recording used for education and training. There is no good information. Although transformational learning, active learning, deep learning, and unsupervised learning efforts are underway to develop models that allow AI models to learn in the absence of good log information, this only exacerbates the problem [59].

Algorithmic bias: Artificial intelligence consists of data and algorithms. The accuracy of intelligent decision-making depends solely on whether it is trained and uses real and unbiased information. If materials used for training are biased based on race, gender, community or ethnicity, Important decisions may be made unfairly and unethically. This bias will only get worse as more AI machines continue to learn negative data.

4. Conclusion

Although AI is broad and uncertain, it remains the subject of public debate, opposition, and political division. Today, many countries are developing their efforts to disseminate their national intelligence strategies and are changing the practices implemented in previous years. Standards are high and the government and work environment support skills. But in fact, the problem is different; in many countries intellectual skills are insufficient, and innovation and development are insufficient; although these countries do not face serious competition from other countries, many countries still have great potential in the field of intellectual property. Investments, research and development in artificial intelligence have increased in recent years, and many countries see it as a priority. With the development of artificial intelligence technologies such as computer network technology, deep learning, reinforcement learning, and natural language processing, the potential applications of artificial intelligence are vast and continuous. Therefore, artificial intelligence will continue to develop and affect our future in many ways.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Alsedrah, M.K. (2017) The American University of the Middle East “Artificial Intelligence”.
https://doi.org/10.13140/RG.2.2.18789.65769
[2] Feng, W.H. (2017) Discussion on the Application of Artificial Intelligence in Computer Network Technology in the Era of Big Data. Electronic Production, No. 13, 98-99.
[3] Pan, Y. (2018) Application Analysis of Artificial Intelligence in Computer Network Technology. Computer Fan, 12, 22-24. (In Chinese)
[4] Simeone, O. (2018) A Very Brief Introduction to Machine Learning with Applications to Communication Systems. IEEE Transactions on Cognitive Communications and Networking, 4, 648-664.
https://doi.org/10.1109/tccn.2018.2881442
[5] Calvanese Strinati, E., Barbarossa, S., Gonzalez-Jimenez, J.L., Ktenas, D., Cassiau, N., Maret, L., et al. (2019) 6G: The Next Frontier: From Holographic Messaging to Artificial Intelligence Using Subterahertz and Visible Light Communication. IEEE Vehicular Technology Magazine, 14, 42-50.
https://doi.org/10.1109/mvt.2019.2921162
[6] Jagannath, J., Polosky, N., Jagannath, A., Restuccia, F. and Melodia, T. (2019) Machine Learning for Wireless Communications in the Internet of Things: A Comprehensive Survey. Ad Hoc Networks, 93, Article ID: 101913.
https://doi.org/10.1016/j.adhoc.2019.101913
[7] Kumar, G.P. and Venkataram, P. (1997) Artificial Intelligence Approaches to Network Management: Recent Advances and a Survey. Computer Communications, 20, 1313-1322.
https://doi.org/10.1016/s0140-3664(97)00094-7
[8] Zou, Y., Zhu, J., Wang, X. and Hanzo, L. (2016) A Survey on Wireless Security: Technical Challenges, Recent Advances, and Future Trends. Proceedings of the IEEE, 104, 1727-1765.
https://doi.org/10.1109/jproc.2016.2558521
[9] Alsamhi, S.H., Ma, O. and Ansari, M.S. (2019) Survey on Artificial Intelligence Based Techniques for Emerging Robotic Communication. Telecommunication Systems, 72, 483-503.
https://doi.org/10.1007/s11235-019-00561-z
[10] Misilmani, H.M.E. and Naous, T. (2019) Machine Learning in Antenna Design: An Overview on Machine Learning Concept and Algorithms. 2019 International Conference on High Performance Computing & Simulation (HPCS), Dublin, 15-19 July 2019, 600-607.
https://doi.org/10.1109/hpcs48598.2019.9188224
[11] Lahmeri, M.A., Kishk, M.A. and Alouini, M.S. (2020) Machine Learning for UAV-Based Networks. arXiv: 2009.11522.
[12] Vazquez-Nicolas, J.M., Zamora, E., Gonzalez-Hernandez, I., Lozano, R. and Sossa, H. (2018) Towards Automatic Inspection: Crack Recognition Based on Quadrotor UAV-Taken Images. 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, 12-15 June 2018, 654-659.
https://doi.org/10.1109/icuas.2018.8453390
[13] Lin, M. and Zhao, Y. (2020) Artificial Intelligence-Empowered Resource Management for Future Wireless Communications: A Survey. China Communications, 17, 58-77.
https://doi.org/10.23919/jcc.2020.03.006
[14] Saeed, N., Elzanaty, A., Almorad, H., Dahrouj, H., Al-Naffouri, T.Y. and Alouini, M. (2020) Cubesat Communications: Recent Advances and Future Challenges. IEEE Communications Surveys & Tutorials, 22, 1839-1862.
https://doi.org/10.1109/comst.2020.2990499
[15] Kato, N., Fadlullah, Z.M., Tang, F., Mao, B., Tani, S., Okamura, A., et al. (2019) Optimizing Space-Air-Ground Integrated Networks by Artificial Intelligence. IEEE Wireless Communications, 26, 140-147.
https://doi.org/10.1109/mwc.2018.1800365
[16] Kothari, V., Liberis, E. and Lane, N.D. (2020) The Final Frontier: Deep Learning in Space. Proceedings of the 21st International Workshop on Mobile Computing Systems and Applications, Austin, 3 March 2020, 45-49.
https://doi.org/10.1145/3376897.3377864
[17] Carrillo, F.A.G. (2012) ¿Can Technology Replace the Teacher in the Pedagogical Relationship with the Student? ProcediaSocial and Behavioral Sciences, 46, 5646-5655.
https://doi.org/10.1016/j.sbspro.2012.06.490
[18] Lai, L., Chang, R. and Kouh, J. (2008) Detecting Network Intrusions Using Signal Processing with Query-Based Sampling Filter. EURASIP Journal on Advances in Signal Processing, 2009, Article No. 735283.
https://doi.org/10.1155/2009/735283
[19] Luo, D. and Zhou, C.a.X. (2020) A Brief Discussion about Application of Artificial Intelligence in Computer Network Technology in the Era of Big Data. Journal of Physics: Conference Series, 1684, 012001.
https://doi.org/10.1088/1742-6596/1684/1/012001
[20] Liu, S.X. (2016) Information Application Development of Artificial Intelligence in Computer Network Technology in the Big Data Era. Chinas Strategic Emerging Industries, 10, 96-98. (In Chinese)
[21] Peng, H. (2018) Application of Artificial Intelligence in Computer Network Technology in the Big Data Era. Electronic Technology and Software Engineering, 14, 169-172. (In Chinese)
[22] Li, Z.G. (2019) Research on the Big Data Era and Artificial Intelligence Application in Computer Network Technology. Communications World, 26, 110-111.
[23] Banupriya, D., Preetha, P.S. and Prathisha, R.R. (2018) A Study on Use of Artificial Intelligence in Wireless Communications. Asian Journal of Applied Science and Technology, 2, 354-360.
[24] Kibria, M.G., Nguyen, K., Villardi, G.P., Zhao, O., Ishizu, K. and Kojima, F. (2018) Big Data Analytics, Machine Learning, and Artificial Intelligence in Next-Generation Wireless Networks. IEEE Access, 6, 32328-32338.
https://doi.org/10.1109/access.2018.2837692
[25] Zhang, C., Patras, P. and Haddadi, H. (2019) Deep Learning in Mobile and Wireless Networking: A Survey. IEEE Communications Surveys & Tutorials, 21, 2224-2287.
https://doi.org/10.1109/comst.2019.2904897
[26] Ge, X., Thompson, J., Li, Y., Liu, X., Zhang, W. and Chen, T. (2019) Applications of Artificial Intelligence in Wireless Communications. IEEE Communications Magazine, 57, 12-13.
https://doi.org/10.1109/mcom.2019.8663984
[27] Mao, Q., Hu, F. and Hao, Q. (2018) Deep Learning for Intelligent Wireless Networks: A Comprehensive Survey. IEEE Communications Surveys & Tutorials, 20, 2595-2621.
https://doi.org/10.1109/comst.2018.2846401
[28] Morocho Cayamcela, M.E. and Lim, W. (2018) Artificial Intelligence in 5G Technology: A Survey. 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, 17-19 October 2018, 860-865.
https://doi.org/10.1109/ictc.2018.8539642
[29] Wang, Y., Narasimha, M. and Heath, R.W. (2018) Mmwave Beam Prediction with Situational Awareness: A Machine Learning Approach. 2018 IEEE 19th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), Kalamata, 25-28 June 2018, 1-5.
https://doi.org/10.1109/spawc.2018.8445969
[30] Maral, G., Bousquet, M. and Sun, Z. (2020) Introduction. In: Maral, G., Bousquet, M., Sun, Z.L., Eds., Satellite Communications Systems: Systems, Techniques and Technology, 6th Edtion, Wiley, 3-11
https://doi.org/10.1002/9781119673811
[31] Chowdhury, P., Atiquzzaman, M. and Ivancic, W. (2006) Handover Schemes in Satellite Networks: State-Of-The-Art and Future Research Directions. IEEE Communications Surveys & Tutorials, 8, 2-14.
https://doi.org/10.1109/comst.2006.283818
[32] Arapoglou, P., Liolis, K., Bertinelli, M., Panagopoulos, A., Cottis, P. and De Gaudenzi, R. (2011) MIMO over Satellite: A Review. IEEE Communications Surveys & Tutorials, 13, 27-51.
https://doi.org/10.1109/surv.2011.033110.00072
[33] Chini, P., Giambene, G. and Kota, S. (2009) A Survey on Mobile Satellite Systems. International Journal of Satellite Communications and Networking, 28, 29-57.
https://doi.org/10.1002/sat.941
[34] Radhakrishnan, R., Edmonson, W.W., Afghah, F., Rodriguez-Osorio, R.M., Pinto, F. and Burleigh, S.C. (2016) Survey of Inter-Satellite Communication for Small Satellite Systems: Physical Layer to Network Layer View. IEEE Communications Surveys & Tutorials, 18, 2442-2473.
https://doi.org/10.1109/comst.2016.2564990
[35] Niephaus, C., Kretschmer, M. and Ghinea, G. (2016) Qos Provisioning in Converged Satellite and Terrestrial Networks: A Survey of the State-of-the-Art. IEEE Communications Surveys & Tutorials, 18, 2415-2441.
https://doi.org/10.1109/comst.2016.2561078
[36] De Sanctis, M., Cianca, E., Araniti, G., Bisio, I. and Prasad, R. (2016) Satellite Communications Supporting Internet of Remote Things. IEEE Internet of Things Journal, 3, 113-123.
https://doi.org/10.1109/jiot.2015.2487046
[37] Lutz, E., Werner, M. and Jahn, A. (2012) Satellite Systems for Personal and Broadband Communications. Springer Science & Business Media.
[38] Kaushal, H. and Kaddoum, G. (2017) Optical Communication in Space: Challenges and Mitigation Techniques. IEEE Communications Surveys & Tutorials, 19, 57-96.
https://doi.org/10.1109/comst.2016.2603518
[39] Liu, J., Shi, Y., Fadlullah, Z.M. and Kato, N. (2018) Space-Air-Ground Integrated Network: A Survey. IEEE Communications Surveys & Tutorials, 20, 2714-2741.
https://doi.org/10.1109/comst.2018.2841996
[40] Burleigh, S.C., De Cola, T., Morosi, S., Jayousi, S., Cianca, E. and Fuchs, C. (2019) From Connectivity to Advanced Internet Services: A Comprehensive Review of Small Satellites Communications and Networks. Wireless Communications and Mobile Computing, 2019, Article ID: 6243505.
https://doi.org/10.1155/2019/6243505
[41] Li, B., Fei, Z., Zhou, C. and Zhang, Y. (2020) Physical-layer Security in Space Information Networks: A Survey. IEEE Internet of Things Journal, 7, 33-52.
https://doi.org/10.1109/jiot.2019.2943900
[42] Rinaldi, F., Maattanen, H., Torsner, J., Pizzi, S., Andreev, S., Iera, A., et al. (2020) Non-terrestrial Networks in 5G & Beyond: A Survey. IEEE Access, 8, 165178-165200.
https://doi.org/10.1109/access.2020.3022981
[43] Ghosh, A.K., Michael, C. and Schatz, M. (2000) A Real-Time Intrusion Detection System Based on Learning Program Behavior. In: Debar, H., Mé, L., Wu, S.F., Eds., Lecture Notes in Computer Science, Springer, 93-109.
https://doi.org/10.1007/3-540-39945-3_7
[44] Hosseini, R., Qanadli, S.D., Barman, S., Mazinani, M., Ellis, T. and Dehmeshki, J. (2012) An Automatic Approach for Learning and Tuning Gaussian Interval Type-2 Fuzzy Membership Functions Applied to Lung CAD Classification System. IEEE Transactions on Fuzzy Systems, 20, 224-234.
https://doi.org/10.1109/tfuzz.2011.2172616
[45] Alazab, M., Soman, K.P., Srinivasan, S., Venkatraman, S. and Pham, V.Q. (2023) Deep Learning for Cyber Security Applications: A Comprehensive Survey. Authorea Preprints.
[46] Wang, J., Zhao, L., Jiang, P. and Cui, J. (2020) A Survey of Machine Learning Techniques for Predictive Maintenance of Network Equipment. Journal of Ambient Intelligence and Humanized Computing, 11, 1845-1859.
[47] Lambert II, G.M. (2017) Security Analytics: Using Deep Learning to Detect Cyber-Attacks. Master’s Thesis, University of North Florida.
[48] Corbett, A.T. and Anderson, J.R. (1995) Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modelling and User-Adapted Interaction, 4, 253-278.
https://doi.org/10.1007/bf01099821
[49] Li, R., Zhao, Z., Zhou, X., Ding, G., Chen, Y., Wang, Z., et al. (2017) Intelligent 5G: When Cellular Networks Meet Artificial Intelligence. IEEE Wireless Communications, 24, 175-183.
https://doi.org/10.1109/mwc.2017.1600304wc
[50] Indika (2011) Difference Between Strong AI and Weak AI.
http://www.differencebetween.com/difference-between-strong-ai-and-vs-weak-ai/
[51] Song, W., Zeng, F., Hu, J., Wang, Z. and Mao, X. (2017) An Unsupervised-Learning-Based Method for Multi-Hop Wireless Broadcast Relay Selection in Urban Vehicular Networks. 2017 IEEE 85th Vehicular Technology Conference (VTC Spring), Sydney, 4-7 June 2017, 1-5.
https://doi.org/10.1109/vtcspring.2017.8108458
[52] Nilsson, N.J. (2005) American Association for Artificial Intelligence. AI Magazine.
[53] Cowen, T. and Dawson, M. (2009) What Does the Turing Test Really Mean? And How Many Human Beings (Including Turing) Could Pass? George Mason University Department of Economics, and University of Montreal.
[54] Abraham, A. (2005) Artificial Neural Networks. Handbook of Measuring System Design.
[55] Zhou, Z. (2004) Rule Extraction: Using Neural Networks or for Neural Networks? Journal of Computer Science and Technology, 19, 249-253.
https://doi.org/10.1007/bf02944803
[56] Zohuri, B. (2020) Deep Learning Limitations and Flaws. Modern Approaches on Material Science, 2, 241-250.
https://doi.org/10.32474/mams.2020.02.000138
[57] Jdid, B., Hassan, K., Dayoub, I., Lim, W.H. and Mokayef, M. (2021) Machine Learning Based Automatic Modulation Recognition for Wireless Communications: A Comprehensive Survey. IEEE Access, 9, 57851-57873.
https://doi.org/10.1109/access.2021.3071801
[58] Rajendran, S., Meert, W., Giustiniano, D., Lenders, V. and Pollin, S. (2018) Deep Learning Models for Wireless Signal Classification with Distributed Low-Cost Spectrum Sensors. IEEE Transactions on Cognitive Communications and Networking, 4, 433-445.
https://doi.org/10.1109/tccn.2018.2835460
[59] Zhuang, Y., Wu, F., Chen, C. and Pan, Y. (2017) Challenges and Opportunities: From Big Data to Knowledge in AI 2.0. Frontiers of Information Technology & Electronic Engineering, 18, 3-14.
https://doi.org/10.1631/fitee.1601883
[60] Guo, Z.M. (2018) Application Analysis of Artificial Intelligence in Computer Network Technology. Computer Fan, 10, 23-26. (In Chinese)

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.