An Overview of the Application of Big Data in Supply Chain Management and Adaptation in Nigeria ()
1. Introduction
The globalization and tremendous growth in the Information Communication Technology (ICT) has more than proportionately transformed the world into what is called global village. The invention of modern high power processing computers with sextillion bytes storage capacity has prompted discuss of Big Data concept as a tool in managing hitherto all human challenges of complex human system multiplier effects. It stands for huge array of convoluted data sets acquired from myriads sources works. The highest known earlier data processing methods now appear to be a subset of set in big data operations technique like: data mining, machine learning, data fusion, social networks, and the trending artificial intelligence (AI) and so on [1]. Many business enterprises have used big data technique to comprehend customer’s behaviour more accurately product price optimization and improved their operational efficiency thereby reducing personnel costs [2] [3].
Also in social network industry, Big Data analytics has assisted various government authorities to manage various insecurities by analysing origin and destination of calls and text messages of network users. Companies like Facebook, Twitter and LinkedIn, etc use it to understand their user’s current behaviour in relation to some products. Doctors in the health industry use it to analyze the pathogenic characteristics and their spatial distribution, at a given point in place nationally. Major public service providers can also use it to optimize their service delivery and tariff discriminant policy implementation for Smart-Grid adaptation. Its application in the Intelligent Transportation Systems (ITS) cannot be over emphasized as it helps in assessing road users’ behaviour to policy implementations and decision taking [4] [5]. In the supply chain industry, it has become very easy to predict the probability of accident occurrences and where and when search and rescue (SAR) operation is needed in real-time with satellite-based sensors. This is also very effective in infrastructural asset management by critically carrying out asset big data analysis to identify for instance, pavement degradation, ballast aging and decaying. This enhances real-time maintenance decision-making and implementation that prevents reactive as against proactive maintenance systems in organizations [6].
Big data technical know-how can be viewed as software-based tools for assemblage, processing, analysing, and extracting information from tremendously complicated large data assemblage that is above the hitherto management tool capacity. It can also be seen as a tool for the collection of huge data that is growing exponentially with time. Big data can generally be classified into three groups: a) Structured Data, b) Unstructured Data, and c) Semi-Structured Data. There are specific tools or software that are used to store and process these complex data which include the following:
Integrate.io
Atlas.ti
Analytics
Microsoft HDInsight
Skytree
Talend
Splice Machine
Spark
Supply Chain Management (SCM) data sources are multivariate depending on the scope of operation. The web-based local traffic flow real-time data across a metropolis on hourly bases is a major data source in SCM for the planning of delivery routing as in Figure 1. In analysing this type of data, the ontology method may be applied, where the types, properties, and interrelationships of the feature variables in a particular domain are discussed. According to Li Zhu, ontology approaches do effectively explain dialectology with their related inferences through a top-down data integration feature. Ontology models disseminate semantics views of data while mapping multisource assorted data and minimizing the ambiguities embedded in the shared data. For instance, Femandez and Ito (2015) reported a driver’s behaviours model using ontology that captured driver’s characteristics, perception, intellectual ability to respond to divers driving task for intelligent transport system [7].
![]()
Figure 1. Architectural flow chat of big data analysis in SCM Source: Adapted after Li Zhu, et al (2018).
From the earlier researchers’ work, it is obvious that the key challenges in big data usage are the determinant of the proportion of structured, unstructured, and semi-structured data to be merged in a given project and the human capacity to apply the appropriate tools, and this seems to have been overcome in the most recent times. The general principle that guides in the selection of big data tools is the license fee where applicable and the human capacity in a given organization. This paper is therefore aimed at eliciting the potentials of big data as a tool in optimizing the benefits embedded in governmental and individual organizational database and other cloud data that are hitherto lying fallow in Nigeria. The operational flow chart of big data is as giving in Figure 2.
Figure 2. Big-data workflow.
2. Methodological Approach
In addressing studies or review in big data application, particularly in supply chain management, there is a need to view it globally in relation to best practices. For this paper preparation, online secondary data is the major source of data collection. The Google engine and dedicated supply chain sites were visited to collect relevant data. The theory of Google search that was used has to do with the web page titles, content, and keywords search of indexed step-by-step to produce the most recent and relevant ideas which is the pivot of this paper. While opinion views of the subject matter were also sourced from stakeholders by prosy to complement the secondary data. All these were appraised to elucidate the relevance of big data in SCM and the level of adoption in Nigeria.
3. The Concepts of Big Data and Right Data
Gudiveda et al (2015) view big data analysis concept as the agglomeration of huge complex data that are beyond the traditional database management tools [8], while Harford (2014) views it as ‘found data’ that goes beyond web search, credit card payment system, and pinging phone mast as it generates multi-divers secondary data as transportation and energy studies require national census and survey data collections [9]. Big data in its simplest form can be seen as is the agglomeration of social media platforms metadata that contains directional flow of media traffic, users’ profile data, behavioural and value system attachment, and bio data that can be analysed for multi-purpose inferential deductions using high power digital tools. If we take a look at Twitter or Histagram, an individual may have thousands of followers displaying different characters in their responses to the subject posted. Also, many incumbent useful data in census survey for instance are not apparent at the onset of which the rapidly reducing costs of data archiving have assisted.
Data scientists break big data into 4 dimensions: 4 V’s of Big Data (Data in volume, Data in velocity, Data in variety, and Data in veracity) as in Figure 3. In fact, according to Copernicus (European system for monitoring the Earth), space orbiting Technology generate more unimaginable data than past decades, in which internet data in 2020 alone amount to 10 zettaoctects, (10,000 billions Go) per month, while Large Hadron collider (LHC) scientific data is about 5 pétaoctets per year as they emerge as Purchase history, social graphs, tweets, blog posts, images and video data. In big data software technology, the following are top technologies that are cost-effective and user-friendly in big data analysis:
1) Apache Hadoop which has been handling about 75% of the world’s data is seen as the hub of every big data solution.
2) Apache Spark as a new generation big data software has been adopted very fast into computing engines as is 100 times faster than others like Hadoop-MapReduce.
3) Apache Flink also called 4G of big data is an open-source tool that handles streaming and batch data.
Figure 3. Big data groupings, source; (Michael W. 2012).
When discussing right or quality data, it has to do with the critical role data plays in data processing inferences for organization decision-making generally. Data procurement and validation are key issues in data-intensive application as high-quality data enhances fast decision-making and organization integrity, revenue growth, cost reduction, regulatory compliance, and customer satisfaction, etc.
Right-data refers to the fitness of data for a particular purpose at a given point in time which addresses consistency, completeness, none duplication, accuracy, and timeliness. Data quality management therefore focuses on the principles of data quality improvement and control that is tasking as it has to do with varied data source and usage. More so, the assessment is domain-specific, subjective, and high human handling. Data quality is studied in numerous other domains including cyber-physical systems [10], citizen science [11], accounting information systems [12], smart cities [13], data integration [14] [15] Scientific work flows [16], and database of customers [17], big data management system [18], and internet of things (IoT) [19]. These areas are evoking new investigations in data quality research techniques for the enhancement of the existing and evolving data sources [20]-[22].
Generally, SCM specific surveys are not without some challenges in term of detailing and consistency with existing database. For instance, the three-year survey of automobile and their total petrol usage does not tally with national petrol sales data in Australia thereby creating some reservation on the travel data reported. Also in the US, there were survey data comparability problems due to issues like sampling and non-sampling errors [23].
3.1. Principle of Supply Chain
Supply chain usually takes the form of a flow chart from the raw material, to the intermediate processors, to the finished product manufactures, to the warehousing, to the wholesale/retailers, and to the final consumers made possible by the integration of information, planning and transportation system as illustrated in Figure 4. SCM is the total efficient integration of suppliers, factories, warehouses, transportation system that ensure the delivery of product and services to right location, at the right time, in the right quantity while still optimizing the total cost, and customer satisfaction.
Figure 4. Supply chain network. source: After ladimer (2013).
3.2. Supply Chain Principles
Works on SCM principles can be traced to that of David, Frank, and Donavon (1997) titled “The Seven Principles of Supply Chain Management” when SCM was relatively new. The work is still adjudged to be relevant to current business environment as discussed below:
1) Profitably adapt the supply chain to the needs of specifically segmented customers based on the peculiar service demands.
2) Consciously tailored the logistics network to favour the service demand and profitability of clustered customer.
3) Align the supply chain to the market indicators while still ensuring optimal resource allocation and consistent forecast.
4) Expose customers to different products and respond to supply chain linear dynamics.
5) Strategically optimize sources of supply so as to reduce the total cost of vertical integration.
6) Employ supply chain technology best practices strategically to enhance multi-level decision-making on the information flow of goods/services, and
7) Employ channel-spanning performance evaluation of collective success in meeting the customers’ need similar to the work of Anderson et al. projecting the use of Activity Based Costing (ABC) for the determine customer’s profitability.
3.3. Case Studies of big data tool in SCM
In the adoption of big data in SCM, every successful branding of product and service delivery depends on customers’ data analysis in decision-making. In fact, about 91% of key marketing company leaders depend on capturing and sharing right-data for effective determination of ROI in their organizations. Power distribution companies depends on the billions of annual meter reading that are converted by IBM in Big Data environment to predict power consumptions The following are some of the major companies that have been benefiting from big data analysis globally:
1) Wlmart, which is one of the major world retailers with about 2 million employees and 20,000 stores has been operating on big data tools even before the current crusade. Their application of data mining has helped them in product recommendation to customers through pattern predictions. In fact, their e-commerce operations that optimize the shopping experiences of customers revolved on big data analysis that uses tools like Hadoop and NoSOL technology that provide customers within with access to real-time multi-source data.
2) Uber is another company that specializes in moving goods and services globally. The bio-data of customers are analysed and used to predict spatial interaction dominance for their service delivery provision and charges policy formation. Their big data analysis has helped them in trend-mirroring surge pricing. For example, double normal prices are charged for urgent trips within crowded zones or during festive periods. New Year Eves and Sallah days trips always quadruple as machine learning are used to identify high service demands.
3) Netflix is an online American best streaming video company that uses big data to determine precisely what customers will desire to enjoy at a point in time. Their ability to satisfy their customers online revolves on the hub of big data effective analysis, in fact, they are now seen as content creator as against distributors. Though not astonish, their service engines are fed by data on video titles watched, playback stoppages, ratings given, etc. that runs on Hadoop, Hive and Pig along with other orthodox business intelligence.
4) eBeay is a company that operates on data streaming in dealing with big data using tools like Apche Spark, Kafka that permits metadata search by the company analyst. They are known for making available such output to a wide range of people with the right levels of security and permission granting (data governance), this therefore made them a leading company in big data technology applications and dissemination.
5) Procter & Gamble is one of the companies whose products is global household name for the past 150 years or more. Their strength is also derived from the adoption of big data tools to making better smarter real time business decisions. The sustainability P & G till date in the global market is highly attributed to their optimization of the potentials in big data in their operations.
6) Big data deployment in Public Transport Management is widely embraced as it among things augment the revenue generation of the embracers. According to EUAR (2016), real-time incidence monitoring and surveillance is the major benefits of big data tools applications in mass transit systems in countries like Switzerland and Finland [24]. For instance, data on wheel and rail contact force do provide information on rolling weight; load balances, and wheel geometry. This inbuilt real-time reporting system enhances the response time of traffic controllers to specific mass transit problems before system collapse. In fact, the following information can be derived from load gauging:
1) Axle box temperature;
2) Wheel temperature;
3) Brake discs friction heat, and;
4) Monitoring of Pantograph and centenary.
Also, in Swedish train operation; a recent big data tool was used to alter train operation 2-hours earlier after delay forecast thereby enhancing the traffic controller’s service delivery efficiency. This system operates on machine learning that simulates the effectiveness of possible solutions based on historical data analysis. Generally, there are several major problems in SCM as in Table 1, which can be resolved using big data technological algorithms as follows:
1) Geo-location analytics of individual territories in terms of logistical accessibility, coverage;
2) Transport logistic optimization through the removal of idle mileage of freight transport and seeking alternative routes;
3) Prompting routing vehicular maintenance;
4) Combating financial maladministration, and;
5) Enhancement of data storage and retrieving for other ally firms like insurance companies.
Table 1. ICT integration in SCM problem-solving.
S/N |
Big Data with other IT |
Transport Logistic Challenges |
Improved IT-Based Solutions |
1 |
Big Data combine with IoT and Machine Learning |
Identification of road sections accident hot spots (Moscow). |
Enhancement in weather, traffic speed and road conditions analysis |
2 |
Big Data combine with IoT and Machine Learning |
Development of traffic situation report into self-regulating system (Los Angeles, USA) |
Optimization of central computing centre response time dynamics to gridlocks |
3 |
Big Data combine with IoT and Machine Learning |
Maximization of transport routes system (Moscow) |
Optimization of central computing centre through data from personal and public transport IoT sensors installations. |
4 |
Big Data only with IoT |
Archiving traffic flow dynamics (Moscow) |
Optimization of central computing centre through data from personal and public transport IoT sensors installations. |
5 |
Big Data combine with RFID and cloud computing |
Traffic flow and compliances surveillance (Kazan) |
Enhancement of road users’ behaviour monitoring and vehicle particulars verification remotely. (RFID 2020). |
4. Big Data Crusade and Adoption in Nigeria
The adoption of big data technology has spread like wildfire over the years due its capability in handling huge and complex data that hitherto traditional algorithms couldn’t fathom globally. In fact, the Nigeria firm industry (otherwise known as Nollywood) that has been the second largest non-oil economic sector that contributed about $5.55 billion dollars in 2019 and projected to grow to $10.8 billion USD in 2023 revolved on big data technology Apart from been rated as the second largest in the world as the industry is digitalized to meet the global audience standard, revolved thereby engaging about a million Nigerians having leveraged on big data analysis. The media and entertainment industry is rated as the leading user of big data as they just have to satisfy the global audience that are highly dynamic behaviourally as they now leverage on digital innovations to search and access content anywhere, any time and on any device.
Many other organisations within the country have come to recognize the importance of big data as a means of achieving and entrenching competitive advantage, and it creates an opportunity to better influence policymakers to take advantage of big data in moving Nigeria in the right direction. Presently, Nigeria has top big data analyst consultants that provide the required services as listed below.
1) N-iX—Trusted Software Development Partner Since 2002;
2) Innowise Group—IT Outsourcing & Staff Augmentation since 2007;
3) Netguru Building—Digital solutions for world changers;
4) Weevil Company—Analytics Technology & Digital Transformation;
5) PayInBits Convenience—... Delivering Excellence, Value and Quality;
6) Factual Analytics—Data Science and Analytics;
7) BlackSentry—Set World Class Data Analytics and Cyber Security.
Perceptibly, as the complex terabits of data are been archived on continues bases, new algorithms are evolving to cope with the emerging challenges. The new security, privacy, and significant initial expenditures challenges seem to redirect the focus of big data applications coupled with the dearth of required digital skill research outputs. The however necessitated the commencement of the first Nigeria’s Big Data Economy Summit held on 12th October, 2017 which was hosted by Data Science Nigeria, a vision of the MTN’s executive, Mr Olubayo Adekanmbi. The summit however focused on the commercialization of data sharing, making reference to the avalanche data that are rotting away in Nigerian universities that should have amounted to billions of dollars if standardized and made available to companies,
However, the concept of big data and its utilization in Nigeria is still controversial among the governmental levels for many obvious reasons that are not unconnected with shady dealings and none transparency in both public and private transactions at all levels is a major challenge, hence the establishment of series of data protective laws in the country. For example, the roles of planning, developing, and promoting the use of IT in Nigeria is vested in the vested on the National Information Technology Development Agency (NITDA Act 2007), who published her personal data protection regulation bill in January 2019. This has gone a long way to reshape data proliferation in the country as a replica of the European Union (EU) General Data Protection Regulation (GDPR). In fact, it is one of the factors that truncated the Federal Road Safety Corps’ National Vehicle Identification Scheme (NVIS). The scheme was to use intersection Red light Sensor in capturing and analysing road user’s regulatory compliance characteristic level.
The Mobile Phone service-providing industries who are the generators most beneficiaries of big data benefits as they exploit the biggest populated countries in Africa. As modern life now depends on digital communication, the effective big data analysis has catalysis the revenue generation of that economic sector and the national GDP through their innovative apps deployments [25].
In practical term, Blumenstock et al. (2021), used big data of geospatial and machine-learning algorithm to identify the spatial location of the poor Nigerians in addressing COVID-19 crises that is anticipated to migrate additional 10 million people into poverty level by 2022 coupled with already near saturated population growth that will drag more than 100 million people below the national poverty line [26]. The poverty maps generated through the big data analysis revealed the actual communities with poor people that empowerment programmes can target for social capital protection as illustrated in Figure 5.
Figure 5. Satellite based big data poverty map. Source: Chi et al. (2021)
Also in the work of Omole et al. (2019) on sustaining vaccine integrity to the last location in Nigeria, discovered that significant number of the actors had good knowledge of vaccine blockchain and supply chain efficiency though relatively new in among the developing countries of the world [27]. There is however the need to apply big data analysis to the Nigerian vaccine supply chain optimization.
In analyzing the present situation of Big data adaptation in Nigeria, as the world is in its get age with the digital world, young teenagers in the country are fully running with it in both positive and negative ways [28] [29]. In the positive adoption of the crusade, the federal government has developed her big data utilization in the e-economy policy, the National Identification Number (NIN) linkages to all digital transactions. In the security sector, where the case affects any of who-is-who in the society, big data analyses of telecommunication have been used to crack down kidnappers in the country. While in the business sector, especially in the supply chain industry, the tool has been deployed in the delineation of regions of administration and service delivery that has significantly grown the sector. In the negative direction, one may be tempted to state that young Nigerian stars are among the global leading hackers called ‘Yahoo Boys’ mining data from victims’ subscriber identity modular (SIM) cards
Summarily, Nigeria can be said to have key into the application of big data in supply chain management and other sectors of the economy. Although much its potentials are yet to be exploited especially in tackling security challenges bedevilling the country and Africa as a whole. The political will to fully allow big data analytical techniques to thrive in the country is lacking due to corrupt practices and analogue nature in governmental operations.
5. Discussions
Fundamentally, big data is the science of the assemblage of multi-source record facts and figure, analyse, develop inferences that are disseminated to decision making. Data-quality and integrity play a major role in societal decision making, while the privacy of the originators and data miners are not to be compromised in the process. Modern day societal complexity and dynamism call for big data technology that cannot be truncated just as the principles of privacy and data protection must be balanced in conjunction with other societal values such as national security, environmental protection, public health, and economic growth. The socioeconomic benefits of the abundant and ubiquitous data sources should not however erode the need for data protection and safety in the value chain. Legally and constitutionally, big data operators and analyst centres should ensure the legitimacy of data following the NITDA Act while still ensuring the data-quality flavour.
The principle of anonymity is the shielding of the information provider, removing personal identity that may implicate his personality directly or indirectly. Data usability depends on the level of the data (information) anonymity in consonant with the existing regulatory laws. But pseudo-anonymity according to GDPR is “the manipulation of personal data in a manner that the data becomes difficult to attribute it to a specific subject area without the use of Meta data that is kept separate and subject to technical and organizational measures”.
Anonymity and Pseudo-anonymity generally go beyond reducing regulatory burden, but alleviating the risk of unintentional disclosure of personal data. These two are the highway for governments, big data analysts, researchers, organizations, and individual run their research, and develop products and services while freeing the data owners the fear of malicious use of data.
Supply chain management generally makes use of the three data analytical methods of ‘descriptive, predictive, and prescriptive’ in big data technology. While the descriptive is the dashboard, the predictive focus is on demand forecasting, and the prescriptive analysis is more about providing recommendations for possible implementation toward achieving the set goals and objectives in the organization setup.
The multilevel usage of all analysed data that are at the same time multi-strategy collaborative classification usually takes advantage of the complementary heterogonous, multi-temporal analysis methods. Due to the high incidence rate of image procurements and the new incremental methods adopted, background knowledge is required to strengthen the links between data generators and computer scientists. But in this part of the world, the following challenges are still obvious: a) Dearth of expertise, b) Scalability of algorithms. C) Quality data evaluation and imprecision in data as well as in knowledge, d) Inability to take into cognisance error/imprecision in multisource data.
6. Recommendations
The relevance of big data analysis as a tool in the modern day supply chain management cannot be over-emphasized as in other sectors of the economy. The following therefore are recommended for all actors in the industry and governmental cycle:
1) There is an urgent need for all governmental agencies to key into e-governance in all their operation whereby data storage will be in digital form for easy retrieval and query analysis for policy formation.
2) All the major players in the SCM should be admonished to embrace digitalization of all their operations for easy data transformation and assemblage at the national level.
3) The educational sector should mainstream big data into their curriculum at all levels for human resources development that will fit into the global digital world.
4) Internet of Things (IoT) along with the required infrastructure should be made a priority by the government in order to create an enabling environment for the stakeholders in SCM and other allied industries.