Evaluation of a Digital Library: An Experimental Study

The development of digital libraries has changed the handling and access to information. Using such a library involves a computer-human interface as well as commands and search strategies to retrieve information. The purpose of this study was to evaluate a digital library in an institution of higher education that serves approximately 75,000 students. Quantitative and qualitative data were collected from a total of 206 participants (structured interviews, popup questionnaires and transactional log analysis). Descriptive statistics and thematic analysis were used for quantitative and qualitative data. Online journals were the most commonly used resources while reference resources were least used. The usability and information retrieval capacity of the library was good. However, there was a need to improve the user interface of the digital library, create more awareness and subscribe to more online journals to meet the information requirements of the users.


Introduction
The library together with the information sector has undergone tremendous changes in recent years. These changes involve the collection and arrangement of information. Consequently, libraries can offer their services without confinement to geographical borders. Advances in information and communication technologies have revolutionised the type, subject matter, design and perception process and timing of the evaluation should also be considered. Digital libraries are multifarious structures. Thus, evaluation approaches and metrics differ based on whether the digital libraries are seen as information systems, institutions, collections, new technologies or new services (Lamb, 2017). The purpose of this paper is to evaluate a digital library in an institution of higher education.

Forms of Evaluation
Evaluation can be done at different stages of the development of a digital library.
Four forms of evaluation are explained herein. They include formative, summative, iterative and comparative evaluations. Formative evaluation is commonly conducted during the initial stages of a project (Stefl-Mabry, 2018). For instance, before starting a digital library project, it is necessary to find out the information needs of the target users to determine whether or not a digital library should be established. Thus, a formative evaluation is akin to conducting a market survey before introducing a commodity. The findings of the evaluation can guide developers to include certain aspects into the structure of the digital library or implement corrective measures in the early phases of a project. Additionally, a formative evaluation provides baseline data that can be used in subsequent stages of evaluation to determine whether the project has achieved some of its intended uses.
Summative evaluation is done at the end of a project to ascertain whether the original targets leading to the initiation of the project have been met or not (Pinfield, 2017). Therefore, the focus of this evaluation is the outcome of an initiative. Iterative evaluations encompass short-term appraisals that are performed in the course of a project. They act as "in-between" assessments that help to ensure that the project is on the right track (Tank, Maradiya, & Bhatt 2017). These evaluations can be done as many times as possible in the course of the project.
Comparative evaluations are complete appraisals performed using formats that can be contrasted across similar systems. In other words, they can be used as benchmarking processes to determine the value of a digital library (Campbell, 2018). Similar systems are compared in a comparative evaluation. An example is comparing various digital library platforms across several institutions of higher learning.

Digital Library Evaluation Framework
Digital library evaluation frameworks enable the evaluator to conduct a detailed and logical assessment. However, before designing such a framework, it is necessary to point out the key constituents that typify the scope of the digital library environment. Three major components in the domain of a digital library are users, content and technology (Agosti, Ferro, & Silvello, 2016). Service is a valuable parameter that can be investigated. However, it often falls under content. These three parameters should be examined in detail to develop an effective framework for the evaluation of a digital library. Journal of Service Science and Management Users form the most critical entity in the information chain regardless of the library platform (whether it is a digital or conventional library). It is important to verify the target users and their information needs to appraise a digital library effectively. To understand the users fully, four points are important. The identity of the users should be ascertained, for example, students, researchers or professionals. Their information-seeking behaviour should also be determined. The type of information required needs to be clarified and this has to do with specific subject areas. Lastly, the purpose of the information should be identified. In summary, four questions to ask regarding the users of a digital library are "who", "when", "what" and "why".
Contains refers to what kind of information is available in a digital library.
The collection may differ based on the objectives of the library. For instance, the contents of an academic library may differ from the components of a professional library. The primary objects may be reports, books or journal articles, whereas secondary data could be metadata schemes or bibliographic descriptions. Various formats can be used to present the data, for instance, video, text or audio. When planning an evaluation, the type of content (audio, text or video), metadata schemes (indexing, citation, thesaurus and bibliographic arrangement) and quality of content (pertinence and subject coverage) should be considered.
Technology refers to the aggregate of skills, techniques and processes used in the development of commodities or services. Technological matters that are factored in digital libraries include user interface, management of access, document technology and system structure (Lyman, 2017). The user interface takes care of diverse options that a digital library offers to its users and the ease of content access. A system ought to have efficient triangulation tools and recovery techniques to aid in the access of information. In contrast, system structure entails the structural design of the system, for example, protocols, database and middleware used in developing the platform. Matters concerning the depiction of documents are considered in document technology, including format and model. Model denotes the conjectural features of a document such as semantic content, hyperlinked logical structure and external features. In contrast, format identifies the core document depiction such as rich text format (RTF), PDF and DOC (Fenlon et al., 2016).

Research Focus
The digital library at an institution of higher learning was unveiled in 2018 December. Its objective was to provide access to various resources offered by the library through a single window on campus and remotely. The architecture of the digital library included a host of hardware and software. Four different servers supported the main one. They included a server for web OPAC, another one for the institutional digital depository, a third server for databases based on hard-disk storage and a fourth CD mirror server for audio-visual materials Journal of Service Science and Management stored on CD or DVD. The library was also linked to a virtual private network to facilitate access for external users (remote use). The key contents of the digital library included information about the library, e-books, online journals, institutional repository, web, online and offline databases and CD/DVD based training tools. Internet protocol (IP) facilitated access to online resources. Therefore, there was no need to log in to find individual resources.
The digital library has about 5000 users, including research scholars, faculty members, undergraduate and postgraduate students. Approximately 80% of users log on to the digital library through the institution's intranet. However, the remaining 20% accesses the services of the library remotely through a virtual private network (VPN) server. The focus of this study is to evaluate the digital library in terms of patterns of use, usability and information retrieval. These three forms of evaluation are explained further under this section.

Usability Evaluation
Usability inquiry testing was done. This type of evaluation entails appraising the usability of a digital library when target users are performing normal day to day tasks instead of evaluator-assigned tasks. This mode of evaluation is useful when trying to collect information concerning the needs, likes and dislikes of the users (Sánchez-Gálvez & Fernández-Luna, 2015). Several approaches can be used in this regard, including focus groups, interviews, questionnaires and field observations.

Information Retrieval
People seek information for various purposes. When evaluating of a digital library, information retrieval refers to finding the information that is being sought by the user. The retrieval of information in the context of a single digital library is a multifarious process entailing aspects such as cataloguing, metadata and indexing. The convolution of a digital library is proportional to the number of aspects (indexing and cataloguing) that are effective at the same time when a user searches for data across various collections that apply diverse metadata systems (Gaona-García, Martin-Moncunill, & Montenegro-Marin, 2017). Nevertheless, details regarding these intricacies are of no use to the library users. Their main concern is being able to find information efficiently and effectively. Thus, information retrieval evaluation is dual, user-focused and systems-oriented. Information retrieval evaluation from a user's standpoint is determining the effectiveness with which a user's search for information satisfies their interests or needs. In contrast, information retrieval evaluation from a systems perspective ascertains the usefulness and efficiency of the retrieval system, which is a core objective of all digital libraries.
In user-focused evaluation, the emphasis is on the user's experience with the information recovery tools provided by the digital library (Cabrerizo et al., 2015). Journal of Service Science and Management desire effectively, notwithstanding the quality of information or sophistication of its technology. Therefore, when conducting a user-focused evaluation, it is necessary to determine the performance of the information retrieval system with respect to the users' interests, requirements and anticipations.
The main challenge faced by most digital libraries is the storing, configuration and recovery of its contents (Places et al., 2016). Therefore, they aim to possess information retrieval systems that permit users to find specific items effectively in the shortest time possible. Assessing the information retrieval potential of a digital library provides valuable information that may guide future decisions concerning the hypothetical and practical modules of the library to optimise the efficacy of user searches.
Other aspects to be considered in the evaluation information retrieval are precision and recall. Precision can be described as the fraction of retrieved documents that satisfy the search requirement (are pertinent to the information being sought by the user). Conversely, recall is the fraction of relevant documents that are recovered from the assortment of all appropriate files. The estimation of recall is more convoluted than precision because the cataloguing of most digital libraries does not allow the identification of all the potentially relevant documents. This study focused on information retrieval from the user's standpoint with a focus on precision.

Research Method
A formal evaluation was done a year following the establishment of a digital library in an institution of higher learning. Qualitative and quantitative data were obtained from the users. Different methods were applied to various categories of users. For example, questionnaires were used to collect information from students, whereas informal interviews were used to make inquiries from faculty.
Transaction log analysis was used to collect quantitative data from all users.
Through this mixed-method approach, it was possible to determine the information about the usage patterns of different groups as well as the usability and information retrieval potential of the digital library.
Using different methods of data collection was necessary because each approach they differ in effectiveness in given situations. Questionnaires are effective where the target response does not require detailed explanations. The approach is also cheap and timely compared to in-person interviews that require explanations that are more detailed. In this case, using both interviews and questionnaires allowed the research to gather more data and information for analysis from the participants.

Sample
A sample of 206 participants took part in the study. Out of this number, 200 were students at undergraduate and postgraduate levels, whereas the remaining Journal of Service Science and Management mined by a computer algorithm. The 6 members of faculty were identified through systematic sampling. Faculty members received letters inviting them to take part in the study. The letters contained a brief description of the study, its objectives and the expected duration of the study. The letter also contained an informed consent form that participants were expected to complete to verify their participation in the study. A copy of the invitation letter (Appendix A) and the informed consent form (Appendix B) are included in the appendices.
Sample determination was based on numerous factors, including cost, accessibility, willingness to participate in the study, and convenience. The cost of typing and printing questionnaires, distributing them to the participants, and conducting interviews with individual participants was considered before choosing the sample size above. The study determined that it would costly to interview and collect data from a larger sample size beyond 206. Furthermore, it also observed that including a larger sample would lead dealing with huge data for analysis, which would have increased the error margins and other related inconveniences.

Informal Interviews with Faculty
The chosen faculty members were interviewed for about 30 minutes. Individual interviews were conducted using a predetermined set of questions as indicated in Appendix D. The informal interviews entailed questioning the user, recording their responses, transcribing the interviews before performing data analysis.
Structured interviews were chosen for this study to minimise ambiguity and narrow down the responses to specific areas of study that were targeted by the researcher.

Questionnaires
Surveys are among the old-fashioned ways that libraries use to collect data. The Popup questionnaires are effective when using the online platform to collect data and information from distant participants. Researcher or institution with websites uses them. Furthermore, modern enterprises use them to collect data about customer experiences and satisfaction rates. According to Stoet (2017), popup questionnaires are more effective and efficient than the embedded ones because the researcher controls how, when, and where they appear on their websites.
Questionnaires are the most common instruments used in general evalua-J. A. Alokluk, A. Al-Amri Journal of Service Science and Management tions. However, they have limited utility in usability evaluations. Currently, surveys can be conducted using web-based. Such data can easily be interpreted and analysed using various software tools (Stoet, 2017). Statistical analyses may also be needed to make inferences of the resultant data. Popup questionnaires, however, are useful in gauging the usability of a digital library. They are programmed to emerge when a user does something out of the blue in a digital library. In some cases, a short on-screen questionnaire regarding usability matters may be instigated after a specified duration of time or when leaving the library. It is also possible to email the questionnaires to users if identification protocols are needed to access the library because they would provide the users' email addresses. Overall, questionnaires used in usability testing need to be as brief as possible notwithstanding their mode of presentation (Sánchez-Gálvez & Fernández-Luna, 2015). Brevity encourages the users to complete them without feeling that they are wasting their time. They should also be clear and unambiguous to yield accurate responses. A researcher can consider including incentives for prospective users to take part, especially if very many responses are required (Stoet, 2017). Some of the motivations that can be used include entries into draws to win a prize or coupons for online shopping in specified stores. In this study, a popup questionnaire containing a prompt to redirect the user to a longer questionnaire was used (Appendix C).
Using the design principles of questionnaires is another vital aspect that reinforces their effectiveness when used to collect data from participants using different approaches. One needs to decide the questions that they will ask the participants according to the research variables, objectives, and hypotheses (Stoet, 2017). Researchers need to apply design principles such as pretests and revisions to ensure that the questions included meet the primary objective of the studies they are conducting.

Transactional Log Analysis
Transaction log analysis is a common approach in the evaluation of digital libraries. It was originally designed to collect quantitative data for the appraisal of OPAC libraries (Arshad & Ameen, 2015). It has since been adopted for the evaluation of other types of digital libraries. Transactional log analysis is an effective tool when applied alongside other assessment tools. It provides valuable information such as the people who use digital libraries, the specific resources used, how long the library is used among other parameters. Transactional logs can also generate information such as frequency and sequence of feature use, response times of the system, hit rates, location of users, error rates and the number of transactions per use (Siguenza-Guzman et al., 2015). In this study, transactional log analysis was used to determine information regarding patterns of use of the digital library. This information was also partly captured by the questionnaires.

Data Collection
with the digital library for at least 30 minutes. The popup questionnaire was brief and had only 2 questions. However, it contained a link to redirect the user to a longer questionnaire. The system was set to produce the popup questionnaires for 14 days until a total of about 200 users had completed the longer survey. Appendix C shows a copy of the questionnaire (popup and longer questionnaire). Evaluation parameters that were covered in the questionnaires were usability and information retrieval. This form of questionnaire has previously produced acceptable data supporting the reliability and validity of web-based questionnaires (Tella, 2015).

Limitations
The first limitation of the study is the number of participants, which was relatively small. The 206 participants formed about 4.12% of the entire population of the digital library users. A larger sample would have been desirable though it was not possible due to financial, technical and time constraints. The second limitation was related to the various perspectives of the participants in this appraisal.
Some of the partakers were postgraduate students while other was undergraduate students. The information needs of these two groups of students have been shown to differ significantly due to the scope of their studies. The questionnaires did not capture the input of faculty members. These different perspectives should be considered when analysing the outcomes of the evaluation.

Logistics
The researcher coordinated the execution of this evaluation plan, which consisted of planning, collection of data and data handling. The institution's assistant librarian, who was in charge of the digital library project, was the primary point of contact. All data were processed, analysed, construed and reported by the author.
All reports were conveyed to the digital library's project manager and members of the development team. The evaluation outcomes were further disseminated to other stakeholders such as the administrators of the learning institution.

Data Analysis
Quantitative data obtained through questionnaires and transactional log analysis were recorded in an Excel spreadsheet for further analysis. Deductions were made by further categorising the responses into three main groups. For example, findings on "strongly agree" and "agree" were combined to mean "agreeing with the statement", whereas "strongly disagree" and "disagree" were combined to mean "disagreeing with the statement". The ensuing data were summarised using descriptive statistics such as means and percentages. Thematic analysis was used to analyse qualitative data from the structured interviews.

Findings and Discussions
About 47% of users who interacted with the system for more than 30 minutes completed the popup questionnaire. Out of this number, 32% completed the longer survey. Out of the complete responses, 43% were from female students while 57% were from male students (Figure 1). The majority of the participants were in their third year of study (25%) whereas the lowest proportion of partakers was in their first year of study (13%). Postgraduate students consisted 22% of the subjects (Figure 2). Figure 3 showed that about half of the students used the digital library to find information about specific research topics. Journals and e-journals were the most used resources at 42% while the least used facility was the reference resources at less than 5% (Figure 4). This observation was also reiterated by the transactional log analysis in Table 1.    The usability evaluation ( Figure 5) showed that 75% of users could accomplish all their tasks with ease, 50% could access information fast and effectively, whereas 60% did not need guidelines to use the library. Furthermore, 60% of users could recommend the library to their fellow students. However, 60% of the participants agreed that the system needed some improvements.
The information retrieval assessment showed that most users agreed about the system's ability to yield results based on search criteria, retrieval of relevant data within a short time and agreement between search criteria and users' expectations ( Figure 6). More than 50% of the users had access to materials that were pertinent to their study areas and were satisfied with the system's ability to retrieve information. However, about 50% of users agreed that the system was inconsistent and had difficulties remembering all information retrieval steps. 60% of the users reported that the system was unreliable in terms of providing information based on their needs.   Thematic analysis from the interview involving members of faculty revealed 3 main themes: low awareness, underutilisation and satisfactory usability. They agreed that there had been substantial changes in the quality of work submitted by their students since the development of the digital library. The usability of the digital library was satisfactory as it was relatively easy to access information.
However, it was apparent that there were low awareness levels about different resources that could be found in the digital library. Therefore, most users preferred to use the same resources frequently instead of other reserves of a similar nature.

Discussion
Usability, in the perspective of digital libraries, can be described as the efficacy with which people can access and use the wherewithal of a library successfully. A human-computer interface is commonly used in digital libraries. Thus, the interface and functionality are the most crucial aspects of digital libraries that should be evaluated and enhanced. According to Iqbal and Ullah (2016), the usability of computer platforms is determined by five attributes as determined by the user. They include ease of learning and use, the fast accomplishment of tasks, low error rates, user satisfaction and user retention.

Conclusions
Digital libraries have transformed the idea of information services by reaching out to many users without temporal and spatial limitations. The advent of open-source software for online library platforms has also improved digital library technologies. However, new inventions are created each passing day.
Therefore, it is necessary to conduct regular evaluations to determine whether appropriate developments are being incorporated. This study showed that the usability and information retrieval capability of the institution's digital library were good. Nonetheless, the usage patterns showed that more 3rd and 4th years students, as well as those pursuing postgraduate studies, used the digital library than those in their initial years of study. Furthermore, users faced certain challenges that need to be addressed to enhance the usability and information retrieval of the library.

Recommendations
Based on the findings, four main recommendations were made. A collection development policy should be created to subscribe to as many online journals as possible given that this resource was most used. The user interface of the digital library should be redesigned to simplify it and ease the navigation process. User education should be done to enhance the usage of different available databases.
More awareness should be created about the existence of the digital library, available resources and benefits of the facility.
Creating awareness among users about the existences of digital libraries should be a primary goal to increase usability. Numerous avenues to create awareness are available for institutions and service providers to use. Using the social media framework is one of the ways to increase public awareness on the existence of digital libraries to potential users. Stakeholder engagement is another critical avenue that could be used to create awareness among users about the existence of the online digital libraries over the internet. The process should not only target learners and researcher, but also scholars with ability to develop research products and publish them on the digital platforms. Teachers and school administrators should also be encouraged to adopt digital learning platforms to encourage students to use the available online resources.

Conflicts of Interest
The authors declare no conflicts of interest regarding the publication of this paper. Usability Evaluation of the Digital Library Please mark with X where appropriate Please indicate the extent to which you agree with the following statements on a scale of 1 to 5 (1 = Strongly disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, 5 = Strongly agree). Thank you for your participation.

Appendix D: Interview Schedule for Faculty
Questions 1) How has been your experience with the digital platform of the library?
2) Have you noted any changes in students' performance or use of academic resources since the inception of the digital library?
3) Do you think digital resources are utilised adequately by students and faculty? Why?
4) What problems/challenges have you encountered so far? 5) What do you think can be done to improve the user interface of the digital library?
6) What are your suggestions to enhance searching for e-books, e-journals and research papers?
Thank you for your participation.