The management of university libraries is becoming more and more dependent on digital change. This report uses data analysis techniques to examine how college libraries make decisions about digital transformation. The paper first examines how digital revolution affects college library management, including topics such as increasing service scope, streamlining resource allocation, and enhancing service efficiency. Second, real-world examples of data collection, cleansing, analysis, and visualisation are used to illustrate how data analysis is applied in digital transformation. Subsequently, an index system for evaluation is set up, encompassing many factors such as the creation of digital resources, service quality, user experience, and management effectiveness. Lastly, the impact of the digital transformation is assessed and confirmed through the use of empirical analysis techniques, offering a scientific foundation for decision-making in university libraries.
ith university libraries are confronting both the potential and the strain of a digital transformation due to the quick growth of information technology [1]. In addition to being an inherent necessity for library administration, digital transformation is also an external necessity for adjusting to the information era. College libraries may increase service efficiency, optimise resource allocation, broaden the range of services offered, better fulfil user needs, and foster the growth of the college library industry by fully utilising the tools of current information technology [2].
Digital transformation, however, is a long-term process that calls for logical, scientific decision-making and assessment [3]. Data analysis is a vital tool in the digital transformation process that is essential for assessing the impact and effect of the change. In order to provide a solid scientific foundation and point of reference for the digital transformation of university libraries, the goal of this study is to conduct decision-making assessment research on the subject using data analysis techniques [4].
University library management has seen a variety of changes as a result of the digital transition. First, library services are now more effective and of higher quality because to the digital transition. Digital technology makes it easier for users to access library materials, which enhances the effectiveness and pleasure of borrowing [5]. Second, the distribution and application of library materials are optimised through digital transformation. Libraries can better serve the information needs of their patrons and make better use of their resources thanks to the creation and administration of digital resources. Once more, the range and type of library services have increased due to digital revolution. Digital transformation has aided in the creation of new service modes, such as online education, academic exchanges, digital exhibitions, and other library services, in addition to the conventional book lending services [6].
An assessment index system was created in order to objectively assess the impact of university libraries’ digital transformation. Digital resource creation, service level, user experience, management efficiency, and other factors are included in the assessment index system [7–9]. Among these are: user experience includes things like user satisfaction and ease of use; management efficiency includes things like resource utilisation rate and operating costs; and digital resource construction includes things like quantity, quality, and coverage of digitised resources. It is possible to completely comprehend the efficacy and significance of the digital transformation by closely observing and assessing these indicators [10–11].
Finally, using techniques from empirical analysis, the impact of the digital transformation in college libraries is evaluated and confirmed. By gathering, examining, and comparing pertinent data with changes in indicators pre- and post-digital transformation, the impact and effect of the shift are evaluated. Empirical analysis results demonstrate that university libraries’ service quality and management efficiency are greatly enhanced by digital transformation, a finding that is widely acknowledged and appreciated by both managers and patrons.
The author created the following evaluation index system with careful consideration to the aforementioned contents and principles of digital resource evaluation in university libraries, as well as extensive citation to NISO Z39.7200X Library Measurement Standard Concave [11], COUNTER Online Network Electronic Resources Usage Statistics [12], SUSH Protocol IV, and the “Guide to Digital Resources Measurement in Higher Education Libraries” jointly submitted by the CALIS Management Centre [13, 14] and the Library and Intelligence Steering Committee of Higher Education Institutions of the Ministry of Education and the Library and Intelligence Steering Committee of Higher Education Institutions of the Ministry of Education [15]. The author has created an evaluation index system that is detailed in Table 1 based on the Guidelines for the Measurement of Digital Resources in Libraries of Higher Education Institutions, which were jointly proposed by the CALIS Management Centreand the Librarianship Steering Committee of Higher Education Institutions of the Ministry of Education. The main goal of this system is to support the purchasing decision for the construction of digital resources in individual libraries.
There are 24 level 2 indications and 7 level 1 indicators in the indicator system mentioned above. There are three second-level indicators, A1-A3, which assess the rate of subject coverage, the coverage of users, and the quality of the resources, respectively. The first-level indicator, A, weighs 0.3 and evaluates the academic, authoritative, and applyability of the content of the digital resources. With a weight value of 0.1, the first-level indicator B retrieval system and function evaluates the organisation and revelation level of digital resources. Four second-level indicators, B1–B4, assess the resources based on the retrieval function’s completeness, the function of analysing retrieval results, and the enhancement of user experience. The ten second-level indicators under them are assessed in terms of the validity of off-site access to digital resources, the database vendor’s statistical work on library data and users’ information literacy training, the permanent access to the resources, and the issue of long-term preservation, respectively [16–18]. The first-level indicators, C Access Performance, D Vendor Service, and G Archiving of Electronic Literature, are assessments of users’ ability to guarantee access, and each has been assigned a value of 0.1. The performance of resource utilisation is represented by the primary indicators E, the electronic literature trial, and F, the price factor of electronic literature. These indicators are assigned values of 0.1 and 0.2, respectively. The secondary indicator E1, the user feedback, has a weighting of 0.6, which quantifies the subjective opinion of the users in terms of recommending and buying the products.
This index system incorporates the use of the Delphi technique (Delphi), sampling survey method, hierarchical analysis method (AHP), and multi-indicator comprehensive evaluation approach [20]. To ensure that the university library’s resource evaluation is inclusive and able to measure the user group’s subjective opinions as much as possible through the integration of objective statistical data, the author created a survey-style questionnaire (refer to Table 2) for more information.
Different indicators relate to different evaluation subjects. For instance, Level 1 indicator F—the price factor of electronic documents relates to the price increase of digital resources over time and the subsidy programme. It is assigned by university library interview librarians. On the other hand, Level 1 indicators C4, which deals with off-campus access restrictions, and E1, which deals with user feedback, are sampled from university users using a straightforward Table 2 questionnaire.
Primary indicator (weight) | Content of electronic literature (0.3) | B retrieval system and functions (0.1) | |||||
Secondary indicators (weight) |
The degree of correlation
between A1 and the important fields we have chosen for our library (0.5) |
A2 Electronic Literature
Applicable Objects (0.3) |
A3 Data Source
Information (0.2) |
B1 retrieval function (0.4) | B2 search result (0.2) | B3 retrieval interface (0.2) | B4 User Service (0.2) |
Primary indicator (weight) | C Access Performance (0.1) | D Supplier Services (0.1) | |||||
Secondary indicators (weight) | C1 access method (0.25) | C2 access speed (0.25) |
The proportion of C3
access failures (0.25) |
C4 Off campus Access
Restrictions (0.25) |
B1 retrieval function (0.4) | B2 search result (0.2) | B3 retrieval interface (0.2) |
Primary indicator (weight) | Usage of electronic literature (0.1) | Price factor of F electronic literature (0.2) | |||||
Secondary indicators (weight) | E1 User Feedback (0.6) | E2 Free Trial (0.2) | E3 Usage Statistics (0.2) | F1 discount range (0.3) |
F2 annual increase
rate (0.3) |
F3 Group Subsidy or
Sharing (0.2) |
F4 school or department
subsidy (0.2) |
Electronic resource name | Corresponding score | ||
A1: Does the content match the relevant subject theme? | 100 | 50 | 0 |
A2: Is it applicable to all teachers and students in our school (department)? | Yes | General | No |
A3: Is the data source authoritative and academic? | Yes | General | No |
B1: Is the search function complete? (Logical assembly/related retrieval/secondary retrieval/classification retrieval, etc.) | Yes | General | No |
B2: Does the search result have analytical function? (By time/relevance/author/journal/year/discipline, etc.) | Yes | General | No |
B3: Is the search interface friendly? | Yes | General | No |
B4: Are personalized services and user assistance provided? | Yes | General | No |
C2: Is the access speed fast? | Yes | General | No |
C3: Can the electronic resource be successfully accessed on campus every time? | Yes | General | No |
C4: Is it possible to access this electronic resource outside of school? | Yes | General | No |
E1: Do you recommend libraries to purchase this electronic resource? | Yes | General | No |
The evaluation outcomes of university libraries’ digital resources using the aforementioned secondary weighting indicator approach are displayed as % numerical scores. The formula for its algorithm is: Tier 1 indicator score = All selected Tier 2 indicator score * corresponding;
Tier 2 indicator weighting factor;
Total score for electronic documentation = Score of selected level 1 indicators*Corresponding level 1 indicator weight coefficients.
The rules for assigning values to individual indicators are described below.
One hundred points were awarded for every 80% of the subjects in the database that had an A1 degree of matching with the Library’s key disciplines (0.5) and the Library’s key disciplines of the selected subjects connected to the subject; every 10% reduction in each subject resulted in an additional 10 points;
A2 Electronic literature appropriate for the intended audience (0.3): 100 points for each reader; if not, the score will be lowered in accordance with the target audience’s reach.
A3 data sources (0.2): 100 points for information from reputable organisations, academic publishers, or professional societies; a score decrease based on authority, academic, or professional decline;
B1 search function (0.4): 100 points for every function, with a 20-point deduction for any function that drops by one: Full search box; b. logical grouping; c. pertinent searches (near-synonym or expanded searches); d. secondary searches; e. searching that is categorised;
Results of the B2 search (0.2): 100 points for all functions, 20 points for every item reduced by one: Search results can be directly imported into bibliographic management systems (e.g., Endnote, Note First, etc.); a. Analysis function of search results (by time/relevance/author/by journal/by age/by discipline, etc.); b. Complete download mode (email/print/save/online browsing); c. Good quality of downloaded documents (clear and readable/no omissions); d. Linking function with library OPAC system; e. Linking function with library OPAC system; f. Library OPAC system (e.g. Endnote, Note First, etc.); g. Linking function with library OPAC system; h. Linking function with library OPAC system (h) Bibliographic management systems (such as Endnote, Note First, etc.); e. Linking function with library OPAC system (h); Linking function with library OPAC system (i);
B3 Retrieval interface (0.2): 100 points for the following functions, with a reduction of 50 points for each reduction: a friendly retrieval interface; b. Retrieval platform integrates other resources, and cross-bank retrieval can be realized on the same platform.
B4 User services (0.2): 100 points for each of the following features, with a 50-point deduction for each additional function: a. Providing support to users; b. Offering customised service features;
C1 access mode (0.25): 0 points for stand-alone version, 60 points for asking the Museum to put up a mirror or paying for international traffic, and 100 points for dedicated line access or establishing a mirror station in China;
C2 access speed (0.25): The rate at which various databases may be accessed determines the score. 100 points for quick access;
C3 access failure percentage (0.25): A score determined by how frequently database access failures are found using different methods. 100 points are awarded for no unsuccessful access;
C4 Restrictions on off-campus access (0.25): 100 points are awarded for offering off-campus access or permitting libraries to do so; zero points are awarded for not offering off-campus access or for preventing libraries from offering it;
Utilisation statistics report (0.2): 100 points are awarded for submitting a quarterly report that satisfies requirements; 60 points are deducted for submitting a report that falls short of requirements; and 0 points are awarded for failing to submit a report;
D2: Giving the Library access to management systems (0.2): 100 points for doing so, 0 points for failing to do so;
D3 Data updating (0.1): 10 points are deducted for every 10% increase in lag (days lag/specified updating period) beyond 100 points for timely data updates as per the agreement;
D4: Training Provision (0.3): Deliver instruction and associated training materials in a timely way in line with user requirements and to attain the desired outcome of 100 points; alternatively, deliver but not promptly or the result is typically 60 points; or deliver but fail to accomplish the desired outcome. 40 points; don’t give out zero points;
D5: Managing Illegal Use (0.1): 100 points for a user’s appropriate response to illegal use, 0 points for an irrational one;
D6:Function Improvement (0.1): 100 points for promptly enhancing services and functions in response to user requests and feedback on issues, 60 points for enhancements that have no discernible impact, and 0 points for no enhancements;
E1: User feedback (0.6): 0 points are awarded for those who do not think it useful, 60 points are awarded for no feedback, and 100 points are awarded for key users (three or more) or general users (five or more) who think it is an essential database.
Due to space constraints, the weight of the second level of indicators on the corresponding higher level of indicators and the second level of indicators to the overall objective of the combination of the calculation of the weight of the indicators will not be enumerated. This evaluation model will be used to assess the libraries of five institutions, A, B, C, D, and E. It is assumed that 15 experts are divided into three groups to participate in the evaluation [21–23].
The following three judgement matrices are the outcome of three expert groups utilising the AHP approach to evaluate the five variables at the guideline level:
\[\label{eq1}\tag{1} P_1=\left[\begin{array}{ccccc} 1 & 3 & 4 & 2 & 5 \\ 1 / 3 & 1 & 2 & 1 / 2 & 3 \\ 1 / 4 & 1 / 2 & 1 & 1 / 3 & 2 \\ 1 / 2 & 2 & 3 & 1 & 4 \\ 1 / 5 & 1 / 3 & 1 / 2 & 1 / 4 & 1 \end{array}\right]\]
\[\label{eq2}\tag{2} P_2=\left[\begin{array}{ccccc} 1 & 2 & 4 & 3 & 7 \\ 1 / 2 & 1 & 2 & 1 & 3 \\ 1 / 4 & 1 / 2 & 1 & 1 & 2 \\ 1 / 3 & 1 & 1 & 1 & 2 \\ 1 / 7 & 1 / 3 & 1 / 2 & 1 / 2 & 1 \end{array}\right]\]
\[\label{eq3}\tag{3} P_3=\left[\begin{array}{ccccc} 1 & 3 & 4 & 2 & 6 \\ 1 / 3 & 1 & 3 & 1 / 2 & 2 \\ 1 / 4 & 1 / 3 & 1 & 1 / 2 & 1 / 3 \\ 1 / 2 & 2 & 2 & 1 & 1 / 3 \\ 1 / 6 & 1 / 2 & 3 & 3 & 1 \end{array}\right]\]
Using the square-root approach, the normalised vectors of the three judgement matrices were discovered to be:
\[\label{eq4} \begin{aligned} & b_1=(0.4185,0.1600,0.0972,0.2625,0.0618) \\ & b_2=(0.4526,0.2046,0.1240,0.1518,0.0670) \\ & b_3=(0.4888,0.1773,0.0977,0.1773,0.0589) \end{aligned}\tag{4}\]
The above equation yields the value of the degree of consistency between two decision-making experts. As a result, groups 1 and 2 of decision-making experts are grouped into a new category, and group 3 of experts is grouped into a new category. There are two expert groups in the new category’s first category and one in the second, for a total of \({\varepsilon _1} = {\varepsilon _2} = 2/(2 + 2 + 1) = 0.4\), \({\varepsilon _3} = 1/(2 + 2 + 1) = 0.2\) expert groups overall, and \(\varepsilon = (0.4,0.4,0.2)\) experts overall make up the ultimate weight when it comes to individual decision-making.
Eq. (3) and (4) state that the final weight vector of the first-level indicators for the objective level is obtained after normalisation by combining the weights of the indicators at the criteria level acquired by the AHP approach with the individual weights of the decision-making experts:
\[\begin{aligned} \label{eq5} \mu = \left( {{\mu _1}{\mu _2},{\mu _3}{\mu _4},{\mu _5}} \right)= (0.4462,0.1813,0.1080,0.2012,0.0633) \end{aligned}\tag{5}\]
Similarly, the value of each attribute of the library informatization level of the same collection of five colleges and universities is normalised and stated in the following Table 3 in order to determine the final weight of each secondary indicator on the goal level.
Index | Weight | A | B | C | D | E |
---|---|---|---|---|---|---|
1 | 0.042 | 0.447 | 0.621 | 0.798 | 0.925 | 0.735 |
2 | 0.035 | 0.178 | 0.425 | 0.935 | 0.916 | 0.412 |
3 | 0.045 | 0.898 | 0.258 | 0.359 | 0.915 | 0.356 |
4 | 0.051 | 0.136 | 0.205 | 0.168 | 0.603 | 0.247 |
5 | 0.035 | 0.198 | 0.621 | 0.715 | 0.446 | 0.935 |
6 | 0.026 | 0.456 | 0.416 | 0.845 | 0.525 | 0.203 |
7 | 0.023 | 0.654 | 0.835 | 0.198 | 0.684 | 0.202 |
8 | 0.021 | 0.832 | 0.503 | 0.709 | 0.426 | 0.398 |
9 | 0.046 | 0.189 | 0.139 | 0.687 | 0.306 | 0.548 |
10 | 0.056 | 0.158 | 0.368 | 0.358 | 0.861 | 0.854 |
Using Eq. (5), create the weighted normalised decision matrix \(B = {\left[ {{b_{ij}}} \right]_{5 \times 26}}\). As shown in Table 4, the rows correspond to the five evaluation objects, and the columns correspond to the twenty-six indication properties.
Index | A | B | C | D | E |
---|---|---|---|---|---|
1 | 0.185 | 0.026 | 0.033 | 0.032 | 0.031 |
2 | 0.006 | 0.015 | 0.034 | 0.034 | 0.016 |
3 | 0.039 | 0.003 | 0.015 | 0.035 | 0.004 |
4 | 0.008 | 0.011 | 0.012 | 0.026 | 0.015 |
5 | 0.067 | 0.025 | 0.033 | 0.039 | 0.031 |
6 | 0.012 | 0.011 | 0.035 | 0.034 | 0.015 |
7 | 0.015 | 0.019 | 0.014 | 0.005 | 0.012 |
8 | 0.031 | 0.018 | 0.014 | 0.015 | 0.016 |
9 | 0.034 | 0.027 | 0.014 | 0.012 | 0.015 |
10 | 0.016 | 0.022 | 0.001 | 0.025 | 0.017 |
This report used data analysis techniques to conduct a decision-making evaluation study on the digital transformation of college libraries. Empirical analysis’s findings demonstrate that university libraries’ digital transformation greatly enhances user satisfaction, resource usage, and service efficiency. Digital transformation still has to deal with a few issues and concerns, like data security, privacy protection, and technological advancement. In the future, we’ll go deeper into these topics, look into more effective and scientific approaches to digital transformation, and support university libraries’ long-term growth.
Research on the Empowerment of Scientific Research Decision-making Evaluation in University Libraries through Digital Education Transformation. Funding No:stxh2023A02.