Effect of Algorithmic Transparency on Gig Workers’ Proactive Service Performance: A Moderated Chain Mediation Model

Abstract

Algorithmic technology in the gig economy is widely used and adopted by digital platform companies as a new form of managing employees. However, this approach has changed the form of interaction between employees and employers, and led to changes in the individual psychology and work behaviors of gig workers. The study of the impact of algorithms on employees is of great significance for enterprises to optimize employee management, improve organizational efficiency, and optimize digital human resource management practices. Based on the social exchange theory, this study developed a research model, took the gig workers of digital gig platforms in Jiangsu Province as the research object and empirically analyzed the 377 valid questionnaires collected. It was found that algorithmic transparency can positively influence proactive service performance through the chain mediation effect of psychological contract and organizational identification, while techno-complexity has a negative moderating effect on relational and transactional contract fulfillment. The findings of this study theoretically expand the current understanding of algorithmic human resource management and provide practical advice to digital platform companies on how to manage the gig workers more effectively.

Share and Cite:

Sun, T. R., & Li, Y. Y. (2024) Effect of Algorithmic Transparency on Gig Workers’ Proactive Service Performance: A Moderated Chain Mediation Model. American Journal of Industrial and Business Management, 14, 462-491. doi: 10.4236/ajibm.2024.144024.

1. Introduction

With the advent of the data era, the underlying algorithms have carefully constructed a dynamic and complex digital world for human beings, reconfiguring the way that human beings acquire the means of production with their characteristics of memory storage, precise calculation and high-speed output. Under the intermingling of physical work scenes and virtual algorithm management, platform gig workers’ understanding of management tools will go beyond the traditional technical scope, further leading gig workers and the public to discuss the issue of algorithmic transparency. However, under the wave of digital algorithms, “algorithmic discrimination” is frequently occurring, “distributive justice” is no longer fair, and employees are breeding algorithmic questions that run counter to the goal of efficient management. Moreover the work efficiency does not increase, but rather decreases. Therefore, it is of great significance to explain the reshaping mechanism of algorithmic transparency on the relationship between employees and their work in the digital era, in order to clarify its intrinsic value and instrumental value.

From government governance to enterprise management, transparency mechanisms such as information transparency and digital transparency break the “black box” mechanism of digital technology to a certain extent, and reveal the essence of organizational operation from the perspective of users, providing effective tools for the organization’s service and management practice. Rather than users try to “enter” the internal operation logic of the organization, transparency expands the scope of the organization to the “outside” of the society instead of the “inside” of the organization, which realizes the connection between human and non-human technologies. Such practice visualizes the complexity and ambiguity of the contents so as to present it to stakeholders in an understandable form (Ananny & Crawford, 2018) . Algorithmic transparency is one of the basic characteristics of algorithmic technology (Ahonen & Erkkilä, 2020; Cram et al., 2022; Felzmann et al., 2020) . It is an important technical concept that one must face when confronting and using automated decision-making systems (Lapostol Piderit et al., 2023) . With the development of algorithmic technology, the platform economy has gradually developed into an innovative economic form with strong momentum, providing a high-quality digital environment for algorithmic growth. Among them, digital gig platforms such as Meituan and Eleme provide relatively idealized research contexts for algorithmic research. On such gig platforms, the machine learning mechanism of algorithm management is gradually breaking the limits of human programming through big data mining and processing, replacing manual management to a great extent (Young et al., 2019) . The use of algorithmic technology in the workplace and the ethical issues it raises have become a hot topic in society. Despite the fact that algorithmic tools can improve organizational efficiency and avoid repeating work, there is still a general pessimistic attitude towards this emerging technology (Liu et al., 2023; Mahmud et al., 2022) . Some scholars have argued that public criticism and bias against algorithmic tools stems from the lack of transparency (Busuioc, 2021) . The psychological need for people to understand algorithmic reasoning processes and logical relationships makes them more inclined to show distrust in algorithms (Dietvorst et al., 2015) . In fact, algorithmic systems in practice tend to present a black box nature to people’s perceptions (Burrell, 2016; Price, 2018) . Algorithms are often perceived as a working system with complex and mysterious mechanisms, and intelligently observed simply in terms of its inputs and outputs, without being able to know the underlying principles by which the algorithm processes the problem and arrives at its conclusions (Pasquale, 2015) . The opaque black-box nature can lead to decision deception (Sandvig et al., 2014) , algorithmic discrimination (Gillespie, 2014) , capital domination of public power (Kitchin, 2014) , and poor government regulation. Algorithmic “black boxes” take advantage of information resources to gain a favorable position in their interactions with gig workers which make them helpless in the face of a negative decision (Zarsky, 2016) . Consequently, under the intertwining of multidimensional negative influences, gig workers are gradually turned to numbers and symbols under the management tools of algorithms, forming depersonalized characteristics. More seriously, due to the nature of profit-seeking, private capital’s improper profit-making behavior behind the black box of algorithms is likely to lead to a crisis in the rights and interests of gig workers and other stakeholders.

Algorithmic transparency means that it is possible for users to understand what the algorithmic system is doing and why it is doing it (Shahriari & Shahriari, 2017) , granting users a degree of accountability and the right to know. On one hand, algorithmic transparency empowers users to be accountable, i.e., when algorithms are out of control or biased, users can claim responsibility for manipulators based on disclosed algorithms and better monitor platform decision-making processes (de Fine Licht & de Fine Licht, 2020) ; on the other hand, algorithmic transparency upholds the right of workers to be informed and provides them with the opportunity to challenge the fairness and rationality of decision-making process (Mittelstadt et al., 2019) . Several studies have pointed out that transparent management processes can help employees accept and adapt to algorithmic management environments (Rani & Furrer, 2021) , effectively increase employees’ perceptions of procedural fairness (Fieseler et al., 2019) , and reduce their propensity to leave (Conroy et al., 2022) . Furthermore, algorithmic transparency was found to positively predict algorithmic satisfaction. The more transparent people perceive an algorithm’s decision-making process, the more satisfied they will be with the services provided by this algorithm (Shah et al., 2023; Shin et al., 2020) . However, some studies have argued that algorithmic transparency may bring negative effects such as information overload due to the influence of the objective environment such as technological competition and information explosion in the digital age (Yang & Pitafi, 2023) . Overly redundant and disruptive information may cause negative effects such as information fatigue and meaningless (Gal et al., 2020; Stohl et al., 2016) . Specifically, algorithm aversion is induced when people realize that the algorithm runs with errors or biases (Dietvorst et al., 2015) . This implies that algorithmic transparency will potentially reduce people’s trust in algorithmic decisions (Rader et al., 2018) . In addition, as people become more aware of algorithms, they are more likely to develop a sense of social isolation (Liu & Wei, 2021) and a sense of moral wrongdoing (Shah et al., 2023) , which counteracts proactive employee behavior. It is evident that there is a divergence between the findings of scholars on the value and role of algorithmic transparency, which still requires further discussion and research.

This article takes “whether algorithmic transparency can influence employees’ proactive service performance” as the research theme. According to the social exchange theory, we believe that algorithm disclosure is a manifestation of the subjective interaction process, and social exchange represents a dynamic and balanced exchange relationship between the algorithm and the employee subject, which achieves a balanced state and maintains the relationship through the exchange of information and production factors, which leads to the effective realization of the psychological contract of the gig workers, and then encourages their organizational identification and proactive service performance. In addition, this study examines techno-complexity as a boundary factor of algorithmic transparency affecting employees’ psychological contract. Based on this, we proposed a research model as shown in Figure 1.

Our study explores: 1) the definition and manifestation of algorithmic transparency in the special context of gig economy; 2) the effect of algorithmic transparency on the proactive service performance of gig workers by elaborating the role of algorithmic transparency in the work cycle of gig workers, contributing to the literature related to algorithmic transparency and employees’ behavior; 3) the future direction for gig economy to clarify algorithm rules, utilize algorithm and optimize algorithm, thus enhancing work ability and performance level; 4) put forward constructive opinions on the regulation and management of algorithms in the future.

2. Hypothesis and Model

2.1. Algorithmic Transparency and Transactional Contract Fulfillment

Algorithmic society has become a description of technology-defined society.

Figure 1. Conceptual model.

With the rapid updating and iteration of big data under the wave of informationization, the huge volume of data has led to a rapid rise in management costs. In this context, algorithms as computer processing technologies historically appear and gradually usurp the subject position of human managers. Efficient, simplified, and rational algorithmic thinking gradually becomes the underlying operating logic of modern enterprises, creating new opportunities for the automation of work processes and organizational management (Jarrahi et al., 2021) . While the design and implementation of algorithms is still performed by humans, the management of work implementation and delivery has been almost entirely replaced by algorithms and numbers, with few elements of substantive human decision-maker involvement, automating management, coordination, and implementation processes, and reshaping organizational activities (Crowston & Bolici, 2019) . While such abstract coordination tools reshape the work patterns of casual laborers, the opaque nature of algorithms is reinforced by comprehension deficits caused by opportunistic behavior of stakeholders (Lapostol Piderit et al., 2023) , the “black box” of algorithmic input-output processes, and the uniqueness of algorithmic languages. Indeed, algorithmic transparency determines the construction of values, biases and ideologies of users in the process of understanding algorithms, demonstrating the importance of algorithmic transparency in the process of employees’ work (Diakopoulos & Koliska, 2017) .

Gig workers in the gig economy with non-standard type of contract usually hold short-term jobs in the organization (Harms & Han, 2019) . The transactional contract fulfillment places more emphasis on the short-term economic behavior of the employee, with the gig workers being more concerned with job compensation and personal benefits, and the platform being more concerned with the task performance of the gig workers, without much emotional investment from both parties (Rousseau, 1990) . Platform gig workers play and use their existing knowledge and skills to contribute to the organization, and therefore in exchange for pay, incentives, promotions, and other rewards provided by the organization. From a resource-based perspective, increased algorithmic transparency liberates the platform gig worker from the role of being dominated, and to some extent enables a transfer of subjective power, empowering gig workers to interrogate and utilize the algorithm (Springer & Whittaker, 2020) . Specifically, as algorithms become more transparent and the role of algorithmic “assistance” is further emphasized, gig workers continue to strengthen their ability to master, control, and filter algorithms, consciously moving from the unilateral role of “managed” to that of “manager”. The deeper involvement of transactional mechanisms with organizations has contributed to the neutral and objective fulfillment of the transactional contract fulfillment between individuals and organizations.

Firstly, the increased transparency of algorithms enhances employees’ perceptions of the security and fairness of the transaction process. Algorithms establish relationships between visible people and invisible data, such as the Uber platform’s use of algorithms to track a driver’s work background and personal information and use it as a basis for tenure decisions (Wiener et al., 2023) . Algorithmic systems link individual interests to data, yet their opaque technique-leading traps gig workers in a vortex of suspicion, breeding insecurity and physical and mental health concerns. Studies have noted that increased algorithmic transparency enables employees to perceive data reliability, fairness, and safety in algorithmic guidance, assessment, and computation processes (Bujold et al., 2022; Lee et al., 2019) . Then they will be more motivated to engage in positive work (Cram et al., 2022) . When employees are explicit about algorithmic techniques and realize that the algorithm follows a procedure, and are more knowledgeable and cognizant of the process by which the algorithm operates, this in turn enhances the perception of algorithmic fairness, even if the final output results in an unfavorable outcome (Diamond & Zeisel, 1978) . Based on the reciprocity principle of social exchange theory (Blau, 2017) , when gig workers realize that the algorithmic system is publicly accessible, and that platform fairness and legitimacy are guaranteed under a transparency mechanism, they will be more trusting of the transactional relationship with the platform, which will in turn enhance the motivation for reciprocal behavior between the two.

Secondly, algorithms concentrate on operating, selecting and sorting data, running efficient, reflecting rational and procedural processes. Nevertheless algorithm operations ignore diverse and complex human situations, which may result in biased data output results (Lee et al., 2019) , leading to dehumanizing and controlling management (Lang et al., 2023) . Increased algorithmic transparency actually enhances the accountability of digital platforms and algorithmic systems, cedes the right to be informed to employees, and encourages the unfolding of human subjectivity in algorithmic management. According to the reciprocity viewpoint of social exchange theory, platform gig workers under a transparent algorithm mechanism will realize the control of labor results, which helps to enhance the rationality of the “give-and-take” relationship with the platform. It actually solidifies the transactional relationship between the two sides, and promotes the fulfillment of the transactional psychological contract between the employees and the platform. For example, mistakes beyond the scope of responsibility of gig workers, such as hot weather, malicious bad reviews, system errors, and the hiding of user addresses, should not be borne by gig workers, so increased algorithmic transparency provides a channel for platform workers to appeal and defend their rights, and reduces the degree of algorithmic “dehumanization”.

Finally, explicit and transparent algorithms can provide technical support to platform gig workers, which can help them learn the specific operation process of the algorithm, adjust and optimize their individual work behaviors according to the algorithm (Cram et al., 2022) , which improves their work autonomy and flexibility. Algorithm content is disclosed and algorithm technology is mastered by users, which can help gig workers to improve their work performance and obtain the salary, rewards, and promotion opportunities provided by the platform. Based on the social exchange theory, the optimization of individual resources and auxiliary tools builds a more high-quality trading relationship.

H1a: There is a positive effect of algorithmic transparency on transactional contract fulfillment.

Organizational identification is the process by which members gradually become emotionally or values compatible with the organization within the organization (Tajfel, 1978) . It has been shown that psychological contract violations weaken employees’ perceptions of membership within the organization (Masterson & Stamper, 2003) , and that there is a negative correlation between psychological contract violations and organizational identification (Stamper et al., 2009) . In particular, transactional contract fulfillment violations involving financial or monetary terms have a significant impact on members’ organizational identification.

Algorithmic transparency is not only dependent on the opaque elements inherent in machine operation, but should also be attributed to the lack of transparency in platform governance (Kim & Moon, 2021) . According to social exchange theory, employees strive to maintain a balance in exchange relationships as they interact with organizations (Homans, 1958) . The positive fulfillment of the transactional contract fulfillment between the individual and the organization realizes the principle of reciprocity between the employee and the organization. When employees perceive that the transactional contract fulfillment with the organization is fulfilled, they believe that their work and behavior are recognized and financially rewarded by the organization. The basic transactional expectations of the employee’s work are met, and the fulfillment of the promise is conducive to the formation of the employee’s identification with the organization’s fair transaction process, reducing the perception of uncertainty (Rodwell et al., 2015) , which in turn enhances the sense of organizational identification. Some scholars have pointed out that transactional contract fulfillment positively affects gig workers organizational identification (Liu et al., 2020) . When there are fair, just and open channels for the transaction of interests between individuals and organizations, and employees hold a certain degree of autonomy, it is conducive to the emergence of organizational identification.

H1b: Transactional contract fulfillment mediate between algorithmic transparency and organizational identification.

2.2. Algorithmic Transparency and Relational Contract Fulfillment

The relational contract fulfillment responds to a broad, long-term, and flexible exchange relationship between an employee and an organization, i.e., in order to obtain the guarantee of a long-term commitment to work by the organization, the employee worker is willing to work for a long period of time in the organization and to accept the organization’s internal work arrangements (She et al., 2020) . Although the algorithmic technology embedding changes the original traditional work context, leading to a change in the relationship between employees and the organization from a traditional hierarchical relationship to a partnership, the gig workers will benefit from the social influence and social prestige of the gig platform. For example, customers optimized by the gig platform can develop a high sense of trust and dependence on the platform and then transfer the positive attitude to platform employees. Gig workers who perceive high professional self-esteem are more likely to develop a sense of identification and belonging to the platform. Therefore, there is likely to be a relational contract fulfillment between the gig workers and the platform.

Firstly, the platform enterprise disclosure algorithm can effectively enhance social goodwill and trust (Buell et al., 2017) . The opaque nature of the algorithm system often leads to a lack of understanding of the platform’s operational process, resulting in the “black box” phenomenon, which breeds skepticism and insecurity about the algorithm’s operational process. In this context, platforms, disclosing their algorithms and making clear the sources of data that support algorithmic decision-making, can thus enhance the social goodwill of the enterprise. When the platform’s reputation is enhanced, social status and social influence are on the rise. Moreover, the relational contract fulfillment between the gig workers and the organization is positively fulfilled.

Secondly, transparency in algorithm management systems can mitigate the negative effects of employees’ technostress (Cram et al., 2022) . Similarly, when organizations proactively display algorithms, employees will feel valued by the organization, which in turn generates positive emotions. They contribute to the psychological resilience of the individual and increase the psychological resources to combat difficulties and challenges. Gig workers perceive algorithmic transparency as support from their superiors on the platform. According to social exchange theory, gig workers will respond positively if their superiors provide beneficial resources, creating a strong willingness to reciprocate.

In addition, the social psychology of procedural justice argues that if the procedure is fair, people may reasonably expect to gain rewards in the long run even if they are not able to obtain their desired benefits in the short run, which further explains the rationality of the existence of a relational contract fulfillment in the relationship between the gig workers and the gig platform. In summary, the hypothesis of this study is formulated:

H2a: Algorithmic transparency positively affects the relational contract fulfillment of casual workers.

According to the social exchange theory, algorithmic transparency as an important individual resource can enhance the cognitive and psychological resources of gig workers and contribute to the generation of organizational identification of gig workers. Organizational identification reflects employees’ perception of their own value in the organization, demonstrating the degree of identification with the organization and a sense of belonging. The relational contract fulfillment is a subjective belief based on social emotional exchange and represents the employee’s positive view of the relationship between himself and the organization. In the process of social exchange, the fulfillment of the relational contract fulfillment between individuals and organizations may produce positive organizational identification. The reasons for this are as follows: Firstly, the relational contract fulfillment can enhance employees’ perception of their organizational roles, resulting in a sense of “insider”, which in turn enhances their sense of identity and belonging to the organization; Secondly, the relational contract fulfillment allows employees to believe that the organization will provide them with higher rewards in the future (Conway & Coyle-Shapiro, 2012) , which mobilizes employees’ positive emotions and generates positive evaluations of the organization; Thirdly, the relational contract fulfillment emphasizes employees’ affective commitment, so employees perceive more emotional and intrinsic obligations. Based on the reciprocity principle of social exchange theory, gig workers will develop a sense of belonging, honor and even dependence on the platform.

Employees with a relational contract fulfillment are more inclined to adopt cooperative behaviors, pursue organizational interests, and contribute to enhancing organizational performance (Rousseau, 1990) . It has been confirmed that relational contract fulfillment positively affects employees’ organizational identification (Liu et al., 2020) . Combined with the inference of H2a, this study proposes the following research hypotheses:

H2b: Relational contract fulfillment mediates between algorithmic transparency and organizational identification.

2.3. Organizational Identification and Proactive Service Performance

Proactive service performance is defined as spontaneous, long-term oriented, and enduring service behaviors that are initiated by front-line service employees beyond basic service requirements, norms, and standard operating procedures (Rank et al., 2007) . This study suggests that organizational identification may positively influence employees’ proactive service performance. Organizational identification is defined as employees’ view of organizational identification and organizational success as part of their personal self-concept, linking individual achievement to organizational success or failure (Mael & Ashforth, 1992) , which can also be interpreted as the extent to which employees perceive themselves as “integrated” with the organization (Ashforth & Mael, 1989) . In an organizational context, employees’ connection to the organization helps them to form perceptions of organizational traits (Albert & Whetten, 1985) .

On one hand, organizational identification helps employees with role orientation. Realizing that organizational performance are closely related to personal future development, employees will tie their personal development to organizational performance and take the initiative to engage in activities that benefit the organization in return. Specifically, gig workers want to further optimize the platform’s reputation and image, so they will demonstrate a quality service attitude when facing customers and satisfy their individual needs in order to increase the positive evaluation of the individual and the platform.

On the other hand, organizational identification enhances employees’ satisfaction, trust, loyalty and other positive emotions in the work environment. Positive emotions generated by work are a kind of psychological resources, and after experiencing the good work experience and pleasant emotions brought by work, gig workers can obtain sufficient individual resources. According to the social exchange theory, exchange subjects are seeking a state of equilibrium to maintain the exchange relationship. Abundant psychological resources satisfy the employees’ resource consumption in the process of work, and extra resources are invested in customer service to show quality service attitudes and behaviors.

H3: Organizational identification positively influences proactive service performance.

Organizational identification is an important mediator between the psychological contract and employees’ work behaviors and work outcomes. According to Restubog et al. (2008) , a key outcome of psychological contract violation is organizational identification. In addition, there has been a large body of research demonstrating the mechanisms underlying the relationship between psychological contract violation and employee work outcomes (Bari et al., 2022; Lo & Aryee, 2003) . For example, when an employee discovers that a manager has violated the psychological contract by ignoring his/her commitment, he/she will no longer have a strong connection to the organization, thus weakening his/her organizational identification and hindering work-related outcomes (Epitropaki, 2013) .

Based on the previous argument, this study suggests that there are two paths of transactional and relational effects of algorithmic transparency on the proactive service performance of gig workers. On one hand, algorithmic transparency makes gig workers aware of the fairness and openness of the economic transaction process, and the increased control over labor results in employees’ trust in the organization, which positively affects the gig workers’ transactional contract fulfillment; On the other hand, algorithm disclosure creates a good image of the platform and a good reputation, which strengthens employees’ insider perceptions and expectations of future development, and thus promotes the gig workers’ relational contract fulfillment.

Gig workers with a high degree of fulfillment of the psychological contract of perception believe that the monetary rewards and social-emotional support provided by the platform meet the expectations of the employees and contribute to the maintenance of fairness in the exchange process between the individual and the organization, thus facilitating the transformation of gig workers from the roles of the “managed” to the roles of the “managers”, and enhances their control over work processes and outcomes. Therefore, employees’ perceived psychological contract fulfillment can trigger their trust, loyalty, and identification with the organization, which in turn motivates them to engage in positive organizational citizenship behaviors. Transactional and relational psychological satisfaction may be positively associated with job satisfaction, security, organizational commitment, and identity, which may motivate proactive service performance.

Based on the above discussion, this study formulates the hypothesis:

H4a: Algorithmic transparency positively influences employees’ proactive service performance through transactional contract fulfillment, organizational identification.

H4b: Algorithmic transparency positively influences employees’ proactive service performance through relational contract fulfillment, organizational identification.

2.4. The Moderating Role of Techno-Complexity

Techno-complexity refers to a situation where the increased complexity of ICT leads employees to feel a lack of skills and have to spend time and effort to learn the technology (Tarafdar et al., 2007) . In the algorithmic context, techno-complexity mainly includes the complexity of algorithmic mechanisms such as algorithmic design, data collection, operational processes and machine learning. Techno-complexity is not only due to the complex, iterative nature of the source code of a single algorithm itself, but also due to the interaction of multiple sets of algorithmic modules. Algorithm complexity and modularization will trigger off the unpredictable mutual reaction between the various parts of the algorithm. And with the continuous evolution of technology, the constant refinement of algorithmic labor division and the increasing demand for algorithms in social life, a large number of algorithms become more and more complex. The complex algorithmic knowledge and technical concepts ultimately overwhelm users (Tarafdar et al., 2015) . And this technological disconnect is particularly evident in the group of platform gig workers, who are mainly manual laborers. Platform algorithms are disclosed to gig workers who lack technical skills: overly complex code is like gibberish, and disclosure of incomprehensible algorithms is meaningless. This study argues that complex algorithmic techniques may lead to reduced perceptions of transactional contract realization among platform workers. Specifically:

Firstly, increased algorithmic complexity forces employees to continually update their skills to keep up with algorithmic developments, which may lead to employees being caught up in multiple and conflicting task responsibilities and take away from their time to complete other tasks (Tarafdar et al., 2015) . In addition, employees are forced to spend more time and effort to invest time and energy to seek further knowledge (Nasirpouri Shadbad & Biros, 2020) . It squeezes their rest time outside work and leads to fatigue, which ultimately affects their working state and brings them into work exhaustion (Zhang et al., 2022) .

Secondly, rising algorithmic complexity leads to impediments for employees in understanding and utilizing algorithms, may make it difficult for employees to understand the platform’s published algorithmic policies (D’Arcy et al., 2014) , and creates continuous learning pressure, work stress, etc. for employees (Ramesh et al., 2021) . Complex machine languages place high demands on employees’ algorithmic literacy, and excessive algorithmic complexity leads to cognitive burden on employees (You et al., 2022) . It reduces the value of information provided by algorithms to employees, leads to redundancy of information in the work process and hinders employees from carrying out their work. As a result, algorithm developers and utilizers attempt to support users with measures of algorithmic transparency and clarify their social responsibility, yet in practice the transparency regime shifts the responsibility from the platforms and algorithm developers to the overburdened platform workers who apparently cannot bear the heavy load.

Thirdly, the link between transparency and autonomy is not as clear as the functional perspective assumes (Felzmann et al., 2020) . The reason is that if users are not able to utilize the information presented by the algorithm, the algorithm will remain complex and ambiguous to the user, which in turn reduces their perception of value, and the sense of autonomy and control will not increase. Thus the public may see no need for full algorithmic transparency because the algorithmic procedures are too complex, which in turn may make it difficult to grasp the algorithmic focus and algorithmic implications. Gig workers are likely to realize that the algorithmic formulas are meaningless and end up ignoring the complex information. In summary, the hypothesis of this study is formulated:

H5a: Techno-complexity negatively moderates the relationship between algorithmic transparency and transactional contract fulfillment.

Scholars believe that there are multiple factors influencing the process of transparency’s role in trust generation in the context of algorithmic system applications, including user expectations, transparency construction mechanisms, etc. (Rader et al., 2018) . And transparency may trigger disappointment and dissatisfaction with algorithmic systems (Eslami et al., 2018) . Decentralized and complex work environments especially blur the positive effects of transparency on employee work behavior. Therefore, this study suggests that the rise in techno-complexity may lead to a decrease in the realization of the perceived relational contract fulfillment among gig workers. The reasons for this are as follows:

First, even if the algorithmic system is characterized by transparency in the technical dimension, effectively solving the fundamental problem of information asymmetry between the platform and the gig workers, yet the complex algorithmic procedure creates barriers to understanding for the gig workers, resulting in transparency losing its meaning in practice. For example, in the takeaway food delivery platform, although the algorithms for task allocation and performance evaluation are clearly listed, the overly complex machine language cannot be effectively understood by gig workers, which is likely to make the employees question its validity and fairness, and instead trigger their negative will.

Second, increased algorithmic complexity further reinforces employees’ perceptions of algorithmic management as a substitute for human managers. Previous research has noted that the lack of direct interaction with human superiors can lead employees to reduce their social presence and question the effectiveness of algorithms. Thus, high algorithmic complexity may impair the employee work experience and reduce the individual’s connection to the organization. When an employee’s perceived relationship with the organization is thwarted by the algorithm, it may lead to a breach of the relational contract fulfillment.

When resources are threatened with loss, employees undervalue such resources (Hobfoll, 1989) . According to social exchange theory, when the support resources available to employees in the workplace are insufficient to offset the loss of resources (Blau, 2017) , for example, when algorithms are too complex to support employees effectively, employees will view algorithms as a nuisance, triggering individuals to experience psychological resistance, fatigue, and tension. On the contrary, when algorithmic outputs are clear, concise, and understandable, it can promote employees’ psychological contract fulfillment and facilitate individuals’ construction of work resources. For example, employees are able to generate stronger perceptions of effectiveness and fairness through transparent algorithmic decision-making mechanisms and are able to leverage transparent machine learning algorithms to optimize their own job performance. With this type of technical support, employees receive a high level of energy resources that meet the basic resource requirements for value co-creation behaviors.

H5b: Techno-complexity negatively moderates between algorithmic transparency and relational contract fulfillment.

Based on the above analysis, Hypothesis 6a and Hypothesis 6b of this study were formulated:

H6a: Techno-complexity moderates the relationship between algorithmic transparency and proactive service performance by moderating the chain mediation role of transactional contract fulfillment and organizational identification.

H6b: Techno-complexity moderates the relationship between algorithmic transparency and proactive service performance by moderating the chain mediating role of relational contract fulfillment and organizational identification.

3. Research Method

3.1. Research Sample

This study adopts online questionnaire for data collection, and the research object is mainly the gig workers of the digital platform. The researcher firstly contacted the platform’s station manager and operation staff, showed them the purpose of study and the process of questionnaire distribution, and asked for their consent. Secondly, the list of employees who agreed to participate in the survey was obtained and numbered with the support of the head of the human resources department. Finally, with the assistance of the human resources department, the questionnaires were distributed and collected from the employees according to the list number.

In this study, two-stage questionnaires were used for data collection. Stage 1 mainly measured the algorithmic transparency, techno-complexity and basic information of employees, 500 questionnaires were distributed, and 463 questionnaires were recovered. After eliminating the questionnaires with particularly short filling time and obvious regularity of the options, 431 valid questionnaires were obtained, and the effective recovery rate of the questionnaires was 86.2%. Stage 2 mainly measured transactional contract fulfillment, relational contract fulfillment, organizational identification, and proactive service performance. This stage mainly distributed questionnaires to the 431 valid samples of stage 1, and after excluding invalid questionnaires, 377 valid questionnaires were obtained. The effective recovery rate of the questionnaires was 87.47%. The demographic distribution of the respondents is presented in Table 1.

Table 1. Demographic profile of respondents.

3.2. Variable Measure

The scales selected were all well-established scales. When needed, Chinese versions of the English scales were created using a translation-back-translation procedure to avoid semantic bias. The questionnaires were measured on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree).

Algorithmic transparency: The 3-item scale developed by Höddinghaus et al. (2021) was selected for measurement, and the relevant statements were adapted to the specific research context. The representative items were, “I think I could understand the decision-making process of platform algorithms very well”, “I think I can see through platform algorithms’ decision-making process”, “I think the decision-making process of platform algorithms are clear and transparent”. The Cronbach’s alpha for this scale in this study was 0.802.

Psychological contract fulfillment: Transactional contract fulfillment and relational contract fulfillment scale were accessed using the measures adopted from Bal et al. (2010) , Turnley et al. (2003) , Wu & Chen (2015) , referring to Liu et al. (2020) modified question items. Transactional contract fulfillment consists of 4 question items, such as “Competitive income compared with people working for other sharing economy companies” and “Income tied to the level of my performance.” Similarly, relational transactional contract fulfillment was measured by using four items, such as “I am always treated fairly and impartially by the platform.” and “The platform organizes training and provides us with development opportunities.” Cronbach’s alphas for transactional and relational psychological dimensions of psychological contract fulfillment were 0.844 and 0.852 respectively.

Organizational identification: Organizational identification was measured using three items from Mael & Ashforth (1992) organizational identification scale, which consists of six items. Representative items include “When someone criticizes the platform, it feels like a personal insult to me” and “When someone praises the platform, it feels like a personal compliment”. Cronbach’s alpha for organizational identification was 0.900.

Proactive service performance: We measured proactive service performance using seven items from Rank et al. (2007) proactive service performance scale, representing items such as “I proactively checks with customers to verify that customer expectations have been met or exceeded” and “I will actively creates partnerships with other service representatives to better serve customers.” Cronbach’s alpha for this scale in this study was 0.908.

Techno-complexity: The 5-item scale developed by Tarafdar et al. (2007) was used, with representative items such as “I need a long time to understand and use new technologies.” Cronbach’s alpha for techno-complexity was 0.930.

Control variables: based on previous studies, demographic variables that may have an impact on psychological contract fulfillment and proactive service performance, including gender, age, marital status, education level, job tenure and algorithmic literacy, were selected as control variables in this study.

4. Data Analysis and Results

4.1. Common Method Bias

Although this study used a two-stage approach to collect data, there may still be a common method bias problem because perceived algorithmic transparency, transactional contract fulfillment, relational contract fulfillment, organizational identification, proactive service performance, and techno-complexity are derived from employees’ subjective reports. Therefore, this study used the Harman’s single factor test to examine the common method deviation problem. The results indicate that without rotation, the variance explained by the first factor is 33.216%, accounting for 48% of the total variance explained (68.272%), which is less than 50%. This suggests that the common method bias issue in the data examined in the study is not serious.

4.2. Analyses of Reliability and Validity Tests

In order to test the degree of consistency of several variables such as algorithmic transparency, transactional contract fulfillment, controlling psychological contract, organizational identification, proactive service performance and techno-complexity, this paper uses the software SPSS to analyze the reliability of the above variables, and the results are shown in Table 2, the internal consistency coefficients of the variables involved in this paper are in the range of 0.800 to 0.908, all of which are greater than or equal to 0.80, which means that they have good reliability.

In order to examine the discriminant validity of the key variables “algorithmic transparency”, “transactional contract fulfillment”, “relational contract fulfillment”, “organizational identification”, “proactive service performance” and “techno-complexity” as well as the corresponding measurement parameters of each scale, this study conducted confirmatory factor analyses (CFA) on the key variables using AMOS 26.0 and compared the six-factor model, five-factor model, four-factor model, three-factor model, two-factor model, and one-factor model. The results showed that the six-factor model fitted better (χ2 (377) = 623.663, p < 0.01; RMSEA = 0.044, CFI = 0.960, TLI = 0.955). Table 3 reveals that the proposed model significantly outperformed the other five alternative models, and the discriminant validity of the main constructs was preliminarily examined.

4.3. Descriptive Statistics and Correlations

Descriptive statistics, correlations and reliability coefficients are reported in Table 4. From the table, it can be seen that algorithmic transparency showed a significant positive correlation with transactional contract fulfillment (r = 0.483, p < 0.01), relational contract fulfillment (r = 0.397, p < 0.01), organizational identification (r = 0.453, p < 0.01), and proactive service performance (r = 0.482, p < 0.01). Meanwhile, organizational identification showed a significant positive relationship with transactional contract fulfillment (r = 0.433, p < 0.01), relational

Table 2. Test results of internal reliability and convergent validity.

Table 3. Confirmatory factor analysis results of alternative models.

Table 4. Descriptive statistics and correlations.

Notes: n = 377; **p < 0.01, *p < 0.05. AT = algorithmic transparency; TC = techno-complexity; TCF = transactional contract fulfillment; RCF = relational contract fulfillment; OI = organizational identification; PS = proactive service performance.

contract fulfillment (r = 0.468, p < 0.01), and proactive service performance (r = 0.521, p < 0.01). Preliminary support is provided for the hypotheses of this paper.

4.4. Hypothesis Testing

This study utilized Amos to construct a structural equation model for hypothesis testing, and the path coefficients and significance levels are presented in Table 5. As known from the table, algorithmic transparency positively affects transactional contract fulfillment (β = 0.617, p < 0.001), thereby supporting H1a. Algorithmic transparency also positively affects relational contract fulfillment (β = 0.511, p < 0.001), confirming H2a. Organizational identification positively influences proactive service performance (β = 0.323, p < 0.001), therefore H3 is validated.

Table 5. Structural model assessment.

Note: *p < 0.05, **p < 0.01, ***p < 0.001.

In addition, in order to test the moderating effect proposed in this study, an interaction term of algorithmic transparency and techno-complexity was included in the model for testing and multiplied after standardizing all relevant variables. The results indicate that techno-complexity has a negative moderating effect between algorithmic transparency and relational contract fulfillment (β = −0.321, p < 0.001), and the moderating effect is also significant between algorithmic transparency and transactional contract fulfillment (β = −0.126, p < 0.05). Thus, H5a and H5b receive strong support (see Figure 2 and Figure 3).

In order to further test the mediating and chain mediating roles and ensure the stability and consistency of the results, the mediating roles of relational contract fulfillment and transactional contract fulfillment between algorithmic transparency and organizational identification, as well as the chain mediating roles of transactional contract fulfillment-organizational identification, and relational contract fulfillment-organizational identification between algorithmic transparency and proactive service performance are verified by Bootstrap’s method. The results are shown in the table.

The mediating effect was verified using the bias-corrected non-parametric percentile Bootstrap method with 5000 repetitions to validate the mediating effects. The results show that: the mediating effect of transactional contract fulfillment between algorithmic transparency and organizational identification is 0.097 (95% confidence interval [0.050, 0.146]), the interval does not contain 0, supporting hypothesis H1a;while the mediating effect of relational psychological contract between algorithm transparency and organizational identification is 0.116 (95% confidence interval [0.076, 0.159]), and the 95% confidence interval for these mediating effects does not include 0, confirming hypothesis H2a.

Furthermore, continuing with the Bootstrap method to test for multiple mediating effects, the results indicate that the effect value of algorithm transparency → transactional psychological contract → organizational identification → proactive service behavior is 0.043 (95% confidence interval [0.025, 0.064]), with the interval not containing 0, indicating a significant indirect effect and supporting hypothesis H4a. Additionally, the effect value of algorithm transparency → relational psychological contract → organizational identification → proactive service

Figure 2. Moderating effect of techno-complexity between algorithmic transparency and transactional contract fulfillment.

Figure 3. Moderating effect of techno-complexity between algorithmic transparency and relational contract fulfillment.

behavior is 0.041 (95% confidence interval [0.025, 0.061]), where the interval does not include 0, demonstrating a significant indirect effect and confirming hypothesis H4b (Table 6).

The coefficient product method was used to test the moderated chain mediation effect, and the analytical method proposed by Edwards & Lambert (2007) was further employed to verify the significance of the differences in the mediation effects at different levels of the moderating variable.

As shown in Table 7, when the level of techno-complexity is low (mean minus one standard deviation), the value of the chain mediation effect of algorithmic transparency affecting proactive service performance through transactional contract fulfillment and organizational identification is 0.065, with a 95% Bootstrap

Table 6. Chain mediating effect test.

Table 7. Results of the moderated chain mediation effect: Test 1.

Note: Bootstrap sample size = 5000. High techno-complexity = mean + 1 standard deviation; low techno-complexity = mean − 1 standard deviation. LL = low limit, CI = confidence interval, UL = upper limit.

confidence interval of [0.040, 0.095], which does not include 0, indicating a significant chain mediation effect. When the level of techno-complexity is high (mean plus one standard deviation), the value of the chain mediation effect of algorithmic transparency influencing proactive service performance through transactional contract fulfillment and organizational identification is 0.015, and the 95% Bootstrap confidence interval is [0.004, 0.028] not containing 0, indicating that the chain mediation effect is significant. The value of The difference in chained mediation effects between the low and high levels of technological complexity reaches −0.050, with a 95% Bootstrap confidence interval of [−0.079, −0.028], which does not include 0, indicating a significant difference.

Similarly, as seen in Table 8, the value of chain mediation effect of algorithmic transparency affecting proactive service performance through relational contract fulfillment and organizational identification is 0.050, and the 95% Bootstrap confidence interval is [0.031, 0.074], indicating a significant chain mediation effect; the value of chain mediation effect of algorithmic transparency affecting proactive service performance through relational contract fulfillment and organizational identification is 0.026, with a 95% Bootstrap confidence interval of [0.012, 0.044] not containing 0, indicating a significant chain mediation effect; the difference value of the chain mediation effect value at lower and higher levels of techno-complexity reaches −0.024, with a 95% Bootstrap confidence interval of [−0.046, −0.007], which does not include 0, indicating that the difference reached significance.

Thus, it can be observed that when the level of techno-complexity decreases,

Table 8. Results of the moderated chain mediation effect: Test 2.

Note: Bootstrap sample size = 5000. High techno-complexity = mean + 1 standard deviation; low techno-complexity = mean − 1 standard deviation. LL = low limit, CI = confidence interval, UL = upper limit.

the chain mediation effect of transactional contract fulfillment and organizational identification on the relationship between algorithmic transparency and proactive service performance is significantly strengthened. Furthermore, when the level of techno-complexity decreases, the chain mediation effect of relational contract fulfillment and organizational identification on the relationship between algorithmic transparency and proactive service performance is also significantly enhanced.

5. Conclusion

5.1. Discussion

Taking takeaway riders in the digital platforms of Meituan and Eleme as the research object, this study used a questionnaire to investigate the mechanism of the influence of gig workers’ perceived algorithmic transparency on their proactive service behaviors. The results of the study show that: workers’ perception of algorithmic transparency positively affects their proactive service behaviors; gig workers’ psychological contracts (relational and transactional) and organizational identity play a chain mediating role between the two; technological complexity negatively moderates the mediating roles of psychological contracts and organizational identity between algorithmic transparency and proactive service behaviors, specifically, the higher the technological complexity, the weaker the chain mediating effect of the psychological contracts (transactional and relational) and the organizational identity, and the relationship between algorithmic transparency and proactive service behavior is weaker; while this mediating role is stronger when technological complexity is lower.

The theoretical contribution of this paper is that by sorting out the concepts and connotations of algorithmic transparency, based on the research results of related scholars on the impact of algorithmic transparency on the psychology and behavior of employees, and with the perspective of social exchange theory and psychological contract theory, it explores the mechanism of the role of algorithmic transparency on the proactive service behavior of gig workers which further expands the theoretical support between the two.

5.2. Practical Inspiration

In the “gig work” format, because algorithms perform managerial functions as organizational agents, they may communicate commitment cues, indicate employer intentions or make decisions that may affect a person’s beliefs about the psychological contract and influence their relationship with their employer in a manner similar to that of a human contract maker (Tomprou & Lee, 2022) . The information asymmetry generated by the opacity of the algorithm is the main measure of the gig economy platform to control digital labor, but it seriously harms the rights of these workers, and increased algorithmic transparency leads to work optimization, humane transactions, organizational respect, and other beneficial effects often represent the fulfillment of the employer’s commitment to gig work, which in turn enhances employees’ trust in the organization. Therefore, managers should appropriately disclose the algorithm operation rules and respect the employees’ right to know, and pay attention to the virtuous relationship established and maintained between employees and organizations in order to reduce the psychological contract fulfillment of the gig workers due to the opacity of the work assignment, scheduling process, route planning, and so on.

Moreover, psychological contract breach will signal to employees that they are not a valued member of the organization and as a result they will tend to identify with the organization to a lesser degree (Zagenczyk et al., 2011) . Although this form of gig work has led to the prevalence of gig workers’ tendency to move quickly from task to task, to undertake multiple tasks in multiple organizations at the same time, and to have shorter employment contracts, digital platform firms cannot neglect the fulfillment of gig workers’ relational and transactional contract fulfillments due to their attributes of freelance workers. Explicitly managing employees’ psychological contracts by focusing on fulfilling realistic promises will enable managers to improve employee outcomes and promote employee acceptance of their organizations (Rodwell et al., 2015) .

More importantly, the analytical results of this study suggest that techno-complexity moderates the relationship between algorithmic transparency and relational and transactional contract fulfillments; specifically, when gig workers perceive higher techno-complexity, their transactional and relational contract fulfillments are simultaneously weakened, which will ultimately lead to a reduction in employees’ proactive service performance. Further, workers who rely on digital gig platforms often work remotely to complete organizational tasks, and higher techno-complexity means that they need to understand and use complex technological information and information systems, which can be stressful to gig workers’ psychological feelings and work progress. In order to help employees cope with uncertainty in remote work situations, the provision of technical and emotional support is considered useful (Díaz-Soloaga & Díaz-Soloaga, 2022) . Similarly in the case of the gig work, this study argues that information support and emotional support are effective initiatives to reduce the stress of the gig worker: Firstly, the platform organization should create more intra-network communication pathways for workers so that knowledge sharing between colleagues is supported by technical support and reduces the emotional exhaustion generated by techno-complexity. Secondly, platform organizations should build good communication relationships between superiors and subordinates so that the relevant departments can respond to technologically complex issues in a timely manner. At the same time, when releasing new algorithms and promoting new technological tools, the platform enterprises should provide training such as video teaching and simulation operation to reduce the rupture of the psychological contract caused by the perceived techno-complexity of the gig workers.

5.3. Research Limitation and Prospects

First, the respondents of this research questionnaire are all gig workers, which may lead to a certain degree of subjectivity in the results, and the objective data collected by the terminals of the big data-driven platform can be combined in the future research to guarantee the comprehensiveness and scientificity of the study.

Second, the results of the questionnaire in this study are based on casual workers in Jiangsu Province, which has some regional limitations and may affect the reliability of the results due to factors such as regional culture or geographic location, and future research could expand the regional sources of the sample to enhance the generalizability of the results.

Third, while this study focuses on the psychological contract as an entry point to investigate the mechanism of perceived algorithmic transparency on employees’ proactive service performance and the moderating effect of techno-complexity. Due to the complexity of algorithmic influences on the work of gig workers, scholars have also investigated the disruptive transgressions of gig workers (Zhang et al., 2023) , job dedication (Lang et al., 2023) , perceived work engagement (Wang et al., 2022) and so on. Meanwhile, in practice, the psychological contract of gig workers may also be affected by a variety of external factors other than technology. In order to further clarify the influence mechanism of algorithmic technology on the psychology and work behavior of gig workers, and to adapt to the needs of the digital human resource management practice, the future research can be based on different perspectives to deeply explore the correlation between the factors in the process of human-computer interaction in the form of gig work.

Finally, although this paper argues that increasing the transparency of algorithms is conducive to promoting proactive service behaviors among employees, it may lead to the emergence of phenomena such as “algorithmic abuse” and “gaming the system” (Rani & Furrer, 2021) , and future research can also integrate social and ethical values to explore the dilemmas and practical countermeasures of leading science and technology for the better.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Ahonen, P., & Erkkilä, T. (2020). Transparency in Algorithmic Decision-Making: Ideational Tensions and Conceptual Shifts in Finland. Information Polity, 25, 419-432.
https://doi.org/10.3233/IP-200259
[2] Albert, S., & Whetten, D. A. (1985). Organizational Identity. Research in Organizational Behavior, 7, 263-295.
[3] Ananny, M., & Crawford, K. (2018). Seeing Without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability. New Media & Society, 20, 973-989.
https://doi.org/10.1177/1461444816676645
[4] Ashforth, B. E., & Mael, F. (1989). Social Identity Theory and the Organization. The Academy of Management Review, 14, 20-39.
https://doi.org/10.2307/258189
[5] Bal, P. M., Jansen, P. G. W., Van der Velde, M. E. G., De Lange, A. H., & Rousseau, D. M. (2010). The Role of Future Time Perspective in Psychological Contracts: A Study among Older Workers. Journal of Vocational Behavior, 76, 474-486.
https://doi.org/10.1016/j.jvb.2010.01.002
[6] Bari, M. W., Qurrah-tul-ain, Abrar, M., & Fanchen, M. (2022). Employees’ Responses to Psychological Contract Breach: The Mediating Role of Organizational Cynicism. Economic and Industrial Democracy, 43, 810-829.
https://doi.org/10.1177/0143831X20958478
[7] Blau, P. (2017). Exchange and Power in Social Life (2nd ed.). Routledge.
https://doi.org/10.4324/9780203792643
[8] Buell, R. W., Kim, T., & Tsay, C.-J. (2017). Creating Reciprocal Value through Operational Transparency. Management Science, 63, 1673-1695.
https://doi.org/10.1287/mnsc.2015.2411
[9] Bujold, A., Parent-Rocheleau, X., & Gaudet, M.-C. (2022). Opacity behind the Wheel: The Relationship between Transparency of Algorithmic Management, Justice Perception, and Intention to Quit among Truck Drivers. Computers in Human Behavior Reports, 8, Article 100245.
https://doi.org/10.1016/j.chbr.2022.100245
[10] Burrell, J. (2016). How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms. Big Data & Society, 3.
https://doi.org/10.1177/2053951715622512
[11] Busuioc, M. (2021). Accountable Artificial Intelligence: Holding Algorithms to Account. Public Administration Review, 81, 825-836.
https://doi.org/10.1111/puar.13293
[12] Conroy, S. A., Roumpi, D., Delery, J. E., & Gupta, N. (2022). Pay Volatility and Employee Turnover in the Trucking Industry. Journal of Management, 48, 605-629.
https://doi.org/10.1177/01492063211019651
[13] Conway, N., & Coyle-Shapiro, J. A.-M. (2012). The Reciprocal Relationship between Psychological Contract Fulfilment and Employee Performance and the Moderating Role of Perceived Organizational Support and Tenure. Journal of Occupational and Organizational Psychology, 85, 277-299.
https://doi.org/10.1111/j.2044-8325.2011.02033.x
[14] Cram, W. A., Wiener, M., Tarafdar, M., & Benlian, A. (2022). Examining the Impact of Algorithmic Control on Uber Drivers’ Technostress. Journal of Management Information Systems, 39, 426-453.
https://doi.org/10.1080/07421222.2022.2063556
[15] Crowston, K., & Bolici, F. (2019). Impacts of Machine Learning on Work. In Proceedings of the 52nd Hawaii International Conference on System Sciences (pp. 5961-5970).
https://doi.org/10.24251/HICSS.2019.719
[16] D’Arcy, J., Herath, T., & Shoss, M. K. (2014). Understanding Employee Responses to Stressful Information Security Requirements: A Coping Perspective. Journal of Management Information Systems, 31, 285-318.
https://doi.org/10.2753/MIS0742-1222310210
[17] De Fine Licht, K., & De Fine Licht, J. (2020). Artificial Intelligence, Transparency, and Public Decision-Making. AI & Society, 35, 917-926.
https://doi.org/10.1007/s00146-020-00960-w
[18] Diakopoulos, N., & Koliska, M. (2017). Algorithmic Transparency in the News Media. Digital Journalism, 5, 809-828.
https://doi.org/10.1080/21670811.2016.1208053
[19] Diamond, S. S., & Zeisel, H. (1978). Review of Procedural Justice: A Psychological Analysis. Duke Law Journal, 1977, 1289-1296.
https://doi.org/10.2307/1371953
[20] Díaz-Soloaga, P., & Díaz-Soloaga, A. (2022). Forced Telecommuting during the COVID-19 Lockdown: The Impact on Corporate Culture in Spain and Kazakhstan. Corporate Communications: An International Journal, 28, 193-212.
https://doi.org/10.1108/CCIJ-02-2022-0018
[21] Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm Aversion: People Erroneously Avoid Algorithms after Seeing Them Err. Journal of Experimental Psychology: General, 144, 114-126.
https://doi.org/10.1037/xge0000033
[22] Edwards, J. R., & Lambert, L. S. (2007). Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis. Psychological Methods, 12, 1-22.
https://doi.org/10.1037/1082-989X.12.1.1
[23] Epitropaki, O. (2013). A Multi-Level Investigation of Psychological Contract Breach and Organizational Identification through the Lens of Perceived Organizational Membership: Testing a Moderated-Mediated Model. Journal of Organizational Behavior, 34, 65-86.
https://doi.org/10.1002/job.1793
[24] Eslami, M., Krishna Kumaran, S. R., Sandvig, C., & Karahalios, K. (2018). Communicating Algorithmic Process in Online Behavioral Advertising. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-13). Association for Computing Machinery.
https://doi.org/10.1145/3173574.3174006
[25] Felzmann, H., Fosch-Villaronga, E., Lutz, C., & Tamò-Larrieux, A. (2020). Towards Transparency by Design for Artificial Intelligence. Science and Engineering Ethics, 26, 3333-3361.
https://doi.org/10.1007/s11948-020-00276-4
[26] Fieseler, C., Bucher, E., & Hoffmann, C. P. (2019). Unfairness by Design? The Perceived Fairness of Digital Labor on Crowdworking Platforms. Journal of Business Ethics, 156, 987-1005.
https://doi.org/10.1007/s10551-017-3607-2
[27] Gal, U., Jensen, T. B., & Stein, M.-K. (2020). Breaking the Vicious Cycle of Algorithmic Management: A Virtue Ethics Approach to People Analytics. Information and Organization, 30, Article 100301.
https://doi.org/10.1016/j.infoandorg.2020.100301
[28] Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies: Essays on Communication, Materiality, and Society. The MIT Press.
https://doi.org/10.7551/mitpress/9780262525374.001.0001
[29] Harms, P. D., & Han, G. (2019). Algorithmic Leadership: The Future Is Now. Journal of Leadership Studies, 12, 74-75.
https://doi.org/10.1002/jls.21615
[30] Hobfoll, S. E. (1989). Conservation of Resources: A New Attempt at Conceptualizing Stress. American Psychologist, 44, 513-524.
https://doi.org/10.1037/0003-066X.44.3.513
[31] Höddinghaus, M., Sondern, D., & Hertel, G. (2021). The Automation of Leadership Functions: Would People Trust Decision Algorithms? Computers in Human Behavior, 116, Article 106635.
https://doi.org/10.1016/j.chb.2020.106635
[32] Homans, G. C. (1958). Social Behavior as Exchange. American Journal of Sociology, 63, 597-606.
https://doi.org/10.1086/222355
[33] Jarrahi, M. H., Newlands, G., Lee, M. K., Wolf, C. T., Kinder, E., & Sutherland, W. (2021). Algorithmic Management in a Work Context. Big Data & Society, 8.
https://doi.org/10.1177/20539517211020332
[34] Kim, K., & Moon, S.-I. (2021). When Algorithmic Transparency Failed: Controversies over Algorithm-Driven Content Curation in the South Korean Digital Environment. American Behavioral Scientist, 65, 847-862.
https://doi.org/10.1177/0002764221989783
[35] Kitchin, R. (2014). The Real-Time City? Big Data and Smart Urbanism. GeoJournal, 79, 1-14.
https://doi.org/10.1007/s10708-013-9516-8
[36] Lang, J. J., Yang, L. F., Cheng, C., Cheng, X. Y., & Chen, F. Y. (2023). Are Algorithmically Controlled Gig Workers Deeply Burned Out? An Empirical Study on Employee Work Engagement. BMC Psychology, 11, Article No. 354.
https://doi.org/10.1186/s40359-023-01402-0
[37] Lapostol Piderit, J. P., Garrido Iglesias, R., & Hermosilla Cornejo, M. P. (2023). Algorithmic Transparency from the South: Examining the State of Algorithmic Transparency in Chile’s Public Administration Algorithms. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (pp. 227-235). Association for Computing Machinery.
https://doi.org/10.1145/3593013.3593991
[38] Lee, M. K., Jain, A., Cha, H. J., Ojha, S., & Kusbit, D. (2019). Procedural Justice in Algorithmic Fairness: Leveraging Transparency and Outcome Control for Fair Algorithmic Mediation. Proceedings of the ACM on Human-Computer Interaction, 3, 1-26.
https://doi.org/10.1145/3359284
[39] Liu, B., & Wei, L. (2021). Machine Gaze in Online Behavioral Targeting: The Effects of Algorithmic Human Likeness on Social Presence and Social Influence. Computers in Human Behavior, 124, Article 106926.
https://doi.org/10.1016/j.chb.2021.106926
[40] Liu, N. T. Y., Kirshner, S. N., & Lim, E. T. K. (2023). Is Algorithm Aversion WEIRD? A Cross-Country Comparison of Individual-Differences and Algorithm Aversion. Journal of Retailing and Consumer Services, 72, Article 103259.
https://doi.org/10.1016/j.jretconser.2023.103259
[41] Liu, W., He, C., Jiang, Y., Ji, R., & Zhai, X. (2020). Effect of Gig Workers’ Psychological Contract Fulfillment on Their Task Performance in a Sharing Economy—A Perspective from the Mediation of Organizational Identification and the Moderation of Length of Service. International Journal of Environmental Research and Public Health, 17, Article 2208.
https://doi.org/10.3390/ijerph17072208
[42] Lo, S., & Aryee, S. (2003). Psychological Contract Breach in a Chinese Context: An Integrative Approach. Journal of Management Studies, 40, 1005-1020.
https://doi.org/10.1111/1467-6486.00368
[43] Mael, F., & Ashforth, B. E. (1992). Alumni and Their Alma Mater: A Partial Test of the Reformulated Model of Organizational Identification. Journal of Organizational Behavior, 13, 103-123.
https://doi.org/10.1002/job.4030130202
[44] Mahmud, H., Islam, A. K. M. N., Ahmed, S. I., & Smolander, K. (2022). What Influences Algorithmic Decision-Making? A Systematic Literature Review on Algorithm Aversion. Technological Forecasting and Social Change, 175, Article 121390.
https://doi.org/10.1016/j.techfore.2021.121390
[45] Masterson, S. S., & Stamper, C. L. (2003). Perceived Organizational Membership: An Aggregate Framework Representing the Employee-Organization Relationship. Journal of Organizational Behavior, 24, 473-490.
https://doi.org/10.1002/job.203
[46] Mittelstadt, B., Russell, C., & Wachter, S. (2019). Explaining Explanations in AI. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 279-288). Association for Computing Machinery.
https://doi.org/10.1145/3287560.3287574
[47] Nasirpouri Shadbad, F., & Biros, D. (2020). Technostress and Its Influence on Employee Information Security Policy Compliance. Information Technology & People, 35, 119-141.
https://doi.org/10.1108/ITP-09-2020-0610
[48] Pasquale, F. (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press.
https://doi.org/10.4159/harvard.9780674736061
[49] Price, W. N. (2018). Big Data and Black-Box Medical Algorithms. Science Translational Medicine, 10, eaao5333.
https://doi.org/10.1126/scitranslmed.aao5333
[50] Rader, E., Cotter, K., & Cho, J. (2018). Explanations as Mechanisms for Supporting Algorithmic Transparency. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-13). ACM.
https://doi.org/10.1145/3173574.3173677
[51] Ramesh, R., Ananthram, S., Vijayalakshmi, V., & Sharma, P. (2021). Technostressors—A Boon or Bane? Toward an Integrative Conceptual Model. Journal of Indian Business Research, 14, 278-300.
https://doi.org/10.1108/JIBR-10-2021-0348
[52] Rani, U., & Furrer, M. (2021). Digital Labour Platforms and New Forms of Flexible Work in Developing Countries: Algorithmic Management of Work and Workers. Competition & Change, 25, 212-236.
https://doi.org/10.1177/1024529420905187
[53] Rank, J., Carsten, J. M., Unger, J. M., & Spector, P. E. (2007). Proactive Customer Service Performance: Relationships with Individual, Task, and Leadership Variables. Human Performance, 20, 363-390.
[54] Restubog, S. L. D., Hornsey, M. J., Bordia, P., & Esposo, S. R. (2008). Effects of Psychological Contract Breach on Organizational Citizenship Behaviour: Insights from the Group Value Model. Journal of Management Studies, 45, 1377-1400.
https://doi.org/10.1111/j.1467-6486.2008.00792.x
[55] Rodwell, J., Ellershaw, J., & Flower, R. (2015). Fulfill Psychological Contract Promises to Manage In-Demand Employees. Personnel Review, 44, 689-701.
https://doi.org/10.1108/PR-12-2013-0224
[56] Rousseau, D. M. (1990). New Hire Perceptions of Their Own and Their Employer’s Obligations: A Study of Psychological Contracts. Journal of Organizational Behavior, 11, 389-400.
https://doi.org/10.1002/job.4030110506
[57] Sandvig, C., Hamilton, K., Karahalios, K., Langbort, & C. (2014) Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms. In 64th Annual Meeting of the International Communication Association.
[58] Shah, M. U., Rehman, U., Parmar, B., & Ismail, I. (2023). Effects of Moral Violation on Algorithmic Transparency: An Empirical Investigation. Journal of Business Ethics.
https://doi.org/10.1007/s10551-023-05472-3
[59] Shahriari, K., & Shahriari, M. (2017). IEEE Standard Review—Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems. In 2017 IEEE Canada International Humanitarian Technology Conference (IHTC) (pp. 197-201). IEEE.
https://doi.org/10.1109/IHTC.2017.8058187
[60] She, S., Xu, H., Wu, Z., Tian, Y., & Tong, Z. (2020). Dimension, Content, and Role of Platform Psychological Contract: Based on Online Ride-Hailing Users. Frontiers in Psychology, 11, Article 2097.
https://doi.org/10.3389/fpsyg.2020.02097
[61] Shin, D., Zhong, B., & Biocca, F. A. (2020). Beyond User Experience: What Constitutes Algorithmic Experiences? International Journal of Information Management, 52, Article 102061.
https://doi.org/10.1016/j.ijinfomgt.2019.102061
[62] Springer, A., & Whittaker, S. (2020). Progressive Disclosure: When, Why, and How Do Users Want Algorithmic Transparency Information? ACM Transactions on Interactive Intelligent Systems, 10, 1-32.
https://doi.org/10.1145/3374218
[63] Stamper, C. L., Masterson, S. S., & Knapp, J. (2009). A Typology of Organizational Membership: Understanding Different Membership Relationships through the Lens of Social Exchange. Management and Organization Review, 5, 303-328.
https://doi.org/10.1111/j.1740-8784.2009.00147.x
[64] Stohl, C., Stohl, M., & Leonardi, P. M. (2016). Digital Age | Managing Opacity: Information Visibility and the Paradox of Transparency in the Digital Age. International Journal of Communication, 10, 123-137.
[65] Tajfel, H. (Ed.) (1978). Differentiation between Social Groups: Studies in the Social Psychology of Intergroup Relations. Academic Press.
[66] Tarafdar, M., Pullins, E., Bolman, & Ragu-Nathan, T. S. (2015). Technostress: Negative Effect on Performance and Possible Mitigations. Information Systems Journal, 25, 103-132.
https://doi.org/10.1111/isj.12042
[67] Tarafdar, M., Tu, Q., Ragu-Nathan, B. S., & Ragu-Nathan, T. S. (2007). The Impact of Technostress on Role Stress and Productivity. Journal of Management Information Systems, 24, 301-328.
https://doi.org/10.2753/MIS0742-1222240109
[68] Tomprou, M., & Lee, M. K. (2022). Employment Relationships in Algorithmic Management: A Psychological Contract Perspective. Computers in Human Behavior, 126, Article 106997.
https://doi.org/10.1016/j.chb.2021.106997
[69] Turnley, W. H., Bolino, M. C., Lester, S. W., & Bloodgood, J. M. (2003). The Impact of Psychological Contract Fulfillment on the Performance of In-Role and Organizational Citizenship Behaviors. Journal of Management, 29, 187-206.
https://doi.org/10.1177/014920630302900204
[70] Wang, C., Chen, J., & Xie, P. (2022). Observation or Interaction? Impact Mechanisms of Gig Platform Monitoring on Gig Workers’ Cognitive Work Engagement. International Journal of Information Management, 67, Article 102548.
https://doi.org/10.1016/j.ijinfomgt.2022.102548
[71] Wiener, M., Cram, W. A., & Benlian, A. (2023). Algorithmic Control and Gig Workers: A Legitimacy Perspective of Uber Drivers. European Journal of Information Systems, 32, 485-507.
https://doi.org/10.1080/0960085X.2021.1977729
[72] Wu, C.-M., & Chen, T.-J. (2015). Psychological Contract Fulfillment in the Hotel Workplace: Empowering Leadership, Knowledge Exchange, and Service Performance. International Journal of Hospitality Management, 48, 27-38.
https://doi.org/10.1016/j.ijhm.2015.04.008
[73] Yang, Q., & Pitafi, A. H. (2023). A Moderated Mediation Investigation of the Influence of Enterprise Social Media Visibility on Work Stress. Acta Psychologica, 241, Article 104084.
https://doi.org/10.1016/j.actpsy.2023.104084
[74] You, S., Yang, C. L., & Li, X. (2022). Algorithmic VErsus Human Advice: Does Presenting Prediction Performance Matter for Algorithm Appreciation? Journal of Management Information Systems, 39, 336-365.
https://doi.org/10.1080/07421222.2022.2063553
[75] Young, M. M., Bullock, J. B., & Lecy, J. D. (2019). Artificial Discretion as a Tool of Governance: A Framework for Understanding the Impact of Artificial Intelligence on Public Administration. Perspectives on Public Management and Governance, 2, 301-313.
https://doi.org/10.1093/ppmgov/gvz014
[76] Zagenczyk, T. J., Gibney, R., Few, W. T., & Scott, K. L. (2011). Psychological Contracts and Organizational Identification: The Mediating Effect of Perceived Organizational Support. Journal of Labor Research, 32, 254-281.
https://doi.org/10.1007/s12122-011-9111-z
[77] Zarsky, T. (2016). The Trouble with Algorithmic Decisions: An Analytic Road Map to Examine Efficiency and Fairness in Automated and Opaque Decision Making. Science, Technology, & Human Values, 41, 118-132.
https://doi.org/10.1177/0162243915605575
[78] Zhang, L., Yang, J., Zhang, Y., & Xu, G. (2023). Gig Worker’s Perceived Algorithmic Management, Stress Appraisal, and Destructive Deviant Behavior. PLOS ONE, 18, e0294074.
https://doi.org/10.1371/journal.pone.0294074
[79] Zhang, Z., Ye, B., Qiu, Z., Zhang, H., & Yu, C. (2022). Does Technostress Increase R&D Employees’ Knowledge Hiding in the Digital Era? Frontiers in Psychology, 13, Article 873846.
https://doi.org/10.3389/fpsyg.2022.873846

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.