XRDS

Crossroads The ACM Magazine for Students

Sign In

Association for Computing Machinery

Magazine: Features
Lessons from Workers' Inquiry: A Sociotechnical Approach to Audits of Algorithmic Management Systems

Lessons from Workers' Inquiry: A Sociotechnical Approach to Audits of Algorithmic Management Systems

By ,

Full text also available in the ACM Digital Library as PDF | HTML | Digital Edition

Tags: Employment issues, Management of computing and information systems, Surveys and overviews

back to top 

How can we hold the black-boxed algorithmic systems that impact our housing, employment, and financial prospects accountable? Audits—structured investigations of a system with the goal of achieving accountability—offer a pathway for scrutinizing black-boxed algorithmic systems. Despite the growing prevalence of algorithmic audit technology, an "accountability gap" persists. AI audits have largely failed to precipitate meaningful change or regulation of the systems they investigate [1,2,3,4].

A sociotechnical approach to auditing AI systems that takes the social, political, and economic contexts into account is crucial to closing this accountability gap. It should go beyond the technical evaluation of model performance to consider factors such as the scope of the audit, participation of stakeholders in the audit process, methodologies utilized, and impact realized. This approach allows us to consider the role of social systems, in which AI models are embedded, in shaping the efficacy and success of audits.

back to top  Algorithmic Management Systems Create Visual Asymmetries

Ride-hail drivers, couriers, sex workers, influencers, freelancers, and journalists may seem like workers who are very different from each other. However, all of these workers increasingly find themselves working on platforms—Uber, Deliveroo, OnlyFans, Instagram—that use algorithms to make decisions about who gets work, what work they can get, and how that work gets rewarded. In other words, all these workers are subject to algorithmic management. Algorithmic management is not unique to these workers; nearly all workplaces now feature some form of algorithmic management like resume screening services and workplace productivity trackers. These algorithms take in massive amounts of data about workers (keystrokes, location data, behavioral data, etc.) and use this information to shape the environment in which they labor. Algorithmic management systems have an immense amount of visibility into workers, but how they function and are integrated into decision-making processes is poorly understood. In other words, algorithms at work are largely black-boxed making it difficult to understand how they contribute to potential violations of labor standards.

To hold algorithmic systems accountable, researchers and activists need to know about peoples' experience within those systems; they need visibility into algorithmic systems. But what should investigators be looking for? Who is best positioned to determine this? Who should do the monitoring?

back to top  Labor's Rich History of Monitoring from Below

The idea that management technologies in the workplace should be monitored is not new. Since the late 1800s, workers leveraged surveys and data visualizations by conducting studies called workers' inquiries to understand how their labor was being impacted by management technologies. These inquiries allowed workers to surface concerns they had about the impacts of management technologies on their working conditions and guide subsequent union-led, large-scale investigations. For example, the United Electrical, Radio and Machine Workers of America (UE) realized the pace of work was increasing during World War II due to increased wartime demand and began calling for systematic documentation of management practices to oversee productivity rates [5]. Worker observations about the changing pace of work informed union efforts to raise awareness about and systematically investigate the management technologies of concern.

As management technologies started to change not only the pace of work but also the organization of work, unions realized they needed to engage in scientific inquiry to understand what these technologies were, how they functioned, and how they impacted the labor process. Powerful unions like UE and the International Ladies Garment Workers Union invested significant resources in funding internal management sciences departments to evaluate how they worked independently. By using the same technologies that employers leveraged, unions were able to speak the same quantitative language as management; workers were able to engage in what Vera Khovanskaya calls "data rhetoric" when it came to bargaining for better working conditions [2]. Workers' inquiries enabled them to construct a counter-visuality [6] of the workplace that could be used to contest management's accounts.

Importantly, unions were able to leverage the counter-visualities they created to bargain with employers for material changes in their working conditions. Union bargaining power depended in large part not on their access to data and their ability to engage in data rhetoric, but rather on the political and economic power they held at the time. During the early to mid 1900s unions in the U.S. were powerful political and economic entities. Their members comprised a highly-skilled and essential manufacturing workforce that companies needed to meet increased war time (and postwar) demand. Unions had the power to make demands of employers. Moreover, that period was relatively worker friendly–the working class was seen as a valued part of American society. This combination of political climate and union power was key to effective bargaining around management technologies.


A key element of equitable design work with community members is allowing them to be active participants in the direction of research.


Union-led counter-visuality efforts relied on two key factors: 1. the observability of management technologies needed to create counter-visualities and 2. the possession of significant economic and political power to take conduct and take action upon investigation findings. However, management technologies today, like algorithmic management systems, are less easily observable. Additionally, unions have far less power and influence than they did in the 20th century. How then can we investigate and take action upon contemporary, black-boxed management technologies given the resourcing available to unions today?

back to top  What Does Counter-Visuality of an Algorithm Look Like?

Counter-visuality of algorithms are alternative narratives about how an algorithm functions and impacts users. In the platform economy, platforms exert a monopoly over the narrative about the impacts of their algorithmic systems because they are the only ones with access to these systems. This enclosure makes it difficult for workers and external researchers to investigate the impacts of these systems and create counter-visualities. Moreover, because many platform companies classify their workers as independent contractors, they effectively hamstring workers' ability to collectively organize and develop the capacities to strategically share information. In response to platforms' data enclosure, rideshare workers and their allies use crowdsourced audits to increase visibility by collecting data from end-users about algorithmic system outputs, such as pay stubs, task offerings, and bonuses.

These crowdsourced audits are often loosely organized, many taking place through informal information sharing on online social networks. For example, in r/uberdrivers and r/lyftdrivers, two of the most popular rideshare subreddits, drivers share screenshots of their paystubs and ride offers to lament about low wages and poor working conditions. While this collective information sharing encourages the construction of shared identity and solidarity, it also prevents more systematic, structured data aggregation and analysis. It is difficult for drivers to read through hundreds of posts with screenshots from dozens of work regions and come up with cohesive ideas about how the algorithm manages their work. In addition, the pseudonymous nature of online forums can lead to non-constructive interactions between drivers trying to hypothesize about the effects of algorithmic pay systems [7]. Efforts by ride-share drivers to systematically collect screenshots and analyze their content require time, technical resources, and expertise.

Other more systematic efforts, like the Workers Info Exchange's use of subject access data requests (SARs) to collect structured information about platforms' algorithmic outputs, encountered issues with platform compliance and response rate. Leveraging legal regimes, like SARs, are limited in efficacy because they rely on platforms' compliance and lack a strong enforcement mechanism. A myriad of third-party software, such as Drivers Seat Cooperative and WeClockIt, have sprung up to empower workers to collect data about algorithmic outputs. However, the majority of these efforts are focused on providing workers with (valuable) individual level insights, like patterns in pay and working hours, about the algorithmic systems. Counter-visualities of algorithms require data in aggregate. Think of algorithmic counter-visualities as an image. A single pixel (e.g., individual workers' data point) would tell you very little about what the overall image looks like–thousands of pixels, of data points, are needed to make a cohesive image.

back to top  Creating and Taking Action with Counter-Visualities through Sociotechnical Audits

HCI researchers are well-positioned to develop technologies that enable the creation of counter-visualities through large-scale systematic data collection. Drawing upon HCI's rich history of supporting crowdsourced activities and structured information sharing, my collaborators and I started the Workers Algorithm Observatory (WAO) in 2023. Our goal is to create the infrastructure that facilitates structured information aggregation and analysis of algorithmic management systems. While the history of workers creating counter-visuality inspired us, we soon discovered there are important differences between the labor landscape then and now that limit the possibilities of creating worker inquiries in the context of algorithmic management.


To hold algorithmic systems accountable, researchers and activists need to know about peoples' experience within those systems.


HCI research is about building trust as much as it is about building systems. A key difference between the historical tradition of workers' investigations that inspired the WAO and our contemporary efforts is that historically this type of research was conducted by unions and not external academics. Who does the research is an extremely important factor, especially in social justice-oriented research [3]. When unions had internal management science departments to conduct investigations into the impacts of management technologies, there was a clear alignment between the researchers' goals and the unions' goals. Unions, and their members, could trust these internal researchers because they were a part of the community they studied. External academic researchers, however, must negotiate access to the communities they work with and build trust.

Building trust requires time and investment. Early career graduate students seeking to support social justice movements through technological interventions should first engage in field-work to understand the movement and key players. I began my journey to support labor rights movements through academic research in 2021. I did a year-long volunteer data analysis project with Colorado Independent Drivers United (CIDU), a rideshare driver union based in Denver. Through this volunteer work I grew relationships with labor organizers, developed an understanding of what data they believe was valuable, and demonstrated my commitment to the labor rights movement. When we started the WAO in 2023, we were able to tap into the goodwill I had built with CIDU to support our research effort.

HCI researchers should have a clear answer to "Why would I participate in this study?" Given the numerous calls for increased data access in the platform economy, we initially thought it would be easy to solicit gig worker participation in the WAO project. News articles, research papers, and policy reports tied access to data about algorithmic management systems to the advancement of gig worker rights. However, it was (and still is) an arduous process to persuade gig workers to participate in data collection efforts. Throughout this research project, we fielded questions from workers about why they should participate in crowdsourced data collection efforts. Workers told us they already used numerous tools to collect data to inform their work strategies and they didn't need yet another tool that they would have to learn how to use. This led the research team to reflect on the question of "who is data about algorithmic management systems useful for?" Upon discussion with our community partners, we concluded worker organizations, rather than individual workers, had both the need for and capacity to engage with aggregated data about algorithmic management systems. Thus, we shifted our research subjects from individual workers to labor unions. This subtle shift in who we were designing for was crucial to designing more effective data collection campaigns. Using the framing that workers were participating in data collection to provide evidence for labor unions' legislative efforts to fight for better gig working conditions helped us to go from 10 study participants to more than 200.

Data is a starting point, not an ending point. Once data collection was underway with our community partners, we thought we had overcome the hardest part of our research–promoting tool adoption. After all, the goal of most academic research is to deploy a probe, gather data, analyze the resultant information, and write up a paper. We finally deployed a successful data collection campaign and gathered detailed information for 200,000-plus rideshare and delivery trips. However, the core motivation behind the WAO was to leverage crowdsourced data to support labor organizing efforts. So what chould we do with the crowd-sourced data we collected that could support labor organizing efforts?

When we presented initial findings to our community partners about how much data we collected and high-level statistics about the data set, they were impressed but unsure of how to move forward with this new information in their organizing efforts. We had to find ways to build connections between the data we collected and the organizing goals of our community partners. To do so, we held open-ended discussions with our community partners about their current organizing goals, what role they believed data could play in supporting those goals, and their existing data practices.

Through these conversations, we were able to create reports for each of our partners that used their organizing goals as a guiding framework for our data analysis. For example, CIDU is currently pushing a transparency bill through the Colorado State Senate where they call for rideshare companies to disclose the percentage of a customer fare that the platform takes (aka "take rate) to be shown upfront. To support their organizing efforts, we produced an in-depth analysis of the take rate for CIDU drivers that provided empirical evidence about the percentage platforms take. As participatory design research guidelines point out, a key element of equitable design work with community members is allowing them to be active participants in the direction of research. One way to support participation in research direction is by building pathways for community members to shape analyses based on their goals.

It is crucial to remember data collection is a starting point, not an ending point. When building technology to support justice movements, researchers must connect data to a material outcome for the movements they seek to support.

back to top  Futuring: Technically Supporting Social Needs in Labor Organizing

When we set out to revitalize the historic process of worker-led investigations into their working conditions, we believed the primary issue facing workers today was a technical one: They didn't have the necessary tools to collect, aggregate, and share information about their working conditions. In hindsight, it is clear the historic process of worker-led investigations relied on much more than access to data collection tools (like surveys, union-run research studies, etc.). These historical examples were largely successful due to the social infrastructures these investigations were embedded within. Unions overcame the cold-start problems with data collection not because they had the most frictionless research probes, but because they could connect the collection of data with change in material outcomes for participants. Data produced by workers supported unions' negotiations with management. Moreover, unions could govern the data analysis and narrative process because they were well-resourced enough to employ in-house researchers.

Contemporary efforts to support worker-led investigations must consider the social elements of research participation. How can we ensure our data collection methods are not only technically sound but also socially viable? At the WAO, we found how we frame the purposes of data collection, who the data collection will serve, and how it will benefit those donating their data to be vital. How can we support the sensemaking processes around data after data has been collected? Through our work, we found most labor organizations today lack the resources to engage with raw data. To support our partners' engagement with data we held open-ended discussions about how data ties into their organizing goals and re-framed our analyses to align with these goals.

back to top  Implications for HCI Research

For decades, HCI researchers have struggled with a persistent socio-technical gap in our interventions; there is always a gap between what we know we want to support socially and what we can support technically. This research with the WAO highlights some key strategies for attempting to close the socio-technical gap in HCI research by identifying how to build scaffolding to support users' social needs when interacting with technical systems. Our users exist in larger social, political, and economic infrastructures. We must think through how users interact with other actors in their ecosystem. How will they use our interventions in their interactions with other actors? How can we support these interactions?

back to top  References

[1] Birhane, A. et al. AI auditing: The broken bus on the road to AI accountability. arXiv preprint arXiv:2401.14462 (2024).

[2] Khovanskaya, V., and Sengers, P. Data rhetoric and uneasy alliances: Data advocacy in US labor history. In Proceedings of the 2019 on Designing Interactive Systems Conference. ACM, 2019.

[3] Pierre, J. et al. Getting ourselves together: Data-centered participatory design research & epistemic burden. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, 2021.

[4] Raji, I. D. et al. Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In Proceedings of the 2020 conference on Fairness, Accountability, and Transparency. ACM, 2020.

[5] United Electrical, Radio and Machine Workers of America. U. E. guide to wage payment plans, time study and job evaluation. 1943; https://catalog.hathitrust.org/Record/001431858

[6] Mirzoeff, N. The Right to Look. A Counterhistory of Visuality. Duke University Press, 2011.

[7] Watkins, E. A. "Have you learned your lesson?" Communities of practice under algorithmic competition. New Media & Society 24, 7 (2022), 1567–1590.

back to top  Author

Samantha Dalal is an information science Ph.D. student at CU Boulder. Her research identifies how digital entrepreneurs navigate, construct, and maintain socio-technical systems integral to their work routines. She specializes in community-based approaches to support workers' documentation of their existing working conditions and engage workers in the participatory design of more equitable work systems.

back to top 

xrds_ccby.gif This work is licensed under a Creative Commons Attribution International 4.0 License.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.