Crossroads The ACM Magazine for Students

Sign In

Association for Computing Machinery

Magazine: Profile
Lessons from Ph.D. Fieldwork: A Conversation with Dr. Azra Ismail

Lessons from Ph.D. Fieldwork: A Conversation with Dr. Azra Ismail

By ,

Full text also available in the ACM Digital Library as PDF | HTML | Digital Edition

Tags: Biographies, Human-centered computing

back to top 

This conversation with Dr. Azra Ismail took place prior to her graduation from the Georgia Institute of Technology. It has been edited for clarity and brevity. Ismail reflects on her experiences navigating "development" contexts, artificial intelligence (AI) research, intersectional identities, and care ecologies, personally and professionally. Detailing her seven-plus years of research in diverse contexts in India, she shares the challenges, opportunities, and insights she gained and her aspirations for future research. It is our hope that this interview helps researchers interested in the topics of care, development, and AI learn from her experiences and learnings.

Vishal Sharma (VS): Thanks for taking the time to chat with me, Azra. To begin, would you like to introduce yourself briefly? What research do you do? What was your dissertation focus?

Azra Ismail (AI): First, very happy to be having this conversation. It's a nice opportunity to reflect on my work and the journey so far. I am a graduating Ph.D. from the Human-Centered Computing program at Georgia Tech, where I worked with Dr. Neha Kumar. At a high level, I see myself as a researcher in the field of human-computer interaction (HCI), which examines the interactions between humans and technology. More recently, HCI has also entailed considering the implications of technologies on people's lives from social and cultural perspectives. My work takes inspiration from that space to study and engage in the design of AI, and data-driven technologies more broadly, in public health. My dissertation focused on maternal care and childcare technologies in India [1, 2], specifically in urban Delhi and Mumbai [3, 4]. I was looking at the impact of AI technologies being introduced for communities that are on the margins, as well as for frontline workers who are often asked to collect data for such AI systems or use these systems as part of their everyday work. It's been an interesting journey so far; I've learned a lot along the way.

VS: You mentioned maternal care and childcare, which I think is a central focus of your work. Can you talk more about what care means in the context you have worked and explored?

AI: In public health, traditionally, much of the focus around maternal care has been on medical care and providing access to certain resources. I've worked with community health workers and nonprofit organizations like SWACH and ARMMAN in India. Their focus has not just been on things like door-to-door visits for care provision and counseling, but on sexual and reproductive well-being more broadly, including emotional support for women even before they get pregnant. My work looks at some of these affective components of caregiving. It also looks at the role of partners and family, especially community actors or community workers, as part of the whole process. What does it mean to create an environment, whether with humans or with these new technologies, that offers a care experience centering communities that may not otherwise get the support they need? I also draw a lot from feminist literature that talk about care and about caregiving. That's a huge element of my work and the perspective from which I enter the public health and technology space.

I focus a lot on frontline health workers in my research because these are women who understand these challenges and that context very deeply.

VS: That was my next question. How would you position yourself, your work, and where you're coming from in your research? How does that help you to explore this topic more in-depth?

AI: My starting point was with community health workers in communities in my hometown of Delhi. I began by doing fieldwork around my home in the Jamia area of Delhi, a Muslim-majority region. That's where a lot of my family is also based, so that environment has shaped how I think. I also grew up in the Middle East before I came to the U.S. So, it's been an interesting perspective for me because, on the one hand, I understand the context where this work takes place, but I also have the privilege of being somewhat removed from some of these challenges and struggles.

I often use the framing that I'm a third-world woman in the one-third world, which is a phrasing from Chandra Mohanty, a transnational feminist scholar [5]. That's been the position I've been coming from, a position of privilege, but also wanting to understand and support the communities I strongly connected with and grew up in.

VS: Did it ever create any hurdle or problem for you when you wanted to study a community that is close to you and you know about it, but at the same time, you want to enter without biases or assumptions?

AI: I started with a community in Delhi that I was familiar with and since then, I've worked with other communities. Much of my dissertation work took place in Mumbai, with communities similarly marginalized but also with differences in language, elements of culture, and just the background that the communities are from. It's been interesting to compare what it's been like to do fieldwork in those two settings. The challenges have been similar in those places, especially with workers being overburdened and underpaid, with community members having similar struggles with accessing care, with women facing challenges with gendered access to technology or accessing care because of limited mobility. Those elements have been common across contexts, giving me a starting point and language to connect with people.

But there are also differences in simple things like where I am based when doing that research. In Delhi, I could just be living in my home. Then it was a five-minute walk to these field sites. In Mumbai, it was very different, where I may be located somewhere else and then coming into the community using public or other transportation. To enter the communities, even that element of access was very different. The language is different. So those pieces came to the fore when I looked at these different contexts.

My work takes inspiration from that space to study and engage in the design of AI, and data-driven technologies more broadly, in public health.

The other thing I have noticed is that it is easier to connect with some communities than others. As a Muslim woman, I found it very easy to connect with Muslim communities in general across different settings. I've tried to reflect on why that is and have seen that the differences are visible in something as simple as sharing my name. I had this situation in the field when I was traveling with a male translator in rural Maharashtra, and he asked me my name and background, and he was like, "Oh, are you Mohammedan?" and I said, "Yes." And then there were all sorts of things he said, all kinds of biases that came up. There were also a lot of frontline health workers who I was interfacing with as part of that particular trip. One particular health worker was Muslim, and [we] connected immediately. Communities too are very unused to seeing someone with a similar name or some sort of connection doing that kind of work. So, it's been very interesting for me and also very heartwarming. I remember in my fieldwork last summer with ARMMAN, which was in Mumbai, there was this one field worker I was interfacing with. She was a Muslim and said, "Oh, it's so nice to see a Muslim woman doing a Ph.D. in the U.S." That was heartwarming, but it also made me sad and somewhat guilty that there were so few in these spaces that having someone come in was such a big deal.

VS: Those are some wonderful experiences. Thanks for sharing. If there is one insight or one takeaway from your fieldwork in India over the last, I don't know, seven or so years, what would that be?

AI: One thing I keep coming back to is just this element of voice. What does it mean, not just to be able to express your voice, but for it to be heard and taken seriously? With care work, it becomes very clear that certain people have privileged voices. Typically, it is not the ones for whom that care experience is most important. So, women and health workers' voices often get suppressed. That has been somewhat frustrating to observe. Caregiving is so socially and culturally embedded that the role of gender shapes it, cultural practices shape it, and it can also be shaped by faith, caste, and class. You start to see how all of those things are interconnected, but finally, the communities that are experiencing multiple axes of oppression are the ones who have the least say. This has been a recurring theme and something I expect also happens in the U.S. We see this also in Atlanta, for instance, where the maternal mortality for Black women is three or four times that of white women—and that's across class, across different ages. Again, that pattern is repeated, where women who come from a certain demographic, their voices are just missing, or when they express their voice, it is not heard.

VS: What do you think could be the reasons behind these issues persisting in different contexts? In India, you mentioned women in frontline health are facing issues, and Black women in the U.S. also face similar issues. What could be the reason behind such issues across these contexts?

AI: I think it is very structural. In the U.S., you have a history of racism and discrimination, which the healthcare sector is not immune to, and it is the same in India. During Indira Gandhi's tenure as prime minister and the state of emergency she imposed in 1975, India had a massive sterilization program in an effort to do population control. Many women were sterilized even when they did not consent to it and did not want it. We still see some of the repercussions today, with mistrust in government healthcare systems. That is also a reflection of how little women's bodies are respected. The fact that you think you can sterilize a woman and dismiss her experience and emotions is concerning and disturbing. Those histories, unfortunately, are very embedded. These are very colonial histories that go back to how these public health structures have been set up and what they value, and maternal care and childcare is where these values become very visible.

It's also something you'll see in other public health spaces and in other social services as well. The history of discrimination in India, especially around caste and religion, is very much part of the system. You can see this in how many healthcare providers speak about their patients. I've had conversations where it's immediately clear that the provider is biased against a certain community. Often, they're unaware of it, or sometimes they may be aware of that bias, which significantly impacts the people they will be working with.

VS: This all sounds very depressing. While doing all of this work, how do you take care of yourself? How do you practice self-care?

AI: The most energizing element of my work has been the people I interact with and talk to. I focus a lot on frontline health workers in my research because these are women who understand these challenges and that context very deeply. They see themselves as change-makers to some extent. To the extent possible, they use their voice and try to look out for and advocate for their communities. Of course, there are constraints—they only have a certain level of power in the system. Sometimes, they are also forced to become agents of these kinds of histories of colonialism. But there is still a lot of activism these workers are expressing.

I understand the context where this work takes place, but I also have the privilege of being somewhat removed from some of these challenges and struggles.

We saw a lot of that during the pandemic, especially because more technologies for care work were introduced, and we had workers trying to advocate against some of those because they were being surveilled [6]. There was also a whole movement of workers across countries, in the U.S. and in many parts of the Global South, engaging in protests and strikes to express their frustration over how much they were being asked to work and how little they were getting paid. That is something that I find very powerful, and I draw a lot of energy from those movements. And then, of course, there are things that I have to do in terms of self-care. Making sure that I am taking care of myself and have time to process what I'm seeing, hearing, and feeling, and even simple things like having people with whom I can talk and process these experiences.

VS: You mentioned care. We talked about the development context. We briefly talked about colonial history in such contexts. Where do you think or how do you think technology fits these different buckets? You are an HCI researcher, so how do you think technology plays a role in this context you explored? From what angle were you trying to explore technology?

AI: In my work, I've been looking at technologies that have already been introduced and could be introduced in these settings. I'm trying to inform the design to the extent possible. On the one hand, there's a sense that these AI systems will be implemented no matter what. So especially in terms of seeing how governments are responding to AI, there is a strong focus right now on data and AI. So, these efforts are going to happen. To some extent, my role is to see how they can do the least harm and where they can be beneficial. What that has meant in practice is thinking about where these technologies can play a role in this large ecosystem. For workers, what are tools that they might be using, or maybe should not be using, as part of their workflows for community members? What are tools that could support them in their care journey? From a public health and policy perspective, what tools could support organizations or government bodies to make better decisions or highlight resources that certain communities might need? For example, one of the studies I've been doing is looking at the role of WhatsApp-based conversational agents in supporting workers as part of their video practices and co-designing that with workers. So really trying to think about how workers perceive such tools. Is that something they are open to using or should even be considering? If so, where are the places where you could support them?

With AI, we often want to have big outcomes, we want to reduce maternal mortality and child mortality. And often, that won't happen because those are structural problems. You need to have more health workers and put more resources in communities. But there are these other places where workers are overburdened. They are doing a lot of work. Some of the work can be automated. We can try to make more time for them to engage in care work instead of all this data work and more mundane work they must do. But that's not as exciting as saying that we're going to reduce maternal mortality, even if the outcomes and the goals are the same, even if the goal is finally for someone to have a better care experience overall. So that is something that folks building these systems need to be aware of.

VS: You are so close to finishing your Ph.D. What are the next steps, research questions, or ideas you want to explore with your work?

AI: I am interested in understanding the community perspective further. Much of my work has focused on workers as intermediaries and spokespersons, representing communities because they understand them. But with many of these tools being developed, the community voice is entirely missing. You're collecting data from them and trying to target them with these technologies, but there's no sense of whether these communities are even comfortable with their data being used for this purpose. Do they even know that this is going to happen? And to what extent are they deriving value from use of that data? So that's something that I want to look into, like how do we build more consensual systems? How do we center community voices as you're building these systems? It's also tricky because with AI-based systems, and especially machine learning-based systems in particular, what might happen is that you collect data in one location, and then you're building it and then testing somewhere else and deploying it somewhere else. So, what does it even mean for a community to have a voice?

I am interested in the interaction element with these tools, but not just looking at the interaction between a single tool and a single person. For instance, let's say you have a chatbot for a worker or community member. What does it mean for this tool to have a place as part of a care team or as part of an ecology? Maybe in a conversation with a chatbot and a community member, you want a community health worker or a family member, or someone else who's already part of that interaction, especially given that some of these tools could be spreading misinformation and other issues could be generated as a result. I am also interested in continuing to look at workers and the impact on their work, but I think that's a natural next step, given how many tools are being proposed for decision support.

VS: Of all the work you have done in the last couple of years, if given a chance to start from the beginning or to talk to six years younger Azra, what would be your advice to her?

AI: One thing that significantly shaped my Ph.D. was the pandemic. I had this idea starting my Ph.D. that I would work with this one community from start to finish. That's not what ended up happening. My advice would be to embrace the shifts that the future brings instead of trying to go in with a fixed idea of what you want to do and how you want to approach that research space. Keep yourself open and excited about building relationships with people and organizations and seeing where that takes you. That's what I ended up doing, anyway. I would encourage anyone starting their Ph.D. to embrace that uncertainty and that opportunity to build relationships with different actors. That's something we don't value enough.

back to top  References

[1] Ismail, A. et al. Imagining caring futures for frontline health work. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (2022).

[2] Thakkar, D., Ismail, A. et al. When is machine learning data good?: Valuing in public health datafication. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2022.

[3] Ismail, A. and Kumar, N. Empowerment on the margins: The online experiences of community health workers. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2019.

[4] Ismail, A. and Kumar, N. Engaging solidarity in data collection practices for community health. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (2018).

[5] Mohanty, C. T. "Under western eyes" revisited: Feminist solidarity through anticapitalist struggles. Signs: Journal of Women in Culture and Society 28, 2 (2003), 499–535.

[6] Srivastava, S. A million female frontline COVID workers in India earn just $40 a month. Now they're planning to strike. Bloomberg. Dec. 1, 2021; https://time.com/6125204/asha-covid-india

back to top  Authors

Vishal Sharma is a human-centered computing Ph.D. student in the School of Interactive Computing at the Georgia Institute of Technology. His research focuses on investigating how digital technologies can assist in enabling a transition to environmentally sustainable and socially just futures. Sharma is a graduate fellow at the Brook Byers Institute for Sustainable Systems at Georgia Tech and a finalist of the Engineering for Change Fellowship, Verein Deutscher Ingenieure. He is a member of the SIGCHI Sustainability Committee.

Azra Ismail is an incoming assistant professor in biomedical informatics at the School of Medicine at Emory University. She is a recent Ph.D. graduate in human-centered computing from the School of Interactive Computing at the Georgia Institute of Technology. Her dissertation research examined the use of AI systems for maternal and child health in India, considering implications for marginalized workers and communities. Ismail was a recipient of the Work in the Age of Intelligent Machines Doctoral Fellowship and the GVU Foley Scholarship at Georgia Tech. She is recognized in Forbes' 30 Under 30 - Asia - Social Impact (2023) for co-founding MakerGhat.

back to top 

Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.