Beyond "Vulnerable Populations": A Unified Understanding of Vulnerability From A Socio-Ecological PerspectiveHCI and CSCW research has witnessed increasing efforts to address diversity and inclusion in research and design practice, as evidenced by the growing body of research with populations deemed as vulnerable, marginalized, or underserved. However, this work has been largely limited to a population-specific approach, i.e., identifying certain populations as vulnerable and gathering their individual experiences. Drawing primarily from human-centered security and privacy research, we identify three key challenges faced by this population-specific approach: (1) It is limited in addressing user diversity within the target population; (2) It may fail to capture the complex social reality of vulnerability; and (3) It runs the risk of perpetuating othering and stereotypes. To address these limitations, we propose a socio-ecological perspective on vulnerability adapted from the Ecological System Theory (EST). We argue that a socio-ecological perspective of vulnerability can guide researchers to look beyond static and stigmatizing definitions of vulnerability --- instead, focus on the situations, relations, and structures that lead to vulnerability, eventually enabling transferable knowledge of vulnerability across populations. We demonstrate how the socio-ecological lens maps onto existing work and generates new insights in the case of older adults' security and privacy, as well as its potential for being applied to other contexts such as reproductive privacy and responsible artificial intelligence. We end by providing concrete recommendations on how HCI and CSCW research can better operationalize vulnerability in scholarship and design practice.2025XTXinru Tang et al.Core Concepts in Privacy ResearchCSCW
Reimagining Wearable-Based Digital Contact Tracing: Insights from Kenya and Côte d’IvoireWhile digital contact tracing has been extensively studied in Western contexts, its relevance and application in Africa remain largely unexplored. This study focuses on Kenya and Côte d’Ivoire to uncover user perceptions and inform the design of culturally resonant contact tracing technologies. Utilizing a wearable proximity sensor as a technology probe, we conducted field studies with healthcare workers and community members in rural areas through interviews (𝑁 = 19) and participatory design workshops (𝑁 = 72). Our findings identify critical barriers to adoption, including low awareness, widespread misconceptions, and social stigma. The study emphasizes the need for culturally sensitive and discreet wearables and advocates for awareness campaigns over mandates to foster adoption. Our work addresses the unique needs of Kenyan and Ivorian populations, offering vital design recommendations and insights to guide designers and policymakers in enhancing digital contact tracing adoption across Africa.2025KNKavous Salehzadeh Niksirat et al.EPFL, School of Computer and Communication Sciences; University of Lausanne, Department of Information SystemsElectrical Muscle Stimulation (EMS)Developing Countries & HCI for Development (HCI4D)Participatory DesignCHI
It's Trying Too Hard To Look Real: Deepfake Moderation Mistakes and Identity-Based BiasOnline platforms employ manual human moderation to distinguish human-created social media profiles from deepfake-generated ones. Biased misclassification of real profiles as artificial can harm general users as well as specific identity groups; however, no work has yet systematically investigated such mistakes and biases. We conducted a user study (n=695) that investigates how 1) the identity of the profile, 2) whether the moderator shares that identity, and 3) components of a profile shown affect the perceived artificiality of the profile. We find statistically significant biases in people's moderation of LinkedIn profiles based on all three factors. Further, upon examining how moderators make decisions, we find they rely on mental models of AI and attackers, as well as typicality expectations (how they think the world works). The latter includes reliance on race/gender stereotypes. Based on our findings, we synthesize recommendations for the design of moderation interfaces, moderation teams, and security training.2024JMJaron Mink et al.University of Illinois at Urbana-ChampaignPrivacy by Design & User ControlPrivacy Perception & Decision-MakingDeepfake & Synthetic Media DetectionCHI
“Oh yes! over-preparing for meetings is my jam :)”: The Gendered Experiences of System AdministratorsIn the system and network administration domain, gender diversity remains a distant target. The experiences and perspectives of sysadmins who belong to marginalized genders (non cis men) are not well understood beyond the fact that sysadmin work environments are generally not equitable. We address this knowledge gap in our study by focusing on the ways in which sysadmins from marginalized genders manage their work in men-dominated sysadmin work spaces and by understanding what an inclusive workplace would look like. Using a feminist research approach, we engaged with a group of 16 sysadmins who are not cis men via six online focus groups. We found that managing the impact of gender identity in the sysadmin workplace means demonstrating excellence and going above and beyond in system administration tasks, and also requires performing additional care work not expected from cis men. Furthermore, our participants handle additional layers of work due to gender considerations and to actively find community in the workplace. To mitigate this additional workload, we recommend more care for care work. For future research, we recommend the use of feminist lenses when studying sysadmin work in order to provide more equitable solutions that ultimately contribute to improving system security by fostering a just workplace.2023MKMannat Kaur et al.Gender and CSCWCSCW
The Use and Non-Use of Technology During HurricanesHurricanes can cause catastrophic damage; it is critical for those affected to access information about conditions, loved ones, and resources. Prior work in the HCI and CSCW communities has focused on how social media can be vital during natural disasters; non-social media technologies have been under-researched. To understand how technology other than social media can support or harm people during crises, we explore hurricane survivors' *use and disuse* of multiple kinds of technologies in online surveys with 138 US participants. We find substantial technology use supporting survivors' comfort and safety *other than social media*. We also observe that designing technologies for high-resource environments--as with many mainstream apps--causes users to decrease use of potentially critical technologies during utility outages, which are common during hurricanes. With themes of both (a) broad technology use and (b) conditions preventing technology use, we make recommendations for technical design, policy, and research to empower communities susceptible to hurricanes.2023LSLucy Simko et al.CrisisCSCW
Beyond the Boolean: How Programmers Ask About, Use, and Discuss GenderCategorization via gender is omnipresent throughout society, and thus also computing; gender identity is often requested of users before they use software or web services. Despite this fact, no research has explored how software developers approach requesting gender disclosure from users. To understand how developers think about gender in software, we present an interview study with 15 software developers recruited from the freelancing platform Upwork as well as Twitter. We also collected and categorized 917 threads which contained keywords relevant to gender from programming related sub-forums on the social media service Reddit. 16~posts that discussed approaches to gender disclosure were further analyzed. We found that while some developers have an understanding of inclusive gender options, programmers rarely consider when gender data is necessary or the way in which they request gender disclosure from users. Our findings have implications for programmers, software engineering educators, and the broader community concerned with inclusivity.2023EBElijah Robert Bouma-Sims et al.Gender and CSCWCSCW
A World Full of Privacy and Security (Mis)conceptions? Findings of a Representative Survey in 12 CountriesMisconceptions about digital security and privacy topics in the general public frequently lead to insecure behavior. However, little is known about the prevalence and extent of such misconceptions in a global context. In this work, we present the results of the first large-scale survey of a global population on misconceptions: We conducted an online survey with n = 12,351 participants in 12 countries on four continents. By investigating influencing factors of misconceptions around eight common security and privacy topics (including E2EE, Wi-Fi, VPN, and malware), we find the country of residence to be the strongest estimate for holding misconceptions. We also identify differences between non-Western and Western countries, demonstrating the need for region-specific research on user security knowledge, perceptions, and behavior. While we did not observe many outright misconceptions, we did identify a lack of understanding and uncertainty about several fundamental privacy and security topics.2023FHFranziska Herbert et al.Ruhr University BochumPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCybersecurity Training & AwarenessCHI
Users Can Deduce Sensitive Locations Protected by Privacy Zones on Fitness Tracking AppsFitness tracking applications allow athletes to record and share their exercises online, including GPS routes of their activities. However, sharing mobility data potentially raises real-world privacy and safety risks. One strategy to mitigate that risk is a “Privacy Zone,” which conceals portions of the exercise routes that fall within a certain radius of a user-designated sensitive location. A pressing concern is whether privacy zones are an effective deterrent against common attackers, such as a bike thief that carefully scrutinizes online exercise activities in search of their next target. Further, little is known about user perceptions of privacy zones or how they fit into the broader landscape of available privacy precautions. This work presents an online user study (N=603) that investigates the privacy concerns of fitness tracking users and evaluates the efficacy of privacy zones. Participants were first asked about their privacy behaviors with respect to fitness tracking applications. Next, participants completed an interactive task in which they attempted to deduce hidden locations protected by a privacy zone; we manipulated the number of displayed exercise activities that interacted with the privacy zone, as well as its size. Finally, participants were asked further questions about their impressions of privacy zones and use of other privacy precautions. We found that participants successfully inferred protected locations; for the most common privacy zone size, 68% of guesses fell within 50 meters of the hidden location when participants were shown just 3 activities. Further, we found that participants who viewed 3 activities were more confident about their success in the task compared to participants who viewed 1 activity. Combined, these results indicate that users’ privacy-sensitive locations are at risk even when using a privacy zone. We conclude by considering the implications of our findings on related privacy features and discuss recommendations to fitness tracking users and services to improve the privacy and safety of fitness trackers.2022JMJaron Mink et al.University of Illinois at Urbana-ChampaignSleep & Stress MonitoringPrivacy by Design & User ControlCHI