A House Divided: How U.S. Politics Could Shape Contact-Tracing Adoption in Future PandemicsContact tracing has shown to be an effective tool in limiting the spread of transmittable diseases in countries where it is widely adopted. During the COVID-19 pandemic, contact tracing app adoption in the United States was low despite having the highest number of recorded cases worldwide. To better understand why, we conducted a survey (N=302, matched to U.S. census demographics) and found that political orientation overwhelmingly predicted attitudes towards COVID-19 and the adoption of contact tracing apps. These attitudes also overwhelmingly shaped people's willingness to participate in contact tracing for diseases in future pandemics. Our findings suggest that the politically charged environment surrounding COVID-19 in the U.S. may have a long-term impact on American's willingness to utilize contact tracing for diseases in future pandemics. We conclude with recommendations for technology designers and policymakers on how to overcome the sharp divide that has been driven by the political discourse in the U.S.2025GSGarrett Smith et al.Brigham Young UniversityAI Ethics, Fairness & AccountabilityContent Moderation & Platform GovernanceMisinformation & Fact-CheckingCHI
Trust and Visual Focus in Automated Vehicles: A Comparative Study of Beginner and Experienced DriversThis study investigated the relationship between trust in automation, gaze behavior, and driving performance in beginner and experienced drivers during a simulated driving session. Twenty participants completed a 17-minute drive across three conditions: manual driving, non-critical automated driving, and critical automated driving, with a non-driving-related task (NDRT) introduced between conditions to assess visual attention. Driving performance was evaluated using the Standard Deviation of Lateral Position (SDLP), and eye-tracking data in terms of mean gaze duration (MGD). While both groups demonstrated increased trust in the automated system post-session, beginners showed greater lateral position variability in critical conditions, suggesting over-reliance on automation. Eye-tracking analysis revealed significant changes in glance behavior across driving conditions, particularly in response to critical events. These findings highlight how driver experience shapes interactions with automated systems, emphasizing the importance of trust calibration in automated driving scenarios.2025RSRicha Singh et al.Tampere UniversityAutomated Driving Interface & Takeover DesignEye Tracking & Gaze InteractionCHI
"I Know I'm Being Observed:" Video Interventions to Educate Users about Targeted Advertising on FacebookRecent work explores how to educate and encourage users to protect their online privacy. We tested the efficacy of short videos for educating users about targeted advertising on Facebook. We designed a video that utilized an emotional appeal to explain risks associated with targeted advertising (fear appeal), and which demonstrated how to use the associated ad privacy settings (digital literacy). We also designed a version of this video which additionally showed the viewer their personal Facebook ad profile, facilitating personal reflection on how they are currently being profiled (reflective learning). We conducted an experiment (n = 127) in which participants watched a randomly assigned video and measured the impact over the following 10 weeks. We found that these videos significantly increased user engagement with Facebook advertising preferences, especially for those who viewed the reflective learning content. However, those who only watched the fear appeal content were more likely to disengage with Facebook as a whole.2024GSGarrett Smith et al.Brigham Young UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingDark Patterns RecognitionCHI
Scaffolding Ethics-Focused Methods for Practice ResonanceNumerous methods and tools have been proposed to motivate or support ethical awareness in design practice. However, many existing resources are not easily discoverable by practitioners, and are often framed using language that is not accessible or resonant with everyday practice. In this paper, we present three complementary strands of work with the goal of increasing the ability of design and technology practitioners to locate and activate methods to support ethically-focused work practices. We first constructed a set of empirically-supported "intentions" to frame practitioners' selection of relevant ethics-focused methods based on interviews with practitioners from a range of technology and design professions. We then leveraged these intentions in the design and iterative evaluation of a website that supports practitioners in identifying opportunities for ethics-focused action. Building on these findings, we propose a set of design considerations to evaluate the practice resonance of resources in supporting ethics-focused practice, laying the groundwork for increased ecological resonance of ethics-focused methods and method selection tools.2023CGColin M. Gray et al.AI Ethics, Fairness & AccountabilityParticipatory DesignPrototyping & User TestingDIS
Perceiving Affordances Differently: The Unintended Consequences When Young Autistic Adults Engage with Social MediaSocial media can facilitate numerous benefits, ranging from facilitating access to social, instrumental, financial, and other support, to professional development and civic participation. However, these benefits may not be generalizable to all users. Therefore, we conducted an ethnographic case study with eight Autistic young adults, ten staff members, and four parents to understand how Autistic users of social media engage with others, as well as any unintended consequences of use. We leveraged an affordances perspective to understand how Autistic young adults share and consume user-generated content, make connections, and engage in networked interactions with others via social media. We found that they often used a literal interpretation of digital affordances that sometimes led to negative consequences including physical harm, financial loss, social anxiety, feelings of exclusion, and inadvertently damaging their social relationships. We make recommendations for redesigning social media affordances to be more inclusive of neurodiverse users.2022XPXinru Page et al.Brigham Young UniversityCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Universal & Inclusive DesignCHI
To Disclose or Not to Disclose: Examining the Privacy Decision-Making Processes of Older vs. Younger AdultsTo understand the underlying process of users' information disclosure decisions, scholars often use either the privacy calculus framework or refer to heuristic shortcuts. It is unclear whether the decision process varies by age. Therefore, using these common frameworks, we conducted a web-based experiment with 94 participants, who were younger (ages 19-22) or older (65+) adults, to understand how perceived app trust, sensitivity of the data, and benefits of disclosure influence users disclose decisions. Younger adults were more likely to change their perception of data sensitivity based on trust, while older adults were more likely to disclose information based on perceived benefits of disclosure. These results suggest older adults made more rationally calculated decisions than younger adults, who made heuristic decisions based on app trust. Our findings negate the mainstream narrative that older adults are less privacy-conscious than younger adults; instead, older adults weigh the benefits and risks of information disclosure.2021RAReza Ghaiumy Anaraky et al.Clemson UniversityAging-Friendly Technology DesignPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
Risk vs. Restriction: The Tension between Providing a Sense of Normalcy and Keeping Foster Teens Safe OnlineFoster youth are particularly vulnerable to offline risks; yet, little is known about their online risk experiences or how foster parents mediate technology use in the home. We conducted 29 interviews with foster parents of 42 teens (ages 13-17) who were part of the child welfare system. Foster parents faced significant challenges relating to technology mediation in the home. Based on parental accounts, over half of the foster teens encountered high-risk situations that involved interacting with unsafe people online, resulting in rape, sex trafficking, and/or psychological harm. Overall, foster parents were at a loss for how to balance online safety with technology access in a way that engendered positive relationships with their foster teens. Instead, parents often resorted to outright restriction. Our research highlights the importance of considering the unique needs of foster families and designing technologies to address the challenges faced by this vulnerable population of teens and parents.2019KBKarla Badillo-Urquiola et al.University of Central FloridaCyberbullying & Online HarassmentEmpowerment of Marginalized GroupsTechnology Ethics & Critical HCICHI
Pragmatic Tool vs. Relational Hindrance: Exploring Why Some Social Media Users Avoid Privacy FeaturesSocial media privacy features can act as a mechanism for regulating interpersonal relationships, but why do some people not use these features? Through an interview study of 56 social media users, we found two high-level perspectives towards social media and privacy that affected attitudes towards and usage of privacy features. Some users took a pragmatic approach to using social media and felt comfortable using various privacy features as a tool to manage their social relationships (e.g., avoiding bothersome posts, not feeling compelled to interact). However, there were also users who viewed taking such privacy actions as a relational hindrance and were concerned how using certain features to meet their own needs would harm their relationships with others. Through a subsequent survey (N=320), we reveal how these two perspectives impact user behavior across four social media platforms (Facebook, Instagram, LinkedIn, Twitter). Users who viewed social media as a pragmatic tool indeed used privacy features more. On the other hand, users who focused on how privacy can serve as a relational hindrance avoided using these features and, instead, prioritized social engagement and took a more indirect approach to protecting their privacy. Furthermore, the results show how these perspectives vary by individual rather than by privacy feature. These findings demonstrate the need to consider different perspectives towards social media and privacy when trying to understand and design for user behavior.2019XPXinru Page et al.Privacy and SecurityCSCW
Moving beyond a “one-size fits all” approach: Exploring Individual Differences in PrivacyAs our lives become increasingly digitized, how people maintain and manage their networked privacy has become a formidable challenge for academics, practitioners, and policy-makers. A shift toward people- centered privacy initiatives has shown promise; yet many applications still adopt a “one-size fits all” approach, which fails to consider how individual differences in concerns, preferences, and behaviors shape how different people interact with and use technology. The main goal of this workshop is to highlight individual differences (e.g., age, culture, personal preference) that influence users’ experiences and privacy-related outcomes. We will work towards best practices for research, design, and online privacy regulation policies that consider these differences.2018DWDaricia Wilkinson et al.Clemson UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingAlgorithmic Fairness & BiasCHI
Bridging a Bridge: Bringing Two HCI Communities TogetherACM SIGCHI is the largest association for professionals in HCI that bridges computer science, information science, as well as the social and psychological sciences. Meanwhile, a parallel HCI community was formed in 2001 within the Association of Information Systems (AIS SIGHCI) community. While some researchers have already bridged these two HCI sub-disciplines, the history and core values of these respective fields are quite different, offering new insights for how we can move forward together to sustain the future of HCI research. The main goal of this workshop is to begin building a bridge between these two communities to maximize the relevance, rigor, and generalizability of HCI research.2018SDSoussan Djamasbi et al.Worcester Polytechnic InstituteUser Research Methods (Interviews, Surveys, Observation)Computational Methods in HCICHI
Workshop: Privacy in Context: Critically Engaging Theories to Guide Privacy Research ...Privacy has been a key research theme in the CSCW and HCI communities, but the term is often used in an ad hoc and fragmented way. This is likely due to the fact that privacy is a complex and multi-faceted concept. This one-day workshop will facilitate discourse around key privacy theories and frameworks that can inform privacy research with the goal of producing guidelines for privacy researchers on how and when to incorporate which theories into various aspects of their empirical privacy research. This will lay the groundwork to move the privacy field forward. To inspire participants and spark discussion, we will have a special keynote speaker, Dr. Helen Nissenbaum, engage with the audience about her renowned Contextual Integrity framework. Dr. Nissenbaum is a Professor of Information Science at Cornell Tech and her framework focuses on understanding privacy expectations and their implications.2018KBKarla Badillo-Urquiola et al.Workshop: Privacy in Context: Critically Engaging Theories to Guide Privacy Research ...CSCW