Interdisciplinary Approaches to Cybervulnerability Impact Assessment for Energy Critical Infrastructure As energy infrastructure becomes more interconnected, understanding cybersecurity risks to production systems requires integrating operational and computer security knowledge. We interviewed 18 experts working in the field of energy critical infrastructure to compare what information they find necessary to assess the impact of computer vulnerabilities on energy operational technology. These experts came from two groups: 1) computer security experts and 2) energy sector operations experts. We find that both groups responded similarly for general categories of information and displayed knowledge about both domains, perhaps due to their interdisciplinary work at the same organization. Yet, we found notable differences in the details of their responses and in their stated perceptions of each group’s approaches to impact assessment. Their suggestions for collaboration across domains highlighted how these two groups can work together to help each other secure the energy grid. Our findings inform the development of interdisciplinary security approaches in critical-infrastructure contexts.2024AGAndrea Gallardo et al.Carnegie Mellon UniversityCybersecurity Training & AwarenessIoT Device PrivacyCHI
Less is Not More: Improving Findability and Actionability of Privacy Controls for Online Behavioral AdvertisingTech companies that rely on ads for business argue that users have control over their data via ad privacy settings. However, these ad settings are often hidden. This work aims to inform the design of findable ad controls and study their impact on users’ behavior and sentiment. We iteratively designed ad control interfaces that varied in the setting's (1) entry point (within ads, at the feed’s top) and (2) level of actionability, with high actionability directly surfacing links to specific advertisement settings, and low actionability pointing to general settings pages (which is reminiscent of companies' current approach to ad controls). We built a Chrome extension that augments Facebook with our experimental ad control interfaces and conducted a between-subjects online experiment with 110 participants. Results showed that entry points within ads or at the feed’s top, and high actionability interfaces, both increased Facebook ad settings’ findability and discoverability, as well as participants' perceived usability of them. High actionability also reduced users' effort in finding ad settings. Participants perceived high and low actionability as equally usable, which shows it is possible to design more actionable ad controls without overwhelming users. We conclude by emphasizing the importance of regulation to provide specific and research-informed requirements to companies on how to design usable ad controls.2023JIJane Im et al.University of MichiganPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
Identifying User Needs for Advertising Controls on FacebookWe conducted an online survey and remote usability study to explore user needs related to advertising controls on Facebook and determine how well existing controls align with these needs. Our survey results highlight a range of user objectives related to controlling Facebook ads, including being able to select what ad topics are shown or what personal information is used in ad targeting. Some objectives are achievable with Facebook’s existing controls, but participants seemed to be unaware of them, suggesting issues of discoverability. In our remote usability study, participants noted areas in which the usability of Facebook's advertising controls could be improved, including the location, layout, and explanation of controls. Additionally, we found that users could be categorized into four groups based on their privacy concerns related to Facebook’s data collection practices, objectives for controlling their ad experience, and willingness to engage with advertising controls. Our findings provide a set of user requirements for advertising controls, applicable to Facebook as well as other platforms, that would better align such controls with consumers’ needs and expectations.2022HHHana Habib et al.Online Platforms; Online PlatformsCSCW
Understanding Challenges for Developers to Create Accurate Privacy Nutrition LabelsApple announced the introduction of app privacy details to their App Store in December 2020, marking the first ever real-world, large-scale deployment of the privacy nutrition label concept, which had been introduced by researchers over a decade earlier. The Apple labels are created by app developers, who self-report their app's data practices. In this paper, we present the first study examining the usability and understandability of Apple's privacy nutrition label creation process from the developer's perspective. By observing and interviewing 12 iOS app developers about how they created the privacy label for a real-world app that they developed, we identified common challenges for correctly and efficiently creating privacy labels. We discuss design implications both for improving Apple's privacy label design and for future deployment of other standardized privacy notices.2022TLTianshi Li et al.Carnegie Mellon University, Carnegie Mellon UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
"Okay, whatever": An Evaluation of Cookie Consent InterfacesMany websites have added cookie consent interfaces to meet regulatory consent requirements. While prior work has demonstrated that they often use dark patterns - design techniques that lead users to less privacy-protective options - other usability aspects of these interfaces have been less explored. This study contributes a comprehensive, two-stage usability assessment of cookie consent interfaces. We first inspected 191 consent interfaces against five dark pattern heuristics and identified design choices that may impact usability. We then conducted a 1,109-participant online between-subjects experiment exploring the usability impact of seven design parameters. Participants were exposed to one of 12 consent interface variants during a shopping task on a prototype e-commerce website and answered a survey about their experience. Our findings suggest that a fully-blocking consent interface with in-line cookie options accompanied by a persistent button enabling users to later change their consent decision best meets several design objectives.2022HHHana Habib et al.Carnegie Mellon UniversityPrivacy Perception & Decision-MakingDark Patterns RecognitionCHI
Toggles, Dollar Signs, and Triangles: How to (In)Effectively Convey Privacy ChoicesIncreasingly, icons are being proposed to concisely convey privacy-related information and choices to users. However, complex privacy concepts can be difficult to communicate. We investigate which icons effectively signal the presence of privacy choices. In a series of user studies, we designed and evaluated icons and accompanying textual descriptions (link texts) conveying choice, opting-out, and sale of personal information --- the latter an opt-out mandated by the California Consumer Privacy Act (CCPA). We identified icon-link text pairings that conveyed the presence of privacy choices without creating misconceptions, with a blue stylized toggle icon paired with "Privacy Options" performing best. The two CCPA-mandated link texts ("Do Not Sell My Personal Information" and "Do Not Sell My Info") accurately communicated the presence of do-not-sell opt-outs with most icons. Our results provide insights for the design of privacy choice indicators and highlight the necessity of incorporating user testing into policy making.2021HHHana Habib et al.Carnegie Mellon UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
``You Gotta Watch What You Say'': Surveillance of Communication with Incarcerated PeopleSurveillance of communication between incarcerated and non-incarcerated people has steadily increased, enabled partly by technological advancements. Third-party vendors control communication tools for most U.S. prisons and jails and offer surveillance capabilities beyond what individual facilities could realistically implement. Frequent communication with family improves mental health and post-carceral outcomes for incarcerated people, but does discomfort about surveillance affect how their relatives communicate with them? To explore this and the understanding, attitudes, and reactions to surveillance, we conducted 16 semi-structured interviews with participants who have incarcerated relatives. Among other findings, we learn that participants communicate despite privacy concerns that they felt helpless to address. We also observe inaccuracies in participants’ beliefs about surveillance practices. We discuss implications of inaccurate understandings of surveillance, misaligned incentives between end-users and vendors, how our findings enhance ongoing conversations about carceral justice, and recommendations for more privacy-sensitive communication tools.2021KOKentrell Owens et al.Carnegie Mellon University, University of WashingtonOnline Harassment & Counter-ToolsIoT Device PrivacyCHI
"It's a scavenger hunt": Usability of Websites' Opt-Out and Data Deletion ChoicesWe conducted an in-lab user study with 24 participants to explore the usefulness and usability of privacy choices offered by websites. Participants were asked to find and use choices related to email marketing, targeted advertising, or data deletion on a set of nine websites that differed in terms of where and how these choices were presented. They struggled with several aspects of the interaction, such as selecting the correct page from a site's navigation menu and understanding what information to include in written opt-out requests. Participants found mechanisms located in account settings pages easier to use than options contained in privacy policies, but many still consulted help pages or sent email to request assistance. Our findings indicate that, despite their prevalence, privacy choices like those examined in this study are difficult for consumers to exercise in practice. We provide design and policy recommendations for making these website opt-out and deletion choices more useful and usable for consumers.2020HHHana Habib et al.Carnegie Mellon UniversityPrivacy by Design & User ControlDark Patterns RecognitionCHI
Informing the Design of a Personalized Privacy Assistant for the Internet of ThingsInternet of Things (IoT) devices create new ways through which personal data is collected and processed by service providers. Frequently, end users have little awareness of, and even less control over, these devices' data collection. IoT Personalized Privacy Assistants (PPAs) can help overcome this issue by helping users discover and, when available, control the data collection practices of nearby IoT resources. We use semi-structured interviews with 17 participants to explore user perceptions of three increasingly more autonomous potential implementations of PPAs, identifying benefits and issues associated with each implementation. We find that participants weigh the desire for control against the fear of cognitive overload. We recommend solutions that address users' differing automation preferences and reduce notification overload. We discuss open issues related to opting out from public data collections, automated consent, the phenomenon of user resignation, and designing PPAs with at-risk communities in mind.2020JCJessica Colnago et al.Carnegie Mellon UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingIoT Device PrivacyCHI
Exploring How Privacy and Security Factor into IoT Device Purchase BehaviorDespite growing concerns about security and privacy of Internet of Things (IoT) devices, consumers generally do not have access to security and privacy information when purchasing these devices. We interviewed 24 participants about IoT devices they purchased. While most had not considered privacy and security prior to purchase, they reported becoming concerned later due to media reports, opinions shared by friends, or observing unexpected device behavior. Those who sought privacy and security information before purchase, reported that it was difficult or impossible to find. We asked interviewees to rank factors they would consider when purchasing IoT devices; after features and price, privacy and security were ranked among the most important. Finally, we showed interviewees our prototype privacy and security label. Almost all found it to be accessible and useful, encouraging them to incorporate privacy and security in their IoT purchase decisions.2019PEPardis Emami-Naeini et al.Carnegie Mellon UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
The Influence of Friends and Experts on Privacy Decision Making in IoT ScenariosAs increasingly many Internet-of-Things (IoT) devices collect personal data, users face more privacy decisions. Personal privacy assistants can provide social cues and help users make informed decisions by presenting information about how others have decided in similar cases. To better understand which social cues are relevant and whose recommendations users are more likely to follow, we presented 1000 online participants with nine IoT data collection scenarios. Some participants were told the percentage of experts or friends who allowed data collection in each scenario, while other participants were provided no social cue. At the conclusion of each scenario, participants were asked whether they would allow the described data collection. Our results indicate that when friends denied data collection, our participants were more influenced than when friends allowed data collection. On the other hand, participants were more influenced by experts when they allowed data collection.In addition, we observed that influence could get stronger or wear off over a repeated sequence of scenarios. For example, when experts and friends repeatedly allowed scenarios with clear risk or denied scenarios with clear benefits, participants were less likely to be influenced by them in subsequent scenarios.2018PNPardis Emami-Naeini et al.Privacy in Homes and GroupsCSCW
"It's not actually that horrible": Exploring Adoption of Two-Factor Authentication at a UniversityDespite the additional protection it affords, two-factor authentication (2FA) adoption reportedly remains low. To better understand 2FA adoption and its barriers, we observed the deployment of a 2FA system at Carnegie Mellon University (CMU). We explore user behaviors and opinions around adoption, surrounding a mandatory adoption deadline. Our results show that (a) 2FA adopters found it annoying, but fairly easy to use, and believed it made their accounts more secure; (b) experience with CMU Duo often led to positive perceptions, sometimes translating into 2FA adoption for other accounts; and, (c) the differences between users required to adopt 2FA and those who adopted voluntarily are smaller than expected. We also explore the relationship between different usage patterns and perceived usability, and identify user misconceptions, insecure practices, and design issues. We conclude with recommendations for large-scale 2FA deployments to maximize adoption, focusing on implementation design, use of adoption mandates, and strategic messaging.2018JCJessica Colnago et al.Carnegie Mellon UniversityPasswords & AuthenticationPrivacy Perception & Decision-MakingIoT Device PrivacyCHI
SIGCHI Social Impact Award Talk – Making Privacy and Security More UsableAs an industry researcher in 1997, I dove head first into the world of privacy when I joined an international working group that was developing a web privacy standard called the Platform for Privacy Preferences Project (P3P). Released in 2002, the P3P standard allowed websites to communicate about privacy in a computer-readable format that could be consumed by web browsers and other user agents to inform users and take actions on their behalf. I worked with (and eventually led) a team of technologists, lawyers, and policy makers, and became familiar with not only technologies for protecting and invading privacy, but also international privacy laws and self-regulatory privacy programs. As the working group debated details such as the precise definitions of data sharing and identifiable information, I came to the realization that we hadn’t thought much about how to make P3P tools usable. Indeed, usability issues had been largely ignored in the development of most security and privacy tools at that time. At the end of 2003, I moved on to academia and focused my research on usable privacy and security. Along with my students and colleagues, I conducted empirical studies to evaluate privacy and security tools, and recommended ways to make these tools more usable. We asked study participants to make privacy sensitive purchases (condoms and sex toys), and conducted some of the first Mechanical Turk studies related to privacy. Our research papers provided empirical evidence about how long it would take people if they actually did read privacy policies (too long!), that ad-industry-driven privacy efforts were largely ineffective, that privacy “nutrition labels” could help people compare company privacy practices, and that many people care enough about privacy to actually pay for it. We presented our research at events at the US Federal Trade Commission (FTC) and on Capitol Hill. In 2016 I spent a year in Washington, DC as Chief Technologist at the FTC. Besides advising the chairwoman and staff, I organized an FTC workshop to discuss methods for evaluating the effectiveness of privacy policies and other disclosures. After having my mobile phone account hijacked, I investigated this form of identity theft, and ended up discussing it on a Today Show segment taped in my kitchen. I also wrote blog posts that raised awareness about privacy concerns associated with open police data, and explained why frequent password changes may not be beneficial. In this talk I will discuss my usable privacy and security research and how it has informed policy work. I will highlight empirical studies that provide insights into users' expectations about privacy and security, as well as their use of privacy and security tools. Finally, I will talk about my experiences at the FTC and ways that members of the CHI community can impact public policy.2018LCLorrie CranorCarnegie Mellon UniversityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI