![]() Bongard, Kerstin ![]() ![]() in 2022 7th IEEE European Symposium on Security and Privacy Workshops (EuroSPW) (in press) App permission requests are a control mechanism meant to help users oversee and safeguard access to data and resources on their smartphones. To decide whether to accept or deny such requests and make this ... [more ▼] App permission requests are a control mechanism meant to help users oversee and safeguard access to data and resources on their smartphones. To decide whether to accept or deny such requests and make this consent valid, users need to understand the underlying reasons and judge the relevance of disclosing data in line with their own use of an app. This study investigates people’s certainty about app permission requests via an online survey with 400 representative participants of the UK population. The results demonstrate that users are uncertain about the necessity of granting app permissions for about half of the tested permission requests. This implies substantial privacy risks, which are discussed in the paper, resulting in a call for user-protecting interventions by privacy engineers. [less ▲] Detailed reference viewed: 111 (19 UL)![]() Sergeeva, Anastasia ![]() ![]() ![]() in CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (2023, April 19) Persuasive tactics intend to encourage users to open advertising emails. However, these tactics can overwhelm users, which makes them frustrated and leads to lower open rates. This paper intends to ... [more ▼] Persuasive tactics intend to encourage users to open advertising emails. However, these tactics can overwhelm users, which makes them frustrated and leads to lower open rates. This paper intends to understand which persuasive tactics are used and how they are perceived by users. We first developed a categorization of inbox-level persuasive tactics in permission-based advertising emails. We then asked participants to interact with an email inbox prototype, combined with interviews (N=32), to investigate their opinions towards advertising emails and underlying persuasive tactics. Our qualitative findings reveal poor user experience with advertising emails, which was related to feeling surveilled by companies, forced subscriptions, high prior knowledge about persuasive tactics, and a desire for more agency. We also found that using certain persuasive tactics on the inbox level is perceived as ethically inappropriate. Based on these insights, we provide design recommendations to improve advertising communication and make such emails more valuable to users. [less ▲] Detailed reference viewed: 80 (5 UL)![]() Rohles, Björn ![]() ![]() ![]() Scientific Conference (2022, May 01) Educators increasingly agree on the importance of teaching Human-Computer Interaction (HCI) to Computer Science (CS) students, but there is debate on how to best integrate HCI into CS curricula ... [more ▼] Educators increasingly agree on the importance of teaching Human-Computer Interaction (HCI) to Computer Science (CS) students, but there is debate on how to best integrate HCI into CS curricula. Unfortunately, standard course evaluations typically do not provide sufficient insights for improving HCI classes. In the present article, we used a human-centered design approach to evaluate our HCI classes, building on a qualitative study with CS students from four introductory HCI classes over two years. We report on a qualitative assessment through interviews, photo elicitation and sentence completion. Specifically, we addressed four research questions: which contents were the most relevant, how students experienced the courses, how they view the role of HCI in CS, and which outcomes they perceived from the HCI courses. We gathered rich qualitative insights beyond the standard course evaluations and derived concrete enhancements for future course iterations. We discuss implications for other HCI educators and contribute recommendations for the living HCI curriculum. Furthermore, we reflect on the usefulness of our methodological approach to collect in-depth constructive feedback from students. [less ▲] Detailed reference viewed: 89 (8 UL)![]() Distler, Verena ![]() ![]() in Computers in Human Behavior Reports (2022), 4 An ongoing discussion in the field of usable privacy and security debates whether security mechanisms should be visible to end-users during interactions with technology, or hidden away. This paper ... [more ▼] An ongoing discussion in the field of usable privacy and security debates whether security mechanisms should be visible to end-users during interactions with technology, or hidden away. This paper addresses this question using a mixed-methods approach, focusing on encryption as a mechanism for confidentiality during data transmission on a smartphone application. In study 1, we conducted a qualitative co-creation study with security and Human-Computer Interaction (HCI) experts (N = 9) to create appropriate textual and visual representations of the security mechanism encryption in data transmission. We investigated this question in two contexts: online banking and e-voting. In study 2, we put these ideas to the test by presenting these visual and textual representations to non-expert users in an online vignette experiment (N = 2180). We found a statistically significant and positive effect of the textual representation of encryption on perceived security and understanding, but not on user experience (UX). More complex text describing encryption resulted in higher perceived security and more accurate understanding. The visual representation of encryption had no statistically significant effect on perceived security, UX or understanding. Our study contributes to the larger discussion regarding visible instances of security and their impact on user perceptions. [less ▲] Detailed reference viewed: 72 (7 UL)![]() Distler, Verena ![]() in ACM Transactions on Computer-Human Interaction (2021), 28(6), 50 Usable privacy and security researchers have developed a variety of approaches to represent risk to research participants. To understand how these approaches are used and when each might be most ... [more ▼] Usable privacy and security researchers have developed a variety of approaches to represent risk to research participants. To understand how these approaches are used and when each might be most appropriate, we conducted a systematic literature review of methods used in security and privacy studies with human participants. From a sample of 633 papers published at five top conferences between 2014 and 2018 that included keywords related to both security/privacy and usability, we systematically selected and analyzed 284 full-length papers that included human subjects studies. Our analysis focused on study methods; risk representation; the use of prototypes, scenarios, and educational intervention; the use of deception to simulate risk; and types of participants. We discuss benefits and shortcomings of the methods, and identify key methodological, ethical, and research challenges when representing and assessing security and privacy risk. We also provide guidelines for the reporting of user studies in security and privacy. [less ▲] Detailed reference viewed: 62 (7 UL)![]() Distler, Verena ![]() Doctoral thesis (2021) In traditional interactions that do not rely on technology, most people are able to assess risks to their privacy and security and understand how to mitigate these risks. However, risk assessment and ... [more ▼] In traditional interactions that do not rely on technology, most people are able to assess risks to their privacy and security and understand how to mitigate these risks. However, risk assessment and mitigation is more challenging when interacting with technology, and people’s perceptions of security and privacy risks are not always aligned with reality. It is important for those who design technologies to understand how people perceive the security of technologies in order to avoid having their designs contribute to erroneous perceptions. Instead, interactions with technology should be deliberately designed to ensure that people do not over- or underestimate the security provided by the system. This dissertation contributes to a better understanding of users’ perceptions of security in human-computer interactions. It investigates which factors induce a perception of security and privacy risks and how user-centered design can influence these factors to deliberately design for or against perceived security. I use a mixed-methods approach to address these objectives, including a systematic literature review, empirical data collection with focus groups, expert co-creation sessions, user tests in a controlled environment and a quantitative survey experiment. The first research objective is to analyze how security and privacy researchers induce a perception of security and privacy risks with research participants. We conducted a systematic literature review and focused our analysis on study methods; risk representation; the use of prototypes, scenarios, and educational interventions; the use of deception to simulate risk; and types of participants. We discuss benefits and shortcomings of the methods, and identify key methodological, ethical, and research challenges when representing and assessing security and privacy risk. We also provide guidelines for the reporting of user studies in security and privacy. The second research objective is to explore the factors that contribute to the acceptance of privacy and security risks in situations where people need to weigh the potential advantages of a technology against its associated privacy or security risks. We conducted a series of focus groups and highlighted the reasons why people accept compromises to their privacy and security, finding that perceived usefulness and the fulfilment of the psychological needs for autonomy and control were important factors. Our results suggest potential links between technology acceptance models and user experience models in the context of privacy-relevant interactions. The third research objective is to design and evaluate examples of visible representations of security mechanisms, with a focus on encryption. We studied the effects of these visual and textual representations empirically to understand the impact of these visible security mechanisms on user experience, perceptions of security and users’ understanding of encryption. We addressed this question in a series of studies, both lab studies and online experiments. In a vignette experiment, we find that more complex descriptions of encryption can lead to a better understanding and higher perceived security when designed carefully. However, we find no effect of novel visualizations of encryption on user experience (UX), perceived security or understanding of encryption. The fourth objective is to explore how we might make the link from subjective experience to more secure behaviors. We introduce a new framework of security-enhancing friction design. The framework suggests helping users behave more securely by designing for moments of negative UX in security-critical situations while also ensuring that overall UX remains at an acceptable level to avoid disuse of secure technologies. Overall, this doctoral dissertation contributes to research in the field of human-computer interaction, and more specifically, usable privacy and security. It improves our understanding of the methods used by researchers in the field of usable privacy and security use to create a perception of risk, and the factors that make people accept or reject certain privacy trade-offs. This dissertation also makes contributions to helping researchers and creators of technology understand how their designs influence perceptions of security, UX and understanding of encryption. This enables them to design for or against a perception of security, depending on the actual level of security provided by the technology. Finally, we conceptualize security-enhancing friction, a framework that suggests helping users to behave more securely by designing for moments of negative UX. [less ▲] Detailed reference viewed: 302 (27 UL)![]() Distler, Verena ![]() ![]() ![]() in New Security Paradigms Workshop (2020, October 26) A growing body of research in the usable privacy and security community addresses the question of how to best influence user behavior to reduce risk-taking.We propose to address this challenge by ... [more ▼] A growing body of research in the usable privacy and security community addresses the question of how to best influence user behavior to reduce risk-taking.We propose to address this challenge by integrating the concept of user experience (UX) into empirical usable privacy and security studies that attempt to change risktaking behavior. UX enables us to study the complex interplay between user-related, system-related and contextual factors and provides insights into the experiential aspects underlying behavior change, including negative experiences. We first compare and contrast existing security-enhancing interventions (e.g., nudges, warnings, fear appeals) through the lens of friction. We then build on these insights to argue that it can be desirable to design for moments of negative UX in security-critical situations. For this purpose, we introduce the novel concept of security-enhancing friction, friction that effectively reduces the occurrence of risk-taking behavior and ensures that the overall UX (after use) is not compromised. We illustrate how security-enhancing friction provides an actionable way to systematically integrate the concept of UX into empirical usable privacy and security studies for meeting both the objectives of secure behavior and of overall acceptable experience. [less ▲] Detailed reference viewed: 220 (42 UL)![]() Distler, Verena ![]() ![]() ![]() in The 5th European Workshop on Usable Security (EuroUSEC 2020) (2020) When communication about security to end users is ineffective, people frequently misinterpret the protection offered by a system. The discrepancy between the security users perceive a system to have and ... [more ▼] When communication about security to end users is ineffective, people frequently misinterpret the protection offered by a system. The discrepancy between the security users perceive a system to have and the actual system state can lead to potentially risky behaviors. It is thus crucial to understand how security perceptions are shaped by interface elements such as text-based descriptions of encryption. This article addresses the question of how encryption should be described to non-experts in a way that enhances perceived security. We tested the following within-subject variables in an online experiment (N=309): a) how to best word encryption, b) whether encryption should be described with a focus on the process or outcome, or both c) whether the objective of encryption should be mentioned d) when mentioning the objective of encryption, how to best describe it e) whether a hash should be displayed to the user. We also investigated the role of context (between subjects). The verbs “encrypt” and “secure” performed comparatively well at enhancing perceived security. Overall, participants stated that they felt more secure not knowing about the objective of encryption. When it is necessary to state the objective, positive wording of the objective of encryption worked best. We discuss implications and why using these results to design for perceived lack of security might be of interest as well. This leads us to discuss ethical concerns, and we give guidelines for the design of user interfaces where encryption should be communicated to end users. [less ▲] Detailed reference viewed: 101 (7 UL)![]() Distler, Verena ![]() ![]() ![]() in Computers in Human Behavior (2019) Privacy is a timely topic that is increasingly scrutinized in the public eye. In spite of privacy and security breaches, people still frequently compromise their privacy in exchange for certain benefits ... [more ▼] Privacy is a timely topic that is increasingly scrutinized in the public eye. In spite of privacy and security breaches, people still frequently compromise their privacy in exchange for certain benefits of a technology or a service. This study builds on both technology acceptance (TA) and user experience (UX) research in order to explore and build hypotheses regarding additional dimensions that might play a role in the acceptability of privacy tradeoffs that are not currently accounted for in TA models. Using four scenarios describing situations with potential privacy trade-offs, we conducted a focus group study with 8 groups of participants (N = 32). Our results suggest that factors influencing privacy trade-offs go beyond existing TA factors alone. A technology's perceived usefulness plays an important role, as well as dimensions related to context, previous experiences, perceived autonomy and the feeling of control over the data being shared. [less ▲] Detailed reference viewed: 105 (9 UL)![]() Zollinger, Marie-Laure ![]() ![]() ![]() in Electronic Voting (2019, October) This paper presents a mobile application for vote-casting and vote-verification based on the Selene e-voting protocol and explains how it was developed and implemented using the User Experience Design ... [more ▼] This paper presents a mobile application for vote-casting and vote-verification based on the Selene e-voting protocol and explains how it was developed and implemented using the User Experience Design process. The resulting interface was tested with 38 participants, and user experience data was collected via questionnaires and semi-structured interviews on user experience and perceived security. Results concerning the impact of displaying security mechanisms on UX were presented in a complementary paper. Here we expand on this analysis by studying the mental models revealed during the interviews and compare them with theoretical security notions. Finally, we propose a list of improvements for designs of future voting protocols. [less ▲] Detailed reference viewed: 215 (17 UL)![]() Distler, Verena ![]() ![]() ![]() in Proceedings of ACM CHI Conference on Human Factors in Computing Systems (CHI2019) (2019, April) An unsolved debate in the field of usable security concerns whether security mechanisms should be visible, or blackboxed away from the user for the sake of usability. However, tying this question to ... [more ▼] An unsolved debate in the field of usable security concerns whether security mechanisms should be visible, or blackboxed away from the user for the sake of usability. However, tying this question to pragmatic usability factors only might be simplistic. This study aims at researching the impact of displaying security mechanisms on user experience (UX) in the context of e-voting. Two versions of an e-voting application were designed and tested using a between-group experimental protocol (N=38). Version D displayed security mechanisms, while version ND did not reveal any security-related information. We collected data on UX using standardised evaluation scales and semi-structured interviews. Version D performed better overall in terms of UX and need fulfilment. Qualitative analysis of the interviews gives further insights into factors impacting perceived security. Our study adds to existing research suggesting a conceptual shift from usability to UX and discusses implications for designing and evaluating secure systems. [less ▲] Detailed reference viewed: 652 (63 UL)![]() Distler, Verena ![]() ![]() in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (2018, April) Autonomous vehicles have the potential to fundamentally change existing transportation systems. Beyond legal concerns, these societal evolutions will critically depend on user acceptance. As an emerging ... [more ▼] Autonomous vehicles have the potential to fundamentally change existing transportation systems. Beyond legal concerns, these societal evolutions will critically depend on user acceptance. As an emerging mode of public transportation [7], Autonomous mobility on demand (AMoD) is of particular interest in this context. The aim of the present study is to identify the main components of acceptability (before first use) and acceptance (after first use) of AMoD, following a user experience (UX) framework. To address this goal, we conducted three workshops (N=14) involving open discussions and a ride in an experimental autonomous shuttle. Using a mixed-methods approach, we measured pre-immersion acceptability before immersing the participants in an on-demand transport scenario, and eventually measured post-immersion acceptance of AMoD. Results show that participants were reassured about safety concerns, however they perceived the AMoD experience as ineffective. Our findings highlight key factors to be taken into account when designing AMoD experiences. [less ▲] Detailed reference viewed: 127 (13 UL)![]() Distler, Verena ![]() ![]() ![]() in CHI Workshop Exploring Individual Diffferences in Privacy (2018, April) This position paper lays out current and future studies which we conduct on the UX aspects of security and privacy, our goal being to understand which factors influence privacy-related decision-making. We ... [more ▼] This position paper lays out current and future studies which we conduct on the UX aspects of security and privacy, our goal being to understand which factors influence privacy-related decision-making. We advocate using UX design methods in order to study interindividual differences, system-related and contextual factors involved in privacy and security attitudes and behaviors. These results will contribute to user-tailored and personalized privacy initiatives and guide the design of future technologies. [less ▲] Detailed reference viewed: 380 (18 UL)![]() Distler, Verena ![]() in Proceedings of NordiCHI' 18 Doctoral Consortium (2018) Detailed reference viewed: 72 (21 UL)![]() Distler, Verena ![]() Bachelor/master dissertation (2017) Detailed reference viewed: 68 (15 UL) |
||