Security and Human Behaviour 2023

9 minute read

Date:

The following are my notes taken during the SHB workshop. I have included the names of presenters but it should be noted that these are my own general thoughs and reactions while attending these presentations.

As with my past blog posts on conferences that have included my notes, the comments and descriptions are my own interpretation of the presentations given by the researchers below and may not necessarily reflect the thoughts or opinions of those researchers. I have tried to add some relevant links to recent research by these presenters to give more context to their work.

Session 1: Privacy

  • Lorrie Cranor: Designing Visable and Useful Privacy Icons: Privacy ‘nutrition labels’ can be used to inform the public on their privacy concerns. 2020: App security labels tested, many instances of confusion from participants. Data collection used a matrix to compare different ways of combining security information in different ways. Question: How does this differentiate between the US And EU standards?
  • Laura Brandimarte: Parental Trust and Automated Detection of Cyber Predators: One issue with the use of tools to ensure protection against cyber predators is algorithm aversion in the general public. People often feel like they are a part of a unique group who may be unjustly harmed by the application of algorithms, even though many people agree that the general public should have trust in algorithms. This can introduce challenges in the detection of online predation as parents may not trust algorithms to handle their children’s interactions online.
  • Tesary Lin: Selection Bias in Consumer Data: Firms maximize data volume by focusing on opt-ins for collecting data from their users. This can lead to systematic biases in the data that is collected. Might be able to account for this using a variety of techniques.
  • Geoffrey Tomaino: Intrasensitivity of Consumer Privacy: Barter vs. cash markets for the value of private data from online users. People assign a greater value in terms of cash to their data. Consider a paid option to not have their data collected. Question: Is the effect different from or more pronounced than previous results or is this just showing that the effect also occurs for private data evaluations?
  • Discussion: Payments for personal data in cash labelling doesn’t drastically change behavior. Specificity can help, while how choices impact us personally (i.e privacy being taken from us), is hard to quantify in a dollar amount.

Session 2: Security

  • Nicolas Christian: Comparind twitter posts and crypto trade practicies, (i.e being long/short on crypto vs. what individual people tweet about). Compared use of hashtags vs their income from the crypto online trading platforms, which are available. Shows an inconsistency in behavior, people might be insentivised to be publically short/long but privately the opposite. Question: Any sentiment analysis of tweets?
  • Serge Egelman: Failure in 2FA: Many security issues, things like using passwords or tokens in plain text on databases, usability and user applications issues. Concerns that taking too long would lead to users simply not using 2FA.
  • Tony Vance: Cybersecurity expertise in boards of directos. Security and exchange commision is requireing a disclosure on the knowledge of the board of directors in terms of security and privacy. Question: Could this lead to problems of scapegoatism, and worse preivacy and security behaviors from companies since they can sapcegoat the ‘crypto’ expert on the board and fire them without making improvements.
  • Elissa Redmiles: Digitally mediated offline introductions: Comparison of peoples data privacy concerns in different contexts like looking for a job or finding a romantic partnet. Compared concerns as well as experiences and how they responded. Transition into the offline space, protections, attacks and safety. Data privacy concerns are less relevant for people when they are doing job searches compared to looking for a relationship.
  • Rainer Böhme: Anatomy of Data Breaches: This study used a dataset of leaked personal information as a method of collecting participants for a study on how they responded and how their experience changed their concerns. Sampling from leaked data to approach participants in a study, concerns of legality and ethicality. Results indicated major issues are spam, some financial issues. Large differences in how people responded to their data being breached.
  • Discussion: Cyber security requirements on boards of directors could create issues. Uber CSIO convicted for a breach coverup. Crypto: pump and dump schemes very common, study didn’t have access to actualy portfolio just their gains.

Session 3: Crime, Policy, and Risk

  • David Livingstone Smith: Apportionment of Human (In)Security. Race and security intersectonality, anti-racism and its applicability on security differential experiences from different races.
  • Sngus Bancroft: Interaction Orders in Illicit Markets: Covid impact of illict drug trade resulted in small scale distributors trying to keep prices down in a way that broke traditional economic norms. Emotional social order caused by opacity of environment
  • Unlisted Presenter: Harassment ofscientists, politically motivated based on areas of study, goals of harassment campaigns are to discredit and find admissions of uncertanty that allow the general public to fester uncertainty in academia.
  • Li Jiang: Perception of Privacy Conscience Behaviour: Concerns over or exhibiting bias over displays of privacy concousness in different contexts. Percieved as higher status when you take privacy into concern. More autonomous and more willing to work with people who are more concerned over privacy.

Session 4: Behavior

  • Coty Gonzales: Cognitie Attack Agents are More Challenging For Defenders: Instance Based Learning models can be used to train attack agents that perform better against human defenders in the cyborg defense game task.
  • Rick Wash: Phishing: Non-experts fail to recognize descriptions or identify phising. Epxert behavior involves a stance (whether or not the item is phishing), which may or may not be relevant? Difficulty inversion makes false positives more common.
  • Julie Downs: Disclosure: Inference form Missing Information. Gamble of what mean a number of participants will guess (Keynes), experiment done in an online context with some connections to privacy and security.
  • Bart Knijneburg: Privacy Decisions and Disclousre. Pricacy calculus: deciding privacy decisions logically and based on predetermiend parameters for the relecance of different features of privacy concerns. Allows people to auto fill in their preferences for different securit options on websites and automatically apply them to different contexts based on the type of information they want to share.

Session 5: Users

  • Norman Sadeh: Usability as Privacy’s Greatest Challenge. Since GDPR there has been a 25% increase in websites online policies that are agreed to before using the website or other applications. More privacy concerns has resulted in more categoixations. A useful tool would be a privacy assistant trained on question answering NLP models that can help people with complex evaluations of their privacy concerns.
  • Sophie van der Zee: Do Shareholders Care About Corporate Fraud: No change before corporate fraud was revealed in the stock prices, large change after and lasting for a short-medium amount of time in most cases except when the company goes out of buisness. Largest relevance for differences in results from corporate fraud being revealed is the type of industry of the company.
  • Kevin Roundy: Adoption of Privacy Practices: Password managers and unique password requirments result in people having a fear of losing all of their passwords if the managers themselves were to be compromised.
  • Florian Schaub: Consumer Awarenes and Reaction to Real-World Data Breaches: Most perticipants had some breach in the past. Most were unaware and were informed during the study. Mostly had angry reactions. Followup study comapred what steps the participants took and how this compared with the steps they claimed they would take when they did the intiial study.
  • Cristobal Cherye: Welfare Effects of Ad-blocking: How does data being sold negatively impact user experiences? Study paid people to install/uninstall an adblocker and evaluated their perceptions of the value of the adblocker after a few weeks of experience with or without it. Experiences tended to not change opinions that existed about the values before the few weeks with or without the blocker.

Session 6: Trust

  • Avi Collis: Measuing Consumer Welfare: Values of free digital goods are compared by incentivising the use or disuse of digital goods and using real incentives to compare their behavior with their opinions on the value of these goods.
  • Ahana Datta: CSEO (cyber security executive officer) Board Member Experience: Cant make organizational changes as a CSEO. In medium constructed companies, cseo reports to board members so cant make changes themselves without integrating the board on their side. In large companies there is often a CSEO and Buisness Secusrity Executive Officer, so nothing gets done either.
  • Ananya Sen: Value of Data for Platforms: Comparison of results from human vs. AI curation of content for online paltforms that primarily serve content like news sites. Humans did better in some instances but obviously would cost a lot more. Things like news can be hard to train AI for since they train on historical data and may lack a human like context for the relevance of new stories and news.
  • Yixin Zou: Older Adults Privacy Concerns: Self perceptions of vulneravility in privacy contexts showed no percieved differences from older adults apart from an increase in the liklihood of being the victim of a crime. Takeaway is that ability focused privacy interventions and education can improve privacy concerns.
  • Tyler Moore: Measure Enterprise CyberSecurity: Quantitative study of empirical effect of security control adoptions. Recommentatinos for cots and benefit of adapting controls based ont hefeatures of enterprises and their cybersecurity concrns.