conferences trip report usable security

SOUPS 2017 Trip Report

What conference did I go to?

I attended the USENIX Symposium on Usable Privacy and Security (SOUPS) 2017 in July, where I presented our paper on Automatic Application Software Updates on Android. Early in July, I blogged about our paper, and its results and implications. In this post, I’ll summarize my experiences at the conference, particularly highlighting papers and research that piqued my interest.

Where was the conference held?

The conference was held in Santa Clara, in sunny Northern California.

What were the three best talks I attended?

The conference featured papers tackling issues across a wide range of topics ranging from authentication, user behavior in security defense, specific sub-populations, and privacy. Although several of these talks were informative, I found the following three talks to be particularly interesting:

  1. How Effective is Anti-Phishing Training for Children?:
    1. This talk described the design of a phishing training intervention aimed at school children, and its evaluation over time. The authors found that the children who received the training got better at identifying phishing emails than those who didn’t; however, the training had no effect on identifying legitimate emails. Furthermore, the students who received the training performed no better than those didn’t four weeks after receiving the training, indicating a decay in performance.
    2. While both the methodology and results of the experiment were insightful, I found the discussion from the authors on ethics illuminating. For instance, the authors reported having to obtain informed consent from the parents of the children before launching the experiment, and also thinking through the ethics of their actions. I was pleasantly surprised to learn that they trained the control group—the group that did not receive the training—at the end of the experiment, and also debriefed the children and their families about the experiment.
  2. I feel stupid I can’t delete…: A Study of Users’ Cloud Deletion Practices and Coping Strategies.
    1. This talk described the findings from an exploratory study examining users’ motivations and mental models about deleting files from the Cloud. The authors discovered that users lack sufficient information about deletion and had incomplete and often incorrect mental models about how files are stored on the Cloud, which in turn led to sub-optimal actions.
    2. I found this talk particularly interesting because it tackles a previously unexplored problem in usable security. In wake of numerous high profile cases of iCloud leaks, this problem has become all the more important, and it seems like redesigning such deletion interfaces can be of help to users.
  3. The Importance of Visibility for Folk Theories of Sensor Data.
    1. This talk described users make decisions about privacy in the context of wearable devices. Specifically, the authors investigated the challenges users make to make informed privacy decisions given that they don’t really ‘see’ how their data is being collected and used.

What was my favorite part of the conference?

My favorite part of attending SOUPS is being able to meet and interact with the HCI and Privacy/Security community. The SOUPS organization invests heavily in its student body (e.g. by almost always offering travel grants), and this enables students—new and old—to continue participating.

What was my least favorite part of the conference?

None really. I wish the venue was closer to restaurants in the area.

publication usable security

Software Updates at SOUPS 2017

We describe a few highlights from our recent paper on mobile software updating that will be presented at the 2017 Usenix Symposium on Usable Privacy and Security (SOUPS).

What did we do?: Software updates are essential to maintain the security of devices and software, and therefore it’s important that users install them at the earliest. In our study, we investigated Android users’ attitudes and preferences towards automatic application updates—updates that are installed without users’ consent—using a survey.

How did we do it?: We conducted the survey on the Amazon Mechanical Turk platform. The survey contained three parts. In the first part, participants filled out several psychometric scales, which captured their risk taking propensity, consideration for future consequences, curiosity, and their security awareness. In the second part, participants self-reported their Android update settings, and their preferences towards auto-updating their applications. Finally, in the third part, participants reported past negative experiences with software updating.

What did we find?: Our findings reveal that Android users who avoid application auto-updates are more likely to have had past negative experiences with software updating, tend to take fewer risks, and display greater proactive security awareness. Users’ perceived level of trust with mobile applications also determined how comfortable they are auto-updating these applications.

What are the implications of the work?: Based on our findings, we make four primary recommendations to improve the design of mobile application updates on Android to encourage users to auto-update. First, we suggest that an improvement to the current Android OS would be to provide users with a more accessible mechanism to rollback application updates to a prior point in time to encourage users to be more risk taking with respect to turning on auto-updates. Second, we suggest leveraging the characteristics we identified of users who avoid auto-updating, including their risk averse nature, to design nudges and messages to encourage users into auto-updating security updates. Third, we suggest that the security community study the practices of software developers, how they develop and build updates, and how these practices lead to negative experiences for end-users. Finally, we suggest that an improved Android application interface for updates could be personalized by inferring users’ attitudes towards their Android applications and preferences for auto-updating those applications using our work as a starting point. Doing so may encourage more users to auto-update their mobile applications, which will ultimately affect the security of their devices.

Read the SOUPS 2017 paper for more details!

usable security

Drones on Sexton field

Students in the COS Independent Work Seminar entitled “These Aren’t The Drones You Are Looking For: Mitigating the Privacy and Security Implications of Drones” left the classroom to get out and about on Sexton field on a beautiful spring morning. The order of the day was to test out a host of Parrot AR Drones in an open flying environment.

Students also worked on testing out their semester-long individual projects involving actions such as controlling the drone with a keyboard, pinging a drone to get its identity, and tracking a drone using a lightweight Android phone mounted to the drone.

Check out pictures of the class outing here.

usable security

Drones, Privacy, and Security at CHI 2017

We describe a few highlights from our recent paper on drones that will be presented at the 2017 CHI conference.

What did we do?: To better inform policy and regulations around drones, we chose to investigate what users’ current perceptions are of privacy and security issues around drones. We conducted an experimental study with 20 laypersons in Maryland, USA with users who had never seen or interacted with a drone before.

How did we do it?: The experiment took place indoors in a laboratory at the Human Computer Interaction Laboratory at the University of Maryland, College Park. Each participant had a pre-interview asking them about their general perceptions and mental models of drones. Participants were then shown a real drone or model drone and given the opportunity to see the researchers control the drone and to fly the drone on their own. Following the demonstration and a series of tasks that included showing users the footage that drones collect and store, participants had an exit interview to discuss their final thoughts on drones, privacy, and security.

What did we find?: Compared to previous studies, 1) we found that users had many more negative perceptions of drones and how they affect their personal privacy and security. For instance, participants were worried drones could injure people and that they could be used for spying on others. Users were particularly concerned about having multiple drones in an area and keeping drones away from people and private spaces like schools, government institutions, or private residences. 2) We found that a drone’s design affected participants sense of privacy and security. For instance the sound of a drone made it seem threatening but at the same time, the noise alerted participants to the presence of the drone which was considered a privacy enhancer.

What are the implications of the work?: Our work suggested that 1) there needs to be better regulations to deal with cases where there are multiple drones in an area and to mandate geo-fencing technologies either in drones themselves, or to develop other geo-fencing mechanisms that can be implemented by those who might encounter drones using existing infrastructures such as home wireless networks. Geo-fencing could enhance privacy by keeping drones away from private spaces and making people feel more secure about being “safe” from drones in these spaces.

2) We suggested that further exploration is needed to see if it is possible to have drones operate in designated spaces or “drone highways” to keep them at a distance from people to protect privacy, be easily identifiable, and prevent injury. Finally, 3) we recommend that drone designers think about how to create designs that make drones more friendly or less so depending on whether or not people should interact with them using both visual and non-visual markers such as sound.

Read the CHI 2017 paper for more details and contact us for copies of supplementary materials such as interview guides.