Categories
publication

Use of Blocking Extensions at SOUPS 2018

We describe a few highlights from our recent paper on studying peoples’ use of browser-based blocking extensions that will be presented at the 2018 Usenix Symposium on Usable Privacy and Security (SOUPS).

What did we do?: One of the ways in which people can block online tracking on the Internet is using browser-based blocking extensions such as Ad blockers, Content blockers and Tracker blockers. In our study, we asked why people use these extensions, what their knowledge of online tracking is, and what users do when these extensions fail to function correctly.

How did we do it?: We conducted two surveys using Amazon Mechanical Turk and measured what extensions survey-takers were using, if any. In the first survey, participants reported details about the extensions they used and how they thought online tracking worked. We then asked them why they used the extensions they reported, how they learned about them, and how long they had been using these blocking extensions. We also conducted measurements to check whether participants were using the extensions they mentioned. In the second survey, which we administered only to the subset of participants who reported using these extensions, we asked participants about their experiences when their extensions “break” websites they are trying to access.

What did we find?: We have three main findings. First, our results show that blocking extension usage only weakly relates with an advanced understanding of online tracking in the real world. Second, we find that each extension type has a primary reason behind adoption that is in line with expectations: users adopt Ad blockers and Content blockers primarily for user experience gains and rarely take full advantage of the privacy benefits of these blockers, other users adopt Tracker blockers for privacy reasons. Finally, our results show that current users report that they rarely experience website breakages because of their blocking extensions. However, when users are poised with a choice to disable their extensions to access the content they are trying to reach, they base their decisions on how much they trust the website and how much they value the content they desire to access.

What are the implications of the work?: Based on our findings, we make two suggestions. First, given that both blocking extension users and non-users do not fully understand the landscape of online tracking, we suggest that system designers should focus their efforts on building systems that automatically enforce tracking protection as opposed to having users take action to protect themselves (such as by installing an extension). We argue that browser vendors can play an important role in facilitating this type of default privacy protection. Second, we suggest that blocking extensions can be further improved by better understanding how website developers embed third-party trackers and deliver content through their websites so that non-use (disabling) is not forced upon users.

Read the SOUPS 2018 paper for more details, and also follow related coverage on the Princeton Engineering website!

Categories
conferences trip report usable security

SOUPS 2017 Trip Report

What conference did I go to?

I attended the USENIX Symposium on Usable Privacy and Security (SOUPS) 2017 in July, where I presented our paper on Automatic Application Software Updates on Android. Early in July, I blogged about our paper, and its results and implications. In this post, I’ll summarize my experiences at the conference, particularly highlighting papers and research that piqued my interest.

Where was the conference held?

The conference was held in Santa Clara, in sunny Northern California.

What were the three best talks I attended?

The conference featured papers tackling issues across a wide range of topics ranging from authentication, user behavior in security defense, specific sub-populations, and privacy. Although several of these talks were informative, I found the following three talks to be particularly interesting:

  1. How Effective is Anti-Phishing Training for Children?:
    1. This talk described the design of a phishing training intervention aimed at school children, and its evaluation over time. The authors found that the children who received the training got better at identifying phishing emails than those who didn’t; however, the training had no effect on identifying legitimate emails. Furthermore, the students who received the training performed no better than those didn’t four weeks after receiving the training, indicating a decay in performance.
    2. While both the methodology and results of the experiment were insightful, I found the discussion from the authors on ethics illuminating. For instance, the authors reported having to obtain informed consent from the parents of the children before launching the experiment, and also thinking through the ethics of their actions. I was pleasantly surprised to learn that they trained the control group—the group that did not receive the training—at the end of the experiment, and also debriefed the children and their families about the experiment.
  2. I feel stupid I can’t delete…: A Study of Users’ Cloud Deletion Practices and Coping Strategies.
    1. This talk described the findings from an exploratory study examining users’ motivations and mental models about deleting files from the Cloud. The authors discovered that users lack sufficient information about deletion and had incomplete and often incorrect mental models about how files are stored on the Cloud, which in turn led to sub-optimal actions.
    2. I found this talk particularly interesting because it tackles a previously unexplored problem in usable security. In wake of numerous high profile cases of iCloud leaks, this problem has become all the more important, and it seems like redesigning such deletion interfaces can be of help to users.
  3. The Importance of Visibility for Folk Theories of Sensor Data.
    1. This talk described users make decisions about privacy in the context of wearable devices. Specifically, the authors investigated the challenges users make to make informed privacy decisions given that they don’t really ‘see’ how their data is being collected and used.

What was my favorite part of the conference?

My favorite part of attending SOUPS is being able to meet and interact with the HCI and Privacy/Security community. The SOUPS organization invests heavily in its student body (e.g. by almost always offering travel grants), and this enables students—new and old—to continue participating.

What was my least favorite part of the conference?

None really. I wish the venue was closer to restaurants in the area.

Categories
publication usable security

Software Updates at SOUPS 2017

We describe a few highlights from our recent paper on mobile software updating that will be presented at the 2017 Usenix Symposium on Usable Privacy and Security (SOUPS).

What did we do?: Software updates are essential to maintain the security of devices and software, and therefore it’s important that users install them at the earliest. In our study, we investigated Android users’ attitudes and preferences towards automatic application updates—updates that are installed without users’ consent—using a survey.

How did we do it?: We conducted the survey on the Amazon Mechanical Turk platform. The survey contained three parts. In the first part, participants filled out several psychometric scales, which captured their risk taking propensity, consideration for future consequences, curiosity, and their security awareness. In the second part, participants self-reported their Android update settings, and their preferences towards auto-updating their applications. Finally, in the third part, participants reported past negative experiences with software updating.

What did we find?: Our findings reveal that Android users who avoid application auto-updates are more likely to have had past negative experiences with software updating, tend to take fewer risks, and display greater proactive security awareness. Users’ perceived level of trust with mobile applications also determined how comfortable they are auto-updating these applications.

What are the implications of the work?: Based on our findings, we make four primary recommendations to improve the design of mobile application updates on Android to encourage users to auto-update. First, we suggest that an improvement to the current Android OS would be to provide users with a more accessible mechanism to rollback application updates to a prior point in time to encourage users to be more risk taking with respect to turning on auto-updates. Second, we suggest leveraging the characteristics we identified of users who avoid auto-updating, including their risk averse nature, to design nudges and messages to encourage users into auto-updating security updates. Third, we suggest that the security community study the practices of software developers, how they develop and build updates, and how these practices lead to negative experiences for end-users. Finally, we suggest that an improved Android application interface for updates could be personalized by inferring users’ attitudes towards their Android applications and preferences for auto-updating those applications using our work as a starting point. Doing so may encourage more users to auto-update their mobile applications, which will ultimately affect the security of their devices.

Read the SOUPS 2017 paper for more details!

Categories
presentations

Princeton HCI @ Princeton High School

I recently presented an overview of our lab’s work to the students—all girls, aged 10 to 14—attending the AspireIT camp at Princeton High School. It was great fun!