Categories
publication

Use of Blocking Extensions at SOUPS 2018

We describe a few highlights from our recent paper on studying peoples’ use of browser-based blocking extensions that will be presented at the 2018 Usenix Symposium on Usable Privacy and Security (SOUPS).

What did we do?: One of the ways in which people can block online tracking on the Internet is using browser-based blocking extensions such as Ad blockers, Content blockers and Tracker blockers. In our study, we asked why people use these extensions, what their knowledge of online tracking is, and what users do when these extensions fail to function correctly.

How did we do it?: We conducted two surveys using Amazon Mechanical Turk and measured what extensions survey-takers were using, if any. In the first survey, participants reported details about the extensions they used and how they thought online tracking worked. We then asked them why they used the extensions they reported, how they learned about them, and how long they had been using these blocking extensions. We also conducted measurements to check whether participants were using the extensions they mentioned. In the second survey, which we administered only to the subset of participants who reported using these extensions, we asked participants about their experiences when their extensions “break” websites they are trying to access.

What did we find?: We have three main findings. First, our results show that blocking extension usage only weakly relates with an advanced understanding of online tracking in the real world. Second, we find that each extension type has a primary reason behind adoption that is in line with expectations: users adopt Ad blockers and Content blockers primarily for user experience gains and rarely take full advantage of the privacy benefits of these blockers, other users adopt Tracker blockers for privacy reasons. Finally, our results show that current users report that they rarely experience website breakages because of their blocking extensions. However, when users are poised with a choice to disable their extensions to access the content they are trying to reach, they base their decisions on how much they trust the website and how much they value the content they desire to access.

What are the implications of the work?: Based on our findings, we make two suggestions. First, given that both blocking extension users and non-users do not fully understand the landscape of online tracking, we suggest that system designers should focus their efforts on building systems that automatically enforce tracking protection as opposed to having users take action to protect themselves (such as by installing an extension). We argue that browser vendors can play an important role in facilitating this type of default privacy protection. Second, we suggest that blocking extensions can be further improved by better understanding how website developers embed third-party trackers and deliver content through their websites so that non-use (disabling) is not forced upon users.

Read the SOUPS 2018 paper for more details, and also follow related coverage on the Princeton Engineering website!

Categories
conferences trip report

Citizen Lab Summer Institute 2017 Trip Report

Posted on behalf of Mark Martinez

What conference did I go to?

I went to the Citizen Lab Summer Institute 2017 (CLSI) conference held by the eponymous Citizen Lab that brings together not only computer scientists, but any actor that works in the privacy and security field. I went to conduct interviews for a research project headed by Marshini Chetty and Philipp Winter. The link to the research project and its description can be found here: Tor Interview Project

It was this intersection of political scientists, computer scientists and political activists that made this conference so unique. To see so much of the impact that privacy technology makes made me realize how important the work in ensuring anonymity in certain circumstances is. One of the first people to speak at the conference talked about how some of her colleagues were jailed in a foreign nation because of the human rights work that they were doing. It hit home as to why it’s so important to actually make sure that when a person wishes to remain anonymous they can because it can be an actual matter of life and death.

Where was the conference held?

The conference is held yearly at the Munk School of Global Affairs at the University of Toronto. It is held in the Citizen Lab which frequently publishes papers on privacy and security both in the industrial and government sector.

What were the three best talks I attended?

My favorite talk of the conference was the first talk that had each major party of the conference rise up and talk about what they are doing and who they are collaborating with. It was here that you got to see just how diverse the group of participants were. It seemed like there were actually no purely technical people: everybody worked on interesting and inter-disciplinary work. The work varied from human rights and combating censorship in nations to deconstructing applications that are widely used in some foreign countries and exposing major security flaws. The first day’s agenda and notes (as well as links to all talks) can be found at this link: Agenda and Notes

Another interesting talk was listening to how censorship affects multiple countries in different ways. Four people talked about how censorship affects diverse regions of the world like, Pakistan, Iran, Brazil and Latin America, and parts of Africa. These people talked about the work that they do to circumvent censorship like creating different ways for people to reach blocked websites such as by redirecting the traffic or even setting up satellite dishes that would allow people to obtain blocked information. One interesting note was that in the 2017 Iranian election there was no censorship of popular media because it was now the entire political spectrum that were using platforms like WhatsApp and not just younger liberal pockets of the populace. This talk’s information can be found here: WorldWide Censorship Notes

A very different talk that I attended was done in collaboration with Jason Li and Andrew Hilts. In this talk Jason took technical concepts from the crowd and within 10 minutes made them into comics. Jason took examples like phishing and Tor and made them into approachable and mildly humorous comics. Jason and Andrew went on to explain that one feature that the Citizen Lab performs is to take issues that are widely relevant to the public but that are easily lost in jargon and make them into comics. This talk’s information can be found here: Technical Problems into Comics

What was my favorite part of the conference?

The conference was an eye-opening experience. Although much of my time was spent doing interviews for the research study I was participating in, I was still able to see how much concrete impact is being made in the lives of people all over the world. Privacy and security is not just a matter of novelty or paranoia, but is something that is critical to the success of so many operations worldwide ranging from understanding what user agreements for apps are to protecting the lives of human rights activists that are under government scrutiny.

 

Categories
publication usable security

Software Updates at SOUPS 2017

We describe a few highlights from our recent paper on mobile software updating that will be presented at the 2017 Usenix Symposium on Usable Privacy and Security (SOUPS).

What did we do?: Software updates are essential to maintain the security of devices and software, and therefore it’s important that users install them at the earliest. In our study, we investigated Android users’ attitudes and preferences towards automatic application updates—updates that are installed without users’ consent—using a survey.

How did we do it?: We conducted the survey on the Amazon Mechanical Turk platform. The survey contained three parts. In the first part, participants filled out several psychometric scales, which captured their risk taking propensity, consideration for future consequences, curiosity, and their security awareness. In the second part, participants self-reported their Android update settings, and their preferences towards auto-updating their applications. Finally, in the third part, participants reported past negative experiences with software updating.

What did we find?: Our findings reveal that Android users who avoid application auto-updates are more likely to have had past negative experiences with software updating, tend to take fewer risks, and display greater proactive security awareness. Users’ perceived level of trust with mobile applications also determined how comfortable they are auto-updating these applications.

What are the implications of the work?: Based on our findings, we make four primary recommendations to improve the design of mobile application updates on Android to encourage users to auto-update. First, we suggest that an improvement to the current Android OS would be to provide users with a more accessible mechanism to rollback application updates to a prior point in time to encourage users to be more risk taking with respect to turning on auto-updates. Second, we suggest leveraging the characteristics we identified of users who avoid auto-updating, including their risk averse nature, to design nudges and messages to encourage users into auto-updating security updates. Third, we suggest that the security community study the practices of software developers, how they develop and build updates, and how these practices lead to negative experiences for end-users. Finally, we suggest that an improved Android application interface for updates could be personalized by inferring users’ attitudes towards their Android applications and preferences for auto-updating those applications using our work as a starting point. Doing so may encourage more users to auto-update their mobile applications, which will ultimately affect the security of their devices.

Read the SOUPS 2017 paper for more details!

Categories
usable security

Drones on Sexton field

Students in the COS Independent Work Seminar entitled “These Aren’t The Drones You Are Looking For: Mitigating the Privacy and Security Implications of Drones” left the classroom to get out and about on Sexton field on a beautiful spring morning. The order of the day was to test out a host of Parrot AR Drones in an open flying environment.

Students also worked on testing out their semester-long individual projects involving actions such as controlling the drone with a keyboard, pinging a drone to get its identity, and tracking a drone using a lightweight Android phone mounted to the drone.

Check out pictures of the class outing here.

Categories
usable security

Drones, Privacy, and Security at CHI 2017

We describe a few highlights from our recent paper on drones that will be presented at the 2017 CHI conference.

What did we do?: To better inform policy and regulations around drones, we chose to investigate what users’ current perceptions are of privacy and security issues around drones. We conducted an experimental study with 20 laypersons in Maryland, USA with users who had never seen or interacted with a drone before.

How did we do it?: The experiment took place indoors in a laboratory at the Human Computer Interaction Laboratory at the University of Maryland, College Park. Each participant had a pre-interview asking them about their general perceptions and mental models of drones. Participants were then shown a real drone or model drone and given the opportunity to see the researchers control the drone and to fly the drone on their own. Following the demonstration and a series of tasks that included showing users the footage that drones collect and store, participants had an exit interview to discuss their final thoughts on drones, privacy, and security.

What did we find?: Compared to previous studies, 1) we found that users had many more negative perceptions of drones and how they affect their personal privacy and security. For instance, participants were worried drones could injure people and that they could be used for spying on others. Users were particularly concerned about having multiple drones in an area and keeping drones away from people and private spaces like schools, government institutions, or private residences. 2) We found that a drone’s design affected participants sense of privacy and security. For instance the sound of a drone made it seem threatening but at the same time, the noise alerted participants to the presence of the drone which was considered a privacy enhancer.

What are the implications of the work?: Our work suggested that 1) there needs to be better regulations to deal with cases where there are multiple drones in an area and to mandate geo-fencing technologies either in drones themselves, or to develop other geo-fencing mechanisms that can be implemented by those who might encounter drones using existing infrastructures such as home wireless networks. Geo-fencing could enhance privacy by keeping drones away from private spaces and making people feel more secure about being “safe” from drones in these spaces.

2) We suggested that further exploration is needed to see if it is possible to have drones operate in designated spaces or “drone highways” to keep them at a distance from people to protect privacy, be easily identifiable, and prevent injury. Finally, 3) we recommend that drone designers think about how to create designs that make drones more friendly or less so depending on whether or not people should interact with them using both visual and non-visual markers such as sound.

Read the CHI 2017 paper for more details and contact us for copies of supplementary materials such as interview guides.