Computer Science Professor Eleanor Birrell Receives NSF Grant to Study Privacy Regulations

Eleanor Birrell on privacy regulation background

How can sensitive data be managed in the modern world? Eleanor Birrell, assistant professor of computer science, seeks to answer this question in her research. She studies system security and data privacy, with a focus on the interactions between those areas and psychology, law and political science.

A recent $750,000 grant from the National Science Foundation (NSF) will help her explore privacy regulation that would require companies to handle personal information in the best interests of the people involved. She will carry out the three-year research project titled 鈥淓mpirically Evaluating Data Fiduciary Privacy Laws鈥 jointly with Ada Lerner, assistant professor of computer science at Northeastern University, and Ari Waldman, professor of law at University of California, Irvine School of Law.

Birrell also looks forward to involving 麻豆影视 students in all aspects of her research, both during the academic year and over summer breaks.

We caught up with Birrell to ask her about the new project, which went into effect October 1. Answers have been edited for clarity and length.

What is the background for this project?

A lot of the questions I鈥檓 interested in right now are: how do people interact with security and privacy systems? And how can patterns in those interactions be leveraged to design and implement better software tools and better technologies?

One of the problems I鈥檝e been looking at over the last few years is the impact of privacy regulations on privacy technologies and user privacy. I鈥檝e done several projects with students over the last three or four years looking at, for example, the impact of the California privacy regulation, which among other things gives California residents the right to opt out of sale of their personal information. How has that requirement been implemented? How usable is it? How does the presentation of choices impact whether people understand their options, whether they use them, whether they鈥檙e able to get the settings to reflect their priorities? And what sort of tools can we as computer scientists build to make that better?

But one of the things that鈥檚 come up repeatedly, both in my work and others鈥 work, is the limitations of a model of privacy that assumes people need to protect themselves, that privacy is just about what you鈥檝e agreed to, or what rights you鈥檝e invoked or not invoked. Because that鈥檚 not how most people deal with their privacy. Most people have preferences and priorities but don鈥檛 necessarily have time or patience to go through elaborate explanations and figure out exactly how to adjust all their settings to make sure those match their priorities.

How does this new project fit into your research interests?

The current project is about an alternate way of thinking about what privacy and how privacy regulations might look like. There鈥檚 an idea floating around the legal community, often referred to as 鈥渋nformation fiduciaries鈥 or 鈥渄ata fiduciaries.鈥 That is, what if privacy laws imposed a fiduciary requirement on how companies handle data, similar to professional fiduciary requirements for lawyers or doctors. If you hire a lawyer, they are obligated to act in your best interests, not their own. A doctor is bound to act in your best interests. They can鈥檛, for example, propose a surgery to make money if it鈥檚 not going to be good for you. What would the world look like where privacy regulations required companies needed to handle personal information in the best interests of the people the data covered?

We鈥檙e trying to figure out how to empirically measure the impact on computer systems of such a privacy regulation, one that encodes data fiduciary requirements. And how would those computer systems impact people's actual privacy online?

How will you go about discovering that?

That involves a few different techniques to discover how people actually interact with software. This involves interviewing people, both people who just use the internet and developers and people in corporate settings. This involves designing surveys so that you can scale up those interactions. It involves measuring things online, particularly looking at before and after different regulations go into effect to try to measure how systems respond to a different set of requirements, and it involves building new tools to try to implement these privacy goals.

What end results are you hoping for?

The hope is to empirically validate the fiduciary privacy concept and use the data make recommendations to legislators and regulators about the potential impact of such regulation and whether It's something that should be pursued.

Also, hopefully we鈥檒l have good data on the types of tools that computer scientists can develop and the sorts of approaches computer science classes can take that would complement regulatory efforts.