Ambiguity in Privacy Requirements and Privacy Risk Perception

Dr. Jaspreet Bhatia
Carnegie Mellon University

Wednesday, Feb 6, 2019
10 AM - 11 AM
3405 A/B

Abstract:
Privacy requirements appear in privacy policies, which companies and regulators can use to ensure that their policies are consistent with their data practices. In addition, users can use these requirements to better understand what the company does with their information. Ambiguity in privacy requirements leads to different stakeholder interpretations of the behavior and functionality of a system, thus undermining the utility of these privacy requirements. In this talk, I will introduce new methods to: (a) identify, classify and measure vagueness; (b) detect incompleteness using semantic frames; and (c) measure perceived privacy risk due to ambiguity. This includes a theory of vagueness based on a taxonomy of vague terms and a Bradley-Terry model to predict the degree of vagueness in requirements. In addition, we use grounded analysis to represent requirements as semantic frames and then use deep learning (Bi-directional LSTM) to automate semantic role labeling for incompleteness detection. Finally, I will describe an empirically validated framework that combines factorial vignette surveys and multi-level modeling to predict the changes in privacy risk due to changes in natural language requirements. Together, the ability to measure vagueness, incompleteness and risk will help us improve the quality of privacy requirements statements consequently reducing misinterpretation.

Biography:
Dr. Jaspreet Bhatia recently defended her PhD thesis at Carnegie Mellon University, in the Institute for Software Research under her advisor Dr. Travis Breaux. Her research aims to improve the design of trustworthy systems for privacy and the trust that users and regulators have in privacy-preserving systems. This research is motivated by the need to develop scalable systems to analyze privacy requirements for defects and to measure perceived privacy risk. Dr. Bhatia employs diverse data collection, modeling and analysis methods, including user studies, statistical analysis, crowdsourcing, natural language processing and deep learning. Her research has been published in top-tier journal and conferences, including ACM Transactions on Software Engineering and Methodology, ACM Transactions on Computer-Human Interaction, and IEEE Requirements Engineering Conference. Her research has won multiple paper awards, including a Distinguished Research Paper Award at the 26th IEEE Requirements Engineering Conference (RE) in 2018, a Best Paper Award nomination at 24th IEEE RE 2016, and the Honorable Mention for Privacy Papers for Policymakers Award in 2016. Her work bridges an important gap between software engineering and law, and she has been an invited speaker at multiple public policy venues, including the FTC and NIST.

Host:
Dr. Arun Ross