My work examines the problems created by the exponential creation and sharing of information, especially personal information. As our lives become increasingly digital, we create, share, and leave information nearly everywhere, both deliberately and inadvertently. What are the long term implications in a world where nearly every aspect of our lives is recorded or documented in some way? These questions inform my research.
I’m a Non-Resident Fellow at the Center for Internet and Society at Stanford Law School, where I occasionally post to the CIS blog.
Looking for my CV? Check out my public profile on Linked In or email me for a copy. [jenking a-t ischool dot berkeley dot edu]
You can also view most of my publications on my Google Scholar page: https://scholar.google.com/citations?user=O5jENBMAAAAJ&hl=en
Law Review Article
Jennifer King and Adriana Stephan. Regulating Dark Patterns in Practice – Applying the California Privacy Rights Act. Georgetown Technology and Law Review (Forthcoming 2021).
King, Jennifer; Flanagan, Anne; Warren, Sheila. “Redesigning Data Privacy: Reimagining Notice & Consent for Human-Technology Interaction.“ White paper report: World Economic Forum, 30 July 2020.
Mulligan, D.K., Regan, P.M. and King, J. “The Fertile Dark Matter of Privacy takes on the Dark Patterns of Surveillance.” J. Consum. Psychol., 30: 767-773.
Conference Paper (peer reviewed)
Jennifer King. “Becoming Part of Something Bigger”: Direct to Consumer Genetic Testing, Privacy, and Personal Disclosure. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 158 (November 2019), 33 pages.
King, J. (2018). Privacy, Disclosure, and Social Exchange Theory. UC Berkeley. ProQuest ID: King_berkeley_0028E_17901. Merritt ID: ark:/13030/m5t77dzd. Retrieved from https://escholarship.org/uc/item/5hw5w5c1
Please note: I have uploaded a corrected version of this dissertation (Jan. 2019). The original filed version is available by following the external link above to the UC Berkeley e-repository, or you can contact me for a copy if you hit a paywall.
My dissertation was selected as the runner up in the annual Information Schools (I-Schools) Organization Dissertation Award (2019).
Maintaining the privacy of one’s personal information—one’s choice of when to disclose it and to whom, how one maintains control over it, and the risks of disclosure—is one of the most important social issues of the internet era. For the past decade, privacy researchers have focused on several domains, including: documenting public opinion about privacy attitudes and expectations; understanding how user interfaces affect disclosure; and focusing on understanding interpersonal privacy dynamics within social media settings, to name a few. All of this work shares the goal of furthering our collective understanding of how people think about information privacy in online settings, what they expect when disclosing their personal information, and why they make the disclosure choices they make. A common element missing from the extant privacy research is an accounting of social structures. More specifically, as researchers consider the various factors that affect personal disclosure, they often do not consider the relationship between the discloser and the recipient, and how aspects of that relationship may directly or indirectly affect one’s decision to disclose. A specific form of relationship I examine here is that between individuals and the companies to whom they disclose their personal information.
This dissertation explores how the structure of relationships between individuals and companies influences individuals’ decisions to disclose personal information. I accomplish this though a mixed-methods approach; first, I conducted twenty exploratory qualitative interviews with ten users of the 23andMe genetic testing service and ten women who used mobile apps to track their pregnancies. I interviewed all twenty participants about their experiences using online search engines. I then conducted three online survey experiments, using a hypothetical wearable fitness device that collects personal information as the premise of the study. The experiments tested a set of hypotheses and further explored themes that emerged from the qualitative research.
These studies examine the ways in which the relationship between individuals and the companies they disclose to, and in particular the distribution of power within the relationship, affects the individuals’ decisions to disclose. I use social exchange theory (SET) as the theoretical framework for this inquiry because the transfer of personal information in exchange for a service is an ex- change between social actors. Thus, SET provides an empirically tested scaffolding for exploring key features of these relationships and their impact on the normative aspects of exchange that affect disclosure choices, specifically: individuals’ perceptions of trust, fairness, power, and privacy.
This dissertation forges new ground in the analysis of information privacy and personal disclosure. Namely, the results of my mixed-method studies demonstrate the utility of the relational analytic approach for identifying social structural factors that affect personal disclosure. Further, it demonstrates the influence of power on personal disclosure—the extent to which individuals can control the terms under which personal information is exchanged, the options available to them to obtain similar resources elsewhere, how fair the exchange is, and the extent to which individuals benefit from it. This approach yields a different set of insights into the dynamic of personal disclosure and information privacy. It reveals the impact of power differentials on personal disclosure, demonstrating that imbalances in power between individuals and companies can affect individual decisions to disclose.
Jennifer King. “Understanding Privacy Decision-Making Using Social Exchange Theory.” Presented at: The Future of Networked Privacy: Challenges and Opportunities workshop, CSCW March 2015
Jennifer King & Coye Cheshire. “Privacy, Disclosure, and Social Exchange.” Presented at the Privacy Law Scholars Conference (invitation only), June 2015, Berkeley, CA.
Jennifer King. “Taken Out of Context: An Empirical Analysis of Westin’s Privacy Scale.” Presented at the Workshop on Privacy Personas and Segmentation (PPS) at SOUPS, July 2014. Menlo Park, CA, USA
Conference Paper (peer reviewed)
Christopher Thompson, Maritza Johnson, Serge Egelman, David Wagner, and Jennifer King. “When It’s Better to Ask Forgiveness than Get Permission: Attribution Mechanisms for Smartphone Resources.” Presented at the Symposium on Usable Privacy and Security, July 2013. Newcastle, UK.
Jennifer King. How Come I’m Allowing Strangers To Go Through My Phone? Smartphones and Privacy Expectations. Workshop on Usable Privacy and Security for Mobile Devices (U-PriSM) at SOUPS, July 2012. Washington, D.C., USA.
Law Review Article
Deirdre K. Mulligan and Jennifer King. “Bridging the Gap Between Privacy and Design.” University of Pennsylvania Journal of Constitutional Law, Vol. 14, Issue 4, 2012. Selected as a Leading Paper for Policymakers by the Future of Privacy Forum, 2012.
Conference Paper (peer reviewed)
Jennifer King and Aylin Selcugoklu. “Where’s the Beep? User Misunderstandings of RFID.” In Proceedings of 2011 IEEE International Conference on RFID.
Conference Paper (peer reviewed)
Hoofnagle, Chris; King, Jennifer; Li, Su; and Turow, Joseph. “How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies?” Selected for the top privacy papers for policymakers by the Future of Privacy Forum, 2010.
Media reports teem with stories of young people posting salacious photos online, writing about alcohol-fueled misdeeds on social networking sites, and publicizing other ill-considered escapades that may haunt them in the future. These anecdotes are interpreted as representing a generation-wide shift in attitude toward information privacy. Many commentators therefore claim that young people “are less concerned with maintaining privacy than older people are.” Surprisingly, though, few empirical investigations have explored the privacy attitudes of young adults. This report is among the first quantitative studies evaluating young adults’ attitudes. It demonstrates that the picture is more nuanced than portrayed in the popular media.
In this telephonic (wireline and wireless) survey of internet using Americans (N=1000), we found that large percentages of young adults (those 18-24 years) are in harmony with older Americans regarding concerns about online privacy, norms, and policy suggestions. In several cases, there are no statistically significant differences between young adults and older age categories on these topics. Where there were differences, over half of the young adult-respondents did answer in the direction of older adults. There clearly is social significance in that large numbers of young adults agree with older Americans on issues of information privacy.
A gap in privacy knowledge provides one explanation for the apparent license with which the young behave online. 42 percent of young Americans answered all of our five online privacy questions incorrectly. 88 percent answered only two or fewer correctly. The problem is even more pronounced when presented with offline privacy issues – post hoc analysis showed that young Americans were more likely to answer no questions correctly than any other age group.
We conclude then that that young-adult Americans have an aspiration for increased privacy even while they participate in an online reality that is optimized to increase their revelation of personal data.
Turow, Joseph; King, Jennifer; Hoofnagle, Chris; Bleakley, Amy; and Hennessey, Michael. “Americans Reject Tailored Advertising and Three Activities that Enable It .”
This nationally representative telephone (wire-line and cell phone) survey explores Americans’ opinions about behavioral targeting by marketers, a controversial issue currently before government policymakers. Behavioral targeting involves two types of activities: following users’ actions and then tailoring advertisements for the users based on those actions. While privacy advocates have lambasted behavioral targeting for tracking and labeling people in ways they do not know or understand, marketers have defended the practice by insisting it gives Americans what they want: advertisements and other forms of content that are as relevant to their lives as possible.
Contrary to what many marketers claim, most adult Americans (66%) do not want marketers to tailor advertisements to their interests. Moreover, when Americans are informed of three common ways that marketers gather data about people in order to tailor ads, even higher percentages – between 73% and 86% – say they would not want such advertising. Even among young adults, whom advertisers often portray as caring little about information privacy, more than half (55%) of 18-24 years-old do not want tailored advertising. And contrary to consistent assertions of marketers, young adults have as strong an aversion to being followed across websites and offline (for example, in stores) as do older adults.
This survey finds that Americans want openness with marketers. If marketers want to continue to use various forms of behavioral targeting in their interactions with Americans, they must work with policymakers to open up the process so that individuals can learn exactly how their information is being collected and used, and then exercise control over their data. We offer specific proposals in this direction. An overarching one is for marketers to implement a regime of information respect toward the public rather than to treat them as objects from which they can take information in order to optimally persuade them.
Jennifer King and Andrew McDiarmid. “Where’s The Beep? Security, Privacy, and User Misunderstandings of RFID.” In proceedings of USENIX Usability, Security, and Psychology.
Chris Jay Hoofnagle and Jennifer King. “What Californians Understand About Privacy Online.”
Chris Jay Hoofnagle and Jennifer King. “What Californians Understand About Privacy Offline.“
Jennifer King, Deirdre Mulligan, and Steven Raphael. “CITRIS Report: An Evaluation of the Effectiveness of the City of San Francisco’s Community Safety Cameras.”
This study evaluates the effectiveness of the City of San Francisco’s Community Safety Camera (CSC) program. Chapter 1 describes the origins of the CSC program and the City of San Francisco’s primary and secondary policy objectives for it, as expressed in the statements, technical choices, policies, and practices made by the Mayor’s Office, the City’s Board of Supervisors, the Police Commission, the San Francisco Police Department, and other entities and individuals that have played key roles in shaping the program as it exists today. Chapter 2 provides an empirical analysis of the CSC program’s effectiveness in deterring crime, particularly violent crime. Chapter 3 analyzes the effectiveness of the CSC program as a investigatory and evidentiary tool, and considers the program’s effectiveness in supporting the secondary objectives of facilitating community participation, oversight and accountability, and the protection of privacy and related interests. Chapter 4 considers the managerial and technical aspects of the system that span all objectives, based on our findings. Chapter 5 provides guidance and recommendations to the City for the CSC program based on its current objectives, and offers preliminary thoughts on possible alternatives the City may consider for the program.
This report was sponsored by Center for Information Technology Research in the Interest of Society (CITRIS) at UC Berkeley.
Jennifer King and Chris Jay Hoofnagle. “A Supermajority of Californians Support Limits on Law Enforcement Access to Cell Phone Location Information.”Presented at the 37th Research Conference on Communication, Information and Internet Policy (TPRC), September 26, 2008, George Mason University, Alexandria, VA.
Egelman, Serge, King, Jen, Miller, Robert C., Ragouzis, Nick, and Sheehan, Erika. “Security User Studies: Methodologies and Best Practices.” Extended abstracts of the ACM Conference on Human Factors in Computing Systems (CHI 2007). San Jose, CA, USA, April 28, 2007.
M. Meingast, J. King, and D. Mulligan. “Security and Privacy Risks of Embedded RFID in Everyday Things: the e-Passport and Beyond.” Journal of Communications, 2(7).
M. Meingast, J. King, and D. Mulligan. “Embedded RFID and Everyday Things: A Case Study of the Security and Privacy Risks of the U.S. e-Passport.” In Proceedings of IEEE International Conference on RFID, March 2007.
Chris Jay Hoofnagle and Jennifer King. “Consumer Information Sharing: Where The Sun Still Don’t Shine.”