A New Concept for Privacy in the Light of Emerging Sciences and Technologies


A New Concept for Privacy in the Light of Emerging Sciences and Technologies

by Michael Friedewald, Fraunhofer ISI

Privacy is recognized as a fundamental human right. It underpins human dignity and other values such as freedom of association and freedom of speech. It has become one of the most important human rights of the modern age. However, privacy is challenged in the networked society. New technologies undermine the individual right because they facilitate to collect, store, process and combine personal data for the use of security agencies but also of businesses. In many cases this means that the notion of privacy is losing its value. Thus a new concept seems to be necessary.

Since January 2010 the European Commission is funding the 3-year project PRESCIENT (Privacy and Emerging Sciences and Technologies: Towards a Common Framework) that is coordinated by the Fraunhofer Institute for Systems and Innovation Research (ISI). The multidisciplinary team includes researchers from Trilateral Research & Consulting (UK), the Centre of Science, Society and Citizenship (Italy) and Vrije Universiteit Brussel (Belgium). PRESCIENT aims to establish a new framework for privacy and ethical considerations arising from emerging technologies.

1     The ever changing concept of privacy

Privacy is a multifaceted concept that is currently challenged by many developments in science and technologies. Some of the most prominent examples are identification technologies such as RFID (Radio Frequency Identification), social network services such as Facebook or the creation of large biobanks.

The concept of privacy has always been subject to changes. People define it differently and value it differently. Moreover, privacy often is balanced against other values, such as society’s safety and security. Empirical research is needed to determine how people value privacy, however they define it, in order to understand how citizens understand the right to privacy and its value within the whole context of other fundamental rights.

Privacy is not only respect for confidentiality, although it implies it. Privacy is not only the right to be left alone, although it includes it. Privacy is not only the right to control one’s own life, although it entails it. Nor is privacy only data protection, although it also concerns data protection. Privacy is all these things together and more, because privacy is the word we use to describe an important aspect of one of the main, vital and constitutive polarities that shape human beings, that is, the tension between individuals and the community. How do new technologies impact on this complex and rich concept? What are the privacy issues arising from different emerging technologies? Multidisciplinary analysis is needed in order to appreciate the various philosophical, political, legal, ethical and social meanings of the word “privacy” in the contemporary technological world.

2     Privacy in technology decision-making

Privacy is also a salient topic in technology policy-making. There is a need for a new social dialogue on privacy rights that includes issues such as the new borders of the private domain, a new business ethics and a dialogue on the balance between civil and government rights. From the privacy issues raised by new technologies, a new taxonomy of privacy problems is needed to help policy-makers balance privacy against countervailing values, rights, obligations and interests.

Data protection is both broader and more specific than the right to privacy. The relationship between these concepts is certainly something that needs to be addressed for a new concept of privacy. Data protection is broader because data protection not only aims to concretize the protection of privacy, but also tends to protect other rights and interests such as the freedom of expression, the freedom of religion and conscience, the free flow of information and the principle of non-discrimination. It is more specific because personal data are processed. The application of data protection rules does not require an answer to the question of privacy violation: data protection applies when the legal conditions are fulfilled. Furthermore, data protection rules are not prohibitive by default; they channel and control the way personal data is processed. Such data can only be legitimately processed if some conditions pertaining to the transparency of the processing and the accountability of the data controller are met.

Yet with the “technology revolution” the notion of privacy has started a new journey, beyond the mere legal sphere, which is probably leading privacy to its original roots, the relation between the citizen and the “polis”. We are facing new contexts (think, for instance, of the so-called PAN, personal area network, which describes a technology that could enable wearable computer devices to communicate with other nearby computers and exchange data) and new concepts (as, for example, the idea of genomic and proteomic information), not to mention issues raised by technologies such as biometrics, RFID, smart surveillance systems, body implants, nano devices and the like.

New technologies have specific features that make them quite different from traditional industrial technologies. Compared to technologies that drove the industrial revolution – which were complex, based on collective action, social infrastructure, and technical know-how – emerging technologies are lighter. They are decentred, dispersed and disseminated, and their control and use are largely in the hands of the individuals, citizen groups and small enterprises. They are network technologies (Castells 1996). In addition, new technologies help reduce the complexity of human (social, biological, political, etc.) interactions and allow the individual to distance himself from his observation. As Paul Virilio (1994) has emphasised, new technologies always bring about even more and even faster new technologies. Emerging technologies also imply a change in the relation between science and politics. In the last few decades, representation of science has changed so much that some people may say “doing science is another way of doing politics” (Mordini 2007, p. 37). Indeed, the post-modern technological system is embedded in politics. Researchers are under increasing pressure to demonstrate the policy relevance of their findings and to deliver tangible results. In turn, policy-makers are under increasing pressure to justify their choices of technology to be developed and socio-economic goals to be achieved. As emerging technologies often challenge basic moral assumptions, they provoke a crisis directly or indirectly, or at least a basic uncertainly with regard to moral standards that is either sanctioned by law or remains tacit presuppositions. This amounts to a growing gap between citizens, technology and politics, notably when the individual’s private sphere conflicts with the notion of common good.

3     The PRESCIENT project

The European Commission (EC) is now recognizing the need for a new concept for privacy to develop suitable methods in order to assess the impacts that emerging technologies have, and to think of privacy as a central element in the global governance of science and technology. The PRESCIENT project will address these issues and aims to progress the state of the art in three main areas:

  1. Developing a concept for privacy: Until now, privacy has mainly been regarded as a legal issue or, increasingly, as a human rights issue. Yet very little work has been devoted to privacy as a value and its role in the overall architecture of EU values as sketched by the Charter of Fundamental Rights of the EU. PRESCIENT intends to carry out case studies of five different emerging technologies (including identification and surveillance technologies) to determine whether there are privacy problems posed by new technologies that do not fall easily within commonly used taxonomy classification of privacy problems, such as the one suggested by Solove (2008). These five cases include 1) localisation and identification technologies, 2) smart surveillance, 3) biometrics, 4) on-the-spot DNA analysis and 5) technologies for human enhancement. The problem with framing privacy solely in individualistic terms is that privacy becomes undervalued. The interests aligned against privacy – for example, efficient consumer transactions, free speech or security – are often defined in terms of their larger social value. In this way, protecting the privacy of the individual seems extravagant when weighed against the interests of society as a whole. Ethical issues will also need to be addressed, especially as they come in increasing numbers and often “packaged” in terms of complex technology. Such ethical issues will require considerable effort to be understood as well as a considerable effort to formulate and justify good ethical policies. People who both understand the technologies and are knowledgeable about ethics are in short supply just as their need is expanding (Moor 2005, p. 118).
  2. Privacy Impact Assessment (PIA): In Europe, policy-makers have considered the adequacy of data protection legislation, the powers accorded to national data protection authorities, the tension between facilitating trade and transborder data flows while ensuring personal data are protected and accessible and not misused once they leave European jurisdiction. There has been a primary focus on legislative consideration. At the same time, the European Commission and others have been concerned about the advent of new technologies and how their possible privacy impacts can be addressed. The EC’s RFID consultation, in some ways, can be considered as a groundbreaking initiative in the sense that the EC initiated a consultation with stakeholders on the introduction and deployment of a new technology, something that has not really happened before. It also recommended the use of privacy impact assessments in new RFID applications. Although PIAs have been discussed in a few countries for more than a decade, they have only recently been introduced (by the UK Information Commissioner’s Office) as a tool in Europe (Warren et al. 2008). Use of PIAs is likely to grow in the coming years. The PRESCIENT project will make the case for more extensive use of PIAs modified to take into account ethical considerations. PIAs used in tandem with ethical impact assessments could do much to come to terms with stakeholder apprehensions and, more specifically, a lack of public and stakeholder knowledge about new technologies and their ethical implications before the technologies are widely deployed.
  3. Privacy policies: Technology, particularly revolutionary technology, generates many ethical problems. Sometimes the problems can be treated easily under extant ethical policies, but at other times – because new technology allows us to perform activities in new ways – situations may arise in which we do not have adequate policies in place to guide us. Sometimes we can anticipate that the use of the technology will have consequences that are clearly undesirable. We need to anticipate these as best as we can and establish policies that will minimise the deleterious effects of the new technology and an element of future governance of science and technology (Moor 2005, p. 115).

Understanding and taking into account the role of stakeholders, including the public, is important because they shape our (social) notions of privacy and how we assess the impacts of new and emerging technologies. More importantly, we need to take these views into account as a matter of social equity: new technologies and the issues they raise will impact the public, so the public must be consulted and given the opportunity to participate in policy-making. The privacy and ethical impact assessment framework, to be developed by the PRESCIENT partners, will be a way of unearthing and assessing ethical problems associated with new technology and involving stakeholders in the process. A final task of the project will be to formulate recommendations with regard to ethical approaches to the development of new technologies and to balance privacy and data protection against other values.


Castells, M., 1996: The Rise of the Network Society. Vol. 1: The Information Age: Economy, Society and Culture. Oxford

Moor, J. H., 2005: Why We Need Better Ethics for Emerging Technologies. In: Ethics and Information Technology 7/3 (2005), p. 111–120

Mordini, E., 2007: Nanotechnology, Society and Collective Imaginary: Setting the Research Agenda. In: Hodge, G. A.; Bowman, D.; Ludlow, K. (eds.): New Global Frontiers in Regulation: The Age of Nanotechnology. Cheltenham, p. 29–48

Solove, D.J., 2008: Understanding Privacy. Cambridge

Virilio, P., 1994: Die Eroberung des Körpers: Vom Übermenschen zum überreizten Menschen. München, Wien

Warren, A. et al., 2008: Privacy Impact Assessments: International experience as a basis for UK Guidance. In: Computer Law & Security Report 24/3 (2008), p. 233–242


Dr. Michael Friedewald
Fraunhofer Institute for Systems and Innovation Research (ISI)
Breslauer Straße 48, 76139 Karlsruhe
Tel.: +49 721 6809-146
E-Mail: michael.friedewald∂isi.fraunhofer.de