2
Extending the Object of Responsibility Assessments in RRI

The discussion about responsibility with regard to NEST usually covers accountability for possible future consequences of scientific and technological progress and the use of its results. In the context of responsibility, it must be clarified which of the possible consequences can today be considered to be explicitly desired (the “right impacts” according to von Schomberg [VON 13]) or, in view of the possibly unintended consequences, at least as responsible. In this chapter, this apparently obvious orientation of RRI is substantially expanded with regard to the object of responsibility, namely, by the responsibility for the attributions of meaning that are currently taking place with regard to NEST developments and for the consequences of these attributions.

2.1. Motivation and overview

The basic idea of RRI is that research and innovation are supposed to take place responsibly. This demand occasionally leads to irritations, namely, when it creates the impression that previous research and innovation have taken place in a more or less irresponsible manner. It is quite easy to counter this irritation. The point of RRI is not to raise an accusation against previous research and innovation, but to make responsibility and accountability transparent and to open the relevant issues to society, such as in participatory approaches.1

The second and frequently posed question as to what responsibility precisely means and how it can be distinguished from what is irresponsible or less responsible is more difficult to answer. When using the notion of responsibility, this notion is usually supposed to have a more or less clear meaning. However, this might be misleading, at least in the field of science and technology. Concerns have been expressed [BEC 92] that responsibility would be an empty phrase without reliable meaning, that it would merely show the character of an appeal and moralization of conflicts, that it would not be able to contribute to problem solving, that the uncertainty of knowledge about future consequences of today’s decisions would render any responsibility considerations ridiculous [BEC 93] and that the complex governance of modern science and technology involving many actors would lead to the effect of “thinning” responsibility. Therefore, in this chapter, I develop a heuristic for the concept of responsibility. This does not aim at philosophical depth2 but is intended to be a pragmatic tool (section 2.2).

The very idea of responsibility and responsibility ethics follows Max Weber’s distinction between the ethic of responsibility (Verantwortungsethik) and the ethic of ultimate ends (Gesinnungsethik) [WEB 46]. In this distinction, responsibility is related with a consequentialist approach. Taking over responsibility or assigning responsibility to other persons, groups or institutions indispensably requires, in this paradigm, the availability of valid and reliable knowledge, or at least of a plausible picture of the consequences and impact of decisions to be made or actions to be performed3. The familiar approach of discussing responsibilities is to consider future consequences of an action (e.g. the development and use of new technologies) and then to reflect these consequences from an ethical point of view (e.g. with respect to the acceptability of technology-induced risk). This is also the customary understanding of the object of responsibility in the RRI debates (sections 2.2 and 2.4), where responsibility is considered in a consequentialist sense as accountability for the future consequences of technology and innovation [OWE 13b, p. 38]:

“The first and foremost task for responsible innovation is then to ask what futures do we collectively want science and innovation to bring about, and on what values are these based?” [OWE 13b, p. 37]

This is understandable and correct. But it excludes a dimension of responsibility of RRI which this chapter would like to help set right. The talk is about the responsibility for the manner in which sociotechnical meanings are created and attributed to the NEST fields (section 2.5). This responsibility is also consequentialist, since these attributions have consequences (Figure 1.1), but in a different sense: at issue is not responsibility for the possible consequences of NEST in the near or distant future, but for the consequences of our current attributions of meaning (section 1.2). This extension of the object of responsibility is in line with the hermeneutic circle: searching for meaning and its further development by processing the circle shows consequences not only in far futures but also during present debates and in present decision making.

I will begin with a few illustrations from the history of RRI (section 2.2). After that, I will introduce a pragmatic concept of responsibility as a social construct with empirical, ethical and epistemological dimensions (section 2.3). Then, I will systematize the use of the concept of responsibility in the RRI debates, making it clear that, until now, responsibility in RRI has been for the consequences of NEST that may occur in the future (section 2.4). This approach must, which is the result of this chapter, be supplemented by considering the responsibility for the attributions of meaning themselves, which form the basis for the RRI debates (section 2.5).

2.2. Some impressions of RRI debates so far

The ideas of “responsible research” in scientific and technological advance and of “responsible innovation” in the field of new products, services and systems have been discussed for approximately 15 years now with increasing intensity. The RRI concept has emerged mainly in connection with a large variety of new technologies subsumed under the NEST notion, such as synthetic biology, nanotechnology, new internet technologies, robotics, geoengineering, etc. However, the motivation to speak of responsible research and innovation goes back to large-scale national programs to conduct R&D on nanotechnology [GRU 14a]. The US National Nanotechnology Initiative [NNI 99] adopted a strategic goal of “responsible development”:

“Responsible development of nanotechnology can be characterized as the balancing of efforts to maximize the technology’s positive contributions and minimize its negative consequences. Thus, responsible development involves an examination both of applications and of potential implications. It implies a commitment to develop and use technology to help meet the most pressing human and societal needs, while making every reasonable effort to anticipate and mitigate adverse implications or unintended consequences” [NAT 06, p. 73].

Other actors active in research policy quickly followed. The UK Engineering and Physical Sciences Research Council published a study on responsible innovation for nanotechnology in the field of carbon capture. The Netherlands organized a national dialogue on nanotechnology, requesting that further development in nanotechnology should be “responsible” [GUS 14a]. The European Union adopted a code of conduct for nanoscience and nanotechnology (N&N) research [ECE 08] referring to research and development but also to public understanding and the importance of precaution. It also links responsibility reflection to governance [SIU 09, p. 32]: the guidelines “are meant to give guidance on how to achieve good governance”, and

“[g]ood governance of N&N research should take into account the need and desire of all stakeholders to be aware of the specific challenges and opportunities raised by N&N. A general culture of responsibility should be created in view of challenges and opportunities that may be raised in the future and that we cannot at present foresee” [ECE 08, SIU 09, p. 32].

Nanotechnology has attracted all this attention because it is an example of a technology that is known for its potential high stakes, deep uncertainties involved and possible adverse effects to occur. Thus, the field of and debate on nanotechnology might be regarded as a model for other RRI debates on NEST developments [GRU 14a]. The purpose of the RRI activities on nanotechnology was to enhance the possibilities that technology will help to improve the quality of human life, that possible unintended side effects will be discovered as early as possible in order to enable society to prevent or compensate them and that, accordingly, the benefits of these technologies and innovations could be harvested.

This rationale is well known from the field of technology assessment (TA) [GRU 09a], in particular from constructive TA [RIP 95]. The control dilemma [COL 80], however, emphasizes that shaping technology to optimally harvest intended and to avoid unintended effects is an ambitious task in danger of either coming too late or too early. Facing this dilemma, the conceptual development of major parts of TA over the last some 10 years may be characterized as an “upstream movement” to the early stages of technology development [VAN 13b]. The expectation was and still is that giving shape to technology should also be possible in the case of only little knowledge being available about applications and usage of the technology under consideration. Various approaches were proposed to circumvent the control dilemma [LIE 10]. The fields of technology considered such as nanotechnology, nano-biotechnology and synthetic biology thus show a strong enabling character leading to a manifold of possible applications in different areas which are extremely difficult to anticipate (see section 1.3.1). This situation makes it necessary to shape any reflective activity on responsibility as an accompanying process referring to the ethical, social, legal and economic issues at stake [SCH 06, Chapter 5] – well known from the field of technology assessment [VAN 97].

Consequently, the RRI definition proposed breathes the spirit of technology assessment in this sense [VON 07] because it basically introduces RRI as a process, enriched by ethical elements derived from the responsibility issue:

“Responsible Research and Innovation is a transparent, interactive process by which societal actors and innovators become mutually responsive to each other with a view to the (ethical) acceptability, sustainability and societal desirability of the innovation process and its marketable products (in order to allow a proper embedding of scientific and technological advances in our society)” [VON 12].

RRI adds explicit ethical reflection to the procedural upstream movement of TA and involves the ethics of engineering and technology as the second major root of RRI [GRU 11a]. RRI brings together TA with its experiences in assessment procedures, actor involvement, participation, foresight and evaluation with engineering and technology ethics, in particular under the framework of responsibility [JON 84, DUR 87]. This integration overcomes the separation of ethics and TA which led to heavy discussions in the 1990s [GRU 99].

A further integration concerns the relation of ethics and TA, on the one side, and actor constellations and contexts of deliberation and decision making, on the other side. Because RRI applies a “make” perspective, the socio-political dimension of the processes under consideration must be taken into account – and this leads to the necessity of involving social sciences, in particular from the field of STS (science, technology and society) studies. RRI unavoidably requires more intense inter- and transdisciplinary cooperation between engineering, social sciences and applied ethics. Thus, the novelty of RRI mainly consists of this integrative approach [GRU 11a]. This interpretation of the genesis and origins of RRI allows relating RRI easily to the EEE concept of responsibility (see section 2.3.2) because RRI involves the empirical dimension (actor constellation and procedural aspects), the ethical dimension of moral acceptability of consequences and impacts of NEST developments and the epistemological question of the quality of the consequential knowledge available.

An operable example of what RRI could mean in practice is the research program “Responsible Innovation – Ethical and Societal Exploration of Science and Technology” (MVI, following its Dutch name) of the Dutch Organization for Scientific Research (NWO). The MVI program – which is among the earliest manifestations of RRI – focuses on technological developments which we expect to have an impact on society [VAN 14a]. On the one hand, these developments concern NEST fields such as ICT, nanotechnology, biotechnology and cognitive neuroscience, and on the other hand, technological systems in transition such as energy, agriculture and healthcare. The MVI program contributes to responsible innovation by increasing the scope and depth of research into societal and ethical aspects of science and technology [NWO 16]. The projects funded under this program have to demonstrate a “make” perspective beyond mere scientific research:

“Projects for research into ethical and societal aspects of concrete technological developments must always have a ‘makeable’ perspective. In other words, they must not only lead to an analysis and an improved understanding of problems, but also result in a ‘design perspective’ – in the broadest sense, including institutional arrangements” [NWO 16].

In 2009, the MVI program started funding 15 projects in the first round [NWO 16]. One example is the project “New economic dynamics in small producers’ clusters in northern Vietnam – Institutions and responsible innovation with regard to poverty alleviation” focused on the analysis and enhancement of the value-added chains of local producers. It “builds further on the research outcome by exploring the potential importance of these specific technological cases for poverty reduction in developing countries, thus whether the innovations could be labeled as ‘responsible innovations’. Vietnam offers a particularly interesting research context since the innovations of poor small producers are based on private initiatives with an institutional environment in transition” and aims:

  • – “to understand the concept of ‘responsible innovation’ and its valorization in small producers’ clusters in northern Vietnam;
  • – to explain the multi-level institutional framework enabling and facilitating the small producers to innovate;
  • – to assess how the institutional framework interacts with small producers’ economic behavior through incentives” [NWO 16].

This description clearly shows that research in the framework of RRI is not an end in itself but rather a means for analyzing and then improving the conditions of local life in the region considered. A Valorization Panel – obligatory to all of the MVI projects – takes care that the “make” perspective is observed in conducting the projects.

This example shows some of the new accentuations of the responsible research and innovation approach compared to existing approaches such as TA and engineering ethics [GRU 11a, OWE 13a]:

  • – “Shaping innovation” complements or even replaces the former slogan “shaping technology” which characterized the approach by social constructivist ideas to technology [BIJ 94]. This shift reflects the insight that it is not technology as such which influences society and therefore should be shaped according to human needs, expectation and values, but it is innovation by which technology and society interact;
  • – A closer look is taken at societal contexts of new technology and science. RRI can be regarded as a further step toward taking the demand pull perspective and social values in shaping technology and innovation more serious;
  • – Instead of expecting distant observation following classical paradigms of science, there is a clear indication for intervention into the development and innovation process: RRI projects are expected to make a difference not only in terms of scientific research but also as interventions into the real world. Thus, RRI might be regarded as part of “transformative science” [SCH 13];
  • – Based on earlier experiences with new technologies such as genetic engineering and with corresponding moral and social conflicts, a strong incentive is to “get things right from the very beginning” [ROC 02] instead of running into necessities of repairing communicative or other damage during the R&D or diffusion process;
  • – User involvement, stakeholder involvement and citizen involvement into the research and innovation processes are regarded as an important approach to better integrate societal needs and perspectives, on the one hand, and technology and innovation, on the other hand [VON 12, OWE 13a].

Thus, responsible research and innovation can be regarded as a further development, even a radicalization of the well-known postnormal science [FUN 93], being even closer related to social practice, stakeholders and innovation policy and being prepared for intervention and for taking responsibility for this intervention and its consequences.

2.3. A pragmatic view on the notion of responsibility

Upon closer examination, the meaning of the concept of responsibility is by no means as obvious as it may appear at first glance:

“This [defining the concept of responsibility, A.G.] appears to be a simple operation […]. However, the constant increase in its usage in different sectors, in order to respond to the challenges related to innovation, has generated a proliferation of meanings and acceptions that makes such an operation not an easy matter” [GIA 16, p. 29].

For this reason, I will first provide a pragmatic explanation of the concept (section 2.3.1), followed by a suggestion for its operationalization in three dimensions (EEE approach, section 2.3.2) and by conclusions regarding responsibility assessment in the framework of RRI for NEST developments (section 2.3.3).

2.3.1. The concept of responsibility

Responsibility is a topic we talk about and discuss if there is a reason. Answering questions about responsibility is usually controversial among those involved (e.g. about its distribution, attribution, guilt, or benefits, etc.). On the one hand, this can refer retrospectively to the responsibility for previous actions and decisions, such as clarification of guilt in legal issues. On the other hand, an issue that can be made a topic prospectively is the accountability and the distribution of responsibility for decisions that are still to be made, and it is only with this issue that this book will deal. This can, for example, be the case if new and vague aspects arise for which there are not yet any rules or criteria for the attribution of responsibility or if the latter are a matter of controversy. The purpose of speaking about responsibility is to overcome these vagaries or controversies and achieve a consensus over the structure of responsibility in the affected field. Speaking about responsibility thus ultimately serves a practical purpose: clarification of the specific responsibilities for actions and decisions. “Responsibility ascriptions are normally meant to have practical consequences” [STA 13, p. 200].

The special role that the concept of responsibility plays in discussions of the early design of scientific and technological progress and in dealing with its consequences is obvious [LEN 07]. The reasons are, first, that new opportunities for action are created by science and technology, that the responsibilities for these opportunities have not yet been clarified and that routines for handling responsibility and accountability do not yet exist. Second, doubts are raised about the existing structures of responsibility, especially as to the extent they are still adequate in view of the spatial and temporal reach of technical actions and the depth of their intrusion into the life of the individual and society [JON 84]. These discussions comprise questions of the distribution of responsibility between science, politics and civil society or between different generations, the range of responsibility in space and time and the ethical foundations of assessing decisions according to accountability, according to the object of responsibility itself, as well as according to the carriers of responsibility. All of them concern the prospective dimension of responsibility, as is the object in RRI [OWE 13a, GRI 13].

Actors can feel responsible or be held responsible by others for decisions and actions and for their consequences, which constitute the object of responsibility:

“Responsibility can be understood as a social construct that establishes relationships between a set of different entities. Primary among them are the subject and the object. The subject is the entity that is held responsible. The object is that which the subject is responsible for” [STA 13, p. 200].

The decisive precondition is the ability to allocate the results of actions to an active party. This can be a person, a group, an institution, or, metaphorically speaking, a collective such as a generation. On this basis, it is possible to attribute responsibility, whether by oneself or by others. Responsibility is a social construct, namely, the result of social processes of attribution [STA 13, p. 200]. The attribution of responsibility is itself an act that takes place subject to objectives and relative to the rules of attribution [JON 84]. The circle of actors that are capable of being responsible has to be limited, and criteria must be given as to what conditions the individual actors have to satisfy in order for them to be held responsible (e.g. minimum age). The possibility of attributing responsibility depends on which requirements in this regard are tied to the capacity of the actors to take action [GRI 13].

Ethical issues pertaining to responsibility arise if a question is raised as to the rules and criteria according to which responsibility is supposed to be accepted, for instance, according to the ethical principles decisions that are to be made over accountability for actions or whether accepting risks is reasonable [GRU 08a]. In addition to the ethical issues, a large role is played by the question of how credible our knowledge of an action’s consequences is. The attribution of responsibility must, therefore, also be relative to the status of our knowledge, precisely because in the framework of NEST debates this knowledge is frequently epistemologically precarious (Chapter 3). In this way, we obtain a concept of responsibility that is productive and pragmatically usable [GRU 14c, GRU 16a]. It brings together subjects of responsibility with its objects and embeds this relation in normative as well as in epistemological terms:

  • someone (an actor, e.g. a scientist or a regulator) assumes responsibility or is made responsible (responsibility is assigned to her/him) for;
  • something (the outcomes of actions or decisions including unintended side effects) relative to;
  • rules and criteria (in general, the normative framework valid in the respective situation (Chapter 3) [GRU 12b], e.g. laws, regulation and rules of responsible behavior given in a code of conduct) and relative to the;
  • knowledge available (knowledge about the impacts and consequences of the action or decision under consideration, also including meta-knowledge about the epistemological status of that knowledge and uncertainties involved).

Though the first two factors are, in a sense, trivial in order to make sense of the word “responsible”, they indicate the fundamental empirical dimension of assigning responsibility, which inevitably is a process among social actors. The third and fourth factors open up essential dimensions of responsibility: the dimension of rules and criteria comprises principles, norms and values being decisive for the judgment whether a specific action or decision is regarded responsible or not – this constitutes the ethical dimension of responsibility. The knowledge available and its quality including all the uncertainties form its epistemic dimension. Together, I call this the EEE approach to responsibility [GRU 14c].

2.3.2. The EEE approach to responsibility

Relevant questions arise in all of these three EEE dimensions in prospective RRI debates on NEST fields [GRU 14c, GRU 16a]:

  1. 1) The empirical dimension of responsibility takes seriously that the attribution of responsibility is an act done by specific actors that affects others. It refers to the basic social constellation of assignment processes. Attributing responsibilities must, on the one hand, take into account the possibilities of actors to influence actions and decisions in the respective field. Issues of accountability and power are part of the game. On the other hand, attributing responsibilities has an impact on the governance of that field. Shaping that governance is the ultimate goal of debating issues of assigning and distributing responsibility ex ante. Relevant questions are: How are capabilities, influence and the power to act and decide distributed in the field considered? Which social groups are affected and could or should help decide about the distribution of responsibility? Do the questions under consideration concern issues to be debated in the polis or can they be delegated to groups, societal subsystems or to the marketplace? What consequences would a particular distribution of responsibility have for the governance of the respective field, and would it be in favor of desired developments?
  2. 2) The ethical dimension of responsibility is reached when the question is raised as to the criteria and rules for judging actions and decisions under consideration as responsible or irresponsible, or for finding out how actions and decisions could be designed to be (more) responsible. Insofar as normative uncertainties arise [GRU 12b], e.g. because of ambiguity or moral conflicts, ethical reflection on these rules and their justifiability is needed. Relevant questions are: What criteria allow us to distinguish between responsible and irresponsible actions and decisions? Is there consensus or controversy on these criteria among the relevant actors? Can the actions and decisions in question (e.g. about the scientific agenda or about containment measures to prevent bio-safety problems) be regarded responsible with respect to the rules and criteria?
  3. 3) The epistemic dimension asks for the knowledge about the subject of responsibility and its epistemological status and quality. This is a relevant issue in particular in debates on scientific responsibility because statements about impacts and consequences of science and new technology frequently show a high degree of uncertainty. The comment that nothing else comes from “mere possibility arguments” [HAN 06] indicates that in debates over responsibility, it is essential that the status of the available knowledge about the futures to be accounted for is determined and is critically reflected from an epistemological point of view [GRU 12b, Chapter 10]. Relevant questions are: What is really known about prospective subjects of responsibility? What could be known in case of more research, and which uncertainties are pertinent? How can different uncertainties be qualified and compared to each other? And what is at stake if worse comes to worst?

This brief analysis shows that the issue of responsibility is not only one of abstract ethical judgment but also necessarily includes issues of concrete social contexts and governance factors (which have to be treated empirically) as well as the issue of the epistemological quality of the knowledge available. It seems that the familiar criticisms toward responsibility reflections of being simply appellative, of epistemological blindness and of being politically naïve are related to narrowing responsibility to its ethical dimension. Meeting those criticisms and making the notion of responsibility work is claimed to be possible by considering the EEE dimensions of responsibility together [GRU 14c, GRU 16a].

2.3.3. Responsibility assessment

The notion of an assessment has been frequently used in the past decade to demarcate complex processes of getting insight into diverse fields of interest with the objective to provide some meta-information which then could be used to inform decisions makers and the public. Some examples are the Intergovernmental Panel on Climate Change (IPCC) with its well-known huge assessment reports [IPC 14], the Global Energy Outlook with its assessment of the current and foreseeable situation concerning energy supply worldwide and the fields of technology assessment [GRU 09a], risk assessment and sustainability assessment [SIN 09], which have become well-established fields of investigation and reasoning as well as the fundament for scientific policy advice.

There are cognitive and evaluative dimensions to assessments. Cognitively, bodies of knowledge from various fields have to be compiled and integrated from a common perspective4. This integrated knowledge must then be evaluated in relation to certain issues from practice in order to draw conclusions regarding actions and decisions (see Grunwald [GRU 09a] for the case of technology assessment). Assessments are regulated procedures in which the integration and evaluation of knowledge take place in specific steps that are as transparent as possible, frequently with the integration of participative elements.

In the field of RRI, assessment types such as risk assessment and technology assessment are used to cover specific issues [GRU 11a]. However, the notion of responsibility assessment has not been used systematically so far. In analogy to the established forms of assessment, the goal of responsibility assessment is to produce evaluations of responsibility for the respective object of debate and of an adequate distribution of responsibility to the relevant actors, corresponding to the postulates of stakeholder participation and transparency that are characteristic of RRI. Viewed in this manner, responsibility assessments are core tasks of the RRI debates.

These assessments require guidance in order to be performed in a comprehensible but also operable manner. The EEE concept offers itself as a starting point. Clarifications must be made in each of its three dimensions, possibly even including the determination of indicators of accountability5. This exceeds the framework of this book, which can only provide initial suggestions:

  1. 1) The empirical part covers the actor constellation in the respective fields, the groups, institutions and perhaps persons involved including their relationships, the accountabilities, the power relationships and so forth – shortly speaking: a model of the governance of the field under consideration (see Grunwald [GRU 12b, Chapter 7] for what this could look like in synthetic biology). This model must be part of a responsibility assessment because of the social dimension of responsibility being a result of social processes of assignment according to rules of assigning responsibility. This model usually has a prospective dimension in NEST issues because a responsibility reflection not only considers present issues such as the organization of laboratory research with regard to safety and security issues, but also applies to responsibility assignments in some future. The current debate on nuclear waste disposal in Germany and many other countries is an excellent example of how the arrangements of authorities, institutions, supervisory bodies, scientific advisory boards and means of participative and public debate are debated in order to identify appropriate responsibility and accountability structures to ensure a responsible, safe and transparent process over a long time [END 16].
  2. 2) The ethical part comprises all the normative issues involved such as criteria of responsibility and their ethical background, including weighing different and possibly counteracting issues against each other. This dimension also involves a prospective part because ethics does not provide an eternal body of rules and principles which only have to be applied to upcoming questions and challenges. Relating the abstract principles of ethics, e.g. the Categorical Imperative by Immanuel Kant or the utilitarian principles by John Stuart Mill and others to specific questions under consideration is not simply an application of these principles to cases but rather requires bridging the gap by intermediary interpretative and hermeneutic steps which necessarily involve social and cultural issues which might change over time. Thus, despite the fact that normative ethics aims at more abstract and universal principles, any responsibility assessment must relate principles to cases, which is impossible without considering empirical attitudes, customs and perceptions. As far as far-reaching futures are involved, the responsibility assessment must not simply extend today’s situation in moral terms to the future while assuming, for example, fast technological advance. Instead, thinking in techno-moral scenarios might be helpful [WAE 14].
  3. 3) The epistemological part is crucial because the dimension of prospective knowledge constitutes the core challenge in many future-related debates. Responsibility assessment in the consequentialist paradigm has to look at future consequences and impacts of alternative actions and decisions to be made. The most responsible action or decision would be the one with the best (relative to its ethical dimension, see above) perspective regarding the envisaged future outcomes and consequences. The responsibility assessment, therefore, must investigate the possible future consequences in an – ideally – comprehensive way in order to allow a comparative view on all the options under consideration. It must also provide instruments to compare the different sets of the consequences and to allow integration of different aspects to a consistent picture. The multi-criteria decision analysis (MCDA) [BEL 02] is one approach aiming at a transparent process of aggregating different criteria and different assessment. However, the frequent criticisms against this method clearly demonstrate the principal difficulties of getting an integrated picture by aggregating incommensurable criteria. This situation is even worse in the field of prospective knowledge because of the high uncertainties involved. In NEST fields, there is, according to the initial observations of this book (section 1.1), almost no chance to apply a quantitative method. Thus, the epistemological dimension seems to be the most challenging one not only in methodological terms but also in cognitive terms.

Previous experiences with assessment procedures which also had to deal with issues of high uncertainty and incommensurable criteria are helpful and could give some insights [PER 07]. However, NEST fields are special in some respect and need specific consideration, given the epistemologically precarious nature of prospective knowledge (Chapter 3).

2.4. The object of responsibility debates in RRI so far

It has long been a matter of controversy whether science and engineering have any morally relevant content at all and could therefore be subject to ethical reflection on responsibility. Until into the 1990s, technology was frequently held to be value neutral. Numerous case studies have, however, since recognized the normative background of decisions on technology (even of those made in the laboratory [VAN 01]) and made it a subject of reflection [VAN 09]. Technology is morally relevant, particularly concerning its purposes and goals, the measures and instruments used and the evolving side effects. It is therefore subject to responsibility debates [JON 84, DUR 87] and related ethical reflection [GRU 99, VAN 09]. This is also true of science. Thus, science and technology exhibit morally relevant aspects, namely, concerning (1) the purposes they pursue, (2) the instruments they employ and (3) the consequences and side effects they produce [GRU 12b, GRU 13b]:

  1. 1) In order to shape technology and provide orientation to science, imagination is needed as to the desired future developments, about the goals and visions of the future of society, or ideas of what science and technology should contribute to meeting future challenges. In many cases, the aims and goals of science and technology are not problematic. To develop therapies for illnesses such as Alzheimer’s disease, to provide new facilities to support handicapped people, or to protect society against natural hazards – visions of this type can be sure to gain high social acceptance and ethical support. There are – at the normative level – no rational reasons to object to them (this might be different at the level of the instruments needed or the side effects to be expected, see below). In other areas, however, there are social conflicts even at the normative level. The visions related to manned spaceflight, for example, are controversial in nature. In the field of nanotechnology, the discussions surrounding the converging technologies [ROC 02], especially those concerning the vision of improving human performance, are the most likely to become subjects of controversy in a moral respect. These questions lead to the challenge posed by knowledge policy [STE 04]: What knowledge do we want and what do we not want? Obviously, such questions will be subject to responsibility debates and deliberation.
  2. 2) Instruments, measures and practices in research and development may lead to moral conflicts regardless of the respective goals. Examples are the moral legitimacy of experiments with animals or of practices making use of human persons, embryos or stem cells as subjects of research, but also experiments with genetically modified organisms or plants, especially their release outside of laboratories or, in earlier times, the experimental testing of nuclear weapons. Professional research and engineering ethics are the respective fields of reflection, for example, concerning a code of conduct or rules of good scientific and engineering practice. This branch of ethical issues involved in research and technology, however, is not at the heart of RRI debates: “We have also spoken of the need for reflection on the purposes of innovation […] and on the need to anticipate the impacts innovation might have” [OWE 13b, p. 37].
  3. 3) Since the 1960s, the unintended and adverse effects of scientific and technical innovations have been considerable, and some of them were of dramatic proportions: accidents in technical facilities (Chernobyl, Bhopal and Fukushima), threats to the natural environment (air and water pollution, ozone holes and climate change), negative health effects as in the case of asbestos, the social and cultural side effects (e.g. labor market problems caused by automatization) and the intentional abuse of technology (the attacks on the World Trade Center). These experiences were among the motivations to speak of a second and more reflexive modernity [BEC 92]. The increasing complexity of technical systems, their diverse interlacing and their connectivity with many areas of society increase the difficulties of being able to predict and consider the consequences of actions or decisions. This applies in particular to enabling technologies (section 1.3.1) such as nanotechnology and leads immediately to responsibility debates: How can a society that places its hopes and trust in innovation and progress harvest the expected benefits but simultaneously protect itself from undesirable, possibly disastrous side effects, and how can it preventatively stockpile knowledge to cope with possible future adverse effects? What extent of risk or ignorance is morally acceptable? How is responsible action possible in view of the high uncertainties involved?

From the aims of RRI (see section 2.2) and the discussions about the individual fields of NEST, it is clear that the objective of reflection on responsibility is to recognize the possible future consequences of these scientific and technological developments. The objective is to design research and development today in a manner that responsible consequences or ideally the “right impacts” [VON 13] occur. Not surprisingly, RRI thus places itself in the tradition of the consequentialist-oriented ethics of technology [JON 84] and of technology assessment [GRU 09a].

2.5. The object of responsibility debates in RRI: an extension

There are two independent reasons why this determination of the objects of responsibility is comprehensible and necessary but nonetheless unsatisfactory:

  1. 1) The consequences of NEST that might occur in the future and that are the objects of RRI debates are frequently epistemologically precarious or even to a great extent bordering on speculation [NOR 07a] (see section 3.2). The lack of knowledge limits the possibility of drawing valid conclusions for responsibility assignments and assessments6. The following quote taken from a visionary paper of synthetic biology hits the point:

“Fifty years from now, synthetic biology will be as pervasive and transformative as is electronics today. And as with that technology, the applications and impacts are impossible to predict in the field’s nascent stages. Nevertheless, the decisions we make now will have enormous impact on the shape of this future” [ILU 07, p. 2].

It expresses (1) that the authors expect synthetic biology to lead to deep-ranging and revolutionary changes, (2) that our decisions today will have high impact on future development, but (3) we have no idea what the impact will be. In this situation, there would be no chance of assigning responsibility; even speaking about responsibility would no longer have a valid purpose.

Many, perhaps almost all, responsibility debates on NEST issues consider narratives about possible future developments involving visions, expectations, fears, concerns and hopes that can hardly be assessed with respect to their epistemological validity (Chapter 3). It thus appears questionable whether consequentialism applied to more or less speculative consequences is at all sensible for NEST. At least for the core field of the consequences of today’s research, as it is expressed both in the search for the “right impacts” [VON 12] and in the approaches to avoiding undesired and unintended consequences [BEC 07], consequentialism threatens to fail because of the unavailability of reliable knowledge about the consequences:

“However, this way of understanding responsibility tends to assume a consequentialist perspective that cannot answer to the uncertainty that characterizes the development of innovative techniques and technologies. RRI’s crucial issue, the one for which we make use of the criterion of responsibility, is exactly to provide an answer to the uncertainties that are implied in the complex relations between individual actions, social relations, and natural events” [GIA 16, p. 36].

If, however, the assumed consequences of NEST disappear in an epistemological nirvana, concern based on the ethics of responsibility would be pure speculation and, accordingly, the consequence of its conclusions would be completely arbitrary. This would demonstrate its inability to provide reliable orientation. This doubt is not new. Bechmann [BEC 93] already pointed out that the ethics of responsibility can lose its subject area in certain constellations. The debate over speculative nanoethics is also driven by similar concerns [KEI 07, NOR 07a, NOR 09]. While this issue has been mentioned over and over again in RRI studies, its systematic consequences have hardly been drawn.

  1. 2) The creation and attribution of sociotechnical meaning take place primarily at the beginning of or even before the RRI debates and can strongly influence them (Figure 1.1), most strongly perhaps under the familiar designation of a self-fulfilling or a self-destroying prophecy [MER 48, WAT 85]. One of the arguments of Nordmann/Rip [NOR 09] against speculative nanoethics was precisely that it could motivate unnecessary and irrational debates which could damage nanotechnology and distract from other real problems. For this reason, such actions have to be viewed from the perspective of responsibility, independent of the uncertainty of our knowledge of the consequences. This is also demonstrated by considering attributions of meaning from the viewpoint of action theory, regardless of whether the attributions take place by means of the creation and communication of technology futures or approaches to definition and characterization:
  • – they themselves are actions; neither technology futures nor definitions arise on their own, but are made in social processes and have authors;
  • – authors pursue their own goals and purposes: something is supposed to be achieved by technology futures and definitions;
  • – in the process, diagnoses are employed as to why the achievement of an objective is expected of the specific attribution of meaning;
  • – its implementation requires means: texts, narratives, diagrams, images, works of art, films, etc.;
  • – the implementation itself constitutes an intervention in the real world and has (more or less far-reaching) consequences (see section 1.3), intended or unintended.

As a result, the creation and attribution of meaning can be addressed from the perspective of responsibility, just as any action can. This is considered consequentially since we are considering the consequences of attributions of meaning. Yet, the object of responsibility here is not the distant future and the possible consequences of today’s NEST developments, but the processes of communication and understanding concerning meanings of NEST that themselves are taking place in the present. At issue are the consequences that certain attributions of meaning can have and what this then could mean for the attribution of responsibility.

Therefore, attributions of meaning and the communicative acts involved in them should accordingly be taken up as objects of RRI debates in addition to the possible or presumed consequences of NEST. This extension has indeed already been implied in preliminary works on vision assessment [GRU 09b] and hermeneutic orientation [GRU 14b, GRU 16b], but so far only to take into account the finding of deficits in our knowledge of consequences (point 1 above). In contrast to that extension, the extension of the subject area of RRI debates suggested here is not viewed only as an expedient if other means of creating orientation no longer function. Here, there is a separate line of argumentation.

Responsibility – and in this I follow Max Weber – is fundamentally thought of consequentially (also the view of Owen et al. [OWE 13b, p. 35], who refer to the prospective dimension of responsibility). To accept responsibility for something, or to have it attributed, does not make sense conceptually if it is done without the dimension of consequences and thus without prospective considerations [STA 13, p. 200] (section 2.3). Attributions of sociotechnical meanings to NEST also have to be discussed from the perspective of responsibility because they can have consequences of their own accord7.

But to be able to do this – and to come full circle – we have to understand how meanings are created, communicated and attributed. We have to understand what is meant by these sociotechnical meanings and which associations they permit. In other words, we need a hermeneutic view of the production, attribution, dissemination and deliberation of sociotechnical meaning in the hermeneutic circle and, in particular, of its inception (Figure 1.1). In doing so, we return to the fifth observation at the beginning of this book (section 1.1).

The following consideration is suitable to illustrate this. In the customary self-descriptions of RRI as well as of technology assessment, it is said over and over again that the opportunities and risks of NEST have to be recognized as early as possible and made the object of reflection in order to be able to use the options for shaping developments. The purpose is to make the positive expectations come true and to minimize or avoid the negative ones [OWE 13a]. Such statements assume that it is already clear what the opportunities and risks of certain NEST are. But how do such determinations and attributions, including the necessary evaluations, come into being and what do they depend on? According to the first observation in this book (section 1.1), the attribution of sociotechnical meaning is decisive in determining this. The prime objective of the approach suggested in this book is to clarify these processes in their initial stage (see Figure 1.1) because decisions are made at this stage that significantly mold the later debates and that sometimes can hardly be corrected because of the path dependencies that arise.

This extension of the question as to responsibility does not make the view of the future consequences of NEST obsolete. The imperative continues to be to gain an idea of the possible long-term effects of today’s research and innovation [JON 84] and to reflect on the results according to the standards of responsibility [LEN 07], inasmuch as this does not fall victim to the epistemological nirvana mentioned previously. This traditional mode of the RRI debates continues to be important, but I put a further mode alongside it: directing our view to the source of the social environment of NEST, which is a prerequisite for new technology becoming socially interesting in the first place. This is where the technological futures and characterizations of new NEST fields create sociotechnical meaning.

2.6. Concluding remarks

The suggested extension of the object of the debates over responsibility with regard to NEST is a further upstream movement of the reflection on technology. At issue are not the early stages of development of NEST generally but the very first steps in which new technology becomes (or is made) the object of ethical and social debates: the creation and communication of sociotechnical meaning. I dare to claim that in doing this, we have reached the source of the innovation stream, namely, the point at which scientific and technological invention is first linked with the ideas of implementation and innovation (see Figure 1.1).

Socially relevant attributions of meaning to scientific-technological advances and inventions are, viewed methodologically, starting points. It is in these beginnings that the first facts for further communication and deliberation are created. By bringing together deliberations about the future with technological research and development, the latter are placed in a social framework of meaning that unfolds a dynamic of its own. This process can reinforce itself and, for example, lead to the initiation of funding for research. In this way, it can have real and very substantial consequences for the agenda and research process of the sciences. Alternatively it can either raise doubts about the framework of meaning that was initially chosen or turn the latter into its opposite, leading to social resistance. The history of nuclear energy in Germany is an example of the latter [RAD 13], while the history of nanotechnology at the end of the 20th Century is an example of the former, which led to the establishment of the highly endowed National Nanotechnology Initiative [NNI 99].

Attributions of meaning take place in the medium of language, independent of whether they take the form of technology futures or of definitions and characterizations. Nonverbal tools such as images, diagrams, works of art and films can play a role. Their use and the reconstruction of their consequences must ultimately, however, take place back in the medium of language. Discussing responsibility for the creation and attribution of meaning thus means to speak about responsibility in the use of verbal tools as well as of the nonverbal tools that are employed:

“If the future depends on the way it is anticipated and this anticipation is made public, every determination of the future must take into account the causal consequences of the language that is being used to describe the future and how this language is being received by the general public, how it contributes to shaping public opinion, and how it influences the decision-makers” [DUP 04, p. 11].

RRI must thus also include reflection on its own tools, such as using linguistic criticism to examine the meaning of the futures and characterizations that it employs. This challenge opens our view to the wide spectrum of sciences that employ hermeneutics, whose concepts and methods are needed here (see Chapter 9).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.12.163.180