Ethics, Technology Assessment and Industy

Schwerpunktthema: Technikfolgenabschätzung und Industrie

Ethics, Technology Assessment and Industry

by Ibo van de Poel, Delft University of Technology

Technological development is not morally neutral. A number of ethical issues may emerge during the design and development of (new) technologies. The implications for the responsibility of engineers and corporations are discussed and possibilities for Technology Assessment (TA) that may help to fulfil these responsibilities are explored. In particular the possibilities and limitations of two activities that may be undertaken by corporations are discussed: formulating and enforcing codes of conduct and involving relevant stakeholders in technological development and design.

Industry plays a major role in the design and development of (new) technologies. Such technologies bring all kinds of goods and desirable effects to society, but they bring also risks and undesirable effects. New technologies like ICT and biotechnology, for example, have confronted society with a range of new ethical and social issues and questions.

In this paper, it is argued that many aspects of - and decisions in - engineering design and development are potentially ethically relevant. This means that engineers and the corporations employing them are confronted with a range of ethical and social issues in engineering design and development. This paper discusses how engineers and corporations could deal responsibly with these issues, using some ideas from the field of engineering ethics (Davis 1998, Harris et al. 1995, Martin & Schinzinger 1996, Unger 1994). Possibilities for Technology Assessment (TA) activities in responsibly dealing with moral and social issues in technology will be explored. I conceive TA as the set of activities, studies, tools and methods that are committed to reducing "the human costs of trial and error learning in society's handling of new technologies, and to do so by anticipating potential impacts and feeding these insights back into decision making, and into actors' strategies" (Schot & Rip 1996, 251). This includes not only early warning activities and TA studies to assess the possibilities, limitations and effects of (new) technologies, but also interactive or constructive activities with the aim to broaden the design of new technologies - in terms of design criteria and actors involved - and the redesign of old ones (Smits & Leyten 1991, Schot & Rip 1996).

The paper starts with an elaboration of the non-neutrality of technology and technological development and an exploration of possible moral issues in engineering design and development. The next two sections elaborate on the responsibility of engineers and corporations respectively. Possibilities for TA are discussed. The following two sections focus on two activities that corporations may undertake to give shape to their responsibilities: formulating and enforcing codes of conduct and involving relevant stakeholder groups and "outsiders" in technological development and design. For both, possibilities and limitations are discussed. Finally, some conclusions and implications for TA in industry are drawn.

The non-neutrality of technology

The idea that technology is a means to an end is still popular. This so-called instrumental vision of technology implies that the choice of technological means is a morally neutral affair, because it is determined by the goals that have been formulated for a technology.

There are at least three reasons why the instrumental vision is not satisfactory (cf. Rapp 1981). The first is that in actual practice the formulation of the goals to be met by a technology is not completely separated from the development and choice of technological means to meet those goals. Sometimes, for example, technologies are developed without clear goals in mind or lead to the establishment of goals after they have been realised. An example of the latter is aspirin, which according to Volti (1992), was not a response to, but the creation of the need to suppress fever.

A second argument against the instrumental vision is that given an end, the choice of means for that end is not neutral. The reason for this is that there are usually alternative ways to achieve an end or solve a technological problem. Usually these alternatives not only differ with respect to how effectively and efficiently they meet the formulated end, but also with respect to, for example, environmental and social effects. This means that the choice of a technological means to meet a given end is not morally neutral.

Third, technologies usually do more than realising their intended goal. They have all kinds of effects, desirable and undesirable, beyond the goals for which they have been intended. Sometimes, undesirable effects and their chance of occurrence are known beforehand, so that they can be formulated in terms of risks. Often not all effects are or can be known beforehand, for example, because they only occur on longer time-scales and partly depend on social developments. An example is the social effects of anticonceptives.

These three arguments show that the design and development of (new) technologies is not a neutral affair. Elsewhere, I have argued that a further analysis of the value-loadedness of technology results in at least five moments or types of decision in engineering design processes that are potentially morally relevant (Van de Poel 2000a): 

TA activities can contribute to dealing responsibly with these five issues in at least two ways: 

Such TA activities may also be carried out by industry. Before I discuss the possibilities and limitations of such activities in more detail, I will first elaborate what the non-neutrality of technology implies for the responsibility of engineers and companies.

The responsibility of engineers

What does the value-loadedness of technological development and engineering design imply for the responsibility of engineers? It might be argued that the fact that engineering design and development is not morally neutral does not imply that engineers themselves have to make ethically relevant choices. Engineers are only one of the actors involved in technological development. Maybe it is possible to devise a division of labour in which non-engineers, like politicians, managers or the "public" make the morally relevant choices and engineers only carry out a technological task. There indeed seems to be an important argument for such a division of labour. Many people are confronted with and undergo the consequences of technological development. It seems ethically desirable to involve these people in the relevant engineering decisions, or to ask their "informed consent" (Baum 1983, Martin & Schinzinger 1996, Zandvoort, 1998). I believe that this is indeed ethically desirable and that it also can be achieved to a larger degree than currently. However, I think that it will eventually not result in a division of labour in which engineers only carry out a technical, i.e. ethically neutral, task. Such a division of labour seems to me infeasible (Van de Poel, forthcoming). The main reason for this is that ethical and technical issues cannot be completely separated. For example, an assessment of the trade-offs between design criteria that different design alternatives imply requires both engineering knowledge and is ethically relevant.

If engineers are involved in morally relevant decisions, what exactly are their responsibilities in this respect? There is no consensus on this issue, not even on what basis the responsibility of engineers should be determined or constructed (Brumsen & Van de Poel, forthcoming). Nevertheless, codes of ethics for engineers, as formulated by professional societies, are - especially in the USA - often seen as a useful starting point for discussing the responsibility of engineers. Therefore, I briefly mention some provisions from such codes.

Most US codes of ethics for engineers have a provision stating that "Engineers shall hold paramount the safety, health and welfare of the public in the performance of their professional duties" (ABET Code of Ethics canon). The European Federation of National Engineering Associations, the FEANI - of which the Deutscher Verband Technisch-Wissenschaftlicher Vereine (DTV) is also a member - has a similar provision: "The Engineer shall be conscious of nature, environment, safety and health and work to the benefit and welfare of mankind." These provisions imply that - according to their own professional organisations - engineers have a responsibility for issues like safety, health, human welfare and the natural environment. However, what this responsibility exactly implies is rather vague. How should, for example, trade-offs between safety and costs be made? Or are such trade-offs unacceptable anyway?

Another provision that can be found in codes of ethics for engineers is that engineers have the right or duty to inform the public on certain issues. The FEANI code, for example states that: "The Engineer shall provide the general public with clear information, only in his field of competence, to enable a proper understanding of technical matters of public interest." The code of the IEEE (the Institute of Electrical and Electronic Engineers) is somewhat more explicit: "We, the members of the IEEE, ... agree ... to disclose promptly factors that might endanger the public or the environment." This implies a duty on the part of engineers to inform the public of risks or possible negative effects of technologies. Such a duty, however, may conflict with the loyalty of an engineer to his company because company management may disagree with, or even forbid, disclosing sensitive or internal information. This loyalty to the company is not only stressed in some codes of ethics for engineers but also in many countries buttressed by labour laws that give companies the right to forbid employees, including engineers, to disclose certain information [1]. This shows that there may be tensions between - on the one hand - duties of engineers as formulated in professional codes of ethics and - on the other hand - their rights as employees.

What can companies do?

What does the above imply for the responsibility of companies with respect to the ethical aspects of technological design and development? According to some authors, the main responsibility of business is to make profit within the limits of law (Friedman 1962, 133-136). This position seems to presuppose a division of labour in which it is the responsibility of the government to create laws that make the profit-seeking activities of industry more or less automatically ethically acceptable. While it is of course desirable to create such laws, I think that such a division is eventually unattainable. The reasons for this are comparable to why it is unfeasible to devise a division of labour in which engineers only carry out a morally neutral task. First, current laws are not adequate for making the profit-seeking activities of companies automatically ethically acceptable. An indication of this is the conflict between the responsibilities and duties engineers have according to professional codes of ethics and the rights and duties engineers have to their employer according to labour laws in many countries (Zandvoort 1998, Unger 1994). But even if certain inadequacies in the current laws would be repaired, there is the more fundamental point that technological innovation can create new options for actions and new ethical problems that cannot be fully or adequately foreseen in current laws. So, it seems inevitable that laws - at least in some cases - lag behind technological developments [2] . Moreover, just like technical and ethical aspects of design cannot be fully separated, the commercial aspects of technological development in companies cannot be fully separated from ethical considerations. For all these reasons, companies do have a certain social responsibility for technological development and design.

What is this social responsibility then exactly? I do not have a definitive answer to this question, but I think that on the basis of the discussion above three realms of responsibility for corporations can be distinguished: 

Below, I will discuss two concrete types of activities than can be - and are now - carried out by companies and that can be used to put into effect the points mentioned above. These activities are the formulation and enforcement of corporate codes of conduct and the involvement of stakeholders and relevant parts of the public in technological design and development.

Corporate codes of conduct

Corporate codes of conduct are voluntary commitments made by individual companies or associations of companies setting certain values, standards and principles for the conduct of corporations. The last few years, an increasing number of companies have formulated a corporate code of conduct. According to a survey, 38 of the 100 largest companies in the Netherlands had a code of conduct in 1998; in six cases a code was in the process of development (Kaptein et al. 1998).

Typical topics that are dealt with in corporate codes of conduct are: fair business practice, responsibility to shareholders, employee rights and duties, responsibility for safety, health and the environment, responsibility to society, consumers and stakeholders and human rights. Of these issues, it is especially the responsibility for safety, health and the environment that is specifically related to technological development and design. According to the above-mentioned survey among the 100 largest Dutch companies, environmental issues are mentioned in 74 % of the codes, safety in 47 % of the codes and health in 32 % of the codes (Kaptein et al. 1998). A questionnaire in 1991 among 200 US, Canadian and European companies with a code of conduct shows the following topics in codes that are (primarily) related to ethical issues in technological development and design: workplace safety (40-50 % of the codes), environmental responsibility (40-50 %), product safety (30-40 %) and technological innovation (20 %) (ILO, no date).

Corporate codes of conduct offer a number of opportunities to further the responsible dealing with the moral aspects of engineering design and development, but they also have limitations. One important function of codes is to commit companies, and their employees, to certain standards, values and norms. This commitment is, however, usually voluntary. Codes do usually not have a legal backing and in many cases there is no external independent monitoring of the codes (ILO, no date). In some cases, even internal enforcement or monitoring of the code is missing. In such circumstances, corporate codes may quickly become a dead letter, especially because formulations in codes are often rather vague and abstract. Nevertheless, the sheer existence of codes offers stakeholders, the public and consumers the opportunity to confront company officials with their own code of conduct. When people in the company know they may be held accountable for their code, this knowledge may change their behaviour. Employees may also use codes to remember their employers or managers of certain values and standards to which the company has committed itself. However, a KMPG survey among 251 Canadian companies in 1996 showed that 78 % of the responding companies lacked a formal policy "to protect employees that report ethics violations or non-compliance with the law or with company policies" (ILO, no date). Lack of internal procedures, moreover, brings the danger that codes are applied to employees in a rather arbitrary way.

A more general problem of corporate codes is that only a limited number of people are involved in their writing (cf. ILO, no date). As a result people who are possibly affected by the actions of corporations - and their interests and values - may be underrepresented in corporate codes of conduct. As a result, corporate codes may reflect the company's own interests and values rather than more general ethical and social values and norms.

The above arguments do not make codes valueless. They rather indicate that a number of requirements should be met for codes to be valuable. These requirements include: 

What can corporate codes of conduct mean in concreto for dealing with ethical issues in engineering design and development? Many codes of conduct do not deal explicitly with such issues. A noteworthy exception is the "responsible care" initiative of the chemical industry. This initiative started in the eighties in Canada, but has now been adopted by the chemical industry in many countries. The German Association of the Chemical Industry (VCI) adopted in 1995 a number of guiding principles for responsible care. These principles include: 

Although some of the formulations are rather vague, this code clearly states a commitment to safety, health and the environment and a commitment to openness and informing the public and the government. As we have seen, these elements are also part of engineering codes of ethics. One thing that may be added to this and other corporate codes of conduct is a commitment to professional codes for engineers and the establishment of structures and procedures that allow engineers in the company to follow these codes.

The responsible care code also refers to extending knowledge of adverse effects. For this, TA studies may be relevant. More generally, corporate codes of conduct could imply a commitment to relevant TA activities - like the ones I have mentioned before. In addition, codes might commit companies to involving stakeholders in certain issues. The possibilities for involvement of stakeholders will be discussed in the next section.

Involvement of stakeholders and outsiders

While many people and groups may undergo the (adverse) effects of (new) technologies, only a limited number of people or groups are involved in their shaping. On democratic and ethical grounds, it seems desirable to involve those people that may undergo the effects of (new) technologies in relevant decisions about technology or to ask their informed consent (Baum 1983, Martin & Schinzinger 1996, Zandvoort 1998, Sclove 1995). A practical way for companies to start doing this is to discuss relevant issues with stakeholders. Stakeholders are here defined as people or groups who have a stake in the development of a technology, for example because they may undergo the consequences of that technology or have to work with it. Examples of stakeholders are consumer groups, citizen groups, environmental groups, representatives of Third World countries et cetera.

Some companies have indeed begun to involve stakeholders, especially in those areas like biotechnology where there is much protest and social unrest about new technologies. A main reason for companies to do this seems to be that they believe that this will facilitate the acceptance of new innovations.

While involving stakeholders seems in general a desirable thing to do, there are at least two drawbacks or limitations. One is the actual influence of stakeholders. Companies may simply involve stakeholders as a kind of PR initiative instead of listening to them or actually involving them in the process of technological development. A second, and probably more fundamental, problem has to do with representation of the interests, values and norms of the public at large. There is no guarantee that stakeholders that get actually involved are representative of the population at large or of those people - including future generations - that will undergo the effects of new technologies. Some people or citizens may, for example, lack the resources to make themselves visible or powerful enough to get involved as stakeholders. There is also the danger that companies only talk to those stakeholders that fit to their interests. For such reasons, one might argue that that democratically elected governments and other democratic institutions are better fit than stakeholder groups for dealing adequately with a number of ethical and social issues in technological development. The question, however, is whether government and other existing democratic institutions are fit to deal adequately with all ethical and social issues in engineering. Governments can, for example, probably play an important role in keeping known risks of an existing technology within acceptable limits. It is, however, more questionable whether they are also well suited for dealing adequately with new ethical issues that arise due to new technologies or for furthering the development of technological alternatives that are more desirable from a social or ethical point of view. In such cases, involving stakeholders can have an added value to existing forms of democratic control of technological development. Moreover it is something that companies themselves can do in addition to obeying to existing laws in order to deal responsibly with ethical issues in engineering and technology.

Involving stakeholders may - apart from considerations of democratisation of technological development - have an important instrumental value in "improving" the process of technological development. Elsewhere (Van de Poel 2000b), I have argued that so-called "outsiders" may in certain circumstances play an important role in technological development and that their involvement may sometimes improve the process of technological development. Outsiders are people or groups that are not (yet) actually involved in technological development or decision-making about technology. Often they do not share the existing rules of technological development or the existing normative framework (Grunwald 2000) surrounding the design and development of a technology. Three groups of such outsiders that may also have an actual influence on technological development can be distinguished: societal pressure groups, engineering and scientific professional outsiders and outsider firms (Van de Poel 2000b).

There are three reasons why involvement of initial outsiders can help to improve the process of technological development. One is that involvement of such groups may lead to the anticipation of possible (negative) consequences of new technologies, and so have an early warning and awareness function with respect to potential ethical and social issues in the development of a technology. This is not only the case because such outsiders may articulate possible adverse - or positive - effects, but also because their activities may force companies to be more explicit about possible consequences or to initiate research on possible consequences. A second way in which involvement of initial outsiders may improve the process of technological development is by stimulating the development of technological alternatives that are potentially (but not necessarily) more desirable from a societal or ethical point of view. In some cases, outsiders may even play an active role in the development of such alternatives (Van de Poel 2000b). Finally, involvement of outsiders may contribute to what proponents of Constructive Technology assessment (CTA) have called "second-order learning" (Schot & Rip 1996, 257). This is learning not only about how to achieve certain given goals or values, but also learning about what goals or values should be striven for. Since outsiders do not necessarily share the normative framework of those involved in technological development, their involvement may launch a debate about what the relevant goals, values and norms are in the development of a technology.

The above shows that involving initial outsiders potentially can improve technological development. This does, however, not mean that involvement of initial outsiders will always lead to desirable results. It may, in some cases, for example lead to controversies about a (new) technology. While such controversies are sometimes productive as a form of informal TA (Rip 1987, Cambrosio & Limoges 1991) because they help to articulate possible consequences of technologies and relevant norms and values, they may also result in trench warfare between proponents and opponents of a technology. In such cases, useful discussions about the ethical and social aspects may become virtually impossible.

Involving stakeholders and outsiders is something that companies only to a limited extent can do as a deliberate strategy. Ensuring that the involvement of outsiders will actually improve the process of technological development is something that is even more difficult to achieve deliberately. What companies - and the other actors involved in technological development - nevertheless can do is to be aware of the potential role of outsiders and its potential, and drawbacks, in "improving" technical development and try to use possibilities at hand as good as possible.

Conclusions and implications for TA in industry

At first sight there may seem to be a limited scope for the moral responsibility of industry in dealing with ethical and social issues in engineering design and development and for TA activities contributing to that goal. Isn't technical development merely a morally neutral affair? And shouldn't any remaining moral and social issues by solved by governments or the formulation of adequate laws? In this paper, I have argued that both arguments do not hold. It has been argued that technology is not morally neutral and a number of moments or issues in engineering design and development that are - at least potentially - ethically relevant have been identified. While industrial firms are not the only actors in technical development and other actors have a responsibility as well, arguments have been given why firms have an own social responsibility. I have also elaborated what this responsibility could, or should imply and what TA activities can play a role in fulfilling it. More specifically, two activities have been identified that companies can undertake: formulating and enforcing corporate codes of conduct and involving stakeholders and "outsiders" in technical development. Both activities have their drawbacks and limitations. Therefore, they should be implemented carefully and thoughtfully and are not a simple recipe for responsible behaviour of companies. Moreover, they are not the only things companies can, or should, do. Nevertheless they provide examples of what companies could do.

It has been shown that a number of TA activities are relevant in industry. This includes early warning activities, TA studies of new technologies, but also interactive and constructive TA activities. One important implication of the arguments in this article is in fact that TA in industry should not be confined to more formal activities and studies, but should also include more interactive and constructive activities, like Constructive Technology Assessment (CTA), and what may be called de facto (C)TA activities, i.e. activities that may not be formally or deliberately organised as (C)TA activities but nevertheless contribute to the overall TA goals. The involvement of stakeholders and outsiders is an example of this.

Another implication for TA in industry - and probably also for the field of TA more generally - would be the inclusion of ethical - or more generally normative - aspects in TA studies and activities. Traditionally, TA practitioners had an inclination to stress the objectivity and neutrality of TA. In interactive or constructive variants of TA more attention has been paid to normative issues, but also here substantial ethical or normative issues in engineering and technological development are hardly addressed systematically. To deal adequately with a number of social and ethical issues in engineering design and technological development, an integration of TA aspects and ethical and normative aspects seem desirable.

Notes 

[1] While whistle blowing is certainly an issue in some countries and attempts have been made to protect whistle blowers legally, this still turns out very hard to realise effectively in practice.

[2] Stricter liability laws may to some extent, solve the problem of laws lagging behind technological development. See Zandvoort (2000).

Acknowledgement

A small portion of this article is drawn from Van de Poel (2000a).

References

ABET / Accreditation Board for Engineering and Technology, 1977: Code of Ethics of Engineers. http://www.abet.org/ethics.html

Akrich, M., 1992: The Description of Technical Objects. In: W. Bijker and J. Law (Eds.): Shaping Technology/Building Society: Studies in Sociotechnical Change. Cambridge, USA: MIT Press, pp. 205-224.

Baum, R., 1983: The Limits of Professional Responsibility. In: J.H. Schaub, K. Pavlovic, M.D. Morris (Eds.): Engineering Professionalism and Ethics. New York etc.: John Wiley & Sons, pp. 287-294.

Brumsen, M.; Poel, I.R. van de, forthcoming: Towards a Research Programme for Ethics and Technology. Science and Engineering Ethics, Vol. 7, No. 3.

Cambrosio, A.; Limoges, C., 1991: Controversies as Governing Process in Technology Assessment. Technology Analysis & Strategic Management, Vol. 3, No. 4, pp. 377-396.

Davis, M., 1998: Thinking Like an Engineer. Studies in the Ethics of a Profession. New York and Oxford: Oxford University Press.

FEANI / European Federation of National Engineering Associations: FEANI Code of Conduct. http://www.feani.org

Friedman, M., 1962: Capitalism and Freedom. Chicago: University of Chicago Press.

Grahe, F.G., 1997: Responsible Care in German Chemical Industry. Dainippon ink and chemicals. http://www.dic.co.jp/tech/rc0302/index-e.html

Grunwald, A., 2000: Against Over-estimating the Role of Ethics in Technology Development. Science and Engineering Ethics, Vol. 6, No. 2, pp. 181-196.

Harris, C.E.; Pritchard, M.S.; Rabins, M.J., 1995: Engineering Ethics: Concepts and Cases. Belmont etc.: Wadsworth.

IEEE / Institute of Electrical and Electronic Engineers, 1990: IEEE Code of Ethics. http://www.ieee.org/about/whatis/code.html

International Labour Organization (ILO), no date: Corporate Codes of Conduct. http://www.itcilo.it/english/actrav/telearn/global/ilo/code/main.htm

Kaptein, S.P.; Klamer, H.K.; Linden, J.C.J. ter, 1998: De integere organisatie. Het nut van een bedrijfscode. Den Haag: Vereniging NCW.

Martin, M.W.; Schinzinger, R., 1996: Ethics in Engineering. New York: McGraw-Hill. Third edition.

Poel, I.R. van de, 2000a: Ethics and Engineering Design. In: University as A Bridge from Technology to Society. Proceedings of the IEEE International Symposium on Technology and Society, 6-8 September 2000, "La Sapienza" University of Rome, Rome: IEEE, pp. 187-192.

Poel, I.R. van de, 2000b: On the Role of Outsiders in Technical Development. Technology Analysis & Strategic Management, Vol. 12, No. 3, pp. 383-397.

Poel, I.R. van de, forthcoming: Investigating Ethical Issues in Engineering Design. Science and Engineering Ethics, vol. 7, no. 3.

Rapp, F., 1981: Analytical Philosophy of Technology. Dordrecht: Reidel.

Rip, A., 1987: Controversies as Informal Technology Assessment. Knowledge: Creation, Diffusion, Utilization, Vol. 8, No. 2, pp. 349-371.

Schot, J.; Rip, A., 1996: The Past and Future of Constructive Technology Assessment. Technological Forecasting and Social Change, Vol. 54, pp. 251-268.

Sclove, R.E., 1995: Democracy and Technology. New York: The Guilford Press

Smits, R.; Leyten, J., 1991: Technology Assessment. Waakhond of Speurhond? Zeist.

Unger, S.H., 1994: Controlling Technology: Ethics and the Responsible Engineer. New York: Jon Wiley. Second edition.

US Council of International Business, 2000: Corporate Codes of Conduct: Overview and Summary of Initiatives. http://www.uscib.org/

Volti, R., 1992: Society and Technological Change. New York: St. Martin's Press.

Zandvoort, H., 1998: Codes of Conduct and the Law. In: P. Kampits; K. Kokai; A. Weiberg (Eds.). Applied Ethics: Papers of the 21st International Witggenstein Symposium, Kirchberg am Wechsel, 16-8-1998. Kirchberg am Wechsel: IWS, 1998, pp. 304-309.

Zandvoort, H., 2000: Controlling Technology Through Law: The Role of Liability. In: D. Brand; J. Cernetic (Eds.): 7th IFAC Symposium on Automated Systems Based on Human Skill. Düsseldorf: VDI/VDE-Gesellschaft Meß- und Automatiseringstechnik, pp. 247-250.

Kontakt

Dr. ir. Ibo van de Poel
Department of Philosophy, School of Technology,
Policy and Management, Delft University of Technology
PO Box 5015, 2600 GA Delft, The Netherlands
Tel.: +31 15 2784716
E-mail: i.r.vandepoel∂tbm.tudelft.nl