After DollyHumility is the best stance when facing the unanticipated consequences of new technologies By Tim Healy The successful cloning of the lamb Dolly in February set off a spate of anxious questions. Many of them concerned the ethics of cloning, but another set asked about the unanticipated consequences of this technology.
Each of these questions speaks to the uncertainty inherent in the development of new technologies. And the true answer to each is, We don't know. Unanticipated consequences are a feature of change, both technological and natural. Acknowledging our limited ability to foresee repercussions is a first step in dealing with them. The Revenge Effect But they have helped triple or quadruple the number of drivers locked out over the last two decades costing $400 million a year and exposing drivers to the very criminals the locks were supposed to defeat. Another analysis is offered by Dietrich Dùrner, who identifies four features of systems that make full understanding impossible: complexity, dynamics, intransparence, and ignorance and mistaken hypotheses. Complexity refers to the many different components in a real system and the interconnections between them. When we model a system, trying to predict what will occur, we necessarily neglect many components and even more so, their interrelations. But it is from such interrelations that the unanticipated may arise. Many systems exhibit dynamics; that is, they change their state spontaneously, independent of control by a central agent. One of the most fascinating examples is the Internet, an extraordinarily dynamic system with no one in charge. There is no way to model the Internet to predict its impact. Intransparence means that some elements of the system cannot be seen but can, nevertheless, affect its operation. Looking again at the Internet, contributors to intransparence might include all of the users at a particular time, equipment failures at user sites, and local phenomena such as weather. These intransparent factors make it hard to foresee how the system will operate. Finally, ignorance and mistaken hypotheses can keep us from predicting consequences accurately. Reducing Uncertainty
Moral Repercussions Although there are no simple answers to this problem, some general ethical principles can guide us as we make decisions about the use of new technologies. We should take advantage of opportunities to reduce uncertainty. Although we are not obliged to exhaust our resources (this could easily outweigh the good to be gained), to the extent that uncertainty can be diminished at reasonable cost, it seems morally prudent to do so. People should share equally in the benefits of an action or a project, and they should also share equally in the risks due to unanticipated consequences. This is, of course, an ideal, since we cannot usually ensure such equal distribution of benefits and risks. People who do not share in the benefits of an action should not, as a rule, be subject to costs and risks. Justice suggests that burdens should not be borne by those who cannot benefit; but this is also an ideal, and it, too, has limitations. For example, it would forbid the building of a coal-burning power plant on the grounds that emissions from the plant could affect the environment of the entire globe, including that of some individuals who could not expect to benefit from the electric power. People who gain some benefit from an action should be able to choose their level of cost and risk. This idea follows from the fundamental ethical principle that everyone must be treated as a free and rational person capable of making his or her own decisions. Of course, joint projects do not always allow this principle to predominate. For example, a community may decide to initiate a flood-control project with consequences that interfere with the free choice of individual community members. Projects affecting more than one individual should provide the greatest balance of benefits over harms for all involved. This utilitarian principle gives us a way to approach public projects, such as flood control or seismic retrofitting. On this basis, we might decide to go ahead with the project, though our responsibility to reduce uncertainty would be greater because of the potential threat to individual rights. We ought to recognize that a resource has greater value to a poor person than to a rich one. If we give $10 to a poor man, we improve his life much more than if we give the same amount to a rich man. Since the principle of justice suggests our first thoughts should be for those who have the least in our society, we must consider the disparate impacts of technological advances on rich and poor. We ought to recognize that the consequences of an action may be long-term. Our actions and their consequences are not necessarily limited to here and now. In fact, their effects may cover great distances perhaps the entire earthÑand may extend for years, decades, or even centuries. We are obliged to take these factors into consideration. Decisions about technology should acknowledge the complexity of life. Implicit in this principle is the requirement to speak with humility about the consequences of our actions, to refine and improve our positions, and to act with a clear understanding that we do not "own the truth." In a brief but beautiful autobiographical essay, economist Kenneth Arrow writes that "most individuals underestimate the uncertainty of the world." As a result, we believe too easily in the clarity of our own interpretations. Arrow calls for greater humility in the face of uncertainty and finds in the matter a moral obligation as well:
Tim Healy, the Thomas J. Bannan Professor of Electrical Engineering at SCU, is coordinator of the Ethics and Technology Program at the Markkula Center for Applied Ethics. Further Reading Dùrner, Dietrich. The Logic of Failure: Why Things Go Wrong and What We Can Do to Make Them Right. New York: Metropolitan Books, 1989. Knight, Frank. Risk, Uncertainty, and Profit. Boston: Houghton Mifflin Co., 1921. Merton, Robert. "The Unanticipated Consequences of Purposive Social Action." American Sociological Review 1 (December 1936): 894Æ904. Tenner, Edward. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Knopf, 1996. |
Featured Materials
Issues in Ethics - V. 8, N. 3 Summer 1997 | ||||
issue abstract | ||||
The Path of Virtue | ||||
thinking ethically | ||||
When Rights and Cultures Collide | ||||
Seeking Asylum | ||||
How Trust is Achieved in Free Markets | ||||
features | ||||
Moral Attorneys; Moral People | ||||
After Dolly | ||||
on the one hand | ||||
The Welfare of the Community | ||||
Frequently Asked Questions | ||||
a case in point | ||||
The Case of the Performance Appraisal | ||||
readers respond | ||||
Comments on the Case of Henry's Publick House | ||||
a good read | ||||
Lofty Instincts | ||||
letters to the editor | ||||
Role-Driven Ethics | ||||
scholars at work | ||||
Tim Healy: Ethics and Technology | ||||
at the center | ||||
LEADership in Ethical Awareness | ||||
Ethics Roundtable: When Products Go Wrong | ||||
Creating an Ethical Political Climate | ||||
Markkula Passes the Gavel to Kvamme | ||||
issues in ethics tools | ||||
Homepage | Subscribe | |||
|