What does it mean to be a responsible nanoscientist?

This is the pre-edited version of an article first published in Nature Nanotechnology 4, 336-336 (June 2009). The published version can be found here (subscription required).

What does it mean to be a responsible nanoscientist? In 2008, the European Commission recommended a code of conduct for responsible nanosciences and nanotechnologies research (PDF). This is one of a growing number of codes of conduct being proposed for nanotechnology. Unlike other codes, such as the Responsible Nanocode, which are focused more on business and commerce, the EU code is aimed squarely at the academic research enterprise. In attempting this, it raises some interesting questions about the degree to which individual scientists are answerable for consequences of their research, even if those consequences were ones which they did not, and possibly could not, foresee.

The general goals of the EU code are commendable – it aims to encourage dialogue between everybody involved in and affected by the research enterprise, from researchers in academia and industry, through to policy makers to NGOs and the general public, and it seeks to make sure that nanotechnology research leads to sustainable economic and social benefits. There’s an important question, though, about how the responsibility for achieving this desirable state of affairs is distributed between the different people and groups involved.

One can, for example, imagine many scientists who might be alarmed at the statement in the code that “researchers and research organisations should remain accountable for the social, environmental and human health impacts that their N&N research may impose on present and future generations.” Many scientists have come to subscribe to the idea of a division of moral labour – they do the basic research which in the absence of direct application, remains free of moral implications, and the technologists and industrialists take responsibility for the consequences of applying that science, whether those are positive or negative. One could argue that this division of labour has begun to blur, as the distinction between pure and applied science becomes harder to make. Some scientists themselves are happy to embrace this – after all, they are happy to take credit for the positive impact of past scientific advances, and to cite the potential big impacts that might hypothetically flow from their results.

Nonetheless, it is going to be difficult to convince many that the concept of accountability is fair or meaningful when applied to the downstream implications of scientific research, when those implications are likely to be very difficult to predict at an early stage. The scientists who make an original discovery may well not have a great deal of influence in the way it is commercialised. If there are adverse environmental or health impacts of some discovery of nanoscience, the primary responsibility must surely lie with those directly responsible for creating conditions in which people or ecosystems were exposed to the hazard, rather than the original discoverers. Perhaps it would be more helpful to think about the responsibilities of researchers in terms of a moral obligation to be reflective about possible consequences, to consider different viewpoints, and to warn about possible concerns.

A consideration of the potential consequences of one’s research is one possible approach to proceeding in an ethical way. The uncertainty that necessarily surrounds any predictions about way research may end up being applied at a future date, and the lack of agency and influence on those applications that researchers often feel, can limit the usefulness of this approach. Another recently issued code the UK government’s Universal Ethical Code for Scientists (PDF) – takes a different starting point, with one general principle – “ensure that your work is lawful and justified” – and one injunction to “minimise and justify any adverse effect your work may have on people, animals and the natural environment”.

A reference to what is lawful has the benefit of clarity, and it provides a connection through the traditional mechanisms of democratic accountability with some expression of the will of society at large. But the law is always likely to be slow to catch up with new possibilities suggested by new technology, and many would strongly disagree with the principle that what is legal is necessarily ethical. As far as the test of what is “justified” is concerned, one has to ask, who is to judge this?

One controversial research area that probably would past the test that research should be “lawful and justified” is in applications of nanotechnology to defence. Developing a new nanotechnology-based weapons system would clearly contravene the EU code’s injunction to researchers that they “should not harm or create a biological, physical or moral threat to people”. Researchers working in a government research organisation with this aim might find reassurance for any moral qualms with the thought that it was the job of the normal processes of democratic oversight to ensure that their work did pass the tests of lawfulness and justifiability. But this won’t satisfy those people who are sceptical about the ability of institutions – whether they in government or in the private sector – to manage the inevitably uncertain consequences of new technology.

The question we return to, then, is how is responsibility divided between the individuals that do science, and the organisations, institutions and social structures in which science is done? There’s a danger that codes of ethics focus too much on the individual scientist, at a time when many scientists often rather powerless, with research priorities increasingly being set from outside, and with the development and application of their research out of their hands. In this environment, too much emphasis on individual accountability could prove alienating, and could divert us from efforts to make the institutions in which science and technology are developed more responsible. Scientists shouldn’t completely underestimate their importance and influence collectively, even if individually they feel rather impotent. Part of the responsibility of a scientist should be to reflect on how one would justify one’s work, and how people with different points of view might react to it, and such scientists will be in a good position to have a positive influence on those institutions they interact with – funding agencies, for example. But we still need to think more generally how to make responsible institutions for developing science and technology, as well as responsible nanoscientists.

2 thoughts on “What does it mean to be a responsible nanoscientist?”

  1. Divisions of labour
    The division of moral labour described — which I agree that many scientists seem to subscribe to — is deeply problematic. The idea that basic research is somehow morally insulated ignores the multiple ways in which basic research impacts on society, not just in its potential application to new products but also in its contributions to knowledge — to our understandings of humanity, nature, and human agency. It also implies that basic research is conducted in a moral vacuum, ignoring the fact that all research is embedded in social contexts in which moral and political values, interests and judgements influence decisions and actions in multiple ways. These are important insights from social studies of science. There is clearly an imperative for scientists and science organisations to take more responsibility for their work. But are they equipped to take this responsibility?

    Investigations of the moral and societal implications of science and technology rely on an understanding of scientific developments, but they also require social science expertise to explore current and future implications and options, and input from a wide range of people and perspectives because of the value-laden and contested nature of these implications and options. Models for such investigations exist in technology assessment, foresighting and related activities. It is clearly beyond the capacity of scientists to conduct these investigations because a) they require a range of research and communication skills that scientist are not trained for; b) they require a broad and critical perspective that sits uncomfortably with the focussed and enthusiastic perspective that scientists generally have towards their science; c) they require time and resources which active scientists rarely have and potentially create conflicts of interest, for individuals and within organisations.

    Whether such activities are conducted within research organisations, funding agencies or independent bodies is a matter for debate, but establishing capacity to assess the implications and directions of research, especially new science, is critical for responsible innovation. In this sense, it is not enough for those who work within scientific enterprises, whether they be bench scientists, managers, funders or industrialists, to ‘act responsibly’; the complexity of responsible innovation requires science and technology assessment capacity and integration of this capacity into decision making about science and technology at all levels.

    It is the requirement for integration that is one of the biggest challenges for science and technology assessment: how to close the loop and feed information about implications back into decision making about science and technology. And perhaps here we come full circle in relation to responsibility. If scientists, and all those involved in bringing science and technology into society, are held responsible for their work, perhaps they will recognise the limits to their capacity to take that responsibility and take seriously the need for a parallel effort to consider implications and consequences.

  2. Wendy, I agree with everything you say so eloquently. I perhaps didn’t make it clear enough that the idea of the “moral division of labour” (a phrase, incidentally, that I suspect I’ve lifted from Arie Rip) is one that I believe many scientists hold, but which, like you, I don’t personally believe is tenable.

Comments are closed.