Transcript, Meeting 15 Session 3


December 18, 2013


Washington, D.C.


Nikolas Rose, Ph.D.
Member, Human Brain Project Social and Ethical Division Steering Committee
Professor of Sociology
Head of Department of Social Science, Health and Medicine
King’s College London
Jonathan Montgomery, LL.M.
Chair, Nuffield Council on Bioethics
Professor of Health Care Law
University College London
Stefano Semplici, Ph.D.
Chairperson, International Bioethics Committee
United Nations Educational, Scientific, and Cultural Organization
Professor of Social Ethics
University of Rome Tor Vergata

Download the Transcript


DR. WAGNER: You know, we're waiting for Dan. But I think since, what I am essentially doing in the beginning of this is reading into the record the introductions of our speakers, Dan has seen this in the briefing material. So why don't we go ahead.

This section is one where we'll hear from three speakers about international efforts in neuroscience and ethics, international effort.

One of our speakers, Nikolas Rose is remotely -- Dr. Rose, are you hearing is?

DR. ROSE: Yeah, I can hear you. Can you hear me?

DR. WAGNER: Very well, thank you. Very well. In fact, we will start with you.

He -- Dr. Rose will be offering insights from the Human Brain Project which is the European Future and Emerging Technologies Flagship Project. It seeks to develop methods will enable a deep understanding of how the human brain operates.

Dr. Rose is a member of the steering committee of the Ethics and Society Program of the Human Brain Project, and is responsible for the Foresight Laboratory. He's also a professor of sociology and head of the department of social science, health and medicine at King's College London. Dr. Rose was a member of Nuffield Council on bioethics, chair of the European Neuroscience and Society Network, and he has worked in various capacities with the Academy of Medical Science and The Welcome Trust and with the Neural Society where he is currently a member of the science policy advisory group.

Dr. Rose, you and each of our other two panelists were asking to make a statement within about -- within the confines of about ten minutes, and so welcome. It's -- we're pleased to have you with us.

DR. ROSE: Okay. Well, greetings from my colleagues on the social and ethical program of the Human Brain Project, in particular from our co-chairs, Jean Pierre Chanseux who is an eminent French neuroscientist, former president of the French National Bioethics Commission. And Kathinka Evers, an ethicist and neuroethicist from Uppsala.

So as you said, Jim, the Future Emerging Technologies program of the European Union, their seventh framework program, funded the Human Brain Project for the tune of about one billion Euros over a ten-year period, and we kicked off in October 2013. And I'm going to use my ten minutes just to outline the approach to social and ethical issues that we are adopting within the Human Brain Project.

So what is the Human Brain project? Simplest and maybe the cliche way of putting the Human Brain Project's objective is it's to simulate the human brain cell by cell in a neuromorphic super computer. But if my colleague there would put my first slide up, you'll see

there that the objective is a little bit more complicated. It's to build an ICT, and information and communication technology infrastructure for future neuroscience, future medicine and future computing to catalyze a global collaborative effort to understand the human brain and its diseases and ultimately to emulate its computational capacities. So it's a global collaborative effort, its aim is to understand the human brain with a focus on disease and ultimately to emulate its computational capacities in a super computer and perhaps to enhance robotics.

There's one thing that I think should be emphasized about the HBP and that is that it's not primarily a data generation project. It's not primarily funding basic research. I heard a little bit in your last session about some of that basic research. It's primarily a data integration project. That is to say it's a project that seeks to collate, to federate and to aggregate data, both clinical data and neuroscience research data from across the world, and especially across Europe, to extract from that certain basic principles, and to use those principles to simulate or to emulate human brain capacities in a super computer in order to model and hopefully to model the nature, the causes and hopefully the treatments for diseases.

One major difference between the HBP and what I understand about the U.S. projects is that ethical issues and social issues have been seen as central to that project really since the start of the discussion about three years ago. And from the beginning of those discussions, it was agreed that a certain proportion of the funding from the HBP, if it was successful, should be devoted to the social and ethical program. About three percent of the funding, and when the first tranche of the funding was announced a few months ago, three percent of that funding did come to the social and ethical program.

And in the proposal that went to the European Commission that was ultimately successful, one of the central themes throughout that whole proposal was the idea of responsible research and innovation.

And if my colleague there can put the next slide on -- I can't see the slides. But I hope that one is headed "Responsible Research and Innovation."

DR. WAGNER: We see it now, yes.

DR. ROSE: Right. This is a slide actually not from the HBP, this is from the website of one of the big research councils, the EPSRC, the Engineering and Physical Sciences Research Council. And it outlines I think very clearly their approach to responsible research and innovation. And unsurprisingly, I'm highlighting this because some of my colleagues working in synthetic biology were involved in these formulations.

So I know your last session, there was some sense that responsible research and innovation are incorporating ethical issues might be something that halted, that hampered, that stifled creativity and research. And I feel the interesting thing about this formulation here is that

the whole aim of RRI, if I can call it that, is to stimulate creativity and opportunities for science, but to do so in a way that directs them toward socially desirable public interest ends.

It recognizes quite early in the research that there are questions that innovation raises, dilemmas that innovation raises that often pose ambiguous questions for those in society and those who are policy makers. Unless you begin to tease out those implications very early on, there are major problems can arise in terms of social acceptability of that research and, indeed, in terms of the direction of that research.

So RRI, if I can call it that, responsible research and innovation, is an attempt to try and create spaces and processes that explore those developments as an early stage in an open and transparent way involving the funders, involving the public, involving stakeholders and crucially involving researchers themselves.

And our view -- and I think it's one that you can see demonstrated in a number of areas in science, is that this doesn't actually make the science weaker, it helps to make the science and the developments more robust.

So if I can just go to the next slide, and just outline very briefly how we're making that practicable within the social and ethical program. There are five streams of the work that we're doing. The first is the Foresight Lab, and that's the area of the work that I'm responsible for overseeing.

Foresight, or anticipatory governance is an attempt to work with stakeholders, with scientists, with researchers and with policy makers to anticipate the potential developments of a new scientific process. A new set of technological developments, to work out the scenarios of what would happen if certain technical developments and scientific developments did come to fruition. To think through what their implications might be and to feed those back into the developing science itself to help increase the awareness and understanding of the researchers themselves about the implications of their work.

The second mainstream is conceptual and philosophical analysis. The Human Brain Project is a simulation on emulation project, emulating or simulating the human brain in a computer. And what does it mean to emulate or simulate a human brain in an artificial environment like a computer? Clearly questions of personhood, of self, of volitional control, of responsibility, of consciousness, are all involved there. And the aim of this conception of philosophical analysis is to try and think through those implications

Thirdly, as I've already emphasized, responsible research and innovation is committed to kind of a transparent and open dialogue with stakeholders. There's a few, especially in Europe, that there's public distrust of scientific research and that public distrust of scientific research can hamper the development of that research. In fact, the research evidence is a little bit more complicated than that, but everything suggests that the more open, the more

transparent, more dialogic the researchers are about what they're trying to do, about why they're trying to do it, about what they see the benefits, about what they see the risks, about what their hopes for, and about what their aspirations are. The more that's an open and transparent process, the better it should be.

That leads into the fourth thing, which is a researcher awareness which is trying to build in the kind of reflexive understanding amongst the researchers themselves about what it is that they're doing. And in your last session, which I caught a little bit on your web cast, I think you said very clearly that that's not trying to get researchers to do something that they're not already doing. The researchers that you had in front of you in the last session showed very clearly that they are committed to thinking through those implications. The same is true of all the young researchers in a massive federated project that's like the Human Brain Project.

And the fifth aspect is governance and regulation. Of course, a big project like this requires a very robust governance system, and that was a kind of obligation of the project as far as the funders were concerned. All the research that is being done by the researchers within the HBP will, of course, go through its local research ethics committees, but there is going to be a central research ethics committee to give guidance to those researchers. And there's also going to be an independent ethical, legal and social aspects committee, which will work with the many ethics committees across Europe.

Europe may look from the outside as if it's a rather homogeneous area, but in fact each national committee has its own priorities and works within a different kind of legal and regulatory regime and so forth.

So that's very briefly the kind of approach that we take. I just wanted to highlight in the few minutes that remain to me two or three issues that have already come to our attention as key things that we're going to have to explore.

DR. WAGNER: Yeah, that's about half a minute to -- for you to say this.

DR. ROSE: Have a minute, okay. I will list them then.

DR. WAGNER: Yes, please.

DR. ROSE: The first one, military and dual use. Here is a major difference between the HBP and the BRAIN Initiative. HBP has an explicit commitment to civil research only. All partners have undertaken not to accept funding or use data or knowledge acquired from military applications. But nonetheless, those dual issues are raised because the HBP also has a commitment to open data. So all the data that's produced will be in the public domain. And of course, once in the public domain, its consequences can't be controlled.

I'll just take the last ten seconds to mention the other big issue that we're confronting at the moment. A mention that the HBP is collating data from across Europe and the

world, collating and federating, as we call it, clinical data. That's data from genetic sources, imaging data, medical records on brain disorders. From multiple hospital sources, clinical trial, pharmaceutical companies across Europe.

Now of course, as you'll be well aware, these all have different ethical regimes for consent, the consent has been given by members of the public and patients giving their data into the system has been specific in many cases to the particular research or the particular medical application they're involved in.

So one of the big issues that we're discussing, and I'll leave it at this, is a conflict I think between two senses of what the ethical obligations in research like this might be. There are, of course, the obligations of individual informed consent, individual privacy, and individual confidentiality. But there are also wider obligations to the public good. Obligations of stewardship, obligations of making the best use of this data. If one really accepts, as I think we do, the tremendous challenge of dealing with the brain diseases that's going to confront us all over the next ten, fifteen, twenty or thirty years, then researchers and policy makers have obligations of stewardship for the public good as well as for those individuals who've contributed their data to medical research.

And that's the point at which I'll leave it for now. Thank you for your attention.

DR. WAGNER: Dr. Rose, thank you, wonderful in content and in example.

Next we will hear from Professor Jonathan Montgomery, Chair of the Nuffield council on bioethics, which released a report earlier this year, as you're aware, considering the ethical, legal and social issues that arise from the use of novel neurotechnologies. He is professor of health care law at University College London. He's Chair both of the Health Research Authority and of the Advisory Committee on Clinical Excellence Awards. Professor Montgomery sits on the Scientific Steering Committee of Brain Banks U.K., and is a member of the committee on the ethical aspects of pandemic influenza.

He previously chaired the Genetics Commission, chaired the U.K. Clinical Research collaboration working group on a strategy for brain tissue banking and sat on the organ donation task force for its work on presumed consent.

Welcome to you, sir.

DR. MONTGOMERY: Thank you very much. And thank you very much for the opportunity to present something of the Nuffield Council's report on the novel neurotechnologies, but in particular what we've been doing since the launch of the report.

DR. WAGNER: Great.

DR. MONTGOMERY: I should say any difficult questions, I'll invite Nik Rose to answer because he was a member of the working party that did it. And I should also say that one of the things that emerges from this I think has already emerged from the discussions we heard this morning, is the overlap between the different pieces of work. And one of the things that the Nuffield Council has as a non-government body is a chance to shape its own agenda a bit. And we have some overlapping projects, of the neurotechnology project built a little bit on work we published in 2012, Emerging Biotechnologies. We're doing some work on some data which picks up some of the connections and I'll say a little bit later on about some work we did on research culture, because it touches on some of the applications of the neurotechnology report, and also something we heard this morning.

So our report I think took a slightly shorter timeframe than some things we were talking about this morning. And it was trying to assess what we should be doing now on things that were reasonably close in actual use. So we're looking at deep brain stimulation, that transcranial stimulation, interfaces and a few other technologies that are there.

We identified a number of underlying tensions which you'll be familiar with, linked to what might be special about the brain both in terms of the need to understand it better and also the anxieties that we have about how we might address it. And we picked out a number of key interests around safety, we're doing things to brains that we needed to be confident about in particular implantation, and we'll come back with some recommendations about that later on.

We picked up some of the ways in which it probably ties as autonomy, both in terms of implant and self-conception produced around the fact that much of this research done on people is autonomy and some ways it's problematic. We picked out issues around privacy, as we discussed earlier. We picked out some issues around equity in terms of differential access, but also the potential for stigmatization about working forward. And we picked out some issues about trust which we'll come back to.

Our framework for looking at it was built around three virtues around inventiveness, humility and responsibility which we feel would help us balance the need against the fact that we're were knowing less than we would like to know about what's going on, particularly in terms of recommendations, focusing on what it meant to move this and do it in a responsible way. So our report picks some of the responsible research and innovation messages which Europe is quite big on at the moment, and to which Nik's already spoken about.

So we identified a series of recommendations, a few of which I’ll pick out. One is about regulatory structures and in the European framework the structures are regulating medical devices, very different from the structures regulating pharmaceutical interventions. And broadly, our report argues that devices are disproportionately light in terms of the regulation system, and the pharmaceuticals are disproportionately heavy.

And we make various recommendations about how we might get something that is both effective while being proportionate, but most importantly in the context of virtual responsibility, identifying how we can do this in the context that gathers information and enables us to learn from it. Not dissimilar I think, to the things we heard earlier about responsible science.

One of the weaknesses of the regulatory framework we have for medical devices is it's a bit too easy to get things into market with very little data on effectiveness, particularly where there are existing devices that are said to do the same thing. Although I think experience over here and also in Europe shows that there's a bit of mission creep on councils with the same thing, that raises concerns.

So we've argued that we should aim to bring at least the devices, neurostimulation devices under the model for medical stimulation, not so much the pharmaceutical one, but the one that's used for neural stem therapies.

We've also identified a series of challenges in doing that smoothly, in particularly the U.K. system, which we were mainly addressing as with the overlap of regulators. And one of the recommendations in the report was to address that overlap. It's been partially addressed, partly by stimulation from the Nuffield Council report and partly also by the parliamentarians on our science and technology committee seeking the regulators to set out a map which will help people find their way through the system.

So if I put my health research authority hat on, which is one of those regulators we worked with colleagues in order to try and declare that we don't have overlap, it doesn't stop the system from being complex and there's still a set of challenges which we are pursuing in terms of opening that up.

We also identify a number of areas were problematic. Our approach to neuro stem cell therapies in the U.K. had been to set up a specialist committee, a gene therapy advisory committee, which met on a different basis from what you would call IRBs, and led to a slow approval process. That's recently been brought into line with the normal timelines and seems to at least be processing applications more quickly.

We've identified particular problems around the attempt to use new cell therapies for individuals without it being within a clear research structure, which is permitted by our legal system. But in order to contribute to responsible innovation in the area, we need to collect the data much better on what is learned from those things. So if it isn't a trial, as such, it remains research that we can learn from.

And we identify particular problems around ethical guidance in relation to sham neurosurgery, and I'll come back to that very shortly.

So what we've done is we've addressed a series of challenges, and I just want to pick up three of them, four of them. And I'll say a couple of minutes, a couple of minutes together, about.

The hype issue which you've picked up, I think it plays out slightly differently in our media that what I've heard this morning. So the media hype that we picked up, our examples in the reports, stem cells can rescue memory from Alzheimer's Disease claim scientists. Coma victim able to speak again after pioneering pneumatic field therapy. How is that the brain could bring out the genius in you. Morality is notified in the lab. Paralyzed man's mind is read.

So I think there's a slight different set of hype issues. But what was interesting from the analysis done is that almost all those reports seem to come straight out of the press releases from the scientists. So the question around scientists getting out there played out slightly differently.

And one of the ways we've addressed that is we have a project that we've set up with the various learning societies to look at research culture and how it's shaped. Because we think the hyperbole comes at least as much in our research and culture as it comes from pressures from outside. And we think that's largely related to funding regimes for research. So we've-- all of us working together, but coordinated by the council trying to address that.

We're also engaged in some of the attempts to get better earlier thinking about ethical issues. We're just starting a bit of work with something called Fun Kids Radio, and I was just look at one of the scripts as we were talking this morning, which includes a little bit about brain turbo tablets. So and that's getting youngsters in schools talking about it. And if you looked about work at what I think you would call high school, around philosophy and religious studies, it's bread and butter for bioethical issues, and we contribute learning materials that are available for that.

Other bits of specific follow-up from recommendations, we held a roundtable earlier this month to bring together new researchers, patient groups and commissions used to running clinical registries. And we understand that the British Society of Stereotatic and Function neurosurgery -- I hope I pronounced that right -- is setting up a deep brain stimulation registry, so it's an early step in collecting this information so that we can learn from it.

The regulation of devices is under consideration at European level. We've had an opportunity to meet with the NRHA which is our medicines and health care products regulatory agency, to feed into that the thoughts from the Nuffield report. But mysteries of the European legislation are manifold. I think today we're going to learn something about what they did with the clinical trials. It may be some time before we learn about devices.

And we've been working with the health research -- I have to be careful here because I can't do both bits, so it's commented in both of them around the guidance of the use of

sham surgery as controls. And their current view, we have something that helps research authority called the National Research Advisors Panel, which supports what you would call IRBs. They'd had a look at the report. At the moment, their response is that they think the existing guidance, which is around proportionality between the public interest and conducting the trials, the need to use sham surgery as part of the methodology and the records of the patients is probably enough. At the council, we think that doesn't really clarify what might be different about sham surgery and neurosurgery. So there are continuing discussions there.

Thank you very much.

DR. WAGNER: Jonathan, thank you.

Our final presentation before we open things up for questions is from Dr. Stefano Semplici. He is the chairperson of the International Bioethics Committee of the United Nations Educational Scientific and Cultural Organization, UNESCO, recently released a draft report on the principle of non-discrimination and non-stigmatization. That report includes a section devoted to the neurosciences which we look forward to hearing a little bit about.

Dr. Semplici is also professor of social ethics at Rome Tor Vergata University, and scientific director of the University College Romero Pisati in Rome. He is a member of the steering committee of many institutions and organizations, including the Institute for Philosophical Studies in Ricos Casteli and the Center for Religious Sciences of the Bruno Kessler Foundation in Toronto. Dr. Semplici is the editor of the Journal of Archives of Philosophy, a fellow of the Italian Society for Moral Philosophy and the Center for Philosophical Studies at Gallerate.

Welcome to you and thank you for being here.

DR. SEMPLICI: Thank you for inviting me and giving me this fruitful, wonderful opportunity to meet you and discuss with you such relevant topic. And I will propose a starting point, the general framework within which the IBC has addressed the issue of neuroscience, and then I would like to make three points which I consider particularly relevant for our shared commitment.

The starting point is a quotation by Immanuel Kant, the well-known German philosopher, "Scrutinizing our hearts." Immanuel Kant consider God as the only one who had the ability to scrutinize the hearts of human beings. We know now that God is not the only one anymore because there is neuroscience that, to some extent, to some extent opens new possibilities exactly to look at our brain working. And as it was correctly pointed out this morning when there is something that we see in our brain when we --that happens in our brain without the brain reminder, which does not entail the conclusion that the brain is everything we need to understand mind and to talk, to think about mind, which is something else.

Why did I decide to address this topic? The second slide. The IBC included a reflection, the principle of one discrimination in the stigmatization as set forth in Article 11 of the Diversification of Bioethics and Human Rights in its 2012-2013 work program as decided in 2011.

Six illustrative examples were pointed out. Three of them, tropical-like diseases, HIV and organ transplantation are immediately related to the persisting challenges of poverty and social cultural force of inequality. Three of them: biobanks, nanotechnology and neuroscience, can respond to emerging scientific and technological developments that could aggravate persistent problems or even create new versions of these problems. So this is a first very important qualification and needless to say, I think it was a quotation by President Obama, the one we saw this morning on the slide -- needless to say, the most important thing that neuroscience is about is to provide new tools to address and treat terrible diseases.

And now we're talking about something else, not to reduce the importance of what I have just underlined, but just to focus on our specific topic. It was nondiscrimination and non stigmatization. I have no time to develop this argument, but I want to share this with you because it was the outcome of a very long, fruitful, lively discussion among the committee preliminary clarification, it is a question of economic and equalities, a possible source of discrimination. So the concept of discrimination entails the idea of a different treatment based on differences that should not be relevant, either in general or to some specific issue. Therefore, it depends on what we are talking about. It is not just about ability of specific characteristics of human beings. It is also about context. For instance, if we have to decide to give a protecting cream against UV radiation, it is no discrimination to give this protection, to ensure this protection, to who is more in need of it. In this case, white people with red hair. So the context is relevant when we address the challenge of discrimination.

And we can look at -- there is a quotation from our report, the lottery of social and biological lives should not be grounds for disadvantages or advantages. Example of characteristics that should not be grounds for special disqualifications or some restrictions are sex, race, religion, political belief, national origin or sexual orientation.

In the case of health care and building on the awareness that material conditions of human life are pervasive factors in its improvement Article 14 of the Universal Declaration of 2005 includes economic condition as a potential basis for discrimination not to be tolerated. Of course, it is about discrimination in a broader sense than it is usually considered, but it is something that is worth getting a deeper insight.

Now, I move forward to the three points I would like to make. Some dark sides of neuroscience. Evil as nature, evil or fragility as someone’s fate, the stigma of the individual, enhancement as a commodity. Enhancement is something very relevant and I think it should deserve more attention because it is likely to become one of the more challenging issues for our

future. So the first one evil as nature. I would make the point the nature culture. Over the last decades, we have shifted many things from nature to culture. Neuroscience is one of the domain of biomedical scientific development which prompts a new approach, a new insight in the sense that we want to reconsider something like ethics before ethics. This is the grounds on which different ethics are developed, are being developed, determining on the challenge of cultural pluralism that was already mentioned this morning.

So a nature culture is a first frontier of commitment and we probably are obliged to face what we were not used to facing any more. For instance, to use an example that was suggested by our friend and colleague, Jonathan Moreno, once a member of the committee. For instance, the evidence, the possible evidence that something bad is not just dependent on culture but has some deep root in our nature.

Second point evil or fragility as someone's fate, this is a very debated case of prediction, it is not necessary to develop the case to long building on what is usually being said with reference to the human genome. We already know that. I would just draw your attention on a very decisive point, which is criminal law. The outcome of the necessary reshuffling of old concepts such as conscience, responsibility, and so on will have a tremendous impact on our criminal law. On both sides -- on both sides neuroimaging has been used in legal cases to assist in retrospective arguments about the defects that might have helped lead an individual to engage in a violent crime. We all remember minority report I think, so the risk of a stigma on individual and maybe even of constraint before they commit some crimes, they could maybe have the ability to. But on the other side, the possibility to use the outcome of this resource in the sense of mitigating the conditions for criminal law to be considered. So both these sides will be very relevant in our future and we have to anticipate what will happen.

And the third, the last one, enhancement as a commodity, let me conclude with a quotation by John Rawls, there is of course the issue of the legitimacy of an incident as such, which is already a very controversial issue. I will not add anything about that. But there is also what -- the issue that we could focus on according to John Rawls’s second principle of justice. Social and economic inequalities are to be attached to offices and positions open to all under conditions of fair equality of opportunity. Maybe we will have enhanced individuals participating in a public competition for whatsoever position maybe in order to enter an academic activity.

So the question is should an enhancement be something to distribute under conditions of fair equality once we accept that such an enhancement is legitimate, something that I have not said so far.

Thank you.

DR. WAGNER: Thank you very much.

Actually that concludes our three presentations. Are there comments from this first group?

I would like to know -- you know, the reason that we asked you to come. And by the way, you've all fulfilled that admirably, is to give us illustrations of how you have integrated ethics into your own respect -- in your own respective international experiences and activities in neuroscience.

I was searching all three presentations, and I trust, Dr. Rose, that you're still with us?

DR. ROSE: Yeah, I am.

DR. WAGNER: Good. I was searching all three presentations for what would be sort of the common -- what's the one single -- maybe it's not one single. But what is a common baseline caution that you would give us? Or let's not call it caution -- encouragement that you would give us to pursue as we step out on this advising the White House on this Brain project? What is the -- you would be certain you would inject this, what would that be?

DR. MONTGOMERY: If I can have a first go while the others reflect. I mean, I think one of the things came out of our report very strongly for me was that there's an awful lot of activity for which we're not getting the best scientific benefit because it's fragmented. So part of our recommendations around registers with others to find out what we've learned is, how do we try and make sure that this can't be fragmented set of enterprises comes together in a way that we can maximize what's being done for common good.

DR. WAGNER: That's interesting, almost an inventory before it gets going.

DR. MONTGOMERY: We have in the U.K. quite a concentrated set of professionals who are organized in a relatively small number of professional societies, and therefore it's possible for us to get quite quickly to people and say, wouldn't it make sense if your sent all you remembers if we're going to be doing this, you need to register, and we need to ask what you've learned from it. So actually collectively, you can create a base from which hypotheses can be generated and experiments designed.

What we particularly found was the mixture of therapeutic and non-therapeutic uses that we had a lot of sort of last resort type uses of emerging technologies. Many that we didn't think we were learning from this as quickly as we potentially could. And it would be similar for things like the emergence of keyhole surgery and access surgery and the like where, early on, quite a lot happened in the U.K. before collected the data. And we could accelerate the development of good science by pulling it together.

DR. WAGNER: We had a recommendation of that in synbio, as you'll recall.

DR. ROSE: Could I --

DR. WAGNER: Yes, please, Dr. Rose.

DR. ROSE: Because that flows directly on from Jonathan's remark. I have, over the last few years, been doing some work on the history and contemporary nature of neuroscience and was struck by the fact that there are hundreds of thousands of paper published every year in neuroscience. And I think probably around 100,000 refereed journal papers in neuroscience. And there was a huge amount of research going on out there, but it's very, very fragmented. It's fragmented between different sub-specialisms, it's fragmented between different learned societies, it's fragmented between different institutional thoughts.

And at least one of the main objectives of the Human Brain Project is to try and find some way of integrating and coordinating all this huge amount of research that's already been done. Now the HBP is not the only project to try and do that, to try and get the researchers and the research to speak to one another. There are many other projects like this.

But I think that certainly sort of picks up on one of the themes that Jonathan says. Surely if we're spending all this money on neuroscience as we are across the world, there's some kind of obligation to bring together and coordinate the results.

And the other side of that is that there is a huge amount of data in hospital records, in the records of clinical trials, in the records of individual patients with brain disorders. There's genetic data, there's brain scanning data and so on. And at the moment, a lot of that data that may be used in specific pieces of research is not brought together. And again, rather than, or before we feel that we need to make another huge research endeavor, perhaps there is a kind of ethical obligation to make the best of the data that we have. And again, that's an aim of the Human Brain Project, this coordination, integration of data to try and find out what we know already and what we know we don't know already.

The last thing I'd say, it's -- you know, obviously it's a little awkward to be contributing at a distance like this because eye contact is quite difficult. You're rather blurry images on my screen. So just whilst I'm speaking.

The other thing I wanted to pick up was something that Dr. Gutmann mentioned in the previous session, and it also came up in the report that I was involved in that Jonathan mentioned on novel neurotechnologies. If we're going to work in this area seriously, we have to work in this area modestly. We have to be very, very aware of what we know we don't know and of the huge gulf, the absolutely huge gulf that there is between our understanding of brain processes at the neuromolecular level, and our understanding of mental states at the level at which we experience them in our everyday lives as human beings in social encounters in our social world.

Both the reports on the Brain Project, and the report on the Human Brain Project are very honest and open about how much we don't know. And I think if we make rash predictions and rash promises about where we're going to be in three, five or ten years time, those promises won't only not be backed up by the science but they'll lead to disillusion. They'll lead to, okay, we've invested all this money but where are the results?

So I think humility, I believe that's the term that we used in the Nuffield report, and maybe that Dr. Gutmann mentioned before, I think humility on everybody's part about the length of the journey we have to travel is also a real advice, a real issue that I think we need to be aware of.

DR. WAGNER: Thank you.

From Dr. Semplici?

DR. SEMPLICI: Well, I have three very simple and clear suggestions which build on the three points I tried to make.

The first is education. This morning, not incidentally, Professor Koch's presentation immediately triggered a strong reaction. And this is the issue we have to face. Our old traditional concept of conscious, responsibility, free will, the relevance of language as a specific human ability are not completely out of date.

I'm not expressing the wish that they go out of date or out of market, if you prefer. But we need to rebuild a more appropriate basis considering what the outcomes of scientific development are. So this is the first point I would like to make and suggest. Education; it is not just about discussing these issues in the faculty of medicines. It is exactly an inter-disciplinary work which calls on all of us to a renewed commitment to thinking of the solid robust grounds of our individual and social life. Education.

The second one is law. Is law. There is already a great deal of literature on that. I think that we are going to face a new wave, not only of literature, but decisions made in the courts. And this is a challenge not only for the systems of civil law; this is a challenge for everyone in the U.K., in the United States. We have to anticipate it, and in order not to simply give up the notions of responsibility. Think of the notion of mens rea. What is likely to become of our criminal code once we get rid of the idea of mens rea. If there is no mens reas, because there is just the brain working, there is no responsibility, there is no legitimacy to charge someone with a crime, not to mention a sin.

The third one is benefit. Benefit and sharing of benefit. Jonathan mentioned the question of the difference between therapeutic and non-therapeutic use. It is not easy to draw the line. So I think that a Commission like this and like all Commissions like this everywhere in the world should take the commitment to help people. And of course, policy makers to draw the

line. I could mention that, yeah, something like deliberative democracy, but that is not even the case. But we need something like that because it is very difficult to draw the line. And once you draw the line, that is, you're at the threshold, and then there is non-therapeutic use. There is enhancement use. Many problems arise.

So we have to draw the line and then to decide how to share the benefits that result from this. So it was unprecedented, scientifically.

DR. SULMASY: I was going to ask Mr. Montgomery to expand if he would a little bit on the reasons to single out sham neurosurgery, because it may help us in thinking about other technologies and neuroscience.

DR. MONTGOMERY: I wish you hadn't asked me that because, I mean, I think that's the question, is this different about the neurology context or is it equivalent to perceive things in other areas? And I think that's where the discussion is going on between the health research authority at the moment. The Health Research Authority's meeting at the National Research Ethics Advisor Panel has drawn the literature around these controls and perceive those more generally, and thinks that there isn't especially different about the brain context.

At the very least, at council, we think that if researchers are unclear about how they should design trials and whether it's acceptable to use sham surgery that it may well be that we don't get the learning that we need moving forward. We may do things without understanding. We may do the trials without the controls, and then act as if that's secure whereas you haven't actually been able to do the methodologies in the way that were more appropriate.

Nik's probably more involved in the discussions around the sham surgery than I would, and I can see him nodding. Do you want to add to that Nik?

DR. ROSE: No. I --

DR. WAGNER: Perhaps everybody else knows what sham surgery is, and I'm not sure I do. Would you --

DR. MONTGOMERY: Well, I think it's essentially if you're trying to run a blind randomized control trial, you need to control not only who are not getting intervention, but the other side of it, the people don't get --

DR. WAGNER: I was afraid that's what it was. Yes.


DR. MONTGOMERY: You were certainly wouldn’t want a double-blind sham neurosurgery, but that issue, the patients --

DR. WAGNER: I was afraid of that, yes.

DR. MONTGOMERY: Nik probably knows more about that.

DR. WAGNER: Dr. Rose, were you to make a comment before I go to Christine?

DR. ROSE: No. I think you might just want to think about deep brain stimulation. Okay. Now is it legitimate to do sham surgery and deep brain stimulation? Deep brain stimulation involves, even with it as much more sophisticated as it used to, a rather invasive brain operation, the implantation of the electrodes and then the stimulation of those electrodes.

Now what would be the sham surgery equivalent of doing that? Would it be the operation? Would it be the insertion of the electrodes? Would it be the electrodes non-functioning, and so on. One of the things that we know about certain kinds of disorders, and we've got this as not just brain disorders, is how susceptible they are to the placebo effect.

But if they're really wanting to test the efficacy of deep brain stimulation, we do have to do the kinds of controlled experiments that Jonathan mentioned. And yet, an unnecessary intervention into that most delicate organ of the brain that we really -- and I'm sort of stressing this again -- again, we know so little about. An unnecessary intervention into that is ethically problematic, and probably publicly a little bit controversial, rather more than say a sham surgery on your knee or sham surgery on your back.

DR. SULMASY: Just follow up a second. But in a principled way -- again this is part of the debate I guess you're having between the two councils in the U.K. Is there a reason for the difference other than degree of danger, the degree of invasiveness, et cetera, something, you know, special and privileged about the neurologic side of this?

DR. ROSE: I don't know if Jonathan wants to answer that.

But just as a member of the working party, so I'm now speaking with a slightly different hat on than the Human Brain Project hat, but as a member of that working party, we did start from the view that the brain was the organ that, in the only sense that we understand, a very special organ, an organ about which we had less knowledge on its functioning than other organs, and therefore an organ that required special protections.

So in that light, the bar that had to be met, if you were going to do something like sham surgery in the brain, was higher than that with another organ that you understood better.

DR. WAGNER: Let's move to Christine. I actually called you Anita. Do you want to enlist Anita?

Thank you.

DR. GRADY: So actually -- I was actually going to ask the same -- I was going to actually have the same questions Dan has, but I'm going to just shape it slightly differently.

Because there are some other things that you talked about that I think one can imagine need attention, like research culture, the funding structures, the lack of regulatory consistency or innovation, things like that.

In those areas, do you think there's anything specific to neuroscience, or any reason that they're more important to pay attention to in the novel emerging neurotechnologies? Or are they just problematic the way we probably know that they are.

DR. MONTGOMERY: Nik may have slightly different views on this.

I think the answer is nearest to, no, I don't think they're unique because actually the themes that emerged from novel technologies report were very similar to the ones that had emerged from our report on emerging biotechnologies. And they really need to have a broader public debate around how we assess what was in the public good, how we made our funding decisions. How we committed ourselves to one line of development with the opportunity costs from there of what we're committing to others?

I think there are, at any particular point in time, there would be reasons to use particular examples to work through the principles. And I think the answer to the sham surgery one is probably the principles are the same principles. But given what we know and don't know about the brain, it makes a lot of sense to work out how to apply them at this point in time in a way that is dedicated to the sham neurosurgery.

So in that sense, I can see where there's common ground between the debate on the two councils. I'm sure if it were a case, it would work its way through to some form of solution. Similarly, I think there are reasons about public anxiety that the reasons why we have the questions we have about brain resonate with deep anxieties around personality, identity, responsibility.

I think if you step back and took a historical view, though, you would say exactly those same anxieties around mens rea responsibility that emerged out of variants borne to social, and that can be determinism. I think it's the particular point to which those anxieties manifest themselves as opposed to being unique. So I think there are a lot of other resources to draw on.

DR. WAGNER: Dr. Semplici wanted to comment on that.

DR. SEMPLICI: Yes. I think there is a difference exactly because of the perspectives of enhancement that are implied in this field of research, we know of authors like Savilesko and Persons who consider world where moral enhancement should be quite accepted if not considered as due practice.

So this is really something different. And in this sense, to draw the line in -- to draw the line between therapeutic and non-therapeutic use is in itself a challenge, which is not only to be considered with respect o technical difficulties or invasive techniques. Because the

same result is likely to be obtained with less invasive techniques. Maybe simply with a drug which can produce the effect of enhancing memory, increasing intellectual activity, eliminating unpleasant memories, and so on.

So the difference isn't a possible outcome. That is why the non-therapeutic use of this knowledge enters a greater and social public responsibility.

DR. GRADY: But that difference means different funding mechanisms, or different regulatory structures or different research culture? That's what I was asking.

You know, I understand the differences that you just described, but you know, how do they affect these bigger issues of research culture?

DR. MONTGOMERY: I wonder if I could invite Nik to say something here, because he came to talk to the European National Ethics Committee, and he raised some really interesting points about what were perceived by people about decisions based on observations that were actually based on simulations because of the way in which the technology emerges.

And for me, I think in many ways this is similar to other debates about enhancement. But there are some big risks that we believe we know more than we do, and we allow shaky knowledge to be as if it was a more secure foundation. I don't know, Nik, if you'd want to -- you might be --

DR. ROSE: If I may, yes. Let me just comment a little bit on that.

My role in the Human Brain Project is to run the Foresight Lab. And foresight is a hard thing --

(Call connection is lost.)

DR. WAGNER: Nik, we -- we'll pick you up later, Nik. We'll pick you up when -- we'll get a few more questions in the meantime.

Nita, Steve and Anita. Nita first.

DR. FARAHANY: We like to keep it confusing over here, like us law folks. So I want to just make a comment, which is to encourage us to be quite careful in our claims about the ways in which neuroscience impacts different institutions, like the legal institution in particular, from my perspective.

So I agree with Jonathan that these kinds of forms of determinism have shown up many times in the past from twinkie defenses to social and environmental determinism and each and every time there's a new innovation, there's this belief that the criminal justice system is going to come crumbling to, you know, its knees and that all of the bases for what we've claimed in the past are going to suddenly go away, when in fact when you look carefully at our legal

system, there are many normative commitments that underlie different issues, like "mens rea", and that those concepts, as they're understood in the legal system, have very specific and nuanced meanings that are not based just on a black box of not knowing what's happening in the brain.

At the same time, I think that neuroscience is a uniquely valuable lens by which we can examine many of our normative commitments in law and in other areas. It can help us understand what it is that we're doing and why we're doing it and, in fact, that's one of the ways in which I think neuroscience has been most useful to our criminal justice system, is to ask what do we mean by concepts, like voluntariness and mental state? Why do we punish people? Do we continue to believe that those justifications make sense as we have a better understanding of how, not why, but how some of the underlying mechanisms in the human brain take place?

And so since we've talked a lot about the importance of not overstating the science, I think it's also very important that we not overstate how it will revolutionize different areas and institutions and commitments in those areas, in particular in law.

DR. GUTMANN: I always take it farther. Underlining things that I think will represent things that the Bioethics Commission want to do in its report and I would underline that just as we want to make clear how important it is for a pathway for neuroscience to progress to, you know, ask practitioners not to exaggerate in the interest of truth of the fundamental basis of science, which is exposing carefully, you know, the truth, so it is that we want to make sure that people who practice ethics do not exaggerate the way -- which happens time and again, which doesn't make it good.

It's a bad thing when it happens. When some new knowledge base comes out, all of a sudden the responsibility basis of the law is going to be undermined. Many people, long before neuroscience developed to the extent it is now, believed, as I do, that the brain underpins the mind and that malfunctioning of the brain would have effects in people's ability, you know, to abide by the law, but that doesn't -- it didn't then and it doesn't now undermine a lot of what the law is based on.

So just to underline that, I think we could do a public as well as an ethical and scientific service to try to expound very clearly on that and so thank you. I mean, I think I see nods but I think it's important and --

DR. WAGNER: Last two questions and then we can come back to enhancements, also, when we have the Roundtable Session.


DR. HAUSER: So my question was to Dr. Rose to Nik also, I was quite taken by the concept of the foresight lab that you are directing and wanted to understand in a bit more

detail how this is integrated and embedded in the specific projects of the Human Brain Project and it may well be too early, but are there any early ideas that you've gleaned about what might work well?

DR. ROSE: In many ways, it is too early. As I was going to say, foresight is quite hard, especially when it's about the future. It's kind of easier when it's about the past.

I think the one thing that I would want to stress about the foresight lab, and it does link quite nicely to some of the points that have just been made, is that this is an attempt to actually think through on the basis of empirical evidence, social evidence, and one's understanding of the social and political context in which these scientific results are going to impact what is likely to happen and this links to Dr. Gutmann's remarks because a lot of what happens in new and emerging technologies, especially from the ethical point of view, is highly speculative and really overstates what the science can actually do.

Take, for instance, the argument about whether or not neuroimaging can read human intentions. Now the difference between one studied in the lab under the heading of volition and any human intention that any colleagues here might have in organizing their every-day life is absolutely huge.

There is absolutely no way at the moment where brain imaging can read any of those intentions in any actionable or practicable way.

Take the example of enhancements, which I know has been a subject of much debate and much moral exploration, and if you actually look at the results of students, for instance, who've taken so-called enhancing drugs, you see that the consequences of those drugs have not been that different from the consequences of other drugs. They exacerbate certain capacities for a certain limited amount of time, for instance, concentration and acuity, and they limit other capacities at the very same time, and they have similar kinds of consequences afterwards of come down and exhaustion, of depression, etcetera, etcetera, etcetera.

So before we run away with the idea of these things having very revolutionary effects that are going to challenge our very understanding of who we are, our view and the foresight lab’s view is that we need to be very realistic about the science and we need to locate the emergence of those technologies and the uses of those technologies in the practices and institutions that are likely to take them up.

And the example of the law, I think, is a very, very good one where one sees legal systems across Europe and across the United States struggling with these new forms of knowledge, integrating them into their existing practices, maybe to some extent questioning their practices, but they're certainly not overturning our very ideas of criminal responsibility for reasons that many legal scholars have pointed out at great length.

You know, if we're wrong about the fact that human beings, by and large, intend the meanings of their acts and therefore can be held culpable for the meanings of the acts that they undertake, if we're wrong about that, then we're wrong about almost everything that we've ever thought about human beings and I don't think neuroscience is going to lead us to that belief.

Now the question of how this is integrated back into the project is again being quite realistic about how, say, our increasing capacity to predict the consequences of genetic or neurobiological anomalies in future disorders and then to develop strategies to intervene in them, to try and think through what might be involved in those practices of screening and early intervention.

For instance, in the light of everything else that we already know and the controversies that we already have about screening and early intervention in mammography for asymptomatic women over 45 or the PSA test that has been such a controversial issue in the United States.

So before we suddenly think, okay, we're going to screen, we're going to identify risk factors, and we're going to intervene, we need to locate those questions in a realistic social and political context. That's what we're trying to do.

How effective we'll be in building those back into the research programs of our colleagues, that's another question. We are committed to doing that and our colleagues are committed to doing that and have been since the beginning but it's a work in progress.

DR. WAGNER: Thank you. Final question, Anita.

DR. ALLEN: Okay. I had two comments for Dr. Rose. One is about your interesting comment, there are a 100,000 refereed papers on neuroscience, and I wondered why? Why has the scientific community not with foresight integrated the research in some important way?

Secondly, I wanted to sort of say as opposed to ask is about your -- I thought I heard you say in connection with the sham surgery, I wrote down "we do have to do this," you know. So I'm wondering what lack of caution might be involved in the notion that we have to do the sham surgery, have to do sort of placebo surgeries, despite the risk and safety, as you put it, against your scientific understanding of risk and safety might be very different from that of the general public?

For Dr. Montgomery, I wanted to ask a question, maybe also for Dr. Semplici, too.

Dr. Montgomery, you listed safety, autonomy, privacy, equity, and trust as the main principles governing this research in the European context.

Are there any principles at play in your part of the world that are different from principles at play in our part of the world? When we debated and discussed synthetic biology, we discovered that the American approach to syn-bio was a little bit different than the European ethical approach to syn-bio.

Are principles, like proportionality or even some other principles, at play in Europe, human rights, that are not at play in the U.S. bioethics context?

DR. WAGNER: Who do you want to go to first?

DR. ALLEN: Maybe Dr. Rose.

DR. ROSE: I'll go first and be quick. Yes, I'm terribly sorry for that normative. We must do that about sham surgery.

I suppose the point I was trying to make, I hope backing up what Jonathan made and our report made, is that it's necessary to use the best means that we have to rigorously interrogate the claims that are being made in these areas.

Many big claims are being made for something like deep brain stimulation. I really hope that a lot of them are true, but we need to find the best techniques that we can to scientifically evaluate those claims.

There are a lot of problems with evidence-based medicine but it's the only system that we have in order to really evaluate claims, especially when those claims are bound up, as we know they are, with the commercial --

DR. ALLEN: But do we have a line we simply will not cross. Maybe sham surgery is one of those --

DR. GUTMANN: I think it would be interesting. Steve, would you be willing to say something, I mean, because Steve really put emphasis in not -- no, but he's, you know, a practicing neuroscientist.

DR. HAUSER: I may be a practicing neuroscientist in need of some education.

I've argued for the importance of sham surgery in some of the early attempts to look at the effects of deep brain stimulation for movement disorders and the ethical implications of that are less or may be the least in this whole arena because lesions are made as part of the alternative therapy.

So one is also asking does the implantation of a wire in stimulation provide more benefit or different benefit than a lesion that we know provides some benefit?

Without sham surgery many groups have used for DBS, you know, on versus off, which is another way to do that, and the sham is when the machine is turned off, so there are different ways to approach it.

I think that to perform intracranial surgery for placebo purposes, that goes beyond at least my experience. When we're speaking about cell-based therapy, I guess these are issues that we will -- that may be faced.

Is that helpful? And especially for things like depression where this is way or other affective or behavioral disorders where we have never -- we haven't thought happily in 40 years about intracranial surgical approaches for these conditions, this would, I would think, be above and beyond.

One of the very good points about some of these treatments is that sometimes observational studies that are not blinded can be adequate when one sees an effect that is very different from the natural history of the underlying disease.

DR. WAGNER: And, finally, Dr. Montgomery?

MR. MONTGOMERY: Thank you. I guess I should say I'm not Dr. Montgomery. We have a different system. It's a little harder to be a professor than a doctor but I'm not a doctor in our system.

Anyway, I think I'm very uncomfortable with the idea of third med-lines that you never crossed because whenever I come across them, we always tend to revise them a bit later on.

I want to go back to the question about are the principles similar or different and I think the position we would want to take is that we should ask the question whether it is to generate the appropriate knowledge, scientific knowledge and other knowledge that we need to test our claims and move forward?

We do or do not need to carry out sham surgery. So that raises a question of balancing what it is we're trying to find out, what other ways there are of finding them out, and wholesale issues around consent and our experience of other therapies is actually -- it can often be quite easy to get consent to people where they think it's their last resort in circumstances where it might look from the outside a rather unwise thing to do and there's a genuine question, I think, in the particular context of neurosurgery around the questions of informed consent and how it operates.

On the one hand, you may wish to say you would only do this on the basis of informed consent and then describe the trauma that would be created in doing it. On the other hand, the people who may be most likely to benefit from it may be least able to -- we may be

more uncomfortable about accepting the choices that they make because of the very need that's there.

So I think this is about -- I think the really difficult question is is there anything that is so special about the brain that we don't have the conceptual analysis already there to work the way through?

I don't think that's what the Nuffield report suggests. I think the Nuffield report suggests that this is a new enough context that we need clarity about when it may or may not be appropriate to do it and if we need clarity in the report, then that clarity is made available to the researchers who shouldn't be expected to guess what the rest of us might think, do it, and then we come back later on and criticize them. We need to be having that debate upfront.

DR. WAGNER: The three of you please do not go far. In about 10 minutes, actually, 2:30, we will want to reconvene -- I'm sorry.

DR. SEMPLICI: Just two short observations. The first, I stress once again the importance of education just because of the fact that I completely agree with what has been said. Claims are to be assessed and claims in this field concern two big promises: wishes and desires of human beings. So education is the first step.

And the second observation is very important because when we look at this as well as other problems at the global level, we have to consider that the principles are not the same. They converge maybe, but are not the same. The consent is an illustrative example. The experience and practice, the practice of consent in many countries is quite different than the practice of consent we are used to in our countries.

So we have to look at that very carefully when we enter the global space of bioethics. Otherwise, conflicts are likely to arise.

DR. WAGNER: Point well made. Thanks to the three of you.


DR. WAGNER: Let's try to reconvene at 2:30 and put about an hour in, I think, on our Roundtable discussion.

DR. GUTMANN: We will have a Roundtable from 2:30 to 3:30.

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.