Transcript, Meeting 15 Session 4


December 18, 2013


Washington, D.C.


Roundtable Discussion


DR. GUTMANN: I'm going to get started. Welcome back, everyone.

I want to thank all of our presenters. It's a tough assignment and you did incredibly well and it's very helpful to the deliberations of the Commission. So on behalf of everybody on the Commission, we thank you, and on behalf of the President, too, thank you for helping us carry out his charge.

We're right in the middle of our deliberations on neuroscience and as you represented and we have been deliberating ourselves, there really are two big buckets here. One is the Ethics of Neuroscience Research and the other is the Potential Applications and Ethical Implications of the Research Findings themselves.

And I would ask and as my typical -- I'm sure you've been forewarned of my typical question and it just helps elicit something that you think is very high-priority, which is I would ask you just to give one piece of advice as to what you would like to see the Commission recommend in its report and I'll just go down the row. I'll begin with Mildred and then we'll open it up for questions and discussions and it can be either about the ethics of the research or the ethical implications of neuroscience research.

DR. SOLOMON: Well, educating for the ethical implications.

DR. GUTMANN: That's right. Those are the buckets, yes. But if I would, because we've had a number -- and we're all -- we think education's important. Be specific. Give something specific, if you would. We're free to accept or reject or modify it but being specific really does help us.

DR. SOLOMON: Okay. I'd like to recommend that there be serious financing and incentives and accountability to develop ethics scholarship in neuroscience and to do it in a way that is very mindful of how to design it so that the scholarship is structured in a way that it itself is educational.

For example, under the CTSA Program, when NIH required that there be an ethics component to CTSAs in order to be funded, that created the opportunity for scholarship through pilot and innovation awards that were a key component and that created the opportunity for self-education of young people especially who wanted to be -- who would, you know, benefit from the rewards of doing that kind of research.

So thinking mindfully about the ways in which we can fund this and reward it and if I may just say a few more features of that that would be an ELSI Program. I think that there's something to say about requirements that scientists collaborate with ethicists or philosophers or historians of medicine through perhaps a co-investigator status. So it's requirements that could

be built into this kind of an award program and also you opened the session, Dr. Gutmann, with attention to the fact that there were broader issues here about bioethics education for scientists, not just neuroscience.

And while I'm advocating for an ELSI Program and for other kinds of initiatives that the Federal Government could do, I also think we should be mindful that we could skew bioethics and bioethics scholarship just to one area and that there might be ways that we can think at the same time that we're thinking about an ELSI Program for neuroscience that we're thinking more generally about science education.

DR. GUTMANN: Paul? Thank you.

DR. WOLPE: That was said so well that I'll talk about something else.

I think that it's very important -- I'm a sociologist. I'm not actually trained as an ethicist, whatever that really means, but over the many, many years that I've been working in that field, I've come to respect the expertise and the depth of thought and the long literature of thought about ethical issues.

We don't need to start at the beginning here. There's a lot of work that's been done and it's been done well and that's why the crucially -- that's why it is so important that a multidisciplinary approach be taken and that when we talk about creating, whether they are programs for education, whether they are programs for public policy, whether they're programs to try to anticipate some of the ethical issues that are emerging, it needs to be a profoundly interdisciplinary process.

I think Nik Rose was right. We're caught between two difficult ways of thinking. One is that we tend to over-estimate, over-hype the science and what it can do. On the other hand, as a great believer in prophylactic ethics and the idea that you should anticipate and think about the ethical challenges that may lay just ahead and so that when they emerge you aren't caught short and have to start thinking about it then, we do have to project into the future where some of these technologies can go and some of the areas that these technologies can go are very fraught and so, as we did with genetics where we made all kinds of predictions, some of which came true and some of which didn't, but there was really good thought even behind the ones that didn't, it was fruitful to think about them. So, too, we need to do that in neuroscience.

DR. GUTMANN: Thank you. Very helpful. Jonathan?

MR. MONTGOMERY: A few quick things. I think the biggest one is the one I mentioned in the question earlier, which is around as we're doing this work, we have to collate the information and build a knowledge base.

I think we have some experience of doing the ethics alongside the science badly in my jurisdictions where, if you just ask for a statement in all the ethics applications for something, you create an industry as opposed to contributing anything. So it has to be done well.

We have seen it's very important to try and engage with support for responsible media reporting. We have experience in the U.K. of the scientific community coming together through something called Science Media Center to try and make it easy for journalists to be put in touch with people who know about the science and I think that that's been a really important part of getting at high-quality debate.

But I think the final thing would be about thinking about the medium-term as well as the longer-term. I would be anxious, if all I knew was what I heard this morning in discussions, and I know that's only a little bit of what you are doing. It's quite different from what we've found ourselves looking at but we wanted to try and look in the reasonably-close term.

So we were seeing the importance of developments in this area around Alzheimer's, treatment for mental health, as pressing current problems and not just science fiction, and I think you need both of those things in order to get across why it's important to do this.

DR. GUTMANN: Yeah. We certainly intend to use examples of what the science can actually do of good and certainly some of the therapeutic examples are very important.

MR. MONTGOMERY: And I think, also, we learned that there are whole areas of important research where people are soft on pharmacological solutions, Alzheimer's, for example, where they don't seem to have got anywhere and people are being disinvesting, venture capital particularly, and so there seems to be -- not putting the neuroscience away from other ways we're doing it -- but seeing some form of continuity as well as the longer-term indications.

DR. GUTMANN: Yes, good. Christof?

DR. KOCH: As a working neuroscientist, I would like to urge you to do two things. One is continue emphasis on the critical nature of animal experimentation, that yes, animals are different in people, although the stuff that their brain is made out of, it's very similar, but it's really critical, particularly if we're supposed to move much more aggressively as some agencies want into human research that we still do translational research into non-human primates and to emphasize that to the public at large, that animal experiments done in the ethical appropriate manner is still absolutely critical.

And, B, as a working scientist, there's already a very large regulatory burden on us and we know in the case of my institution we have several people who are full-time doing nothing but all the paperwork associated with any one experiment and I would urge you, if you make recommendation, to give some thought to how much more this is going to increase the burden.

I mean, we're supposed to translate all of this into clinical insights but the more, the higher the walls, the higher the regulatory burden that's put on us the more difficult that's going to be.

DR. GUTMANN: Yeah. I think it's really important to recognize that ethical considerations do not equate to regulatory burdens. We have a principle of regulatory parsimony that we have published in our first report on synthetic biology, which means that the more scientists can self-regulate the better off they are and society is. So important, both points important.

Mi Young?

DR. CHUN: So I wanted to echo the scientists working together should be a big emphasis on communications amongst scientists about the best practices and how to take care of animals as opposed to what are conundrums that they may have faced in human studies and so on. It seems to me they are very lacking for the moment and if Commission could actually emphasize that these are the kinds of things that doesn't cost money. These are the things that makes really common sense and yet it's not being practiced.

So as I go through different institutes, they seem to have very different practices. Some are much better than others and it would be great if there's more communications amongst scientists and that would be great.

DR. GUTMANN: Stefano?

DR. SEMPLICI: Once again, education, as I said, neuroscience, the development in the field of neuroscience challenge some fundamental premises of our understanding of human beings in a way that I do perfectly agree is maybe more related to science fiction than clinical practice but this is something to make clear to put on the table of an open debate involving experts, public society, and the media, I would add, because the responsibility of the media is terrible, terrible in what we've been discussing around this table this afternoon, and I think this is something that you could underline with great strength.

Second, in our report, we made it clear that what we are talking about, even though it can seem to have some relations with what happens in the field of genomics and the

human genome, is quite different exactly because it is maybe for the future but not for nowadays.

That said, there is some risks of stigmatization that could become relevant in the future as the outcome of an applied neuroscience. I think it could be useful also to anticipate this problem and it would be also helpful to our report because it is something we focused on in the text of the IBC.

Lastly, the concept of benefit. This is not to suggest a solution but the need to look at the problem. What is the benefit in such a complex field as neuroscience? So where to draw the line between the therapeutic and non-therapeutic use and how to address the possible challenge for the future of enhancement?

DR. GUTMANN: Yeah. So a number of you have mentioned the importance not only of accuracy of what the present findings are but something that we've also, as a Commission, talked about which is anticipating what's coming down the road in a way that actually enables people, both scientists and people who will be affected by the science, to understand it better and I think that's another goal of our Commission's report, is not only to talk about what's happening now and where the science is now but where it could prospectively go in a way that will make it less of a jolt and a surprise as it happens because science -- the greatness of the principle of intellectual freedom that we've talked about and the scientific discovery is it will be -- anybody who predicted the cloning of Dolly was lucky to have predicted it.

I mean, somebody will predict something that happens but it's not all -- it's hard to predict what the next big scientific leap will be and I daresay, given the interest and the knowledge and expertise going into neuroscience, there will be some big leaps forward.

We've been talking a lot about how there are claims for present research that outpace it but there will also be some big leaps that we didn't anticipate.

Christof, why don't you say something?

DR. KOCH: I want to say if you look at President Nixon's announcement of the war on cancer, it took 40 years roughly before the basic investment in cell biology turned into Gleevec and other drugs that right now begin to eliminate or ameliorate specific types of cancers.

I think it's important to warn the public not to expect a breakthrough in Alzheimer's despite all the press releases announcing such a breakthrough in the next two or three years, that maybe we have to wait for an equally long period before some of this basic research work will actually pay off into therapeutic advances.

DR. GUTMANN: Right, right. Yes, and there have been books written about the

disillusionment that set in because public expectations were too high. Good point.

John, and anybody who has questions?

DR. ARRAS: So after sitting through today's conversation, I think I know now what it's like to be on the receiving end of deep brain stimulation.

So I want to thank all of you for really illuminating presentations.

I want to invite you to think a bit more about what might be a tension between intellectual freedom and social responsibility. Jonathan, I was struck by the Nuffield Report and its emphasis on responsible research and innovation, right? Your report made a point out of stressing how the aims of research should be harnessed to the healing of human diseases, right? My late great colleague Benjamin Friedman talked about science in the service of healing, right?

So you made a distinction there, I think a good one, between science in the service of healing and science for the mere pursuit of novelty or innovation for its own sake. So I have a couple of related questions.

One is how does this distinction and this mandate for science in the service of healing get institutionalized, you know? Like, how is it that people are encouraged or incentivized to do socially valuable sorts of research rather than just me-too drugs of various kinds?

And secondly, how might our educational interactions with young scientists also engender that kind of social consciousness or social responsibility?

Thank you.

MR. MONTGOMERY: Small questions then. Very sort of sketchy sort of answer.

I mean, I think there are a number of levels of which you need to ask those questions and I think one of the things we need to bear in mind -- a little bit like the communication science issues, not everybody has to do it all in the same way, so whether we want all our scientists to know how to be expert lay-communicators or some of them.

So the Nufield Council Report on Emerging Biotechnologies was particularly concerned about the way in which the decision-making made by investors in research got locked into particular pathways without there being very much public discourse about the values that lay behind it.

So I think part of the answer to that is in the context of who's investing both in projects and in infrastructure and I want to come back to that in a minute.

How do you get a debate that gives you confidence that that is rooted in some sense of the common good. I don't think you necessarily need every individual person to come up with an idea to have the same debate and you certainly wouldn't want them to be doing poor science because they designed projects they thought would be popular and that's part of the discussion about sham surgery.

I think a big part of our thinking is around how you could make these developments in some sense a common project, hopefully in the service of healing, you know, but you accept that not all of it will turn out to do that. So that was why we were interested in the registry idea and creating a website. If you're engaged in this field, you need not only to do your own project but to contribute to the emerging knowledge and I think I heard a number of bits this morning about open data which are in a similar sort of train.

And I think that it's certainly in our society, it's misleading to think, with all the enthusiasm for the sciences with the scientists and all the anxiety is with the lay public, because actually we have very strong promotions for science moving probably faster than the scientists think is scientifically well designed by some funding groups and some patient groups. So there's a bit of harnessing on that.

DR. GUTMANN: If I just may interrupt for a moment, but the pressure there tends to be from groups that are organized around diseases and therapies. So what they're pushing for, which is totally understandable and for the social good, is for more discoveries, translational discoveries, and rarely, although there are cases of this and wonderful -- in the States, at least, in my institution -- donors who get it, the basic research is often not what they're pushing for but, rather, the translation of the research.

MR. MONTGOMERY: Yeah. I think that's a yes and no in the English context.


MR. MONTGOMERY: We have quite a lot of medical research charities which grow out of concerns about particular diseases but are actually looking at investing in quite basic science and actually the final thing I was going to say under that heading was the strategy group I chaired around brain banking for the U.K. was a group involving industry, charitable, and academic groups thinking about infrastructure and one of the things identified was the importance of common standards for brain banking, so that the brain banks could become interchangeable, so they could do more than their own projects. They could provide controls for each other. They could generate sets of data that the other would pick up.

We have in the U.K. a lot of now cohort studies, bio-banking which is designed in the same sort of way to create the infrastructure that would enable the research, including neuroscience, to move forward. So you can have your discussions about creating infrastructure, as well as project by project.

DR. SOLOMON: I'd like to respond to John's question about the tension between intellectual freedom and social responsibility, so that's science in the interest of healing.

You asked, and you looked at me, how do we prepare scientists to understand that, and my quick and simple answer is to put it out there, I mean, to problematize these big meta questions.

I think, you know, this morning we talked a lot about the problems that scientists have in understanding why bioethics education is important for them and Dan read off a list, but I think that the field of bioethics itself has some responsibility here, that we are sometimes perceived as the moral police or as restraining science, and we have an obligation to demonstrate that we're not just about studying questions of means, like whether this study should go forward or not, but also issues of, you know, what kind -- what's our concept of the good?

Well, this tension between intellectual freedom on the one hand and socially responsible science on the other is fundamental to the question of what do we think is the good? What do we think is the kind of human flourishing that we -- you know, what's our notion of human flourishing?

So I would say that any kind of designing for the learning of scientists that would engage them in these sort of meta questions that are not often brought to the fore and that bioethics itself often doesn't bring to the fore and that as we think about what bioethics has to contribute here, it may not only be the, you know, principles and guidance about study design and informed consent but also the critical reasoning and the idea of a dispassionate stance that should offer dispassionate justifications for a position, a way of thinking, not a position, and that might get us out of the moral police frame that we're often put in and create a kind of critical consciousness among both the scientists and the bioethicists who are working together on these tensions to create a kind of self-reflection about the tensions.

DR. GUTMANN: Thanks. Raju?

DR. KUCHERLAPATI: I wanted to make a comment and a question and this is related to education.

The comment is that actually I don't know about all institutions but at least at our institution, everybody gets ethics education. All of the graduate students has a compulsory

course that they have to take. All of the fellows, they have to take a compulsory course that deals with ethics. All of the faculty, at least not all aspects but certain kinds of things, such as privacy, every year have to take a course and pass it.

And this is not only true for Harvard, right, but it's true for many other places because everybody who receives a grant from the NIH training grants in biomedical research are required to have a compulsory course for all of the trainees and therefore many schools sort of feel that if we're going to train, you know, NIH-funded trainees, then why don't we do it for everybody?

So I thought that maybe there was an impression that was left from this morning's discussion that there was -- that such training is not offered. So that's just a comment.

The question is, I think there was some discussion this morning, what is the best way of incorporating, you know, this awareness of the ethical issues in science so that all of the people will be able to practice science in the most appropriate way and some people argued, you know, there's required courses may or may not be good.

So I wanted to ask everybody, I mean, what is our current experience and what's the best way that we could get the community, you know, to do ethical science?

DR. KOCH: I think there may be difference between medical schools and basic sciences schools. So I ran for many years am NIH training grant and there’s one -- so every Ph.D. student in their entire careers has to participate in one, exactly one course that has 10 or 12 hours. That's it. That's the entire ethical training.

I think bioethics need to start at the undergraduate level and needs to encompass graduate level. Most basic scientists take the attitude that's prevalent in California. There's a technical, libertarian attitude. Whatever is possible, of course, you worry about the ethics but then you go do it -- and few people think about the larger good.

So, I mean, this is 27 years. It took an encounter this year with two singular individuals, one, I spent a week with His Holiness, the Dalai Lama, who loves science, but he emphasized, he gently chides us that you scientists are not very good if you don't think about, you know, ameliorating the suffering at all. You just think about your own curiosity but you don't think how to heal. And then people like Bill Gates and his singular foundation is totally focused on taking the science but then actually eliminating an entire class of disease and I think we need more -- so at the level of the individual basic science researcher, we need more sort of -- we need more education and more immersion into some of these issues.

DR. GUTMANN: Yeah. This provokes me to -- again, my role is to underscore

that it is better than nothing if it's done well at the graduate and professional level but it needs to start earlier. It's as if you brought someone into professional school who hadn't done any basic -- in medical school hadn't done any basic science and required them to do 12 hours of basic science and then become a doctor.

The ethical issues need to be introduced early to be taken seriously and to get the understanding necessary but the point I want to make is that it's heightened in neuroscience because there are ethical implications to the science itself and if you come to those late in your career, you're playing catch-up with a whole set of understandings that are analytically rigorous as well as very important to human understanding.

So this has been very -- it's very helpful to have scientists who get it and I think we could make a contribution here to enabling science to move forward, where it isn't about the next regulation but really about how we're going to educate ourselves.

So let me ask Mi Young.

DR. CHUN: I want to just extend the content. We sort of communicate and educate scientists. Scientist community is somewhat perhaps different from other groups in the sense that they're much more bottom-up group. They have to convince themselves that this is what they like to do, such that if such education are vertical, like let's say either by it's some advices from some other groups or bioethics group or, you know, the professors from law or philosophy telling the scientists how we should think about our way of thinking, so then now all of sudden we can be ethical.

I have to confess that that method may not be very frugal in the sense that somehow we have to gather more of scientists, like Christof, who has been thinking about ethical issues and then they have to be the leader and then spreading the words out amongst themselves that they will be in certain ways very important element to think about because they would not be, I think, as receptive if there are vertical sort of enforcement of how they should be thinking because they’re not ethical and therefore they should be ethical, I mean, it's a very interesting boundary.

DR. GUTMANN: Well, it is, and I think what you say is ideal, but we often don't live in an ideal world and sometimes you have to get people's attention because having unethical science practiced in any field, one example of it that's prominent is the single best way to derail all of the ethical science that's being done and that's a disservice to everyone and so one of the reasons our Commission is constituted by science people whose expertise, whose original expertise is science as well as people whose original expertise is law and philosophy and so on is bringing people together.

We don't practice any of what we do in a vacuum and scientists, and my daughter is a great young chemist and I'm really proud of the fact that she took rigorous -- at Raju's institution, at Harvard as an undergraduate -- rigorous moral philosophy courses, and it served her very well, even though that's not what she -- day in and day out, she thinks about, driving her lab science forward.

So I think our goal, our goal as a Commission is to convince both the President and the Administration and the public and scientists that the best way forward -- good science is ethical science. It's not a matter of police. It's a matter -- just as it's a matter of getting the data right, it's also getting the choice of how you do it and what you do right. So I appreciate what you've said and it factors into us.

Both of them, yes. Mildred and Paul?

DR. SOLOMON: Okay. I'd like to make a friendly extension in terms of the age of the audience because several times today we've been talking about the importance of advocating for undergraduate, not just graduate education but undergraduate.

I'd like to make the case that this is something that should be at the secondary school level, as well, both directly for students in high school and preparation of their teachers and not only because it's good for science but also because it's good for our society.

It's a way to develop critical reasoning, so the mode of bioethical analysis, regardless of the positions that people take, so as a strategy for developing critical reasoning, and as a strategy for trying to address the problem in our nation right now of extremes between an absolutist position on many moral questions and an extreme ethical relativism position.

Bioethics can be a way -- I've seen this demonstrated in the project that I mentioned this morning that we're doing with Kent Place School in Summit, New Jersey, and in the work that we did with the NIH. If it's done well, it can really animate young people to want to say, gee, we should be thinking about the implications of our actions. What is the right thing to do here? I don't know. I understand these tensions, whichever ones we want to bring to the fore, and try to help them see themselves as if they're being too subjectivist, too much of -- in our notions of ethical relativism or too much in the other side of absolutism, they can develop an awareness of that through bioethics education.

So, anyway, that's just a pitch to have a younger level.

DR. GUTMANN: Yeah. Paul?

DR. WOLPE: A couple of quick things. First of all, when people ask me what

ethics is, you know, ethics on one foot, I tell them ethics is how you express your values in the world and when you talk to science graduate students about how they're trying to express their values in the world through their science, they have a lot to say and it's not been my experience -- if you go in to talk to graduate students about ethics and you do any of the things that were just mentioned, you tell them you're thinking about ethics wrong, you try to tell -- you know, give them a different way to think, any of those things, you've failed before you walked in the door. That's not what an ethics conversation should be.

It has to be evoking of those things that are most noble and right in the other person and then using pedagogical techniques trying to move that conversation to places they may not have thought of before. So there's a lot of bad ethical teaching out there.

Secondly, depending on scientists for this is as equally problematic as the challenge -- I taught in medical schools for most of my career -- is as bad as depending on doctors for this because the truth is neither physicians nor scientists are trained in ethics and ethics is more than just being a good person and thinking right. It has a whole literary canon, ways of thinking, issues that have been decided and agreed upon, and ways of reasoning oneself to a conclusion, and so it takes a certain amount of experience and skill to have an ethical conversation that is more and better than, you know, a group of people sitting around with a scotch talking about ethics and we need more and better than that when we train our students.

What Raju was saying, the problem with the kinds of training that most people get through the NIH-required courses is they're about regulations primarily. It's regulatory ethics and that's not really ethics, that's regulation, and in the best institutions, it goes beyond that but in many it doesn't, and in my comments earlier, I said scientists have three levels of responsibility.

RCR courses usually only address the first, the actual nature of your hands-on science in your immediate environment. They don't tend to address the second, which is a person's responsibility for their discipline writ large, and they don't address the third, which is their responsibility for society.

And one last thing, we've been talking a lot about sending information out to the public, but science is also about information coming in to science from the public. The public funds most science and therefore have a profound not only right but I think responsibility to feed back to science the kinds of science they want to see. Their tax dollars fund it. That gives them a right to say something about it.

When you talk about sending neuroscience out into the world, it's not just up to scientists to think about what human flourishing it about, it's about scientists to learn from the public what human flourishing is about and, finally, you know, I have written about, for example, issues of brain imaging as they approach the law and the question of things like if we

could actually get a piece of information from someone's brain directly through brain imaging that might be relevant to their case that is information that could have been communicated verbally, for example, but now we can apprehend it direct from the brain, might that violate a person's Fifth Amendment right to again self-incrimination? I think Anita's written about this, too.

The point I make there is not that particular issue but one of the things I'm absolutely convinced about is that if that ever happens and that case starts to go through the courts, it will be the public who basically says to the courts we will not stand for that. That will end up determining what the laws will be.

It will be the feedback of the public who is already wary of a lot of this technology that is going to be a powerful influence on where these technologies will go and this is a two-way street and we need to always keep in mind the other direction because the vast majority of what we've talked about here has been unidirectional.


DR. ALLEN: Thank you. It was mentioned earlier, I think that last year Zeke Emanuel, who's a distinguished bioethicist at Penn, taught an undergraduate course on ethics. I was actually a co-teacher in that class, along with two other outstanding Penn professors, if I can boast, and we taught legal ethics, medical ethics, and business ethics to 100 Penn undergraduates, and in this class, you know, taught by the likes of Zeke Emanuel, we had two cheaters. We had to -- I had to pull them aside and work with them on understanding why cheating was not ethical and right.

DR. GUTMANN: But there are also professors of ethics who have committed serious ethical crimes.

DR. ALLEN: This is part of my point --

DR. GUTMANN: Teaching it doesn't guarantee or learning it doesn't --

DR. ALLEN: I want to underscore that my husband teaches in a Catholic college where ethics is required. He had four cheaters in a class of 20 last year.

So we know and the literature tells us that undergraduates cheat and engage in other unethical conduct, even in the context of classes about ethics. So I want to get to the question of ethical failure because we know that professional scientists, like other professionals, legal professionals, I'm a lawyer and I'm required to take two hours of ethical training every year because lawyers are not always ethical, but we know that there will be scientists who will be

ethical failures and I'm wondering what your thoughts are about how we can best prepare for the inevitability of ethical failure? How do we prepare for it? How do we perhaps cabin it? How do we disincentivize it? What sort of sanctions should be in place? What sort of protections for research subjects who may be the victims of unethical conduct should be in place? How do we -- we can't pretend like it's not going to happen. It's going to happen. What do we do about that?

DR. WOLPE: That's too big of a question to answer, you know, comprehensively, but I do want to say this.

As I mentioned before, the National Academies of Science wrote a book called Responsible Science in 1992 which became kind of the bible of scientific misconduct. The 1992 version of that was -- gave a little bit of aspirational talk in the first couple of paragraphs and then the entire rest of the book was about plagiarism, falsification, fabrication, and other kinds of misconduct.

I'm on the committee to rewrite that book, we’re at towards the end of the rewrite. We have spent two years discussing exactly the question you just asked, though I should say this is not about human subjects research. This is entirely about non-human subject science.

And what we thought was crucially important in rewriting this book was to address exactly the questions you say. How do you not use the bad apple theory of scientific misconduct, that is, most scientists are good, this one's bad, and if we eliminate him, we've eliminated the problem when the incentives for misconduct are institutional, social, you know, in their local environment and in a broader environment. Some institutions incentivize bad behavior more than others. The enormous pressure on scientists, incentivize.

How do you begin to recreate the scientific environment in a way that maintains the most important parts of it, that is, scientific productivity, but, you know, takes away those things that incentivizes bad behavior.

So I'll just stop there, but I'm saying there's been a lot of thought that's been going on about this in the National Academies of Science and hopefully when this book is ready sometime very soon -- we should be finished in the next few months -- it can begin that conversation in a pretty robust way.

DR. GUTMANN: That said, the best incentives in the world -- our Framers wanted to design a Constitution that would work even for devils. It doesn't work even for devils. So it is true that the first and foremost thing that you can do is institutionally do -- you know, provide the best incentives to do ethical science.

Anita's question, still, I don't -- holds and we'll have to grapple with it. How do you -- one of the incentives has to be to dis-incentivize people who want to get away with not doing it, despite the -- so that's a -- we can't judge. I do think you would agree, Anita, we can't judge the success of science courses or ethics courses or whatever by the people who, despite the best efforts and the best teaching, nonetheless don't learn it or don't do the ethical thing.

And the -- one of the -- I'll speak now as -- in the role not as chair of this commission, but as president of a university. It is very important to have punishments for students who plagiarize and some people don't get that. If the core value of our institutions, our academic institutions and scientific institutions, is truthfulness in -- and the pursuit of truth, then the violation of that, whether it's in plagiarizing or cheating, it's a very, very serious violation.

So we need the incentives, as Paul said, in the institutions to incentivize doing it right, but we also do need to have the -- it's not just the carrot. We need the stick when it -- when people, despite your best efforts, don't --

PARTICIPANT: For faculty as well.

DR. GUTMANN: Absolutely.

DR. WAGNER: Oh, of course.

DR. GUTMANN: And the faculty of the model, if you --

DR. WAGNER: It's easier for faculty, actually.

DR. GUTMANN: Absolutely. Thank you. I mean I think that's important.

DR. WAGNER: I've been --


DR. WAGNER: No, no, don't apologize. Important to follow-up on that.

I've been troubled a little bit, Paul, by your earlier conversation and your earlier statements about the importance of listening to the public and not that that alone is problematic, but I do wonder in this area of what John called science for the sake of healing, if we don't have a more difficult problem than simply listening well. And I'm referring to the fact that while we -- and I guess -- actually, I think it was you, Christof, who was talking about the exuberance if not the hyperbole of scientists. I wonder if they're not, in fact, answering exactly what the public does want.

We have a public around issues of healing and health that is so eager to believe the fantastic. I mean that's the whole issue behind certain forms of nutritional supplements, you know, that can be sold so readily and I can make such great money on hyperbole, on promise.

I wonder if there isn't, in the area of bioethics and biomedical ethics, an even higher standard, a higher issue for us to address owing to that special bias of the public.


DR. SEMPLICI: I also wanted to make this point. This is the dark side of the bottom up direction, I dare say, because -- well, when it is about incentives, it is not just about money, of course, but research -- maybe philosophical research not so much, but scientific research requires a lot of money, a huge amount of money.

So the question is how many times does it happen that this malicious relation with the public opinion is, well, implicitly caused by the necessity to have resources, to get resources. This is an implicit incentive in the old procedure we are used to when it is about allocation of resources in democratic countries because you need the support of public opinion to justify the allocation of such an amount of resources, of taxes paid because it is about taxes paid in the end, at least in the case of public institutions, but I think that the National Institutes for Health is still the great supporter of scientific research in the field and the United States as well.

So I think this is a very difficult point because every one hopes that Alzheimer be effectively treated --


DR. SEMPLICI: -- one day, one day. Everyone -- maybe President Obama himself when -- wants to get the support for a financial effort in this or that direction, in this case in the direction of neuroscience, indicates Alzheimer's disease. So the temptation is hard.

DR. GUTMANN: Yeah. Yeah.

DR. SEMPLICI: Alzheimer disease is a big issue. In not 40 years, a lot of patients, but the result is there. Just your hand is enough to get it, but it doesn't work. It doesn't work. So --

DR. GUTMANN: Thank you. I'm going to ask Nita to ask her question, then I have a question from a member of our audience.

DR. FARAHANY: So I was struck as Paul was talking about the methodology of

ethics education and how there is a methodology, in fact, of ethics education, but that there is a disconnect between what the general public and even many other academic fields think about ethics.

I just spent a yearlong process going through many different committees at Duke getting a master's of bioethics and science policy approved and what I was struck by was the nature of the questions that I got, which most often from faculty colleagues who, you know, are terrific academics who simply had no idea that there was a methodology in bioethics, and that, you know, their perspective was this is really a matter of different opinions and how do you actually educate people about different opinions.

And so I'm wondering if there's some role to play in helping to translate that, right, just the methodologies, to be able to share that there are methodologies, to be able to elucidate what those are and how people can actually do reasoning, and if so, how we do so, how we get that better embedded and better understood so that it isn't viewed just as a matter of differences of opinion and say, well, you know, we all just have difference of opinion and that's what it is.

So I mean if there are any thoughts about how we might do that, I would welcome it.

DR. WOLPE: Just a quick comment on that. This has been a struggle for ethicists for a long time and I think we've actually come quite a long way and a lot of the successes of ethics is embedding ethics in institutions where people don't recognize that it was ever different or that ethics has made those -- take human subjects protection as an example, but you're right, it is the only field where everybody seems to think that they are qualified to have the conversation with their students. They don't think -- and in almost any other field and that's because everybody believes they're ethical and they believe they have some ethical insight and they think that that's all that's needed.

And I think with the way in which it becomes clear is when you really confront thorny ethical case studies and dilemmas and then people often don't know how to think their way through them and that's where -- a kind of a methodology in ethics methodology becomes really useful.

I don't have a lot of insight into the question on how we could convince people that that's true, but I think when you actually go into those venues -- I have never -- I've trained material science -- had ethics sessions with material scientists and chemists and engineers and neural engineers and scientists across the board and have never not found students engaged in those conversations deeply and surprised and somewhat in wonder when we are done at how engaged they were and how much it made them think and then talking about it for days

afterwards often.

It's a matter of access, but I wish I had the answer to how you magically convince people that this isn't just about sitting around talking about ethics.

DR. GUTMANN: Let me read a question from Robin Parcelle. How will neuro -- Robin, do you want to -- just is Robin here?

How will neuroethics education and findings address the issue of conflict of interest -- conflicts of interest across professions in health sciences and biomedical investigation of human subjects?

DR. WOLPE: I had the great honor when I came to Emory, after being there just a few weeks of being appointed by the man who’s sitting next to you, as the chair of an ad hoc committee on conflict of interest at Emory. It was trial by fire in my new position.

Conflict of interest exists throughout the system, throughout the system and throughout all of our lives. We have conflicts of interest in almost everything we do. What makes a difference -- difference in science is the desire not to let certain kinds of conflict of interest impact the scientific product and we have tried to put in place certain ways to make that less possible.

My big problem with what we put in place is that we always seem to think that disclosure is enough. If I just say what my conflicts of interest are, then I've fulfilled my conflict of interest obligations, and I think people -- I think because of the number of scandals, and unfortunately, ethics is often driven by scandal, have begun to see that that isn't enough and I think we are gaining a more sophisticated set of criteria and to -- I mean journals have gotten together and created new criteria, universities have, so I think we're actually going in the right direction.

DR. GUTMANN: Thank you.

I should say Robin is a graduate student in bioethics at the University of Pennsylvania. So -- Pat -- and then Paton Blough?


DR. GUTMANN: Blough, from the founder of Rehinge. According -- and here's Paton's question and it -- it's preceded by this.

According to the National Institute on Corrections, the United States has had a

drop in psychiatric hospital beds from 700,000 40 years ago to less than 70,000 beds today. During this time, we have grown a mentally ill prison population of 1.2 million. The largest acute care facility is the LA County Jail.

As neuroscience continues to advance, it appears that it will continue to prove that medical brain issues can cause criminal behavior. In that context, what are the implications that are coming to our legal system?


MR. MONTGOMERY: Well, you decide that for the states. I mean we have a similar pattern in the UK. We lost a lot of beds, I think for good reasons as well as for bad reasons, in that we were institutionalizing people who didn't need to be institutionalized. What we didn't do so well was to get the infrastructure for community-based care funded properly as we did it, but I think, you know, we see a big tension in our mental health legislation between therapeutic justifications for restrictions on people's choices or maybe disregarding its best choices because you think they're impaired by disease conditions, and public protection issues and I would imagine you would see the same over here. You see a political oscillation between the two.

We have institutionalized improved care in custodial institutions and put it into the responsibility of our health services and not set in the prison services, but in relation to the particular question on neuroscience, I mean I think it's all about predictions of dangerousness, is where I would expect to see the activity, and as we discussed earlier, it's a little bit like the responsibility question.

I don't think it's a completely new question. I think it's every time you get a new scientific opportunity, people think through how this -- so I've seen presentations on MRI scans as predictions of whether people will be violent and the like, and I would expect it to be an issue there.

DR. GUTMANN: Not an easy question to answer, but something that we -- well, let me just stop there because it's 3:30 and say thank you to all of you. We have our work cut out for us. We were very enlightened by your presentations, and again, happy holidays to everybody.


(Whereupon, at 3:30 p.m., the meeting was concluded.)

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.