TRANSCRIPT, Meeting 8 Session 5

Date

February 2, 2012

Location

San Francisco, Cal.

Presenters

Roundtable

Download the Transcript

Transcript

            DR. GUTMANN:  So, thank you all, first, for your presentations and second, for participating in a roundtable, and square roundtable, yes, and if we have -- do we have Dan's slide about attitudes towards privacy that we could put up, so, why don't we -- because we had asked Dan about what they found.  So, Dan, why don't you just say a few -- just explain what you found, just because I'm going to ask a question about privacy to everybody.

            DR. MASYS:  Okay.

            DR. GUTMANN:  Go ahead.

            DR. MASYS:  So, this was a survey of 5,000 patients who came through the door at Vanderbilt, and before the DNA biobank or the mechanism of acquisition of samples was ever implemented, several years before, and it was to survey attitudes towards this opt-out approach.

            And so you see in the green, that about one-third of the people are passively favorable to the idea, one-quarter are actively science-supportive.  They want this to move forward.  About one out of five are affirmatively altruistic in our part of the country, that is in Tennessee, where they really want to help somebody else, even if it didn't help them, and then you see about 20 percent who are either skeptics or that five percent who are decisively opposed, believe this is the wrong way to do it, and again, that corresponds over more than a million consent events to exactly the fraction that opt out of the bio-bank.

            DR. GUTMANN:  Good, thanks.

            DR. WAGNER:  That 16 percent don't opt out, I'm sorry?

            DR. MASYS:  It’s interesting. That people had expressed concern, but it apparently doesn't appear -- doesn't seem to be enough concern for them to act.

            DR. GUTMANN:  Well, thank you. Thank you for the information.

            DR. WAGNER:  That's interesting.

            DR. GUTMANN:  Whatever you want to extrapolate from it, it does show, as many surveys show, that -- it shows the diversity of views out there.  So, we wanted to enter that into the record.  So, thanks, and you can take it down, now.

            Dr. Ossorio: Could I ask a question about what accounts as passively favorable there? What would that mean?

            DR. MASYS:  So, there were --

            DR. GUTMANN:  But if you --

            DR. MASYS:  Okay.

            DR. GUTMANN:  Yes, use the microphone.

            DR. MASYS:  All right, so, passively favorable were questions on the survey about their general attitudes towards medical science; whether they were in favor of it as a public good or whether they were skeptical or believed that science was only done for commercial gain and those sorts of things.

            We can get you a copy of the survey, but that was the source of that chart element.

            DR. GUTMANN:  Now, we've entered that into the record.

            Now, what we would like to ask you all very quickly to do, and then we're going to open it up to questions from the Commission, this has become a tradition and it's one that has seemed to yield positive results from what the Commission and our audience can learn from asking this question, and that is to ask you what your primary, if you had to pick one, and it's going to be a brief -- express it briefly, what your primary concern related to advances in whole genome sequencing is, and it can be a concern on the upside or the downside, but I should -- I will say that the Commission is very likely to focus our report on issues of privacy, broadly construed, and we have not yet defined that breadth, but we will as we move forward.

            So, anything that is relevant to the issue of privacy will be of great interest to the Commission.

            So, and just so you recognize that we're not prejudicing this in any way.  The primary concern could be that privacy is taken too seriously and it's seen as something more serious than it actually is, or it could be that there needs to be more attention to privacy, or anything in between or around that.

            So, let me -- let's begin on -- what?  There you go, with John. Let's begin with John.

            MR. WILBANKS:  I was trying to move.  I'm going to go sit on the other end.

            DR. GUTMANN:  It's always dangerous to make eye contact with the Chair.  John, you did.

            (Laughter.)

            MR. WILBANKS:  I'll remember that, the next time I speak at a Presidential Commission.

            I think my biggest concern would be that we do not find a balance that maintains whatever the balance -- whatever the privacy boundaries are that we want to maintain, with the capacity of whole genome sequencing to let us actually start tying traits and gene variation.

            Because right now, we are really not that good at causally tying genetic variations and traits, and I think that is the promise of whole genome sequencing and my fear is that we don't get the balance right in a way that lets us get at that, as fast as we can.

            DR. GUTMANN:  Good, well expressed.  Jane, and please use the microphone because that is the only way it can be webcast.

            DR. KAYE:  Thank you.  I think my concerns are similar to John's, really.

            I suppose, though, I would say that I see that whole genome sequencing is just another twist on things that are happening already in science, and my concern would be that actually we do need to take very seriously privacy concerns, but at the moment, we actually just -- so, what we need to do is make that more nuanced and allow individuals to have a say about the way in which whole genomes are used or who has access to them.

            DR. GUTMANN:  Okay.  Mark?

            DR. ROTHSTEIN:  My concern is that the clinical utility at this point will not outweigh the privacy risks to individuals, both from their psychological response and also to what third parties might do, based on the information that they could obtain.

            DR. GUTMANN:  Thank you.  George?

            DR. ANNAS:  My main concern is the historical one, that this is fundamentally a reductionistic, deterministic model and the more we look at genes, the less we're going to look at the whole person and the less we're going to take that person's whole life into account, and I think that is a big problem, for privacy, but for just basic control over your life.

            DR. GUTMANN: So, more than privacy, just the whole --

            DR. ANNAS:  Well, privacy writ large, I guess.

            DR. GUTMANN:  Okay, good.  Pilar?

            DR. OSSORIO:  So, I guess my main concern is that we have used anonymity as the way to protect research participants from informational harms for the last 50 or 100 years, at least since the Common Rule went into effect, and we're not taking seriously the fact that we really are having decreasing amounts of anonymity and a decreasing ability to promise that to people, and we're not developing in creative ways alternative governance structures that would protect people and minimize harms, when we can't actually promise them anonymity.

            DR. GUTMANN:  Melissa?

            MS. MOURGES:  Well, I think knowledge is power, and I would hate to give up the knowledge that you can gain through this project, and I think that we figured out how to do it with forensic DNA databasing, and you are a room full of incredibly smart people, and I think you'll be able to figure this out, too.

            DR. GUTMANN:  Thank you.  Flattery will get you everywhere, Melissa.

            (Laughter.)

            Dr. Wagner: Actually, that was my concern.

            DR. MASYS:  So, metaphorically, your genome has been called the book of humanity, and if it is, then it's a book that we only understand perhaps one-third of the words.  We can barely speak a single sentence and not even so much as one full paragraph.

            And so that is where we stand, with the understanding of the incomplete genome and of biological science and health and disease.

            So, my major concern would be that privacy not materially inhibit the rapidity with which we can de-code that book and understand what it means.

            DR. GIBBS:  Thank you. I share John and Daniel's view on this balance between privacy and advancing science.

            I'd like to add in, though, a concern about a lack of the right infrastructure and process to advance genetic literacy and genomic literacy as ubiquitously as we possibly can.

            MS. BEERY:  And I agree with Melissa, with the flattery, and I agree, as well, with -- I think that information is a good thing and I think that it's a good thing that can be used in diagnosing a rare disease.

            I think that it's something that should be shared in a broader format, and again I leave that up to you to make all the decisions on how to go about that, but I think that it's a very good thing to share.

            DR. GUTMANN:  Thank you. So, you can see there is a range of concerns and I should say, from the outset, and I know I can speak for the Commission, that the Commission Members share the whole range of concerns, and that we have to tackle them and come up with some very clear, nuanced view about how to accommodate concerns for privacy and anonymity, at the same time as really support the progress of the kind of science that really holds out great, not just hope, but enormous potential and actually some real benefits for individuals, as well as society.

            So, that is our charge, if you will, it's not the solution, but actually the charge, and with that, taking the Chairman's prerogative of saying what the charge is, but not the solution, I'm going to open up for our Commission Members to ask questions, and since Nita had a follow up before, I'm going to begin with Nita, since we now have the time.  We have extra time, now.

            DR. FARAHANY:  Great. So, it was really helpful to hear all of your concerns.  I was hoping you could build on that just a little bit more, because what I'm really trying to understand is exactly what the fear is of information and transparency, and understand what the fear is of moving from a society where we had greater anonymity to a society where we have lesser anonymity.

            And what I am imagining is a world, not of asymmetric data, where the government has data and individuals don't, but a society of transparency, where there is actual information that is available across the board, to researchers, to individuals, to commercial entities, to government entities, both ways.

            And so, some of the things I've heard from George, for example, as a fear about reductionism or eugenics, a fear of reviving the American eugenics movement, a fear of discrimination, but I was hoping that people could articulate what is it that -- what is the fear exactly about giving up anonymity and going into a society of complete transparency, including transparency of genetic information?

            DR. GUTMANN:  Mark, do you want to take a stab at that, and, George, I would ask you, also, and, Pilar?

            DR. ROTHSTEIN:  Well, I'll take a stab as to one part of it.

            You know, when the Genome Project got off the ground, there was concern about many of these same issues, of course, and one of the common answers to those of us who had concerns, was that look, every person on Earth has five or eight or 12 deleterious mutations, and once everyone's mutations are on the table, we'll know that we're all lemons, genetically, and there won't be a problem.

            Well, that was, I think, a very naive way of looking at it.  Even though we all have genetic flaws, the consequences of those flaws are markedly different, right? So that if I am at risk of dying suddenly, my health insurance company probably doesn't really care about that, but my life insurance company would.

            If I'm at risk of a long illness that is going to debilitate me, but last 30 years and I'm going to die at 90, after running up huge medical bills, it's just going to be reversed.

            So, the social consequences of every one of these aberrations are totally different and so are the third parties who have an interest in the projected future health of people, and that would include the whole range of groups that I mentioned earlier, and it seems to me that it's a policy question that we haven't addressed yet in this country very well, as to which risks we want to socialize, which risks we want to let the market operate on, which risks are combinations of the market and socialization, and until we reach some sort of understanding of that, when you just drop a whole pile of information, some of which may be of better quality than others, I think there is a real concern about what is going to happen and unintended consequences.

            DR. GUTMANN:  George?  Use your microphone.

            DR. ANNAS:  I should have learned how to do that.

            I take it to be the government's main job is to protect our freedom and to let us fulfill our lives in ways that we want to do that, and if people want to make their genetic information public, and everything else, then -- on Facebook, that is fine with me.

            But it would strike me that a regime that would say, this stuff is automatically, all your private information is automatically public and automatically known to everyone, including the government, is not the kind of country I think most people in this country want to live in.

            I know it's not the way the Tea Party wants to live, but let's face it, 70 percent of American's say they don't want a mandate to buy health insurance, which could only help them.

            I cannot imagine, if you asked how many Americans think that it should be mandated that their genome be used in research or be known for other people, it would be much higher, it would be under 10 percent, would be my guess, would think that is a good idea, and I would be with them. I  think that is a bad idea.

            I think it doesn't just cover genetics, it covers their medical information, as Mark said, it should cover their banking information, it should cover a lot of their personal information, and we seem to be starting to -- I mean, it's the reason we heard about the CODIS, which is a great talk on CODIS, why people aren't upset about that?

            It's because they're very careful to use it just for identification.  If they started using it for other purposes, people would have second thoughts about that, as well.

            So, I think, you know, to try to avoid a surveillance society and to try to maximize the choices individuals have, I would like to keep the choice of whether to make private data public, an individual choice.

            DR. GUTMANN:  Pilar?

            DR. OSSORIO:  I have a few comments.  One is, I think too much transparency, and this is very general, would actually diminish human flourishing in important ways.

            So, for instance, talking particularly about privacy, rather than anonymity, having types of information and physical places where we can limit access to ourselves, helps us to form relationships, right, and helps us to understand and develop intimacies, and part of how we define our relationships is by deciding what information I give you about me, and what information we exchange, and what information I exchange with my parents is different than what information I exchange with my intimate partners.

            And I think losing so much control over one's information or one's physical space even, losing that kind of privacy would actually inhibit our ability to form intimate relationships.

            I also think that -- so, I'm with Mark on this idea that the consequences of different kinds of information disclosure, whether genomic or broader medical information disclosure, vary, vary depending on what kind of information you're talking about and also who, whose information you're talking about.

            So, if you're Jim Watson or Steven Pinker and you want to put your entire medical information and your entire genome out there in the world, well, great, because you're a very high status person.  You have enough money to pay for your own health insurance, no matter -- and your own life insurance, and everything else, your reputational capital is not going to be undermined by this kind of information.

            You're not Steven Pinker and you're someone who is 25, and there is information in your medical record or in your genome, you know, your risks might be much greater. 

            Also, if you are older, a lot of things in your actual genome, either the risks will have materialized or they won't.  So, the risks to you, in a sense, from the genome information, I think decrease as you age because some of the things that are just probabilistic in there, you know, Steven Pinker is probably not going to go bald tomorrow, even though apparently, there is something in his genome that suggests he could go bald, right?

            So, I think it's -- you know, there is nuance in there, right, about whose information it is and which information it is.  So, even though we all have genomes, and we all have risks in there, it's not that we're all equally at risk, right?

            And finally, I think this whole idea that transparency is so desirable, you know, some of the studies I've seen suggest that people who are say, 20 and younger are understanding much more than sometimes their parents, or their, you know, the people who are say, 25 and older, that transparency could actually have consequences that they don't like, right?

            And so, now, there are whole industries of people who are trying to scrub things from your web, from the web about you, and there are kids who are, you know, insisting that their friends don't take a photo of me and put it on Facebook because they've seen the consequences, and I think some of this enthusiasm for total social networking and total disclosure comes from people who either don't actually understand what they're disclosing, or who haven't experienced any consequences yet.

            So, you have probably seen just what we've seen in our law school and in our medical school, that you know, some students are shocked, simply shocked when employers come back and say, "You know, by the way, there were a couple of your students we didn't hire because we saw these debauched photos of them on Facebook and we don't want our clients seeing that," right.

            Then the students are simply shocked, and apparently, up until then, they had never experienced any consequence, right, but when people do, suddenly, they don't think that transparency is such a great thing, and there are just all kinds of consequences, so --

            DR. GUTMANN:  Jim?

            DR. WAGNER:  Pilar, your comments actually lead us into something, you mentioned human flourishing, and I know, John, you cautioned against getting into philosophy, but I wonder if we shouldn't here.

            I'm not sure if the concern, Nita, is as much about -- or I'd like you to comment, Pilar, perhaps, losing control was the language you used.

            The other dimension we haven't talked about is the risk of awareness.  I mean, part of human flourishing depends, in part, on the ability to have aspirations, even unrealistic aspirations, and if a young person wishes to aspire for some period in their lives to be a professional basketball player, in spite of the fact that the genome might predict that they're going to be 5'6", it would diminish -- the awareness of that would diminish flourishing.

            We also guard heavily, our right to put up a facade.  Very few of us -- most of us, I should put it this way, are wonderful at hiding the inner-child that we expose only to a few number -- to a few people.

            To the extent that awareness of the genome, rather than just control of it, compromises my false pleasure in having unrealistic aspiration and tears down my ability to put up a facade, are just two elements, it seems to me, that attack -- that undermine human flourishing.

            Is it just a matter of control, in which case, that is around privacy, or is it awareness, which is more around anonymity, that other people would be aware, or even I would be aware of some of these things. Is there a distinction?  Do you see the same distinction that I think I'm beginning to see?

            DR. OSSORIO:  I'm not sure I see the same distinction, but I certainly -- so, I guess there are two different things.

            One is, I have a fair amount of confidence that people would maintain the ability to have sort of dreams and potentially unrealistic expectations.  Although there is, again, a lot of data to suggest that the expectations that other people have of you, do shape your own -- not only your expectations, but your actual achievements, right.

            And so, to the extent that not just I learned about my information and had expectations about myself, but that say, my teachers drew some conclusions, whether accurate or not, about my capacities and abilities to excel based on what they found, you know, in my maybe genome medical information and connected to other data about me, that might influence my actual performance in ways that would be not so great.

            I think that kind of thing goes on anyway, and so it's not clear that whether we would just be substituting one set of pressures and social sort of expectations for another.

            MR. WILBANKS:  So, just quick, I mean, I think it's easy to try to get caught up in ontology, right, I studied semantics, so I know exactly how easy it is.

            But I'm not sure that this is as clean cut as something you can put into a taxonomy or an ontology, and when we talk about it, in our project, we talk more in the language of estuaries, which is that it's -- you know, estuaries aren't completely saline and they aren't completely fresh water, and privacy feels more like an estuary than it does something that is all salt or all fresh because there is lots of places where I might be willing to tell my friends something that Jane said, but I wouldn't be willing to tell it to a reporter.

            And so that is one of the reasons why it's hard for me to answer questions like Nita's question, is that I really don't think of it as having these clean borders because I make very different decisions with who I show my inner-self to, and there are some kinds of data that I might want to make fully free and some kinds of data that I might want to make fully private, and there are some kinds of data where it depends on who you are and what you want to do with it.

            And that is the hard part about coming up with clean divisions and definitions in this space, and that's why it's interesting and that's why it's frustrating.

            DR. GUTMANN:  Let me try to make this a little bit more specific because having been trained as a moral and political philosopher, I realize you don't -- you only get far if you actually bring things a little bit down, down to earth.

            Melissa said, in response to a question at the end of your session, that you'd rather have access, the police would have a lot more information now, if they had access to medical records.

            So here is a specific question. What is the difference, if any, between the ethics of access to medical records and the ethics of access to genomic information, to somebody's whole genome sequencing?

            Is there any difference or is -- essentially, is it -- should we be asking that -- I mean, should we be asking, look, here is -- we have -- in ethics now, in practice, of access and the inability to access people's medical histories, and is genetic information and the whole genome sequencing of a piece with that ethics, or is there something qualitatively or quantitatively different here?  Daniel?

            DR. MASYS:  I think this is one place --

            DR. GUTMANN:  I'm going to take the liberty of calling you all by your first names.

            DR. MASYS:  I think this is one place where actually genetics could be wrapped up with all other medically relevant clinical information, and if -- the trust is an essentially instrumentality of the health service enterprise, the trust between a clinician and a patient.

            And to the extent that you overlay secondary uses, specifically for law enforcement, I think you get a severe disruption of the instrumentality of trust in healthcare, to the extent that people won't go, won't say, will not tell doctors things that are important for decisions that may affect -- they may die from a medicine they're prescribed with good intentions because of their unwillingness to release that information.

            So, I think there -- clearly, I think there is sensibility that the use of information -- the joining of the uses of information has very profound effects on people's willingness to disclose it, even the first time, to anyone.

            DR. GUTMANN:  So, and that would be so, similar to the information about your genome, because it would be a deterrent for people to make public their genomic information in order to get it assessed and to get medical help, services, if they thought it could then also be used in all of these other ways.

            And to the extent that people don't think about that, they are like the innocent people who put their various -- the “innocent people”, I'm putting in quotes, because you can use that in different ways, who put information on Facebook, who don't really think about, or haven't been taught to think about the unintended consequences of that.  Raju?

            DR. KUCHERLAPATI:  I want to make a couple of statements and ask all of you a question.

            This is like, you know, 2012, right, and as Richard pointed out, over the last --  

            DR. GUTMANN:  Raju, you stand corrected, this is 2012.

            DR. KUCHERLAPATI:  So, in the last 10 years, the cost of sequencing, including interpretation, has gone down tremendously, very significantly, and Richard knows exactly the amount of money that he spends to be able to sequence the whole genome and interpret the genome, and I think it's probably somewhere around $10,000. All right, number one.

            Number two is that as Richard pointed out, how many sequences have been -- how many genomes have been sequenced, and the rate at which they're going to be sequenced?  This is going to -- this is happening and this is going to continue to happen.

            Third fact is that the information that we have about the genome is incomplete, but I would argue that there is no point in human history that -- any knowledge, about any enterprise, including the human genome sequence, or human body is going to be complete.

            We are going to be on this Earth for another billion years or so, God willing, and we probably will still not have all the information that we need, all right.

            So, given all of those facts, and I heard all of the concerns that all of you have, now, here are the choices for this Commission to think about, right?  One is that one could say, we should not do whole genome sequencing because of all of these concerns, and we should make a recommendation to completely prevent it in any way that we can make it happen.

            DR. GUTMANN:  Okay.

            DR. KUCHERLAPATI:  Right?

            DR. GUTMANN:  Straw man, shot down.

            DR. KUCHERLAPATI:  Straw man, shot it down.

            DR. GUTMANN:  All right.

            DR. KUCHERLAPATI:  All right, the --

            DR. GUTMANN:  Nobody would ever -- nobody -- I would assume --

            DR. KUCHERLAPATI:  No, no, no, let me --

            DR. GUTMANN:  All right.

            DR. KUCHERLAPATI:  Let me make the point.

            DR. GUTMANN:  Good.

            DR. KUCHERLAPATI:  You understand exactly where I'm going.

            DR. GUTMANN:  Good.

            DR. KUCHERLAPATI:  All right, the other aspect of it, be able to say that, you know, all of these sorts of things, nothing, no regulations, no, nothing, let it go, right, and I would imagine everybody sort of falls somewhere in between.

            So, I want to know where that in between is for each of you?  What -- because I'm still trying to get at it, and be able -- everybody says, “I'm concerned about this and I'm concerned about that”.  But if you were to make a recommendation, right, to somebody who is going to listen to you, what would that recommendation be?

            DR. GUTMANN:  Good.

            MS. BEERY:  I'll take that, and I'll look at you in the eye.

            DR. GUTMANN:  Yes.

            MS. BEERY:  So, I am here representing other patients, right?  Everyone is -- has their field of expertise, and I talk with patients, I talk with parents of patients on a weekly basis, and out of every 100 patients or parent that I talk to, there is maybe one that says, "Please don't share this information."

            So, I am definitely in that less than 10 percent that was spoken about, when Richard and Matthew Bainbridge called me about the paper that was published in June of last year. 

            They told me that we would be anonymous, and I said, "Do you know, make it -- I want it to be -- I want a person, a name to go with the story," and it wasn't because I wanted our name in the story. I wanted it to come to life.

            And so, I believe that there are a lot of patients that are looking for answers, and I know for a fact, people that I've talked to, that are either in the process of having their whole genome sequenced, or they're on a list to get their whole genome sequenced, that are sharing information with other people in different avenues and different venues, and they're doing this for the purposes of discovery, of helping other people, or of helping their own children, and there is a lady that called me in 2005, her first name is Heather and her last name escapes me, I think it's Long.

            She is working on -- with a couple of Congressmen on a registry, it's called, CAL, C-A-L, and the acronym is actually taken from her son, who died at age five, undiagnosed, and she had blood drawn that she preserved, and she is trying to get that blood sequenced because she wants to use it to help other patients.

            She wants a registry that people can go into, and put the information of the symptoms of, you know, what the symptomolgy is, what testing they've had done, what's come up positive, what's come up negative, a source that can help other people.

            And so, when you have a diagnosis, you can put that in the registry, and then someone that's been logging can have a sign that says, this is a diagnosis for your child, as well.

            DR. GUTMANN:  Retta, that's extremely helpful because it gives us a start, but as Raju recognized, in asking, it doesn't cover the whole territory in between because what you've told us is patients who stand to benefit from this, and consent, should be able to share their information.

            MS. BEERY:  Correct.

            DR. GUTMANN:  And that's a part of the universe that exists between shutting it down, which we're not going to do, or making it mandatory that everyone share, right?

            MS. BEERY:  Right.

            DR. GUTMANN:  So you want it to be easier for patients to share information and get the medical benefits of that.

            MS. BEERY:  Absolutely, absolutely.

            DR. GUTMANN:  John?

            MR. WILBANKS:  I want to just echo one of Mark's points, which is that there is -- there is a gulf right now between the clinical efficacy of the information and the ability to be hurt by some of the information.

            So I fall into that 10 percent, the screen shot of 23andMe, of the guy with elevated prostate cancer, that's me.  I've disclosed that elsewhere, so this is not a big deal, and I can get harmed by that, more than I can get helped by that today, and that's the gap that we're in.

            And so the question -- one of the questions is how to you accelerate -- or how do you sort of decrease the gap between the clinical efficacy and the capacity for harm?

            And so some of us are making a gamble that that is going to happen fast enough that a cure will come out and that is better than the harm that would come from having different insurance rates.

            But the -- I think one of the reasons why you have the gradient is that there is -- whether you sort of scientifically know it or you just feel it in your gut, there is a gap between having your genome on a disk, and being able to make good decisions about your healthcare or get therapy for it.

            I would try to balance between how do we accelerate the decreasing of that gap, because right now, it's much easier to get screwed because of your genome than it is to get cured, and that is not ideal.

            DR. GUTMANN:  Dan?

            DR. SULMASY:  A question that came up briefly a little earlier, but I don't think has been fully addressed is the -- a question of how we handle informed consent for persons who might have potential moral objections to research that would be done using this data in the future, whether that is the Havasupai Indians or people who have, for instance, pro-life objections to what might be done with it, how should we handle those sorts of questions in terms of informed consent?

            DR. KAYE:  That's a great question, and I think that we do this, we've got to have dynamic patient interfaces, because at the moment, our informed consent processes are paper-based. 

            They're right at the beginning of the research process, but we're actually talking about secondary, tertiary uses of information.

            And so, if you have a patient interface that actually can provide information to individuals about how information is being used, if necessary, go back for consent and those parameters have to be carefully thought out so that you can actually ensure that research can happen, but also that individual privacy is protected.

            I think that you need patient interfaces to do that, and there are a number of them out there that are in development, and I think one of the things that we need to do is actually bring them into publicly funded research, as well, and we need to start thinking about having these mechanisms used more broadly.

            And as John said earlier, we need investment in that infrastructure to really enable it because the real danger is that if we have any breaches of privacy or things start to go wrong, we've seen it in the UK with Alder Hay, then things go seriously bad, and I think it's really important to be operating on a precautionary principle, really.  So, that's it.

            DR. GUTMANN:  Pilar?

            DR. OSSORIO:  I guess this is maybe kind of a response to a couple of things that have been said.

            One is that I'm not sure exactly what the task here is, but whole genome sequencing has been mentioned a lot, and I think it's a mistake just to focus on that in its current form, because, of course, we're also doing epigenomics, transcriptomics, proteomics, and I think we ought to be thinking about the implications of this kind of whole molecular profiling of people's cells and their body, because it's not just the implications of the DNA sequence, but the epigenetic markings of it, and those technologies and our advances there, maybe are behind the sequencing technologies right now, but they're on that same trajectory of, you know, kind of large-scale, high-throughput processes that are being developed, and I mean, I think that from a research perspective, integrating those things is going to be the direction of the future, in order to get the most research benefit.

            And so, we ought to be thinking now about our governance mechanisms handling all of those data in the research context, and probably in the medical context, too.

            DR. GUTMANN:  Daniel?

            DR. MASYS:  So, consent in this era of -- this early era of wide open discovery, is a very difficult thing, because what you would put on a consent form of what might be discovered in genomic associations, as we've discovered with this PheWAS scanning, you know, you anchored this on one disease, and then lo and behold, you find previously unsuspected associations with diseases that some people might consider socially stigmatizing.

            And if they had known you would discover that fact about them, then they would have not given you the consent, but how can you do that discovery research, without allowing the possibility that you will discover unfavorable things, so to speak, from a societal context?

            So, some notion of consent that allows that we may not all get happy answers from the science results from the use of this DNA, is perhaps, what is placed before us, that differs from the traditional clinical trial consent that we're quite familiar with.

            MR. WILBANKS:  And that's actually exactly what we've had to put into ours, is what was in the Personal Genome Project's and it's -- it was this laundry list of bad things that can happen to you, having your DNA sequence placed at a crime scene, right, finding out that your dad is not your dad, having diseases that are socially stigmatizing, all of those things are in the recitation of considerations that we've had to include, both in the form itself, and in the wizard that you go through, they can't be obscured in the interface.

            DR. GUTMANN:  I'm going to ask Christine to ask a question, but I do want to remind members of the audience, if you'd like to ask a question, there are just cards, please, just jot it down and a member of the staff will pass it up, and if you don't have a card and want one, if you raise your hand, a staff member will give you one.  Christine?

            DR. GRADY:  Since we've focused a lot on consent and voluntariness and people's choice about making information available, one of the things I've wondered about a lot over the course of thinking about this issue is children, how do we think about children? Either at birth, or any time before they come of age to make their own decisions?

            So, I wondered if any of you have had -- have ideas about how we should think about children, with respect to whole genome sequencing or all the other things -- I agree with you -- that we may not know as much about today, but we will tomorrow.

            DR. MASYS:  I have only a small anecdote.  In the delayed launch of the pediatric biobank at Vanderbilt, because we were told, just don't go there.  They're a protected population.  Start with adults, because maybe it doesn't even work in adults.

            And so, after doing all of the same sort of attitudinal surveys and multiple cultural groups and such, they launched the biobank in pediatrics and found exactly the same opt-out rate and in terms of that pie chart of attitudes, the only thing that was somewhat different was a much stronger altruism signal of parents who say, “If you can't help my child, maybe it will help the next one.”

            So, I think our notion that there is unique and special set of considerations, which we thought was to be the case, at least in that limited setting, appears pretty much to inherit the same set of concerns as in adults.

            DR. GRADY:  Can I follow that, though?  That's the parents you're talking about, right?

            DR. MASYS:  The parents, because the parents were the ones who had to give consent.

            DR. GRADY:  Sure, so, I mean, one of the things I don't think we know yet, but I'm just sort of thinking ahead, so, these are enduring -- you raised it earlier, enduring samples or data that can be gone back to hundreds of times over the next decade and interpreted for different things, and some of those children will grow up, and then what?

            DR. MASYS:  Well, we do -- have actually set up that biobank, so that when someone transitions from 17 to 18, it trips an adult consent event, and if they opt out, even though their parent might have had them in the biobank, that sample is no longer available for use.

            So, it does -- that transition to adulthood grants -- the decision to make the decision a second time, which might be different than your parents made.

            DR. GUTMANN:  I have Nelson -- Lonnie, did you want to follow up on this?  Please put your microphone on.

            MS. ALI:  Yes, then what happens to the information that is already out there?  I mean, it's sort of like the damage is -- I hate to say damage, but it's out there.

            I mean, once it's like out of your mouth, it's gone.

            DR. MASYS:  That's right, and the papers are published and, in essence, we can't retract literature or scientific analyses that are already done.

            So, it's -- you can limit or deny all future uses, but there is no way to take it out once it's done.

            DR. ANNAS:  Right, so the real question is whether parents have the authority to authorize their children to be screened for at least late-onset disorders.  There is no question, they can have them screened for -- to try to discover what disease they have today or what disease they might get while they're children.

            But I don't know if that's technologically possible even, to segregate that information and not do a test for BRCA1 and BRCA2, just automatically, when you're doing everything else.

            But if you do permit that to be done, basically, you're saying you don't really believe in the right to privacy, because you're going to take it away from your kids and they'll never have it again.

            DR. ROTHSTEIN:  Let me just add that it was an argument of faith in the genetic counseling community for many years, and in medical genetics, that you didn't test children for disorders that wouldn't manifest during childhood.  You would wait until they could exercise their own autonomy when they became adults.

            And those who are advocating sort of newborn whole genome sequencing, are just sort of throwing that out, in one swoop, under the theory that the benefits will outweigh all these -- I don't know that there is evidence of that yet.

            DR. GRADY:  I've also heard people say, “Well, you just won't look at some of those other things.”  They will be there, but you won't look at them, right.

            DR. GUTMANN:  So, it is -- this isn't -- it is an interesting answer to how whole genome sequencing is different than very targeted information, and that's something we have to come to grips with.

            MS. BEERY:  And I was just going to add, I know that in cases like I'm talking about in my -- in cases like my own children, where it's already out there, they're the cases that they actually have a rare disorder, they have something that we're trying to discover.

            So, I understand that that is different than parents just having their children that are typical-sequenced.

            DR. GUTMANN:  Good.  I have Barbara next.

            DR. ATKINSON:  I'm interested because I really don't know the answer to it.  You may have some data on it.

            Are there any aspects of this relative to minority populations or disadvantaged populations that we should be thinking about, relative to the ethics of all of this, and did you do any analysis to see whether they had a different response rate than anybody else? Just to be sure that we don't miss some ethical issue relative to disadvantaged populations.

            DR. MASYS:  The suspicion of medical research and thie stigma of highly publicized past wrongs against under-privileged and minorities was certainly evident in our surveys.

            On the other side, interestingly, in the eMERGE network, Northwestern and Vanderbilt were able, actually through the reuse of genome scans that were acquired for different primary phenotypes, to do the first ever African American Type 2 diabetes GWAS because no one had ever been able to enroll enough of those people.

            But in using this model that the scans were -- you could acquire it for one clinical condition, but there would be other coexisting diseases, and so, it looks like there is an egalitarian benefit that helps to overcome that bias against biomedical research in these acquisition models.

            Wasn't the primary design, but it is -- seems to be a salutary effect of having done it that way.

            DR. OSSORIO:  So, there are a couple of things.  One is, there are a ton of studies out there that I'm sure most of you know about, showing higher levels of suspicion among minority populations of various sorts. Not necessarily different rates of willingness to participate, even though there are higher levels of suspicion, and some sociology studies that I've seen have shown both higher levels of suspicion, but also, surprisingly, higher levels of enthusiasm about the possible benefits of medicine.

            But I think there are some other things to take into account, besides attitudinal information, you know.  One is, who is less likely to have health insurance?

            So, who is less likely to benefit, until we are in a world where we have universal healthcare, and this kind of goes to a question you asked to Mark and George this morning.

            I would have a lot more sympathy for the kind of free-rider concern as a major overriding concern, if we were in a world where our health systems generally showed this kind of concern for the common welfare and common good, and then the idea that, well, we might at least go with an opt-out approach because, you know, there is some obligation that comes with your right to healthcare, right?

            But given that people don't have a right to healthcare, and many don't, and it's not evenly distributed in terms of who doesn't, I'm a lot less sympathetic to that set of concerns about free-riding.

            In addition, there are many ways in which people can participate in medical research.  They don't necessarily have to participate by giving their genomes over.

            DR. GUTMANN:  Again, to be specific, you would be sympathetic, according to what you said, for the concern about free-riding for anyone who does have access to health, full access to healthcare.

            DR. OSSORIO:  Full access to healthcare, but with the caveat that there are lots of ways to participate, of giving back and contributing to the advance of science.

            So, it's not clear to me that any particular kind of science you have, should be opted into because of that concern for free-riding.

            So, that is one thing, and the second thing is, even when people of color are in the medical system, they're not necessarily treated fairly or respectfully.

            So, we have the National Academy study that came out a few years ago, and so, I think there are issues there that we ought to be paying attention to.

            DR. GUTMANN:  Nelson?

            COLONEL MICHAEL:  I'm going to ask a question that's very related to what Barbara asked, which is, I assume that -- and I won't make an assumption that it just goes forward.  We're more likely to get the Rosetta Stone of how to read the human genome by the involvement of those individuals that are more likely to want to participate in research, and as a consequence of that, then individuals that are disenfranchised or stigmatized are going to be potentially less likely to be involved.

            I mean, Pilar's comments not withstanding about the dis-link potentially, between participation and suspicion.

            But my guess is going to be, just like with children, that you're going to initially have a Rosetta Stone that's going to allow us to understand genomics better in majority populations in the United States and the individuals that are more enfranchised, and I think that gives me some degree of concern, because it means some elements of society are going to be excluded.

            So, how do we approach that in terms of if you -- I mean, clearly, it's going to be difficult to have a, we're just going to let science rip environment, but I do -- I am concerned about a free-market approach that's going to, I think naturally exclude significant portions of our society.

            DR. MASYS:  I think one of the great ironies of the notion of protected populations, as evidenced in the human subjects protection, you know, the Common Rule, 45 CFR 46, is that as you watch -- I was an IRB director for a couple of years and institutional official for human subjects protections for the better part of a decade at UC San Diego, and watching year upon year of the hundreds of applications to do research, it was clear that being a protected population meant you protect -- you were protected from the science ever being done on any of the conditions you had.

            And although it was, you know, done for purposes of respect for persons, it has this completely opposite effect that just, people wouldn't go there and it was too much trouble to try and do the science.

            DR. GUTMANN:  You all know that one of our charges and the report we will do after this report, is on the development of countermeasures for children and the development of an anthrax vaccine, for example, for children.

            In order to do that, there needs to be testing on children, and that's precisely the -- the issue that you raise, is the horns of the dilemma you have outlined are the horns of the dilemma in developing vaccines for children.

            So, if any of you want to communicate with the Commission about your views on that, we will be happy to take those communications because we will get to that after -- we'll begin -- right, right, John looked down, he didn't want to -- right, right.

            DR. GARZA:  Thank you.  A couple of questions.

            There was a lot of talk, and I think John you even mentioned the clinical efficacy versus harm, and it seemed like, when we were talking about harm, it was mostly a financial harm, whether you could get insurance, either life or health insurance, and it seemed to be more of an artifact of the American health system, more so than a perceived or real, actual harm.

            So, I was wondering if there is any difference in this perceived harm in nations where there are more social programs or national health insurance, issues like that, or is that anxiety decreased somewhat, or are there other harms that are expressed?

            DR. KAYE:  So when you're talking about harms, and so, can you just explain what you mean by the --

            DR. GARZA:  Right, and so, one of the arguments that has been articulated is, if I know that I'm going to be a risk for a disease, then an insurance company will view me as too much of a risk, and then I won't be able to get health insurance, and theoretically, I will suffer economic consequences because of that.

            And I'm not saying that's the only harm that's out there. I recognize there is more than one type of harm, but in other cultures where -- or other countries, where there is a much more robust national health program, where the elimination of individual health insurance is eliminated, where everybody is treated for their health problems, is that view of harm essentially removed from the table, and are there other harms that people articulate?

            DR. KAYE:  I think the harm -- the possibilities of discrimination and the fact that you might be excluded from healthcare provision is something which probably is of greater importance or of greater concern, than say, in the UK, where we do have a universal healthcare structure, but of course, our universal healthcare structure is actually slowly being dismantled by this government, so that it will no longer be provisioned by the state, but will be a combination.

            So, I think that obviously, you know, just being able to have healthcare is a fundamental human right, I think, and not be worried that you're actually going to be discriminated against because of you genetic disposition. 

            I think, but people who have genetic tests are also very concerned about travel insurance, about mortgage protection insurance, but it's not really in the same league as fundamental healthcare.

            So, I think that is a huge difference, and it does mean that, within Europe, things are thought about quite differently.

            DR. OSSORIO:  Could I just add to that?  I think there are a couple of things.  I was trying to remember, because I've seen just one paper that asked similar questions about fears and concerns of genetic information across a couple of European countries and the U.S., and they didn't see huge differences, right?

            So, there is at least one paper out there.  That's only one study, and I cannot remember for the life of me who did it, but I know there is at least one study out there, and maybe the Commission staff could dig it up.

            There was a second thing I wanted to say.  Oh – I know what it was.

            Also, I think there are really -- you know, this is path dependent, right, and so, for instance, in Germany where, you know, the Stasi secret files were really an important aspect of social oppression, that people still remember, and where Nazi experimentation was still an aspect of, you know, their history that people still really remember, they have very different laws about what you can do with genetics and all kinds of things, than anybody else does, right.

            South Africa, they have all these transparency laws because of -- including laws that seem to apply to health records, making them much more accessible than they are in many other countries, again because of secret trials, I guess it was.

            You know, so, I think this is very path dependent, the kinds of -- and historically and contextually situated, in terms of what kinds of concerns people may have about the implications of the information.

            DR. GUTMANN:  Nita?

            DR. FARAHANY:  So, I've been mulling over some of your answers, and trying to figure out, back to Raju's question, about where I would pin some of you, as to what your recommendation would be, as a result of the things you're saying, and I want to push back a little bit on George, but I'm opening this up to everybody.

            So, George, you said, you know, you made the analogy to mandatory health insurance, and how Americans oppose that.

            But I see the analogy as a little inapt, because I think there is a difference between mandating that people make their information free and accessible to everyone, versus recognizing it as an inevitable consequence and asking whether or not we're going to adopt measures to restrict the flow of information, the access to the information or the use of information.

            So, John said earlier, you know, information -- trying to put a kibosh on information is really quite difficult. I tend to agree with that.  I think it's very difficult to restrict the flow of information.

            So, imagining some of the things that you all have said, like for example, Mark, you point out the very useful, I think truth, which is socialization of risk isn't something that we totally figured out yet.  So, how are we actually going to figure out that information?

            But a recommendation could be trying to figure out the socialization of risk, right, or if we're worried about reductionism, George, then perhaps, you know, a recommendation is, we should educate people about genetic information, and we should target information, you know, education, or the point that we have too little knowledge about the information. 

            A positive recommendation could be, we should target funding to bioinformatics and we should encourage greater development of bioinformatics and translation of the information.

            And so, I was hoping that I'd kind of push you all to say, assuming it's not about mandating information being free, would you agree, the information is likely to get out there, and if so, what are some of the positive recommendations, like education around what the information means, like government funding or funding, you know, encouraging funding of bioinformatics, what are some of the recommendations that you would positively, you know, add to that, aside from just trying to restrict the flow of information? Or you could say, really, I just want to restrict the flow of information.

            DR. GUTMANN:  And just to add on the list of Nita's, because it's really -- we have anti-discrimination laws, and they're easier to enforce when you know people have information and are using it, because you can look at the information they have and say, they only way you could have made this decision is on the basis of information that requires discriminating, in ways that are not legal.

            So, this is about, there is access of information and then there is use of it, and I thought Melissa gave a good example of how the police who are more -- you know, who are suspected more than doctors are, I dare say, there are certain constraints, and we can't perfectly hold them to it.

            But Nita is trying to suggest how if the information is out there, what -- you know, what then?  Go ahead.

            DR. FARAHANY:  And just one condition of that is --

            DR. GUTMANN:  We could go back and forth, too.

            DR. FARAHANY:  Well, we've had a little bit, you know, of conversation about insurance.  We have the Genetic Information Nondiscrimination Act, which is specifically Title 1 targeted to insurance companies, Title 2 targeted to employers, which grew out of the fear of insurance companies making discriminatory decisions based on genetic information and employers making discriminatory, you know, decisions based on genetic information.

            And while that isn't a perfect piece of legislation and nobody thinks that it is, it is a responsive piece of legislation to the use and access of information. 

            And so, you know, that is one way in which we've already tried to address it.  What are the other things that you're worried about, that things like education couldn't address or funding and bioinformatics or, you know, how do we deal with the socialization of the risk and start to positively address that, rather than just trying to restrict the flow of information?

            DR. ANNAS:  I don't think I recommended restricting the flow of information, by anybody but the person who has the information.  But if I did --

            DR. GUTMANN:  Well, that is very helpful.

            DR. ANNAS:  If I did, I apologize.  I also don't recommend that we repeal the Fourth Amendment.  I don't think we're going to do that either.

            The government is not going to be able to have to all your information, even if you want them to.  You can let the government have it.  I think that is fine.

            I'm an educator, so I'm certainly not going to be against education, but after 40 years of education, I know its limits, and I don't think no matter what we think, and I'll include myself here, that we're likely to set the agenda for American education.

            I think health education, very important.  Genetic education, very important, but that's going to take generations.  Most people in this field think that the primary person that you have to educate are the physicians, who know very little about genetics and many of them are not interested in learning much about genetics.

            At this point in time in our healthcare system, I think that Bill Sage is right, when he says that it's easy to identify the most dangerous, most costly piece of equipment in medicine, which is this, the pen that the doctors use to write their orders and order tests.

            So, absolutely, I'm all in favor of education.  I'm actually all in favor of genetics.  I'm spending most of my time now, as I said, working with Dr. Elias on nothing but this subject.

            But it's very, very complicated and as many other panelists have said, it's very nuanced.  It doesn't make sense just to talk a lot.  Information. What information? And who gets access?  Not just access, but who gets access and for what purpose, and who ultimately controls the dissemination, in so far as it can be controlled?  Obviously, not everything can be controlled.

            All of those things are important, but actually, I appreciate your comments.  We may disagree, but probably just at the margin.

            DR. KAYE:  Can I just come in, just to add one comment? 

            I think it's very -- I would hate for the panel or the Commission to think that in actual fact, we were advocating that it was just up to the individual to decide, and I think it's really important that we actually have governance structures or we've thought about what is appropriate in a global civil society.

            So, we're thinking about higher principles or protecting individuals, because there are people who can't actually, or don't have the information to make decisions which are appropriate, and I think Facebook is probably quite a good analogy here, the Facebook, the privacy settings are set so that you don't have any privacy, that it's all out there, and then you crank it down, okay.

            I don't think we want to be in a society which is like that.  I think we want to have privacy as a human right in a civil society, and then people can actually crank down their privacy protections.

            But we have to actually, and I would urge the Commission to do this, to be thinking very carefully about what the significant harms are and the risks are, and what we want to protect in a civil society, and then to actually work out mechanisms to do that, so that we're protecting all people in society, not just those who can actually make decisions on their own behalf, and do that, exercising their full capacity.

            MR. WILBANKS:  So, can I just push this a little bit? 

            DR. ANNAS:  I just want to follow up just by saying that I think forget the laws for a second, but one of the things that Commission could do was suggest like best practices that could be used globally, because I think we really are going to get involved in globalizing.

            DR. GUTMANN:  John?

            MR. WILBANKS:  So, I hate the whole information wants to be free meme.  It's so beaten up, but the reality is that if it can be reduced to bytes and copied costlessly and distributed costlessly, it's fairly easy for it to leak into places that you don't want it to leak.

            And the reality of the consumer world we live in is that, you know, my wife and I have a baby here in San Francisco and within a week, the mail we get, the physical mail we get changes unsolicited from Conde Nast Traveler to Working Mom.

            DR. GUTMANN:  But you just throw out mail now, right? 

            MR. WILBANKS:  But the reality is that lots and lots of pieces of secondary information can be used to create primary information, right? And there are mathematicians that you should probably invite to testify in front of your Commission, who will show you how easy it is to do re-identification.

            So, the -- it's -- when you want to talk about specifics, and I think that's fair to try to pin us to specifics, I would not want to mandate that everyone's information be available without any agency.

            But I wouldn't -- I think it would be interesting to think about, if an organization is going to make a decision based on my data, it would be nice if they had to be transparent about how they arrived at the decision.

            Right, now, that is -- because transparency over the decisions that are based on data that's available are how the perceived harms might come to be.

            So, requiring -- say, if you want to make these data driven decisions to become more efficient as a corporation, all right, you're going to have to show your work.  That would be an interesting specific thing to look at.

            DR. GUTMANN:  Steve?

            DR. KAYE:  Sorry, can I just -- the junk mail is really interesting because in the UK --

            MR. WILBANKS:  The hospital sold our information, for what it's worth.

            DR. KAYE:  Right, and so, I mean, there should be protections in place to stop that.  So, in the UK --

            MR. WILBANKS:  Well, it was all addressed as 'to occupant.'

            DR. GUTMANN:  Okay, okay.

            DR. KAYE:  I mean, it is the role of the state, to actually put mechanisms in place, and to have effective governance mechanisms and compliance mechanisms, so you go in and you fine the hospital for doing that, you know.  It's inappropriate behavior.

            DR. GUTMANN:  Steve has a question.

            DR. HAUSER:  Amy, can I switch gears to this morning's discussion?

            DR. GUTMANN:  Yes, absolutely.

            DR. HAUSER:  Of clinical application?

            DR. GUTMANN:  Absolutely.

            DR. HAUSER:  About genetics, and maybe Dan or others might want to weigh in.

            I was so taken by this -- by your concept, that one great opportunity of genetic information that is medically available is the identification of new clinical problems that are associated with genes and your phenome-wide association study.

            And one potential challenge, even in an era of electronic medical records, is that we're not protocolizing and standardizing the clinical information that we are receiving, and obviously, we have many different systems.

            So, I just wondered how you're handling that at Vanderbilt.

            DR. MASYS:  So, actually, that issue is front and center for the eMERGE network and its seven participants, and the good news was, although we didn't expect we would be able to share high quality phenotype information derived from what were originally five entirely different electronic medical record systems, it turns out it can be done, and without, really a lot of work.

            There have been those who would say, well, you know, all data entries should be coded, so we'll have exactly the right coded term, but it turns out that all of these institutions have been able to use computerized natural language processing to do concept indexing and that have, in a sense, post-hoc apply the codes that create the high quality phenotype.

            So, it looks like our attention to making sure everybody had the same electronic medical record system is unnecessary.  We can allow a wide variety of heterogeneity in the way that clinical observations are recorded and still be able to -- after the fact, to be able to do this correlation science, that was not expected and it was a very felicitous outcome of that network.

            I do think with respect to the -- what's -- you know, in your sense, what is different about this? 

            It strikes me, the thing that is different is this unquantified, unintended consequences, and you know, I wouldn't post my 23andMe on the internet, and I'm reasonably literate in this area, and so, it turns out though that I think my ignorance is much larger than my knowledge, and so, the reason I wouldn't publically post that is the unintended consequences of the reuse of that 10 years from now or 20 years from now, because we just don't know enough.

            And so, I think if the Commission could highlight in some way, that as opposed to traditional medical information, that people are quite comfortable with, this is this book that has yet to be understood and read, and to the extent you publish this when nobody knows what the language is yet, there will be a much greater probability of unintended consequences for publically released data, and that's really, I think different than using a combination of technology and policy for approved uses and then sanctions when people violate those rules.

            MR. WILBANKS:  The one edit I would make is, sometimes unintended consequences are great, and so, that is the balance we have to draw, and so, that -- the question is, how many people who are willing to take arrows in the back, does it take to bridge the gap?  And that is the fundamental question.

            I have -- I reached the same decision as you.  I face the same choices you did, but my decision was to post mine, in order to help change the balance.

            DR. GUTMANN:  So, John, here, actually, some just simple moral philosophy is helpful.

            If people understand that there are -- it's very likely that there will be unintended consequences, they can decide, as you do, to -- if they want to bear the risk of those unintended consequences, or they can decide, as Daniel would, that they don't want to bear, but they are then, making a decision about whether to bear unintended risks, as opposed to not understanding that there would be unintended risk and just charging blindly in, at least anything -- and education certainly plays a role there.  Christine?

            DR. GRADY:  I've been thinking about both the question of what is different about this kind of information and also, Alex's question about the harms, and one of the harms that has occurred to me is what I think is called over-diagnosis, that's sort of the focus -- you know, we've looked at PSA and mammograms and things like that, and you know, the sort of harmful consequences of that, and it seems to me that when you have whole genome sequencing, as an example, you're going to have lots of things that one would tend to want to do something about, even though maybe we shouldn't.

            So, I guess I have two questions about that.  One, do you think that is a thing to worry about, number one, and number two, although it has less to do with privacy than the things we've been talking about today, do you think there is anything we could or should say about that?

            DR. GUTMANN:  I'd like to hear, actually, from Richard, who deals with this day in and day out.

            DR. GIBBS:  Well, I mean, I think it's a familiar issue, is the most comfortable thing I can say.  I mean, certainly, the issue in imaging and whole body imaging, I think it's been well discussed there, and it's real.                But I think it's a broad philosophical question, or it's a kind of different question than the mechanical response question.

            How do you want -- do you want to know more than might be comfortable for you, that you can't do everything about, or do you want to restrict your knowledge?

            Perhaps, I could add, and this comes up to an earlier point about the ability of technology to screen the data in a way that we can implement in a facile way, all of these decisions.

            If the decisions are made about access, I think the technology, not just to generate the data in the first place, but to distribute it and have access to it in all sorts of creative controlled ways, can be easily put in place, and we're only beginning to tap into that capability.

            So, indeed, if this Commission recommended that, “Dammit, nobody should know anything about data that couldn't be acted on in a certain way,” I think to impose that completion on most of the ways the data are being generated would actually not be that technically difficult.

            (Off the record comments.)

            DR. GIBBS:  No, no, I think it would be the worse thing you could say, just to go on the record.  Not the worst thing, but a bad thing.

            DR. ANNAS:  One quick follow up on that.  This is from my colleague Bob Green, so, I take no credit for that.

            But this is -- they obsess about this down at the -- doing whole genome screening, and you're all familiar with the therapeutic allusion for people in research, think they're in treatment. 

            Bob Green has proposed that what we have now is a diagnostic allusion, that people are seeing things in the genome that they think have something, and so, they have to tell the person about it, and he is also deathly against this idea of actionable information.  We're supposed to write a paper on that, but I can't figure that out.

            But actionable is a very funny word.  He's absolutely right about that.  What the hell does that mean?  You know, does it mean, I got to tell you, so you won't sue me later on, if it turns out to be true?  Who knows?

            DR. GUTMANN:  Mark and then Daniel, and then I'm going to ask Jim to make some concluding wrap-up remarks.

            DR. ROTHSTEIN:  I just wanted to answer Christine's question, and that is with a concept of vulnerable child syndrome, which we've known about for a long time, and basically in genetics, even children who have been screened and found to be heterozygous carriers and not being affected, such as CF and so forth, there is a long history of their parents treating them as if they were at greater risk.

            They don't let the kids climb trees.  They don't let the kids ride bikes, because they're afraid they're somehow going to like, shatter if something happens to them, and that worries me.

            If we had sort of routine newborn whole genome sequencing, and tying this back to Nita's point, without education, we can't do anything.

            For those of us who are very committed to the whole concept of informed consent, you can't have patients give informed consent if they don't know what it means.  We see that every day in the clinic.  We can't get a lot of our patients to even grasp that concept, because they don't have the health literacy, and I think this is absolutely essential, and we need to do work on this, to -- the tie in between knowledge and informed consent.

            Just tell you briefly about an interesting study that my group did about 10 years ago.  We wanted to do research on the issue of whether the population would support research and participate in research on pharmacogenomics.

            So, we did interviews with 2,000 people and obviously, can't use the word pharmacogenomics.  We explained what we meant, and at the beginning of the interview, we asked, would you be willing to participate under these conditions, and we got a number.

            Then we did 20 minutes of asking them lots of other questions, but in the process of asking those questions, we were educating them about what it's all about, and then at the end, we asked them the exact same question, and the numbers were startlingly higher for people who are willing to participate in research after they had just 20 minutes of indirect education.

            And we need, I think, to explore lots of different options about how we can get the public to a level where they can exercise meaningful informed consent, because the science is getting more complicated, the issues are getting more complicated, and too many of our patients are just sort of behind the eight-ball now.

            DR. GUTMANN:  Interesting.  Is that a published paper?

            DR. ROTHSTEIN:  Yes.

            DR. GUTMANN:  Great.  Daniel?

            DR. MASYS:  So, to speak actually to your question about cost, there is a scenario where costs cause the entire machine to fail, and my colleague Isaac Kohane coined the term “incidentalome”, and Russ Altman and I and Zach wrote a paper in JAMA in 2006, about the incidentalome would, in essence, look -- even if you had high quality testing with 99 percent precision and specificity, the fact that all of us have all of these unsuspected genetic variants would crush the healthcare system with diagnostic costs, and essentially, render genomic medicine ineffective.

            So, I think it is the case that too much information -- in our current models of the implicit obligation to follow up, if you're a clinician, is it real and does it have clinical meaning, is a serious problem, as genomes begin to arrive.

            DR. GUTMANN:  Very interesting.  We could go on and on, and we, as a Commission will go on at another meeting, until we get our report. 

            But before I conclude, which will be by thanking these wonderful presenters, I do want to ask Jim to make some conclusion -- semi-concluding remarks.

            DR. WAGNER:  Semi-concluding, and I'll do my best and I certainly am trying to draw words from your mouth, rather than put them in.

            But we are so grateful for your participation today.  It's been a marvelous day.  We've heard about the great, no longer potential, but some of the early realized potential of this technology.

            We heard a bit about the potential also, and challenges of the large data set management and how one governs that, and what governance systems should look like, and we've had a lot of recent talk here towards to the latter part of the day, about the privacy issues.

            Perhaps, sort of a summary thesis statement on that might be to acknowledge that currently, people enjoy, both enjoy the freedom of and suffer the consequences of being treated in a manner that is independent of their genome.  They both enjoy the freedom of being treated in a manner independent of their genome, whether that's by an employer or by an insurance company or by their neighbor.  We also suffer the consequence of being treated in a manner independent of our genome and therein lies some potential.

            Whole genome sequencing and literacy, thank you for that word today, has the potential and in fact, the likelihood of changing all of that, and I think it's in that, that the ethical challenge exists.

            The prospect that this -- I hope the prospect that this committee might suggest some global best practices.  I think that came from someone's mouth, and ensure, as best we can, against negative unintended consequences, is quite a big challenge.

            But I thank you all for helping us frame up that problem.

            DR. GUTMANN:  So, I would like to invite everyone to come back here.  We will reconvene at 9:00 a.m. tomorrow morning, and I want to ask anyone, our presenters, members of the audience, who have any thoughts to share with us, to please do that.

            Our website is bioethics.gov and you can submit comments there, and finally, this was an incredibly thoughtful group of presenters, so we thank you all very, very much.

            (Whereupon, the above-entitled matter concluded at 4:46 p.m.)

 

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.