Transcript, Meeting 10, Sessions 5 & 6

Date

August 1, 2012

Location

Washington, DC

Presenters

Member Discussion of Recommendations

 

Download the Transcript

Transcript

SESSIONS 5 & 6: MEMBER DISCUSSION OF RECOMMENDATIONS

DR. GUTMANN:  Welcome back, everyone.  If you would please take a seat we will get started.  This is the beginning of a series of sessions in which we as members of the Commission will discuss in real potential form possible draft -- well, these are draft recommendations and we'll discuss them as a Commission. 

Happy to take any questions, again, from members of the audience.  Just write them down on a card and Lisa, Michelle, Hillary or one of the other staff members will be happy to pass them up.  Our staff members all are wearing badges so it's easy to find them.

We're going to begin with I want to set the stage.  There's an overriding theme to our discussions to this point or at least one overriding theme and that is how can we reconcile care for the enormous public benefit that is anticipated from whole genome sequencing with care for individuals' privacy and respect for persons which has been a driving force of what all research on human subjects as we talked about in our earlier report and research in clinical care have taken as guiding principles, respect for persons and as a subset of that respect for individual privacy. 

And we've been considering in our discussions on the Commission recommendations that might facilitate continued societal benefit while minimizing risk to individuals and respecting individuals' varied concerns for privacy.

I want to just note explicitly the fundamental principle from which much of our discussion has stemmed and that is while we have focused on questions of privacy, on control of, on access to data, on use of data, these concerns are all motivated by a concern for respect for persons and that's a principle that has guided our recommendations to date and will continue to guide them. 

So, let me read a possible formulation.  It will be just our working formulation of a preliminary recommendation.  And then -- I'll read two of them but let's start with the first.

The first goes as follows.  Funders of genomic research and policymakers should maintain or establish strong policies for protection of genomic data while protecting opportunities for open models of data access and sharing by individuals who want to receive and share their own genomic data broadly with clinicians, researchers, or others.  So that suggests as we, I think we believe that you can do both.  This is not a dramatic tradeoff.

Let me read the second recommendation but then we'll go back to the first.  The second reads in draft as follows.  Funders of genomic research and policymakers should ensure the security of all genomic data.  All persons who come into contact with genomic data must be, one, guided by professional ethical standards and not intentionally, recklessly or negligently misuse whole genome sequence data, and two, held accountable to laws and regulations that require specific remedial measures in the case of lapses to data security, for example, breaches due to the loss of portable data storage devices or hacking.  So those are the two that I want to go over in this session. 

So let's begin with the first one which is funders of genomic research and policymakers should maintain or establish strong policies for protection of genomic data while protecting opportunities for open models of data access and sharing by individuals who want to receive and share their own genomic data broadly with clinicians, researchers, or others.  Jim.

DR. WAGNER:  First of all, I think this is a good one.  I would like to hear some discussion or offer a modification based on what we heard today.  I would think we should reinforce wherever we can this difference between access to the data and the data itself.  And I'm wondering if in the one line there where we say protecting opportunities for open models of data access and sharing, it sort of conflates and confuses these two.  I'm trying to find where to insert these words, but I wonder if we aren't talking about -- well, I don't want to wordsmith.

DR. GUTMANN:  You don't need to say what --

DR. WAGNER:  The idea is that we want to ensure that there is openness triggered by individuals who want to either permit access to or allow their data to be shared.  I think these are two -- I'd like to make sure we keep that dichotomy.  I think it's a useful dichotomy.  So, an individual may say you can have -- I want to permit access to my data for certain kinds of studies but I don't want my data given away.  And I think those are two -- I don't want to cede possession of my data, but I'm happy to have access -- people to have access to it. 

Christine, I don't know if you're catching what I'm trying to say there.

DR. GRADY:  I wonder if, I don't know in the actual recommendation or in the discussion of it we can recognize what one of our speakers this morning said, that we can use -- that technology can be used to create ways to allow access and retain privacy.  And I liked the idea of computational access.  So I don't know if that can be built in but that sort of gives some of those distinctions without -- as examples, anyway.

DR. GUTMANN:  That certainly can be made specific in the text around the recommendation, that there are -- increasingly there are ways technologically to separate ownership and access and to keep access secure in ways that are intended.

Is this on this or something else?  Okay, Dan.

DR. SULMASY:  Just very briefly, that instead of open models again based on this morning's discussion maybe broad models would be a better word to use there.  Just because "open" I think implies the sort of sense that Jim had talked about.

DR. GUTMANN:  Yes.  Good.

DR. WAGNER:  Thank you.

DR. GUTMANN:  Nita.

DR. FARAHANY:  So, as my comments up until now have made relatively clear I'm conflicted about access versus use.  And so I'm focusing on the first sentence right now which is, "Funders of genomic research and policy makers should maintain or establish strong policies for protection of genomic data." 

And I have a couple of concerns with this part of the recommendation.  The first is I feel like it is so non-specific as to not be as helpful as we could be because we've really struggled with a lot of different conceptions of what those strong policies for protection might be.  And so I want us to give much greater and more specific guidance as to what we think that means.

And I think there's two sides of the coin in my mind right now which is on the one hand in general I am not convinced that strong restrictions to access is feasible in the long term, satisfies the issues of regulatory parsimony that we hope for and very much think that the cat and cats are out of the bag on the flow of information. 

On the other hand, there isn't adequate protection of the types of misuse that we're concerned about for genetic information.  And so a kind of stopgap measure from my perspective is to limit the flow of information and access to information rather than to try to regulate and to safeguard against the uses of it.

So the way it's stated right now, protection of genomic data could be protection of use, it could be protection of access although I read it as protection of access.  And I think what I would want to see is something along the lines of, you know, funders of genomic information and policy makers should maintain strong policies for particular uses of genomic information, or in the absence of those types of policies to have strong safeguards against access to the information. 

DR. GUTMANN:  Yes.  So what we suggested earlier and I think we're in agreement in, that there's certain uses that are not legitimate uses that you want to protect against.  And in order to protect you often have to limit access.

DR. FARAHANY:  But I would want us to specify particular uses rather than leaving it quite so open.  And I'd want to specify them in part because, you know, one comment we heard earlier today is particularly with respect to sensitive health information most of this is going to be considered health information that's covered by HIPAA. 

To the extent that it isn't then we should be recommending that HIPAA should be strongly expanded to include those types of protections.  To the extent that there are other forms of discrimination beyond employment discrimination and insurance information like educational discrimination and economic discrimination that we heard earlier then you would hope that statutes like GINA or other statutes would be expanded to include those types of protections.  So I would want us to provide much more specific guidance as to what we mean by that, but I am not convinced that it is a good policy recommendation to suggest restrictions to access.

DR. GUTMANN:  So, just so I understand, do you think people should -- take the cup example which is used over and over again.  Do you think anybody should have access and be able to access and use that information?

DR. FARAHANY:  No.

DR. GUTMANN:  No, okay.

DR. FARAHANY:  So that's a use.

DR. GUTMANN:  Well, it's access and use.  If you --

DR. FARAHANY:  Well, let me clarify, right?  So, do I think that I should be able to pick up your coffee cup that you threw away?  Yes.  You threw it away and you no longer are manifesting interest in it.  I don't think you have thrown away your genetic information when you do that, certainly not intentionally which is why I don't think that the property analysis is the right way to think about it.

But, that being said, do I think that once I obtain a sample of your information I should be able to go to 23andMe or any other company and say I'd like to have this information sequenced without having to verify that it is my own.  No, I actually don't.  That's a use which I think many states have taken steps to restrict which gets at a lot of the different types of privacy concerns that we were talking about that individuals may have earlier.

DR. GUTMANN:  But as you said half of the states have and half haven't.

DR. FARAHANY:  And so I think a nice recommendation would be to say many states, and include which ones those are, have varying levels of protection which prevent the use of information that's obtained from other individuals to sequence it and to find out additional information, that kind of thing should be restricted.  So those are -- yes.

DR. GUTMANN:  No, I am in heated agreement with you and I think that is a great specific recommendation.  That is something even if you don't put it in the recommendation as broadly stated where we can refine the statement we should have a specific recommendation that says that -- because we recommend that the 25 states who haven't done it do it because that actually gets at the foundational protection of use which answers the question of why it's okay to pick up a coffee cup but it's the use of it. 

And that's just a -- for me that's a great example because it gets at this underlying care that not everybody has but a lot of people have that some stranger doesn't just have access to being able to know a whole bunch of things about me that I haven't agreed for them to know.  And we're not yet -- we may be in the world 100 years from now where everyone takes it for granted that everybody does but we're not there yet.

DR. FARAHANY:  Well, so let me just make a clarification.  So at least half the states have some form of regulation.  They are varying in what they protect.  And very few of them care, for example, about your being able to access identity information from it.  So that gets to even more specific use, right, which is that it isn't just your ability to sequence somebody else's information, it's that you cannot obtain health information about another individual.

But you could, for example, obtain some basic information in many of those states about an individual such as, you know, they're basic phenotypic information and be able to build a phenotypic profile from them.  And so that's why when I say it's not just about access for me it's about particular types of uses I would continue to only want to be as parsimonious as possible about identifying which uses we're concerned about and individuals are concerned about rather than broad access.

DR. GRADY:  I want to actually ask Nita if I can about how that distinction then plays out with large databases like in dbGaP.  So, currently as you heard there's public access and there's controlled access.  And part of the reason is because of how people use it and they want to make it usable according to the limitations that the donors gave, basically put on it. 

So if I understood what you were just trying to say maybe there should be no distinction between public access and controlled access but there should be some restriction on use.  And if that's true how do we get there?

DR. FARAHANY:  So there's kind of two issues with those databases, right?  The one is if I've signed up to be in a database, I've entered into a contract and said that you may use this for particular purposes which may include you may not give public access to the information, right?  And I think, you know, the freedom to contract about specifics like that is certainly permissible and that that's something we should honor and that the database holder should honor the commitments that they've made to individuals.

That being said, going forward what I suggest, what I think that the right answer is to have contracts that say we won't share it with the public.  Well, if that's the thing people are really concerned about and that's the kind of contract they want to enter into, sure.  As a regulatory matter do I think we should require individuals have that kind of a policy?  No.  And instead what I think those contracts are likely seeking to achieve is for trust for the individual that their data won't be misused in ways that they are uncomfortable with. 

And so that's what people are really concerned about from a regulatory perspective and policymaker perspective and funder's perspective.  I think there can be an opportunity to try and safeguard that end rather than trying to stop the flow of information which is very difficult, right?  It's very difficult and unlikely to continue to be successful.  If we knew at the other end even once the data was obtained that people would be safeguarded I think a lot more people would be comfortable with public sharing which has tremendous, I think, value as individuals are seeking to do considerably more research through those venues.

DR. WAGNER:  May I comment just very briefly?  It seems to me to get very practical about this, trying to come up with a list of improper uses is an impractical exercise.  I'd like to go back to the notion, I'd like to see if we couldn't frame it up so that how data are owned, what sort of access, computational access is permitted doesn't -- I'm hoping that would provide the kind of framework for the individual to exercise the kind of control that would only permit certain -- would only permit access for certain kinds of uses.

I think there may be a couple of very broad uses that we go to in recommendation 3, you know, we're worried about law enforcement uses, et cetera.  But I'm wondering if the way to get to your point isn't again to talk about how we imagine the data should be managed in order that uses could be regulated, either by the owner or by some entity, rather than trying to imagine that at this time we could list all those improper uses.

DR. FARAHANY:  So I think you're right, we can't list all of them.  I think the shorthand of access though is my concern. 

DR. WAGNER:  I think we're saying we don't like -- I don't like the word "access" without some modifier.

DR. FARAHANY:  Right.  I don't like the word "access" really at all, right?  So what I would hope instead it would say is, you know, "Funders and policymakers should maintain strong policies for protection against particular genetic uses" rather than, you know, broadly saying something like protection of data.  I think the next point, right, the next recommendation gets at databases more particularly.  So this one is the more general place I think to reflect that difference.

DR. GUTMANN:  Yes.  I'm not sure.  I think we need to think about this more because what I think is important is that respect for persons requires consent or at least some protection of those things that people vary in what their sensitivities about giving other people access, regardless of uses. 

Recommendation 1 is supposed to simply say, and it's never simple to say it, but it is -- the idea is to say that there are ways of protecting individuals' interests in not sharing their whole genome sequencing with the world and at the same time allowing individuals, indeed even encouraging individuals who want to receive and share their genomic data broadly to do so.  So it's basically saying we respect the range of concerns that people have which vary from people who really don't want their whole genome sequenced in public and don't want it just to be open to anyone to do that, and those who want to share it and have it made, you know, public and used for the public good.

DR. FARAHANY:  You just said something which I think maybe would help on this difference which is -- so one thing I noted was the difference between, for example, being able to obtain identifying information versus other types of information.  And you said for people to not have to share their whole genome, right?  Which is what the subject here is. 

And so if what we are imagining is if you get access to anything you get access to the whole rather than -- so, I think the idea being certainly individuals should have the right to not share their entire genomic sequence, but there may be particular queries on their genomic sequences which are perfectly permissible for which I wouldn't actually think policy makers and funders have strong claims against.

DR. GUTMANN:  Okay, good, good.  Other?  Yes, Raju.

DR. KUCHERLAPATI:  Amy, I have another question.  I think one of the things that is not defined here are the sources of the data.  And we just say funders, right?

DR. GUTMANN:  Right.

DR. KUCHERLAPATI:  And generally when we think about funders, you know, we're usually thinking about research funders like such as NIH or other such organizations.  But actually the genomic sequence and whole genome sequencing funders are going to be much more diverse than these organizations that we're thinking about.

For example, you wouldn't consider CMS to be a funder in the normal set of circumstances but if it is going to have a significant amount of clinical implications they are going to be the major player.  Then if this is going to have really clinical significance then all of the payers, you know, insurance companies will be funders of this.  And I don't think that the recommendation was sort of intended to include all of these different things.  That's one aspect.

The other aspect is that if you obtain information for research purposes then they go into these databases and so on and so forth that we're talking about.  But if it is clinical information that you obtain they are governed by -- and there's a different set of rules in terms of how the privacy of those things are dealt with.  And we haven't made a distinction between the two. 

And as I pointed out in the comment this morning and elsewhere I think that our focus on research-based stuff may be appropriate today but it is going to become completely eclipsed in a very short period of time with clinically-based sequencing.  And I think that anything that we make a recommendation should include both of those.

DR. GUTMANN:  Yes.  I'm going to move on.  Anything else on this?  So I'm going to move on to recommendation 2.  So we need to clarify is it all funders we're referring to, and I think that was the intention, Raju.  And that we're not making a strong distinction between research and clinical practice because in fact ethically speaking there isn't a strong distinction.  So I think that's all very helpful.

So the next -- yes, I'm going to number 2.  Funders of genomic research and policy makers should ensure -- I'm just going to read it again -- should ensure the security of genomic data.  All persons who come into contact with genomic data must be, one, guided by professional ethical standards and not intentionally, recklessly or negligently misuse whole genome sequence data, and two, held accountable to laws and regulations that require specific remedial measures in the case of lapses in data security, for example, breaches due to the loss of portable data storage devices or hacking.  And Jim, I see you.

DR. WAGNER:  I was just saying Raju, does your point also obtain in the very opening sentence of this.

DR. GUTMANN:  Oh, we switched 2 and 3.  I'm sorry.  I just switched them because logically 2 came before 3.  Okay.  So yes, this is now 2 because it made more logical sense.

DR. WAGNER:  To Raju's point we have in this one also funders of genomic research and policy makers should ensure security of genomic data.  Are leaving it to policy makers, you think Raju to cover CMS or for that matter to cover private repositories of genomic data like Ancestry.com?

DR. KUCHERLAPATI:  Otherwise it would be meaningless.

DR. GUTMANN:  Yes.

DR. WAGNER:  Good.

DR. GUTMANN:  Yes, I think that's a really important point,that given the flow of data now it has to -- unless somebody suggests otherwise I can see no reason why it shouldn't -- it has to include both.  Has to include them all.  So I hope you all understand it's basically -- these days the interface between research and clinical practice with regard to whole genome sequencing is basically -- there's very little daylight between them.  And commercial.  John Arras.

DR. ARRAS:  So this might sound a little nitpicky but when we talk here about everybody who comes into contact with this genomic data must be guided by professional ethical standards I guess there's a huge range of people who might come into contact with this sort of information, right?  Not just doctors, not just geneticists but you know, biotech people, computer people.  And I'm wondering if the notion of professional ethical standards really is a wide enough umbrella to encompass all those different sorts of people.

DR. GUTMANN:  Well, I think these are all really good points and I think require us to refine this recommendation.  First of all, we're talking about whole genome sequencing which is very different than a specific point of genomic data.  And we should consistently use the notion of whole genome sequencing because that's really what we're talking about. 

We're in an era where people are sequencing whole genomes and getting vast amounts of information which nobody right now knows how it will be used as soon as a year from now given this.  And that's the security we want to put in place here and we want to speak to.  So we've got to get the language of whole genome sequencing throughout because there are rules and regulations.  They may, you know, not be totally adequate but they're much more adequate than they are with regard to the ethics of whole genome sequencing about particular genomic data.  And so I think it's really important.

And then coming into contact is a little bit of a euphemism, right?  We come into contact with genomic data all the time as, you know, here we go.  So we should be more -- it's not a nitpick.  I think it's an important -- it's important.

DR. ARRAS:  Okay.  There's one other --

DR. KUCHERLAPATI:  Maybe I could just make a brief comment about this.

DR. ARRAS:  Sure.

DR. KUCHERLAPATI:  You know, like all of us at academic institutions, like you have to take a test on environmental health and safety every year and you pass the test.  And unless you pass the test at least at our institution you're not allowed to get into the building.  So your access to the building is denied if you don't pass this, right?  So the -- and it works effectively obviously. 

And so it is possible and NIH certainly has put in for many of those things systems in place that will allow the people to be educated before they have access to something.  So I think it's possible to put those types of systems in place so that people who get in contact with the data would clearly understand the significance of what they're doing.

DR. GUTMANN:  People who work with the data, who work with the data on a regular basis should have to have -- should be guided and then it goes on.  And those people, whether they're -- this is a question but I'll make it as a statement and you can react, whether the people are doing clinical work, research in private or public sectors should be guided by these standards.

DR. ARRAS:  Okay, one other bit of ambiguity.

DR. GUTMANN:  Okay, and then Anita.

DR. ARRAS:  Okay.  We talk about specific remedial actions, right?  Now the one example that we give of a remedial action is to notify breaches of security to HHS, right? There's another meaning of remediation here which is to make things right with the individuals whose data has been misused.  So I'm not sure what we want to do about that.  But I just wanted to flag that there's an ambiguity here and we're only talking about one prong of the ambiguity.

  DR. GUTMANN:  It's broad and we need to decide how we want to specify it.  Anita.

DR. ALLEN:  I'm wondering whether in response to your interaction with John that the phrase "professional ethical standards" may need to be altered too.  Because I think we're primarily talking about standards of confidentiality and -- are we there?  "Professional ethical standards" that's a very broad concept.  I think we're primarily talking about confidentiality-type concerns here and if so, maybe we might want to use the language of confidentiality or some other privacy-related term specifically in that context.

And then we switch from that concern about confidentiality and privacy to number 2 which is our concern about adherence to data breach standards.  That is to say we want to avoid them, try to avoid them, and if we happen to have one whether it's intentional or accidental, that we take appropriate quote unquote remedial measures.

And I think the reason why the word "remedial" is there is because it's used so much in discussions about what state laws require actors to do when data breaches have occurred. 

And I also just wanted to note that we might need to say, it might be useful to say here held accountable to not just laws, but state and federal laws.  Because we're operating here in a mostly federal context but I'm going to say that most of the laws that need to be worried about are actually going to be state laws.

DR. GUTMANN:  Good.  We will -- is there any comments or questions from audience members?  I'm about to turn it over to Jim to move on to recommendation 3.

DR. WAGNER:  Or is it 2.

DR. GUTMANN:  It's 3, it's now 3.                       

DR. WAGNER:  It says that funders of genomic research and policy makers, so it's that same group, should establish strict limits on access to genomic databases by those seeking to use data for purposes other than those for which the data were collected or to which the individual consented.  Entities such as law enforcement, defense and security and commercial entities should not have access to genomic samples and data collected for health-related purposes.

We may want to just spend time on that.  Well, I can go ahead and read the second one.  It says --

DR. GUTMANN:  Actually, it's better to do one at a time.

DR. WAGNER:  Yes, I'm just thinking we might keep them separate.  Yes.  Dan?

DR. SULMASY:  First, a general comment on Raju's point.  It might help to say something more at the beginning of each of these three recommendations like managers of research and clinical genomic databases and policy makers.  Something like that might be the kind of language we're looking for as the stem in each of them.

But my specific comment on 3 is in the second sentence I'm just a little concerned about its absoluteness there.  And I'm with you in being very cautious about it but wonder if there might not be, you know, room for some exceptions in cases of grave danger to the common good.

DR. WAGNER:  We had talked about the phrase "strictly limited" have we not?  With regard to that, recognizing -- subpoenaed.

DR. GUTMANN:  I think one way of doing this, and I think the intention here is should not routinely have access.  That is, there are always ways, I mean there are subpoena ways of getting access.  The idea is that they should not routinely have access.  And --

DR. SULMASY:  So some sort of qualification like that I think is valuable.

DR. WAGNER:  That's good.  In fact, I thought we had had that conversation before.

Anita, your light's on.  Is that just -- Nita.

DR. FARAHANY:  So I struggle with this one as well.  And I don't struggle with the sentiment of it, I struggle I think more with the language of it.  So, and to whom it's directed.

So I like the addition that Dan and Raju have suggested for individuals who are actually maintaining the databases should establish clear and articulable limits on access to genomic databases by those seeking to use the data for purposes other than those.  I don't think I would recommend a regulatory approach to this. 

And the reason is I think that it should be permissible for people who are collecting the data to make clear what their policy is and for people to choose to participate or choose not to participate.  And so if that were the case then I would feel quite comfortable with this recommendation.  It's really to whom it's directed that I am less comfortable with it.

The second thing is, so, on the third sentence where say "other than those for which the data were collected or to which the individual consented."  So, it's either redundant or I'm concerned with the first part which is "for which the data were collected."

DR. WAGNER:  That's a purposes phrase, right?

DR. FARAHANY:  Right.  So, "to the use the data for purposes other than those for" and I think I would just go with "for which the individual consented."  And you know, here included within this we have to have, you know, detailed conversations about re-consent and you know, what exactly consent means and how robust and how longstanding the consent can be, and blanket consent, things like that.  But I think I would eliminate "for which the data were collected."  So direct it to different individuals so that it's not a regulatory recommendation, but rather individuals who are maintaining the databases and second, eliminate that. 

And then I agree with your change, Amy, which is law enforcement, defense and security.  They shouldn't have routine access but there are certainly permissible public good reasons for which they will need access.

DR. GUTMANN:  So I also think the first sentence needs clarification and I want to know whether people -- what we mean here.  Individuals and institutions who are responsible for managing the data.  So there are individual and institutions who are responsible for managing data, whole genome research, must establish is it strict limits on access or is it clear limits?

DR. FARAHANY:  I think it's clear and reasonable.

DR. GUTMANN:  Clear and, you know, clear and reasonable limits on access.  Because -- do we all believe that they have to be clear limits?

DR. ARRAS:  Clear is good.

DR. GUTMANN:  I think that's really important.  And this is saying by those seeking to use the data, period, right?  I think what Nita is saying is there have to be clear limits by anyone who's seeking to use the data.  Whether the data was used for reasons that it was initially gathered or not you don't need that extra. 

And then if we want to make sure that we mean that they not only have to establish clear limits but the clear limits have to be consented to.  We should add there have to be clear limits and they must be consented to by all individuals who are affected.  And I think then we get rid of the other -- and I think that's clear.

DR. ARRAS:  This is just, this might be superfluous at this point but just to endorse Nita's concerns here about the purposes clause, right?  It seems to me it says the purposes other than those for which the data were collected.  I think that invites a narrow interpretation, you know.  In other words, the data were collected for this particular study, right, and that's way too narrow a fallback understanding for this paragraph.

DR. GUTMANN:  Now, in the text we should say that one of the now obvious issues with whole genome sequencing which is why it holds out the promise for public good is that day-in and day-out researchers and clinicians are discovering new uses for it that weren't initially intended and that's why it's important to have clear limits specified and have individual consent so that individuals understand what they're consenting to.

DR. WAGNER:  Yes.  I'm trying to remember the context, the larger context of this.  I'm in agreement with the purposes we're talking about but I don't know as I would stop.  Obviously there should be clear limits on access to genomic databases by those seeking to use the data but I thought the purpose of this particular one was to be specific about those wishing to use the data for purposes other than those for which the individual originally consented.

DR. GUTMANN:  I think that's right but I think what we can say and what we should say based on our discussion and everything we've heard is that there is this concern and it's a legitimate one.  And it's of a piece with the concern for making sure that individuals know what the limits are on the use of the data and know -- and that the individuals and institutions who manage it assure that there's consent to those uses.

I just don't think it's -- I don't think there's an ethical bright line between intended and unintended uses.  The bright line is between making clear to people how broad the uses might be and getting their consent to them.  Because it's a moving -- the uses are a moving target.

DR. FARAHANY:  Right, but I think Jim's point is also right which is so the idea is we want to make sure that there are clearly articulated limits, right?  So that a person who is choosing to participate understands exactly what the limits are of access and what, you know, what is beyond the limits.  So what are purposes for which it can be used so that they can consent, right, in order to do so.  So I think --

DR. WAGNER:  This is to have a mechanism in there to go back to the individual essentially.

DR. GUTMANN:  And they would have to re-consent if it goes beyond those limits.

DR. FARAHANY:  Maybe, right?  I mean that's part of the conversation, right?  You could think blanket consent is permissible and as an individual you could be willing to give blanket consent.

DR. WAGNER:  And I think that's what we were trying to capture in this one as I recall.  I should go back and look at the broader text.  So I suggest we leave that little phrase in about for purposes other than those for which the individual consented, originally consented.

DR. GUTMANN:  Well, could we say for any purposes including the purposes for which?

DR. WAGNER:  The blanket got bigger.  Yes, that's fine.  I think that's fine for me.

DR. GUTMANN:  Yes.

DR. WAGNER:  Okay, good.  Other thoughts on this one?

DR. GUTMANN:  Dan?  Oh, I'm sorry.  You're on.

DR. WAGNER:  You're doing a great job.  I don't know who was first.  Christine?

DR. GRADY:  I was just going to ask if it includes people who haven't consented like children.

DR. WAGNER:  Say it again?

DR. GRADY:  This would include those who gave permission for people who can't consent for themselves?

DR. WAGNER:  That's a great question. 

DR. GUTMANN:  Well, it logically does if you -- we're not -- we haven't yet discussed what the conditions are of consenting for children, but it does logically -- if there is consent for children then this covers that.  It's no -- the difficult issue is what constitutes legitimate consent for whole genome sequencing for children.

DR. WAGNER:  Yes.  But were you asking if philosophically this covers that or were you asking do we need to have language that's inclusive of that?  I thought you were asking the language question.

DR. GARZA:  So I think the way to get around that is not using the phrase "individual consented" and maybe saying something like to which consent has been given.

DR. WAGNER:  For which consent has been obtained or something like that.  I think that's a fair amendment.  Dan?

DR. SULMASY:  I just wanted to bring us back to the stem at the beginning of the first three of these recommendations and the question again of to whom this is addressed.  Because I think we seem to all be in agreement that Raju's comments about making sure we include the clinical uses, so managers of research and clinical genomic databases should be part of it.  But I've heard a couple of formulations that dropped off the policy makers part of it.  And I thought we ought to include that because even if we want to have a lot of it done by contract I don't think we want all of it done by contract.

DR. GUTMANN:  Yes.  I think there's broad agreement on that.  And thanks for making sure we didn't drop it.

DR. WAGNER:  Our scribes have that?

DR. GUTMANN:  Yes.

DR. WAGNER:  Great.

DR. ALLEN:  Jim, I know you want to move on but just one thing.  I just wanted to put in a strong endorsement for your keeping in the purposes language.  And the reason for that is that a longstanding federal policy has been fair information practices.  And the way we have it phrased at the moment it's consistent with those fair information practice principles which include that anyone who collects data not use that data for purposes other than that for which it was collected.  So we are kind of in harmony with the larger federal fair information practice standards if we keep in some such language.  Concerns about mission creeping and concerns about unauthorized third party uses are basic to how our government views information practices.  So I suggest we keep that language in there.

DR. WAGNER:  Keep the purposes phrase in.

DR. ALLEN:  Yes.

DR. WAGNER:  Excellent.  John?

DR. ARRAS:  I'd be happy with that just so long as the purposes clause doesn't naturally gravitate toward a narrow interpretation of the purpose.

DR. WAGNER:  We've got it linked to the purposes for which -- what was your language?  For which consent was obtained.

DR. ARRAS:  It's just that, you know, I think most people would say well, I consented because this is for medical research or something of that sort.

DR. WAGNER:  And that's about as broad as it can get.

DR. ARRAS:  Yes, yes, rather than -- I just think we should be careful that we not give the impression that this could be limited to a single study.

DR. GUTMANN:  Now, I have a question here and anybody should feel free to answer this question.  We're talking about consent to the uses of whole genome sequencing.  Are we assuming or that this is going to be de-identified or are we assuming that -- so I see a no.  We're assuming that people -- so are we -- now let me ask the question. 

Are we assuming that we are endorsing the possibility that the government or -- let's just say the government at this point would be able to get broad almost blanket consent to using identified whole genome sequencing of individuals provided they consent to it?  It's not now possible to do that according to current regulations.  You cannot pass IRB muster if it's identified and it's blanket consent.

DR. FARAHANY:  I don't think we're endorsing it, I think we just recognize that it's, you know, we're moving into the world where it's just not possible to really imagine that de-identified data is going to remain de-identified.  So rather than endorsing that I think it's recognizing that that's the inevitable direction of this data.

DR. GUTMANN:  Well, I'm always wary of statements of inevitability when they're not I think -- it's not inevitable next year or the year after or in the, you know, it's not inevitable that it would just -- identified data would just freely flow.  That is, even with all of the possibilities of re-identifying de-identified data it's still a hurdle to get over to do that.  And I'm just asking, I really am asking the question are we comfortable with endorsing the government setting up genetic databases of whole genome sequencing that are not de-identified and getting --

DR. WAGNER:  So long as permission is given.  Is that what you're…?

DR. GUTMANN:  Yes, so long as permission is given for any uses that may come in the future.  Because so far these statements don't say anything that would morally prevent that from happening.  And most people have this view -- I just have to, you know, a lot of people have the view that government is a very powerful entity and to cede to it these kinds of powers is a potentially dangerous thing.

Researchers, you know, individual researchers, people trust their doctors more.  You know, trust is at an all-time low in our society but there are levels of trust.  And even, you know, HIPAA requires doctors to keep things confidential.  But patients can agree to doctors not having it confidential.

DR. WAGNER:  Christine?

DR. GRADY:  Well, I guess I'm -- I have lots of thoughts so let me see if I can get them out right.  This recommendation as I read it doesn't commit us to anything related to identifiability.  It just says you can only use data for purposes for which it was consented to or something like that, with the idea that there's a limitation on use for things that some people find problematic.

But I think it does -- I appreciate your raising it, Amy, because I think it raises an issue that we need to talk about in all of the whole recommendations on the whole report and that is what we think about identifiability or de-identified data.  We've heard people say that de-identified data is probably a myth but we've used it not only federal regulations, but we've used it in a couple of different ways historically as I understand it.  One is to be able to use data for things that people have not given consent for.  Two is to minimize perceived risks of being identified.

DR. GUTMANN:  And three is, let me just add a third.  Three is to minimize the totally -- the inability, to minimize the things that nobody can regulate which is somebody who's doing research on this space seeing a person's name attached to a set of information and unconsciously, subconsciously using it in their daily life in a way that no government can regulate. 

And the de-identification of whole genome sequencing, it's not simply that it gives people some irrational trust, it also prevents everybody who's using that database in legitimate ways from carrying around with them knowledge that they don't need to know that's attached to people's names.

DR. GRADY:  But I think one question, maybe Raju can help us with this.  I mean, you know, the de-identified databases have existed but they're not whole genome sequences.  They're, you know, pieces of genetic data.  And people have doubts whether that's de-identifiable.  So I guess the question that I don't even know the answer to is can whole genome sequenced data be de-identified?  

DR. GUTMANN:  Before Raju answers, I want him to answer it, there are different -- that question has two different parts.  One is can whole genome sequencing be permanently de-identifiable.  And my assumption – and we’ll see - is the answer is no because if someone wants to given how unique whole genome sequencing is to people you can find out who the person is.  But the second is does it deter people, does it at least provide a first cut barrier to everybody who comes in contact with that, working on that data from knowing who the person is without extra effort and I'm assuming the answer to that is yes.  You have to take some extra effort to find out who the person is if it's de-identified.  And that, I'll just speak for myself.  To me that's a good thing. 

DR. WAGNER:  We've got to get back to this.  Go ahead, Raju, but we have to get back to this piece because I think that might actually be a separate conversation.

DR. GUTMANN:  Okay.

DR. KUCHERLAPATI:  I want to -- first, I want to address Amy's comments.  So, I think let me give you a couple of case studies that are -- sort of like case studies.  And I think that because there is no one answer to your question.

So for example, if I were to go and have my DNA sequenced and whole genome sequence done to really find out if I have susceptibility for a particular disease or something and that information is -- that's what I would give consent for let us say.  And then a year from now it turns out that there is a definitive information about some other gene, you know, mutations in which would -- that I need to know.  In which case the doctor or whoever is handling it must absolutely need to have identified data.  This comes from Raju with the Social Security Number and so on.  So there's some circumstances in which that is absolutely essential.

The second type of thing that we talked about is more sort of population-based data.  A thousand people for which we have -- that we need to have information and I want to understand, you know, whatever, susceptibility for heart disease or schizophrenia, whatever studies that I want to do in which case it is not necessary to be able to say that this particular DNA came from Raju, right?  It could be --

DR. WAGNER:  Aggregated data.

DR. KUCHERLAPATI:  Right.  No, no, even if you --

DR. WAGNER:  Oh, I see what you're saying.

DR. KUCHERLAPATI:  -- from A, B and C, not necessarily aggregated data.  And that's fine to be able to do that.  So I think that obviously we don't want to just put ourselves into one bucket or the other but circumstances where identification is important, identification is not important.

To answer Christine's question, Amy, you answered that partly.  We all talk about the fact that it is, you know, there is nothing like de-identifiable data.  That is theoretically possible, that if you have a bunch of SNPs that are unique to you and so therefore that defines who you are.  But you know, if I were given your DNA and if I had all the SNPs to be able to say this is Christine is not so easy.  I mean it's theoretically possible to be able to do it but it's not practically so easy, especially if you have 100,000 or 200,000 or 500,000 you know such pieces of data.

DR. FARAHANY:  But what about building a phenotypic sketch that you then just compare to a facial recognition software database if you're the government?  Isn't that relatively easy?

DR. WAGNER:  That's what he means, it's theoretically possible.  Actually that segues back to where I think I can capture us again.  Well, because there was a question about the government.  And the way the question was put -- I think all these issues are important.  I'm not sure they're germane to this particular recommendation because the original question that was posed are we comfortable endorsing the possibility that -- I'm going to pose it in the absurd -- the possibility that people who don't want the government to have their information nevertheless will consent for the government to have their information.  I'm perfectly that that's their own idiocy and that's what we've protected against here, that I think the recommendation works. 

What we're saying is people -- there need to be systems in place so people who consent to have data transmitted, owned, shared, accessed beyond what they originally consented it for.  If we're trying to capture in here that in addition to law enforcement the government should not sweep in and capture data for which people have not consented -- for purposes for which people have not consented we could add that. 

But I'm not concerned.  I don't know what else we could do than this to mitigate against the Big Brother phenomenon.  Because what we are saying is, Uncle Sam, you can't have the information unless it's consented to, at least if you're going to stay within this regulation.  So I think the regulation works as it is.

DR. ALLEN:  Amy's a philosopher and I think she was making a broad conceptual point that de-identification just like password protection is a way of making information more obscure, harder to get to, practically obscure and therefore it protects privacy.  And I think that we need to recognize that as a general matter.  I think it doesn't necessarily belong in any one recommendation but as a general matter it's just that the harder it is to decipher information or get access to it, the more privacy you have.  And we have a nice concept for that in the law, we call it practical obscurity.

DR. WAGNER:  Well, then maybe -- so getting our nose out of this particular recommendation, are we imagining that the report needs to have some content to that point in it?

DR. GUTMANN:  So the reason I raised it with regard to this is it raised the issue since we put clear limits instead of strict limits in here it is a clear limit to say you consent to no limits on your use.  It's as clear as can be.  There are no limits and that includes identifiable whole genome sequencing for whatever use the federal government might want. 

At present that is not possible legally.  That is, if it's identifiable there must be limits on consent.  And I do think that not to say that -- I suspect we should say there need to be some limits, otherwise I think we will -- whatever you think ethically about that I think we will practically undermine the ability to collect these data and use them because I think if it were taken seriously that there were no limits there would be -- it would not be a fantasy scenario.  There would be some use of it that would shock people because people would not have thought that oh, it could be used for this.  So I think we should just consider in the text of this not so much in the language here, the text of it, we need to, and here I'm agreeing with you, say how there are some side constraints necessary to protect both the government and people from having massive distrust in the way this is being used.

DR. WAGNER:  You know, I'm with you and I think we also heard from Dr. Knoppers this morning her recommendation that there almost shouldn't be permitted such broad.  Remember, she said -- is she still with us?  Made the statement that she hadn't seen such an open consent since the nineteen seventies or something.  I think that's the other way to get at this, that consent needs to have some bounds to it.

DR. GUTMANN:  Jim, when you -- you and I oversee many thousands of employees, right?  So when I wear my other hat as president of Penn I have 31,000 employees and 20,000 students and 4,000 faculty.  And there is -- there's certainly not a year that goes by when I don't learn something that people do that I totally didn't anticipate any human being ever doing.  So this is in effect my extension of this to the human condition.

(Laughter)

DR. WAGNER:  Fair enough.

DR. GARZA:  So Jim, I just wanted to make one point.  And that is you used the word "regulation" before.  And I think it's important that we understand that we're not writing regulations here, that we're writing recommendations.

DR. WAGNER:  Absolutely.

DR. GARZA:  And so there has to be a fine balance between getting too narrowly scoped like you would with a legislative aim whereas we're trying to make recommendations which by their very nature should be --

DR. WAGNER:  I apologize if I referred to --

DR. GARZA:  No, but I've heard the discussion going around about narrowly focusing and using particular words, but we have to remember we're not writing law here that people are going to interpret.  And so --

DR. WAGNER:  Good advice.

DR. GARZA:  -- so I think if we stick with guidance then we'd be much better served.

DR. WAGNER:  Yes, that was another recommendation this morning.  Let's go to our final one for the session anyway and that's the one that reads that the Office of the National Coordinator for Information Technology, the Office for Human Research Protections and other relevant agencies should continue to invest in federal initiatives to ensure that third party entrustment of whole genome sequencing data complies with the Health Insurance Portability and Accountability Act and other data privacy and security requirements.

I think the one thing this does philosophically is appropriately or inappropriately imagines that these data should be managed as health data, and I know there's some varying opinions on that but that's what this is saying.

DR. FARAHANY:  So I think this is true and good as a recommendation.  I think certainly if you have access to the entire genome it's going to include a lot of information that pertains to health records and so we're right to do this. 

In kind of building on what Alex just said about the difference between recommendations versus regulations, what I like about this recommendation is it's quite specific in the guidance that it's providing.  And I think we should ensure that our other recommendations are providing as specific of guidance as this is as to what we're hoping to achieve by it.  And so I think this is helpful and including other data privacy and security requirements is also quite important because there may be many others that also, other types of legislation concerning data privacy and security requirements that will continue to evolve and be required.

DR. GUTMANN:  I think this is a place where we can integrate some of what we heard this morning about the ability of technology to help keep data appropriately private.  And that even makes even more specific some of the things that investment could be in.  I think there will necessarily, and I think this is what Alex was suggesting, necessarily be different levels of specificity to different of our recommendations because we're providing a framework and then also making some -- this is one of the more concrete ones.

And I think it follows, Raju, the point that you've made repeatedly about how we shouldn't draw this.  There is no bright line, right, between research and clinical practice now when it comes to health.  And so this is a specific way of recognizing that.

DR. WAGNER:  Is this a place and should we consider, Nelson, your comment this morning, your question this morning about whether or not there is sharing around information assurance practices.  Would we want -- does the Commission want to say something here about that as well?  Since we're talking about the several agencies.

DR. MICHAEL:  Well, you probably inferred why I asked that question specifically because I think different organizations, especially for-profit organizations might spend or might have different reasons to put more resources on protecting their information.  And so they may in fact have learned best practices that could be more generally applied and I think that ultimately it's in everyone's best interests to share those best practices. 

And I don't think I've really got a clear answer, at least not in open session, but I do gather that there is an imperfect sharing of those best practices across those groups that now are currently collecting whole genome sequencing information.

DR. WAGNER:  I guess what I'm asking is would it be helpful to say something about that.  We're now saying that these different offices and other as yet unspecified relevant agencies should continue to invest in federal initiatives to ensure third party entrustment.  We certainly don't want what we've seen in history, that the FDA has one policy and over and over and over again.  Was this a place to use the sharing thought, somehow ask our folks to weave in the sharing piece?

DR. GUTMANN:  I think it would be good to add another sentence about the sharing of best practices with regard to security rather than -- period, but in the text.  We shouldn't minimize the importance of efficiency here because the amount of money that could be wasted on every agency and private and public entity inventing its own security means when, by the way, that will minimize trust because people won't know what means actually are secure as opposed to sharing.  So I think that's important. 

You know, this is not something in the future.  We're dealing now with so what is the security of cloud computing.  That's what has limited a lot of uses of cloud computing which is otherwise could be very efficient.  And so I think this is the place to have a -- it's a friendly amendment to add to this recommendation.  I wouldn't add a different one --

DR. WAGNER:  No, no, no.

DR. GUTMANN:  -- about the sharing of best practices with regard to security.

DR. WAGNER:  That's part what it means to ensure as we have written here.

DR. GUTMANN:  Right, right.

DR. FARAHANY:  I just had a clarification question which is given that many third party entities, whether it's someone like Ancestry.com that we heard from today or you know, databases like 23andMe that would be collecting information are going to be moving from SNP analysis to whole genome analysis, but collecting it and delivering information back to individuals that may not be health information. 

Are we through this recommendation suggesting that every entity who has whole genome sequence information even if they're not reading onto that whole genome sequence information anything that's health-related, that they're subject or should be subject to HIPAA requirements simply because they have data that could potentially be read to be relevant to health information?  So I'm --

DR. WAGNER:  That's not how it reads now but that's --

DR. FARAHANY:  Well, I mean it could be how it reads, right, which is to ensure that third party entrustment of whole genome sequencing data.  So any company that has whole genome sequencing --

DR. WAGNER:  Oh, I see.

DR. FARAHANY:  -- would, the way we've written it, need to comply with HIPAA.  And I worry about that, right, and yet you know that may be what we're saying which is because there is the potential to mine any whole genome sequencing for health data they're therefore subject to HIPAA.  But that's certainly quite different than the status quo for the ways in which we think of those companies. 

So, one way to think about it is you have whole genome sequencing but if you are only analyzing non-health related portions of it then you would be subject to HIPAA.  Another way to read it is if you simply have the data at all, even if you aren't interpreting any of it then you're subject to HIPAA. 

So it's more of a question as to what is it that we're seeking to achieve by this.  Are we really saying something as broad as everybody now will be subject to HIPAA?

DR. WAGNER:  Do you think there's a different way to word it?

DR. FARAHANY:  Yes.  So, another way to word it could be, and we'd have to think about the precise wording, could be, you know, third parties who have whole genome sequencing data that have interpreted anything related to health information.  Simply having the data but not reading portions of it or interpreting portions of it is different than like raw data versus interpreted data. 

So if you actually have sequenced the entire genome but you've said, okay, I recognize these three portions are relevant to Alzheimer's then you have health information.  If you simply have the raw data that doesn't necessarily make you subject to HIPAA.

DR. GARZA:  So I was thinking of just changing the word "and" after "Accountability Act or" and then adding on a qualifier at the end saying depending upon the use of the data. 

DR. GUTMANN:  So how would -- let me just ask how would HIPAA relate to -- so, what is it, Ancestry.com.  So let's pick any one of these, pick a commercial one and they have gotten, they've done a whole genome sequence.  HIPAA would require individual consent, right, to making it public.  Can they now without -- so if they made the whole genome sequence public without individual consent they would violate HIPAA.  If they --

DR. FARAHANY:  There's a lot more to HIPAA requirements.

DR. GUTMANN:  I understand, but I was just picking the first, the big one.  We would want to say they should get individual consent.  We wouldn't want to exempt them from that, right?

DR. ALLEN:  But my belief is that HIPAA only applies to what the statute defines as covered entities.  And so not everybody who might happen to have some whole genome sequencing data would be covered by HIPAA.

DR. GUTMANN:  Yes.

DR. ALLEN:  The question is do we want them to be covered by HIPAA.

DR. GUTMANN:  That's what I'm asking.

DR. ALLEN:  And do we want to recommend that Congress or the Department, you know, HHS change the regs to require anybody who has this kind of data to become a covered entity.

DR. FARAHANY: And that's what I'm asking.

DR. GUTMANN:  Yes.  And that's what -- I understand that's the question.  So I was trying to see what the implications would be compared, you know.

DR. FARAHANY:  Well, it gives a cause of action to any individual whose data is maintained by any of these databases who don't -- it's not just consent, it's, you know, how it's kept and where it's stored and what kinds of electronic records can be --

DR. GUTMANN:  That's what I was asking.  What I was asking is so, is what are the things in HIPAA that would make us feel uncomfortable extending them.  It wouldn't be the obtaining of individual consent before publicizing it, that we'd want.  It's the --

DR. FARAHANY:  There's a whole list of regulations about exactly where it can be stored.

DR. WAGNER:  Do we know enough about that?  Do you guys know enough about that to recommend it?

DR. KUCHERLAPATI:  I don't know enough about the --

DR. FARAHANY:  I know enough about it to recommend that we don't do that.

DR. GUTMANN:  So we will -- this is in Commission parsimony knowing what we don't know.  We know enough to say, to make sure we make this specific enough that it's third party entrustment from -- it's delegations of these agencies to third parties.  In other words, the attempt here is to say that research and clinical care that is ceded by government entities and public entities should be covered.  We're not extending it further than that here.

DR. FARAHANY:  When what they are maintaining is --

DR. GUTMANN:  Exactly.

DR. FARAHANY:  -- health-related.

DR. GUTMANN:  Yes.

DR. KUCHERLAPATI:  I think our intentions are clear but the mechanisms may not be.  I don't think I know enough about HIPAA to be able to say that this is what we need to do, and to be able to say HIPAA or other privacy measures.  I think that would cover it.  I think that's the way that we have phrased it.

DR. GUTMANN:  Right.  And HIPAA, there are many criticisms of HIPAA for being overly onerous in its extension.  So we don't want to -- we are not, let's just be -- when we draft this beyond the words of the recommendation we will make clear that there are aspects of HIPAA that don't fit our regulatory parsimony principle.  And as this is extended as it should be in principle we would like to see HIPAA regulations be parsimonious in what they require.  Because it is important to protect privacy in all of these matters but not to make it so burdensome as to be more costly than it needs to be.

DR. WAGNER:  I wonder if we should have a concern either about just leaving it open, saying look, we want these several agencies to invest in federal initiatives to ensure trustworthiness of these third parties, or -- and does that run the risk actually of resigning our concern of parsimony and that these agencies might in fact come down with a much more blunt instrument. 

Or on the other hand should we try to specify, again, aspects of HIPAA or other things.  Do you see what I'm asking here?  I don't know whether to leave it open, or to use HIPAA as an example, or to actually -- I don't know enough to say that HIPAA is specifically what we would recommend. 

DR. ALLEN:  There actually is --

DR. WAGNER:  There is?

DR. GUTMANN:  There is.

DR. WAGNER:  Then I'm not familiar with it.

DR. KUCHERLAPATI:  I think we need to include HIPAA because the clinical information clearly is covered by HIPAA.

DR. WAGNER:  Okay.

DR. KUCHERLAPATI:  We're not talking about -- so you need that and to be able to say the rest of the data, we need to have something similar. 

DR. FARAHANY:  But it could say something like with, you know, relevant regulatory schemes such as, right, HIPAA.  Right, so that we're not committing it to saying everybody has to --

DR. WAGNER:  I would favor that approach.  I think that's what Raju was talking about.

Dan, I've still got your name open.  Did you get your comment in or have you forgotten it?

DR. ARRAS:  Thank you. 

DR. WAGNER: I apologize.  So we'll give it to John for the last question.

DR. ARRAS:  We're done with the HIPAA conversation?

DR. GUTMANN:  Yes.

(Laughter)

DR. ARRAS:  Okay.  I didn't want to preclude the final agonizing question.  So, I've been puzzling over the implications of Raju's warning that the clinical and the research spheres will soon merge, right? 

So here's the puzzle.  I think that the distinction between clinical care and research originally meant to signify that there were reasons to -- much more reason to worry about conflicts of interest in the research area than in the clinical care area, right? 

The clinical care area is governed by the Hippocratic ethic and so forth, and patients can be pretty much assured that their doctor is caring for their best interest, okay?  Whereas in the research area that people are concerned about generating generalizable knowledge.  It's not clear that the research is being done for the benefit of the individual, right?  So that distinction then led to heightened protections in the area of research as opposed to clinical care. 

So if indeed it's going to emerge that the clinical sphere and the research sphere will come closer together or merge in some ways I'm wondering what the implications of that might be for the kinds of scrutiny that we apply to, say, informed consent. 

DR. KUCHERLAPATI:  Yes, I understand.  John, I don't know whether I can answer your question.  But I guess the point that I'm trying to make is that the kinds of data that you obtain from either of these methods is essentially indistinguishable.

Second is that it is not going to -- I mean, whatever the data, wherever the data comes from that doesn't necessarily mean it's going to just stay where it was collected.  It doesn't mean, like for example, in the clinical context it is not necessary that the data is going to just sit with the ordering physician but that data will essentially -- can be bundled, you know, used in a variety of different ways no different than the way that the research data would be used.  So there would be I think no distinction between them.

Laura talked about, I mean the way that currently NIH for example, these are individually initiated investigative studies from which, you know, genome-wide associate studies have been done but all the data now -- because it resides at a different location, not necessarily at the place where it's collected.  So, that's the reason why I don't think that the distinctions are real anymore.

DR. ARRAS:  Yes, and I guess my question is given that those distinctions aren't going to play much of a role anymore what are the implications of that for the level of regulation or consent restrictions that we currently apply in the clinical.

DR. GUTMANN:  So the interesting feature of our recommendations and of Raju's arguments here is that in this particular case the protections on the clinical side are greater than the protections on the research side.  And what Raju is arguing is that to give the public the trust that they should have and everybody should want them to have you need to assimilate in this particular area the safeguards in clinical work onto the safeguards of research because they've become so seamlessly intertwined.  They're not identical but they are seamlessly intertwined.  Yes, and it's an interesting result.

DR. KUCHERLAPATI:  And if I could?

DR. GUTMANN:  Please.

DR. KUCHERLAPATI:  I think that one of the things that John is talking about is actually in the case of research today that you have, you know, what, a 5- or 6-page informed consent and that you would actually explain all of the things and get them to sign everything. 

Whereas when you go to the doctor just before you go to visit your doctor you sign a form which you don't read which essentially says that the doctor can order whatever tests that are necessary.  So actually in terms of collecting data there's probably less regulation in the physician's office than there are in a research context.

I wonder whether you're asking whether one should consider having greater levels of informed consent for whole genomic information at the clinical level.  I don't know if that's what you're intending.

DR. GUTMANN:  That's the flip which is -- and it's paired with there are greater protections on the other end because of HIPAA and other kinds of regulations.  We haven't gotten into the informed consent. 

I want to thank -- I have to begin by thanking my Commission members as always for a very informative, stimulating set of questions and deliberations.  Secondly, all of the members of the audience who have been so attentive and asked questions.  And thirdly, and overwhelmingly on the part of all the Commission members the people who presented, our experts who presented today.

We will reconvene at 9 a.m. tomorrow morning but I think all members of the Commission want to thank you all for being here and thanking our presenters as well.

                            

 

 

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.