TRANSCRIPT: Meeting Six, Session 3

Human Subjects Protection Reform

Date

August 30, 2011

Location

Washington, DC

Presenters

Ezekiel J. Emanuel, M.D., Ph.D.

Former Chief, Clinical Center Department of Bioethics, National Institutes of Health

Download the Transcript

Transcript

 

                      Our next speaker -- our first speaker

  today -- is Dr. Ezekiel Emanuel.  Zeke was one of the

  driving forces behind this effort while he was working

  at the White House last year.

            Dr. Emanuel is the former head of the

  department of bioethics at the clinical center of the

  National Institutes of Health, and he is an oncologist

  who also has a Ph.D. in a field that is dear to me as

  well, political philosophy.  Dr. Emanuel has published

  widely on the ethics of clinical research, health care

  reform, international research ethics, end of life care

  issues, euthanasia, the ethics of managed care, and the

  physician-patient relationship.

            Dr. Emanuel previously served on President

  Clinton's health care task force, the National

  Bioethics Advisory Committee, and on the bioethics

  panel of the Pan American Health Care Organization.  As

  of September 1st, Dr. Emanuel will be the Robert and

  Diane Levy University Professor at the University of

  Pennsylvania, and the chair of the -- of Penn's Medical

  Ethics and Health Policy department, and Vice Provost

  for Global Initiatives.

            Welcome, Zeke.

            DR. EMANUEL:  Thank you.  I am going to stand,

  if you don't mind.  My high school debate coach would

  kill me for sitting and talking.

            So, I'm here to talk about the ANPRM, which

  was developed and then published this July.  So let's

  see if I can handle all this.  Just want to go through

  a brief history to remind you of where we've been, and

  also where the regulations that currently exist came

  from.

            Many of -- you know the famous Beecher article

  in the New England Journal highlighting 22 scandals, if

  you will, in research at leading universities, then the

  release of the 19 -- the Tuskegee scandal in 1972,

  leading to the national commission, one of your

  predecessors, which published its final report, the

  Belmont Report, establishing a ethical framework for

  human subjects protections, leading to HHS adopting

  regulations for itself.  Those regulations, after 10

  years, being adopted by 14 agencies -- or part A of

  those regulations being adopted by 14 agencies.

            So, let me just emphasize, first of all, that

  10-year gap takes a long time to get a lot of agencies

  to work together to adopt regulations.  Second, it's

  only 14 agencies, it's not all of the Federal

  Government, as many of us think it should be.

            There are -- after 30 years of seeing this in

  action, as it were, a lot of problems have been

  identified with the Common Rule and the regulation:

  inadequate IRB time devoted to review of high-risk

  studies; time-consuming reviews of -- and continuing

  reviews of low-risk or no-risk research; such as

  surveys; inconsistent IRB practices regarding research

  with biospecimens, claims data, medical records data

   -- a lot of this pre-existing data has been a problem;

  multiple reviews of multi-center trials.

            Many of us have had the experience of spending

  basically a whole year of getting our protocols

  approved before we can start, because we're going

  through a number of different institutions, and there

  is no evidence that it reduces the risk or enhances the

  protections -- as a matter of fact, a lot of evidence

  that it doesn't.

            Informed consent documents that we know, over

  time, become longer and longer, and written at a very

  high grade level, where the boiler plate actually tends

  to be the worst part of it.  Lack of data.  We simply

  don't know a lot of data about the actual risks of

  research.  We see scandals, or we see something bad

  happening, and we have no context to put it into.

            And then we have increasing evidence of what I

  will call -- probably contentiously, a little

  bit -- evasion of IRB review by a lot of people trying

  to say what they are doing is not research, so that

  they can avoid what is increasingly seen as an onerous

  process.

            When we convened an interagency working group

  to examine this and see if we could actually reform the

  system, we had two goals.  One was to enhance the

  protection of research participants, and the other was

  to improve the efficiency of the review process.  We

  did not view these as contradictory.  In fact, we

  viewed these as synergistic.  If you actually improve

  the efficiency, you could spend more time on the really

  risk stuff, so that you could actually enhance

  protections.

            I am going to go through some of the specific

  reforms, to give people a sense.  This may be the most

  important, which is to get a risk-based review process.

  The Institute of Medicine had a group about a decade

  ago look at the oversight process, and recommended a

  risk-based review process, although didn't flesh it

  out.

            Here we've tried to flesh it out in the ANPRM,

  suggesting that greater-than-minimal risk research

  basically received the current protections, which is

  full IRB review, annual review.  However, getting rid

  of the annual review when all you're doing is sort of

  standard clinical follow-up, or analyzing the data.

            Things that are less than -- are minimal risk

  or less -- that is, they're no more risky than everyday

  life, which itself is somewhat risky -- you get

  expedited review by one person with the option to send

  to full IRB review, if for some reason that person

  thinks this is -- needs more attention -- and no annual

  review, again, unless explicitly justified, because

  it's going to be risky.

            And then there is a lot of research which is

  only what we call information risk research.  That is,

  unauthorized disclosure is where the risk is.  That

  research would not have an IRB review, but would have

  to follow standardized data security measures loosely

  based upon HIPAA kind of measures is the proposal.  And

  the idea there is you're not actually exposing people

  to physical or psychological risks, and that

  you -- having standardized data security -- is the best

  way to protect people from information risk.

            So, surveys, focus groups, interviews, maybe

  economic and psychological studies with mentally

  competent adults would be excused from IRB review as a

  result, because it's really only information risk, but

  adhere to strict data security standards.  Research,

  therefore, based on secondary use of existing

  data -- which, again, doesn't have any -- expose people

  to physical or psychological risk, would qualify for

  this excused status, which is a new status.

            Written consent would be required for all uses

  of biospecimens with identifiers or without identifiers

  because, in our view, all biospecimens in the near

  future are going to become identifiable with either

  existing technology or soon-to-be-developed technology.

  So the distinction there between identifiable and not,

  I think, really has to go away.

            And consent would be a sort of standardized,

  general, open-ended consent, not a check box,

  complicated check box, and not for each specific use,

  which would be obtained at the time of admission to a

  hospital or clinic is the proposal.

            Specific reforms would include

  also -- multi-site research would get only one IRB of

  record.  You wouldn't have to go to 80, or 100, or 120.

  And institutions would obviously have to decide whether

  they want to participate in a protocol, but that's

  different than getting an IRB review.

            One of the things we've identified is the fact

  that the minimal risk interventions that determine

  what's minimal risk and can get expedited review

  actually hasn't been updated in over a decade -- in 13

  years to be specific.  And so the proposal is to stand

  up a federal committee that would regularly update

  these minimal risk interventions, based upon data

  submitted or data in the literature, so that you

  actually have a learning process that would be constant

  and dynamic to reflect actual risk.

            Almost everyone who has commented on human

  subjects research protections has pointed out that, you

  know, the rules apply only to that stuff which is

  federally funded or going for FDA approval.  There is

  lots of research that is not covered by these

  regulations, people bemoan that.  Actually, to cover

  that would require legislation by Congress.  You can

  tell me how likely you think that is.

            But in lieu of that, you could actually

  require institutions to have all their research,

  whether federally funded or not, going for FDA approval

  or not, fall under the Common Rule.  Now, that

  wouldn't -- still wouldn't get the entire universe, but

  it would get pretty close to the entire universe.  It's

  a regulatory way of trying to achieve the goal,

  incrementally.

            We have also proposed an electronic adverse

  event reporting system to develop a web-based reporting

  system that would permit constant input of information

  on research that is being conducted.  So we would

  actually have systematic and pretty comprehensive data

  on adverse events that would allow us to identify which

  research actually is low-risk, contrary to

  expectations, and which research may be hot spots of

  risk that we probably need to spend more time and

  attention with.

            Right now, all we have on that is your gut

  reaction and my gut reaction, which is worthless, in my

  opinion.  So we really do need to have a constant way

  of collecting this kind of data.

            There is also several suggestions related to

  improving the informed consent documents to having

  explicit delineation of information that must be in

  documents, creating more standardized templates at the

  Federal Government level, so that people can be sure of

  what would qualify, and having all consent for adults

  who participate in surveys, focus groups, and similar

  types of research, because, after all, their answers

  suggest consent.  And if you don't get their answers,

  you don't get data.  So this seems like one of the

  areas we can both streamline and increase protections.

            Standardized protection for information risk.

  Right now IRBs decide how data security is going to be

  done.  Typically, they're not composed of experts in

  information technology and data security.  So the

  proposal is to require institutions to implement

  HIPAA-like data security standards for research, posing

  only information risks, so that everyone knows what the

  rules are.  This should also facilitate the exchange of

  samples, the exchange of data, because everyone will be

  under the same regime.

            This comes to, I think, the reason I was

  invited, which is what can the Presidential Commission

  add?

            So, here is a -- I will be very frank.  Here

  is my warning to you.  I have two warnings.  The first

  is whenever there is a call for a response to a problem

  or a scandal like the Guatemala situation, there is a

  tendency to just add more regulations, another layer.

  "We're being tough."  "Here is increased requirements."

  I think this is part of the way we get burdensome,

  inefficient regulations that end up not really

  protecting people, but sort of satisfy a one-day need

  for a headline of, "We're Doing More."

            I think the correct response is to reevaluate

  the whole package of protections to see what's helpful,

  what's unnecessarily burdensome, and actually, where we

  can make good, productive changes.

            So, I think one thing that would be helpful is

  for the commission to look at the ANPRM and endorse the

  need to reevaluate existing regulations in general, and

  then to go through and maybe take some stands on some

  of the things that are in there, in particular, and add

  to -- given your experience now -- your voice to

  whatever you think is good, and maybe criticize what

  you don't think is good.

            I would say the interagency group that I had

  chaired at OMB on the -- to develop the ANPRM, we

  recognized that we couldn't address all the issues that

  needed reform, that if we wanted to get something out,

  we had to be efficient and focused.  And we explicitly

  set aside issues related to international research as

  something that we couldn't address in this time.

            One area that we had raised and set aside, but

  thought really needed attention -- which I am going to

  suggest to you -- is the area of equivalent

  protections.  That is, when you go and look overseas at

  other countries and research conducted in other

  countries, there are provisions for countries that have

  equivalent protections for us to recognize those

  protections, and not impose our own regulations.

            Here is my second warning.  If you take up

  that suggestion, don't kick the ball down the road to a

  future committee, say, "Oh, someone should look at

  future protections, that's a really important area."

  You shouldn't simply identify it as an important area.

  Once, a long time ago, the IOM had another commission

  which sort of identified a number of -- was supposed to

  look at regulations or what needed change, identified a

  number, but didn't propose what the changes should be.

  We need the work.  What will qualify as equivalent

  protection?

            So, here is a suggestion, all right?  Solve

  the problem.  Delineate the principle or principles for

  determining what should qualify as equivalent

  protections to our Common Rule.

            Specify what differences and regulations are

  not ethically significant, we shouldn't worry about

  them; whether difference in continuing review time

  lines or processes for certification are necessary or

  different.  Are they significant?  Yes, they differ

  from ours, but are the ethically significant, or do

  they still provide equivalent protections?

            Evaluate, in particular, whether certain

  current international regulations like ICH or the

  European Union's regulation offer these equivalent

  protections or not, or where slight changes would

  qualify as equivalent protections.

            I'm not actually going to talk about this.

  Okay?  So that's one suggestions, or two suggestions, I

  guess, for you.

            DR. GUTMANN:  Thank you very much.  Before we

  engage in discussion of this, let me just say to

  everybody attending that our system of taking public

  comment would be if you have a question or a comment

  relevant to this or any other session, we have cards

  available.  Please write your name and the question or

  comment down, and give it to any staff member here.

            Will people who are members of the staff

  please stand up, so -- there they are.  So there are

  staff members in every corner.  And feel free to write

  a question or comment and put it on, and they will

  deliver them up here, so we will know, and we will

  engage with those as our time permits.

            Let me just -- let me begin, and then see what

  other members of the commission have to say.  But

  actually, before I do that, what I didn't do yesterday,

  which I meant to do, was ask the members of the

  commission to introduce themselves.  So, Anita, would

  you begin?  And we will go around the table.

            DR. ALLEN:  Thank you, Amy.  I am Anita Allen,

  professor of law at the University of Pennsylvania Law

  School.

            DR. ARRAS:  John Arras, professor of

  philosophy, University of Virginia.

            DR. ATKINSON:  Barbara Atkinson, executive

  vice chancellor and dean of medicine at the University

  of Kansas Medical Center.

            DR. MICHAEL:  Nelson Michael, director of the

  U.S. military HIV research program at the Walter Reed

  Army Institute of Research.

            DR. FARAHANY:  Nita Farahany, a professor of

  law and philosophy at Vanderbilt University.

            DR. WAGNER:  Jim Wagner, serving as president

  of Emory University.

            MS. ALI:  Hi.  Lonnie Ali.  I am a caregiver

  and an advocate for Parkinson's research.

            DR. HAUSER:  Steve Hauser, chair of neurology,

  UC San Francisco.

            DR. GRADY:  Christine Grady, the department of

  bioethics at the NIH Clinical Center.

            DR. KUCHERLAPATI:  Raju Kucherlapati,

  professor of genetics and medicine at Harvard Medical

  School.

            DR. GUTMANN:  So, Zeke, you mentioned -- and

  we are all acutely aware of -- not only that there are many

  rules and versions of them in different agencies, but

  there is a lot of clinical research concerning human

  subjects that goes on, sponsored by different

  government agencies, and also privately sponsored.

            One of your specific reforms is an electronic

  adverse event reporting system, which would be

  terrific, if, you know, we had it.  The question I have

  is, do we need, before you get an adverse event

  reporting system, an electronic system that enables us

  to know what experiments are going on with human

  subjects that are  -- that's just sponsored by the U.S.

  Government.

            We are engaged in an empirical background

  study because we've been asked to assure the President

  that these studies are sound.  And there is no database

  for them.  We know clinicaltrials.gov, but that

  doesn't -- it's not at all comprehensive.

            DR. EMANUEL:  I have certainly made this

  comment for, I think, the last 15 years, that I think

  it's a scandal that neither the head of the FDA nor the

  head of the NIH can actually report how many people are

  on clinical research trials sponsored, how many people

  have had an adverse event, and tragically, how many

  people may have died, or any other relevant piece of

  data.

            I actually -- so my view of the adverse event

  reporting system is actually that it would do both.

  You would actually know how many people are enrolled

  in -- depending on how you create that -- phase one,

  two, three clinical research trials, if you want to add

  observational studies and other things which are low

  risk.  You could.  But at least on those trials -- so

  we would both know who is on and have a, therefore, a

  denominator.  If you can have an adverse event

  reporting system, it will only produce meaningful data

  if you actually have a denominator.  So you would

  actually have to have both, which is know how many

  people --

            DR. GUTMANN:  Right.

            DR. EMANUEL:  -- are enrolled.  I have

  actually also --

            DR. GUTMANN:  That's precisely why I asked

  the --

            DR. EMANUEL:  Right, right.

            DR. GUTMANN:  -- this question.  You have to

  have the denominator base.

            DR. EMANUEL:  Right, right.

            DR. GUTMANN:  Which is really, in this day and

  age, with computerized systems, should be on the order

  of easy from the scale of easy to hard to implement.

            DR. EMANUEL:  Well, I do think --

            DR. GUTMANN:  Right?  It's much harder to do

  retrospectively than it is --

            DR. EMANUEL:  Right, in --

            DR. GUTMANN:  -- to do prospectively.

            DR. EMANUEL:  In fairness, because the

  enrollment is distributed in tens of thousands of

  places, in trials, it's not -- you actually have to

  oversee and get those people -- or have some

  requirement for them to actually introduce the data in

  a common format, et cetera.  But I agree with you,

  given the fact that we do have the Web, and it should

  be relatively easy.

            Fortunately, the other thing is six federal

  agencies, including the NIH, FDA, OHRP, the VA, and

  DoD, have pioneered a template called the Basel Adverse

  Event Reporting System, which is now being beta tested

  related to genetic -- gene therapy studies, which I

  think is a good platform that we can build on.

            So, for a long time I have agreed with that

  statement.

            DR. GUTMANN:  So it's doable?

            DR. EMANUEL:  I think it's doable.  I mean

  it's not going to be free, like everything.  But the

  question is, isn't that a sort of -- it's the sort of

  basic amount of data you need to really analyze the

  system and its safety.

            DR. GUTMANN:  Correct.

            DR. EMANUEL:  And then I think we'll find out.

            DR. GUTMANN:  Correct, correct.

            DR. EMANUEL:  How safe is the system, and

  also, where do we focus the resources to make it safer?

            DR. GUTMANN:  Correct.

            DR. EMANUEL:  And where can we sort of, as it

  were, not have to spend a lot of resources, because it

  already is safe, and it's not going to be a problem,

  and we can target the limited resources in a more

  effective manner for protecting people.

            DR. GUTMANN:  Thank you.  Jim?

            DR. WAGNER:  Zeke, thanks for the presentation

  and overview -- very clear -- about where we hope to go

  with these reforms.  And I appreciate also the two

  suggested charges for the commission to offer an

  opinion on.

            I was imagining that you might ask also our

  opinion, or for us to say something about the notion

  that the Common Rule or the revised Common Rule might

  be applicable to non-federally-funded research, as

  well.

            As I contemplate that challenge, I wonder what

  is being thought about.  What do others imagine

  the -- anticipate that the mechanisms of accountability

  would be for the Federal Government to try to impose

  the common rule on research that they, themselves, are

  not supporting?

            DR. EMANUEL:  Well, I mean there is a -- I

  would presume you could create a situation where there

  are a whole series of penalties that aren't just, you

  know, turning off the federal money spigot, which is

  the sort of common penalty at the moment, suspending

  your ability to --

            DR. WAGNER:  So essentially, a criminal

  mechanism?

            DR. EMANUEL:  Well, it could be, I presume,

  civil fines, as well as criminal penalties.  I'm not an

  expert, I'm not a lawyer in how you might -- you know,

  the administrative law of this.  But we have lots of

  other, you know, either financial or other penalties

  that people -- could be imposed upon institutions,

  so --

            DR. WAGNER:  It was a financial category that

  I was hoping you had some creative thoughts about,

  because obviously that's the mechanism that we have for

  those that are federally funded.  And even for

  institutions like universities who may be performing

  work that's not directly federally funded, but owing to

  the fact that we are under -- you know, that we have

  large amounts of federal funding, I see good ways to

  put teeth into the universities.

            But into private institutions --

            DR. EMANUEL:  Well, look.  I think fines are

  possible.  So a large category which don't receive

  federal funds are sort of IVF clinics, which, because

  they're involved in reproduction, tend to fall outside

  of almost all our regulations, because we can't agree

  on what should happen there.

            DR. WAGNER:  Good point.

            DR. EMANUEL:  So that would be a case, it

  seems to me, of where you might have sort of civil

  monetary penalties imposed.

            DR. WAGNER:  So, in your view, we shouldn't

  hesitate to make some recommendation about broader

  applicability of the revised Common Rule, simply

  because we anticipate it would be difficult to enforce?

  You feel there is mechanisms -- your opinion --

            DR. EMANUEL:  Yeah, I guess if I were hesitant

  on that, it would mostly be because it would involve

  legislation, and I just want to be practical.  Let's do

  what we can do, and let's not sort of --

            DR. WAGNER:  So --

            DR. EMANUEL:  -- windmills, which just aren't

  likely to happen.  I mean that call has been out there

  for decades and, you know, just not going anywhere.  We

  should be more practical, and let's get important work

  done that we -- that is within our purview.

            DR. WAGNER:  Thanks.

            DR. GUTMANN:  Raju?

            DR. KUCHERLAPATI:  Thank you very much.  You

  know, when the commission was thinking about, you know,

  the kinds of topics that it wanted to examine, and I

  think when we talked with colleagues, this was a very

  important issue to try to reexamine the Common Rule, so I would greatly

  appreciate the efforts that you have described ongoing.

            The question that I have is that to -- many

  people argue that to improve human health, that you

  need to use humans more as experimental organisms.  And

  if that is, indeed, the case -- I don't mean it in a

  bad way, but in a good way -- that if we were to do

  that, that means that, you know, we're going to have a

  lot more humans participating in these types of

  studies.

            Do you think that we have adequate amount of

  infrastructure to be able to handle that, or that these

  new proposed rules would be able to deal with that

  increased number of individuals who might wish to

  participate in these types of studies?

            DR. EMANUEL:  Well, that's probably well

  beyond my expertise, but let me say it never stopped me

  from making comments before.

            So let me just say a lot of it depends upon

  the kind of research you have in mind.  Some of it is

  much more intensive in the laboratory.  Some of it is

  observational, some of it is more epidemiological, and

  sort of scaling it up and scaling it down

  is -- requires less bricks and mortar, as it were.  And

  the infrastructure, I think, is available.

            So, I am less concerned about the -- you know,

  do we have the capacity, it seems to me, than I am

  about can we make the oversight both better and more

  efficient.

            I mean one of the things that I think strikes

  many people is if you have a relatively large

  multi-centers trial, getting it from sort of

  protocol-written to running is probably a two-year

  process.  That seems crazy, from all sorts of

  standpoints.  It's a waste of money, it's a waste of

  science, since, by the time you get it up and running

  you may be behind the curve.  And it seems to me that

  it doesn't -- that two years is probably not adding to

  protections.  And that is, I think, what -- in my view,

  that is the biggest lesion we have.

            DR. GUTMANN:  Yeah.  Nita?

            DR. FARAHANY:  Thank you for the presentation

  on this.  I am grateful for all of the work that you've

  done on streamlining, particularly the Common Rule and

  the recommendations for doing so.  A lot of what we've

  heard is that many of the regulations, as they exist,

  are quite cumbersome and difficult to comply with, and

  I think bringing it in line with the rationale for

  protection, particularly in areas like creating

  exceptions for surveys makes tremendous sense.

            One of the things that we have heard a lot and

  been struggling with ourselves is thinking about how do

  you make the ethics requirements actual -- not just,

  you know, check-boxes that people sign off on, but how

  people actually understand that they are designed to

  protect human subjects, such that researchers are

  actually seeking to do that, rather than just check off

  boxes.  It seems like some of the revisions that you're

  suggesting helps to do that by getting rid of

  regulations where it doesn't make sense.

            But how do you ensure that those regulations

  that do exist, particularly when you extend it to new

  institutions, becomes more than just a check-box, and

  instead, really a consideration by researchers about

  how to ensure the protection of human research

  subjects?

            DR. EMANUEL:  So you're getting into human

  psychology and institutional design, another area I

  have no expertise in, but it will not prevent me from

  making more comments.

            So, the first thing is I do think we have

  entered what, what in my view, is a dangerous place, which

  is increasingly I do think researchers view this -- two

  things -- first, as an onerous hurdle to get over, and

  therefore the check-box mentality comes more into play,

  especially if they can't see the rationale between what

  they're doing the research on and the regulations.

            And so, I do think actually, ironically,

  slimming down the -- or not slimming down, but focusing

  the full IRB review on those things which are truly

  greater than minimal risk is actually going to help

  with compliance.  You will also, therefore, I think,

  see less of the attempts to contort things and say,

  "Well, it's not really research, this is other kinds of

  work quality improvement," or whatever, so I can evade

  the rules.  And so, I do think that, in and of itself,

  is going to be helpful.

            The other thing which I find, from both

  teaching researchers a lot in this area and just

  talking to them, is the rationale for why they're

  supposed to do this is disconnected from what they're

  supposed to do.  And so, if you don't have a good

  justification -- I mean these are really relatively

  intelligent people.  This isn't their everyday world,

  but they do understand justifications, and they do

  understand, yes, if I did this it would be better.

            The problem is, we have a situation where, if

  I did this, it's either not going to be better, or

  going to be worse, and it's going to make my life hell.

  So, I think that is a problem, and leads to a certain

  kind of disrespect for the rule.

            So, I think if we actually connect the

  justification with the protections, that will help,

  itself, in establishing – is that a problem?

            DR. WAGNER:  No, I just wanted -- real quickly

  to that point, we were having this conversation about

  ensuring -- trying to reconcile the divorce between

  regulation and rationale.  And it seems that -- I hope

  what I hear you saying is that both parties need to be

  modified to mend this marriage.

            In other words, simply to reinsert the

  rationale around each of the tic marks just makes for

  more reading, and no less onerous responsibility.

            DR. EMANUEL:  Right, right.

            DR. WAGNER:  But rather, that both sides need

  to be revised, the number of tic marks and what they

  really amount to, in terms of a pledge that says, "I am

  satisfying a particular rationale," as opposed to many,

  many, many tic marks that ensure that you don't have to

  think.

            DR. EMANUEL:  Right, right.

            DR. WAGNER:  Or are intended to ensure that

  you don't have to think about the rationale, because

  someone else has imagined that if you check all those

  properly, you are in compliance.

            DR. EMANUEL:  Well, I agree with you.  In

  general, I don't think most clinical researchers are

  sort of -- malicious people who just try to get rid

  of -- I mean if they are evading rules, you have to

  think there must be some reason that normally otherwise

  pretty good people who do the right thing most of the

  time are really trying to get around this.  And that

  has to be that this makes no sense to them, and is

  really looking like it's just trying to impede them for

  no good reason.  That is a bad place to be, I think,

  institutionally.  Right, right.  You have to -- right,

  you have to change the rule.

            And so, I do think that, to those people who

  say, "Well, we can do all this under the current regs,"

  I actually think that's -- first of all, I think it's

  wrong; you cannot change the defaults.

            Part of what the ANPRM is trying to do is to

  change the defaults.  The default of a survey is you

  will get oral consent by people actually answering your

  question, and you don't need an IRB, you can adhere to

  the data safety monitoring -- the data security rules.

  Or, if you're doing minimal risk, you get expedited

  review, and you fill out a shorter form.  That change,

  I think, will help substantially, because people will

  understand how -- what they're doing is linked to how

  they're being regulated.

            DR. GUTMANN:  So, this is very helpful,

  because --

            DR. EMANUEL:  Sorry.

            DR. GUTMANN:  -- this is -- no, this is

  something that is a very important broad theme that we

  have heard over and over again.  And we really, I

  think, along with the reforms you are proposing, need

  to recommend something quite broad with some specifics

  attached to it on how to make progress here.

            Because we've been moving as, you know,

  science and medicine -- if you take what Raju said, we

  need more and more of this kind of research, and yet

  the spirit behind it of doing good and doing good in an

  ethical way is not being promulgated through the way

  our rules are.

            Let me take -- we have three questions and

  comments from those in attendance here.  And let me

  read them -- let me take them one at a time and see

  if -- most of them are directed at you, some of them

  are directed at the commission, but let me begin and

  you can reply.

            This one is from Ruth Macklin, who is a

  professor, we all know, at Albert Einstein College of

  Medicine, a professor of bioethics.  Ruth, would you

  stand up?  There is Ruth Macklin.  I will just

  summarize it, so -- and Zeke can answer.

            Some social science research has risks that

  fall between physical risks typical of drug studies,

  and mere informational risks.  Examples include

  domestic violence, adolescents engaged in high-risk

  behavior, research involving people engaged in illegal

  activities -- for example, drug use, sex work.  How do

  you -- and now I'm just -- how do you deal with that,

  given that you want to cordon off certain clinical

  trials for the more extensive -- this is a question

  that is -- you had to have thought about, because

  whenever you draw lines --

            DR. EMANUEL:  You see all this gray hair?

            DR. GUTMANN:  Yeah.

            DR. EMANUEL:  So, first of all, the ANPRM

  specifically asks this question, which is, "In the case

  of surveys and other psychological" -- where the main

  risks are psychological -- "how do you identify what

  would be greater than minimal risk in that sphere?"  We

  don't think that there is a great example.  That is the

  first thing.

            The second thing is notice that I think I said

  on the surveys focus group that you would not have to

  go through an IRB, or be excused from the IRB system

  under the new proposals.  I think I restrict it to

  competent adults, and the ANPRM does restrict it to

  competent adults, and that is for a reason.  You might

  want to have additional protections for children.

            But in some cases we know that, for kids who

  are engaged in high-risk behaviors and other things, we

  do want to actually survey them.  And the question is

  whether it's a higher risk not to have the information

  from them, or --

            DR. GUTMANN:  Yeah.

            DR. EMANUEL:  -- at better -- or that they're

  pretty savvy and will be able to differentiate whether

  they want to participate or not.

            So I think, you know, one way -- place to

  start is let's say, for some -- maybe domestic

  violence, sex abuse, illegal behaviors, initially

  you're going to put it into the minimal risk but not

  greater than minimal risk category.  It gets reviewed

  by a person, that person can decide whether it really

  needs full IRB review or not.

            And then, by God, we'll be able to study this,

  and we'll have an adverse event reporting system, and

  we will understand whether we're getting a lot of

  adverse events or, in fact, whether this is actually,

  you know, really not that risky and people are

  participating.

            DR. GUTMANN:  When you say "a person," I

  assume you're going to ask for somebody who is

  independent of the equivalent of a, you know, of an

  IRB.  In other words, the person is not going to be a

  person who is connected to the research itself.

            DR. EMANUEL:  So -- right.  The idea is --

            DR. GUTMANN:  Because the whole --

            DR. EMANUEL:  Right.

            DR. GUTMANN:  I think the thrust behind this

  question is are you going to leave it just to the

  researchers themselves to determine this, and

  this -- the fact that you're asking for an expedited --

            DR. EMANUEL:  Can we go back to the slides for

  a sec?

            So, one thing I probably didn't emphasize

  enough is this slide.  So, in the greater-than-minimal

  risk, you go to a full IRB review, which is the current

  system, basically, and you -- all that's being altered

  here is the annual review process.  So when you're not

  actually exposing people to additional risks, you don't

  have to have a -- the minimal risk, you get reviewed by

  one trained person.  Typically, that person is either

  an IRB member or someone in the protocol office,

  unrelated.

            Now, in the information risk, the -- ANPRM

  suggests as a possibility for people to comment on, you

  would have a form that you would have to register with

  the IRB.  You can imagine two possibilities:  you

  register with the IRB and you can either start

  immediately; or, you give the IRB a week to sort of

  look at the form and say, "Hey, you know, we think that

  this should go into the -- either a full IRB review, or

  one trained person."

            DR. GUTMANN:  Good.  Okay.  For Zeke, this is

  from Roger Glass, director, NIH Fogarty Center.  Many

  companies are taking their clinical trials overseas to

  avoid regulation, scrutiny, and ethical issues, such as

  paying high fees for doctors to recruit patients.  Much

  of this data remains unpublished, especially if

  it’s -- now my -- I don't know whether to blame the

  handwriting or my ability to read it, or -- oh, sure.

  Especially if its results are negative or if adverse

  events occur.  How does a Common Rule reform affect

  this form of research?

            DR. EMANUEL:  Well, if in fact --

            DR. GUTMANN:  Roger, would you please stand

  up, so people -- there is Roger.

            DR. EMANUEL:  If, in fact, the companies are

  seeking FDA approval, typically they're going to have

  to register with the FDA.  But again -- well, not

  again.  This -- ANPRM may not solve all problems in the

  entire universe of problems here, so it's -- I think we

  said we didn't tackle everything, that's why we didn't

  do equivalent protections and other things.

            So, I think there could be problems, and

  especially, you know, if people are intent on evading

  reporting and releasing data, there is going to

  be -- it's going to be really hard to enforce.  You're

  not going to develop rules for that problem.  So it's a

  problem which revision of the regulations is not going

  to solve, it seems to me.

            On the other hand, if they are planning to go

  to the FDA, they are going to have to register and use

  the --

            DR. GUTMANN:  Yeah.

            DR. EMANUEL:  -- God willing, the adverse

  event reporting system.

            DR. GUTMANN:  Yeah.  The third question I'm

  going to say is from Joseph Millum of the NIH, and it's

  about equivalent protections.  And I'm going to save it

  for the next session.  Joseph, just stand up so I can

  see where you are.  Joseph, we will get back to this

  when Christine and Nelson speak, because it's actually

  directed to the commission.  So we will hold off.

            I'm going to take two quick comments or

  questions, and we're going to eat into our later break.

  Anita and John?

            DR. ALLEN:  I have a question for you, Zeke,

  about your recommendations regarding lightening the IRB

  requirements for multi-site research.  I understand the

  problem.  It seems absurd, in a way, to have 25

  different IRBs looking at the same research protocol.

            But I'm a little bit concerned about what we

  might call IRB shopping.  Not all IRBs are created

  equally.  There is tremendous variation in quality and

  composition.  Isn't there a risk if we limit the need

  for IRB oversight to, say, one IRB for a multi-site

  project, that we will have some pernicious IRB

  shopping, and the researchers will look for the easiest

  IRB to get through, the quickest ones, and not

  necessarily the best ones?

            DR. EMANUEL:  I would have thought that was

  the problem today, when you want to get through -- you

  have to get through 100, so you're looking for the

  easiest ones.  But the ANPRM does ask about how to

  limit IRB shopping, and actually has several proposals

  or suggestions for how you might identify which IRB you

  would have to go through.

            So, we could consider the following.  You have

  to go through a national IRB, like the NCI's national

  IRB for cancer, multi-center cancer trials.  Another

  possibility is you have to go to the IRB for the PI,

  the principal investigator, so that, you know, you

  would be choosing your principal

  investigator -- presumably that's on scientific

  expertise, and not on who their IRB is.  But maybe

  there is another good suggestion you have that could be

  added there.  I think this is not a trivial problem,

  but I do also think that there are some reasonable

  solutions.

            Let me also say I think the other problem is

  that, as much as this is commented on, let me just say

  industry has a very big reason not to sort of stint on

  this.  Because if something goes bad, and they have

  been seen to do a malicious thing like trying to skirt

  the system, they're going to lose hundreds of millions

  of dollars.  So they're not into that game.  And I

  think most clinical investigators, it's the IRB of

  their home institution that they're going to.  And

  under this system, you know, presumably they'll have

  more time to focus on the really high-risk research.

            And so -- and since you're only going through

  one, you're not going to be wasting so much time in

  others.  So I think the incentive to sort of evade is

  going to actually go down, rather than up.

            DR. GUTMANN:  John?

            DR. ARRAS:  Yeah.  Zeke, thanks for a really

  helpful presentation.  I really do appreciate the drive

  toward simplification, but I am wondering if it might

  extend a bit far in one area that I can think of, which

  is the area of kind of blanket generic consent for

  research on tissue samples, right?

            So, you know, we have some case studies out

  there where people gave consent to study their samples.

  The Havasupai Tribe comes to mind as an interesting

  case study, where they gave consent for, I believe it

  was, diabetes research.  And then researchers turned

  their attention to the linkages between this tribe's

  DNA and psychiatric conditions.

            How would your approach deal with a case like

  that?

            DR. EMANUEL:  What -- I mean I don't

  understand what the dilemma is here.  The researchers

  collected under one rubric, and then they bait and

  switched to do a different kind of research.  That's

  not permitted.

            Now, if you --

            DR. ARRAS:  Okay --

            DR. EMANUEL:  So, here is what the

  tribe -- they don't want their stuff done for general

  research.  Guess what?  They don't sign that piece of

  paper, and they don't give their consent.  It's just

  the old, normal way.

            It seems to me, in that case -- I don't

  actually understand all the hoopla about that case, to

  be honest, in the following sense.  They didn't

  actually consent for the research that was done, as I

  understand it.  I'm not an expert in the case.  They

  consented to diabetes research.  This wasn't diabetes

  research.  Guess what?  Didn't qualify.

            DR. ARRAS:  Okay.  So then I guess --

            DR. EMANUEL:  In the new system there will be

  a form.  You don't want your stuff used for other

  research, you don't have to sign the form.

            DR. ARRAS:  Oh, okay.  No -- so when you said

  there would be a kind of generic consent form --

            DR. EMANUEL:  It's not required consent.

            DR. ARRAS:  -- I just sort of assumed that

  would mean that you gave permission for any and all

  kind of research in the future.

            DR. EMANUEL:  Yeah, you would.  And if you

  didn't want any and all research done, you wouldn't

  sign that form.  Right?  It's amazing how that works.

  When you don't give your consent, you're not supposed

  to do it.

            DR. ARRAS:  Okay.

            DR. EMANUEL:  So, I mean, it seems to me that

  the Native American tribes that are worried about their

  samples being used for something that they didn't

  authorize, you're right, it ought not to be used for

  something they didn't authorize.  And this is giving

  them an opportunity.

            What we know from the data that's been -- you

  know, now there are thousands and thousands, tens of

  thousands of people, who have actually answered

  questions on this.  In general, where "in general"

  means 80 to 90 percent of Americans, want their samples

  to be used for research, what they want to be asked is,

  "Will you use it for research or not?"  They don't want

  to be asked, "Which lab is going to do it, which

  disease is going to do it," and all that other stuff

  that had been suggested for the -- lo, these 15 years.

            That's what this ANPRM -- it's actually

  listening to what people say in their mind is important

  consent, and trying to put it into practice.  If you

  don't want your sample used for determining who your

  ancestors were, there is a real simple thing.  Just

  don't agree to it.

            DR. ARRAS:  Okay.

            DR. EMANUEL:  That's what the word "consent"

  is.  It's not trying to take away consent, it's trying

  to say, "This is what the consent is going to cover."

            DR. GUTMANN:  Zeke, thank you very much.

            DR. EMANUEL:  No problem.

            DR. GUTMANN:  It was very helpful, thank you.

            (Applause.)

 

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.