TRANSCRIPT: Meeting 3, Sessions 1 & 2

Commission Deliberations and Public Comment

Date

November 16, 2010

Location

Atlanta, Ga.

Transcript

 

Amy Gutmann:
We’re going to begin with the first principle of the five. These are not in any priority order. These are five principles that we think together can be recognized and are largely compatible. The idea of Public Beneficence is to act so as to maximize public benefit and minimize public harm. The principle encompasses the duty of the society and of governments to promote intellectual activities and institutional practices, including scientific and biomedical research, that have great potential to improve the public’s wellbeing. The Belmont Report, which brought this principle to the fore as far a presidential commission report, was directed to research involving human subjects. It defines beneficence in essence to require that, and I quote, “Person’s are treated in an ethical manner, not only by respecting their decisions and protecting them from harm, but also by making efforts to secure their wellbeing.” There are two general rules that stem from this principle that we have taken very seriously. The first rule is simply stated, do no harm. The second rule is to maximize public benefit and minimize public harm or maximize potential public benefits and minimize potential public harm.
 
For synthetic biology and other emerging technologies, we need to apply the principle of beneficence beyond the individual level, which was the primary emphasis of the Belmont Report, because it treated conduct regarding human subject research. We have focused largely, not exclusively, on the institutional community and public level. This is an important feature of all of these principles — that they apply not only to individual actions, but also to institutional communal responsibilities as well.
 
So, to put it briefly, policy makers should adopt a societal perspective when deciding whether to pursue particular benefits of synthetic biology research in the face of risks and uncertainty. When deciding whether to restrict these pursuits, similar consideration of community interests and the potential positive and negative impacts is essential. We’re going to begin with our recommendations under Public Beneficence, and Lonnie Ali will start. We have three provisional recommendations under Public Beneficence, and I will ask three of the Commission members to present those, and then we’ll open it up for discussion and comments.
 
Lonnie Ali:
The Commission during deliberations has found that, as Amy stated, syn-bio has great potential to improve the public’s wellbeing. The impact on the public can be seen in areas of medical and healthcare, food, transportation, energy, along with other means to improve people’s lives. However, there is a diversity of interest and practitioners with public, private and what we call your do-it-yourselfers. The intermingling of academic and commercial research, both basic and applied, provides fertile ground for innovation. There was a collaboration that came together of private and public funds in the development of synthetic artemisinin, an anti-malarial drug that demonstrated how academic public and nonprofit interests have come together to promote good and global wellbeing. However, most potential products of syn-bio are in very early stages of development. Basic research is critical to further expansion of this science and the effective translation into useful products. At the same time, the area of exploration is in competition for scarce resources with other areas of science and other societal needs. In the future, decisions will be required regarding which research directions deserve funding over others. These decisions should be driven in part by which strategies offer the most promise, based on scientific technical and social considerations.
 
Our first recommendation states, through a central body, such as the Executive Office of the President — the EOP — the federal government should undertake a coordinated evaluation of current public funding for syn-bio activities, including risk assessment and the consideration of ethical and social issues to ensure effective use of public funds, increase transparency and avoid redundancy.
 
As an emerging technology, the field of synthetic bio continues to advance, resulting in novel products and applications, the ability to maintain oversight and accountability provide risk assessment while evaluating potential benefits that can contribute to the public good becomes increasingly difficult as more and more syn-bio research is funded through private dollars. This imbalance impedes the ability of any oversight agency to provide good stewardship over research efforts in the private sector that may not be visible or accessible for review.
 
In addition, privately funded research projects may not always be interested in benefitting as many people and communities as possible. Identifying potential markets here and in wealthy countries abroad may affect where private research dollars are directed. The benefits of public funding bring a measureable level of oversight on the accountability to any emerging technology while maximizing the public good. Government-funded research provides for substantial peer review, prior to funding, along with basic disclosure requirements of results that help ensure that public dollars are being directed towards research that has the greatest impact in promoting general welfare.
 
In addition, public funding of syn-bio ensures a level of transparency in accountability, which is mandatory for ensuring responsible stewardship, while gaining public trust, whereas private funds do not always allow for research to risk assessment; standards that will maximize the public benefit, while minimizing the risk associated with a new emerging technology. Centralizing the coordination of current public funding for syn-bio activity also creates efficiencies and avoids redundancy, directing scarce public funds where public benefit is maximized and potential risks are minimized. Public funding can enhance the advancement of research that addresses the world’s most pressing needs in health, energy, food production and environmental applications. Therefore, this Commission found that our recommendation was essential to the idea of Public Beneficence.
 
Stephen Hauser:
The public funding of research is extremely important. It engages all of us. It brings accountability, oversight, and focus to the field. There are some other inherent advantages of publicly funded research. There are uniform standards for protection of subjects. Often there are disclosure requirements of results and substantial peer review that determines funding and in fact, balances between levels of public and private funding make it more difficult for us as a society to be good public stewards of research. One other area where public research may be extremely important is in what’s been called the “high risk/high reward” arena. These are areas of immense importance nationally or internationally, but where market forces alone may not be adequate to encourage innovation because of high risk. Two areas where this is particularly relevant are in the development of antibiotics, where there are not financial incentives currently in place, also the very high risk to develop therapies for neurodegenerative diseases, such as Alzheimer’s disease and Parkinson’s disease, which are increasingly prevalent globally as we age. Public support is needed to ensure that we take full advantage of opportunities in synthetic biology.
 
In addition, even basic research is an important avenue for public engagement and support. Basic science is work that focuses on clarifying our understanding of fundamental principles of science in the natural world. Although direct commercial applications to basic science may not always be immediately evident, this work in many ways can be extremely valuable to society and in fact is essential if we are to be successful. The understanding of basic principles is also extremely valuable to prepare us for the emergence of unanticipated risks and problems that require rapid identification of culprits and creative responses.
 
So the second recommendation of our committee states as follows: Advancing the public good should be the primary determinant of relative public investment in synthetic biology versus other scientific activities. The NIH, Department of Energy, and other federal agencies should continue to evaluate research proposals through the peer-review mechanism and other deliberative processes created to ensure that the most promising scientific research is conducted on behalf of the public.
 
Amy Gutmann:
Thank you, Steve. The third recommendation of the three under Public Beneficence, will be presented by Anita.
 
Anita Allen:
Thank you. The principle of Public Beneficence would appear to require that researchers, inventors, patent holders, and others work together to develop and create strategies to maximize opportunities for innovation. We can all agree that innovation, and in particular, biomedical innovation, has often benefitted the public. Innovation can save lives and it can prolong lives. It can improve the quality of life. For example, a new therapeutic medicine, or vaccine, these kinds of things are clearly public goods. As a practical matter, the innovation effort requires a mix of incentives and resources, which are efficiently and productively enablers of good work. The free market has to go a long way and does go a long way to promote innovation, but it’s not enough. Innovation is often benefited from a mix of secrecy and selective disclosure and sharing. Patents, trademark, trade secrecy, and copyright comprise a legal framework that seeks to facilitate and encourage innovation.
 
Some thought needs to be given to the ways in which academic traditions, and business and industry practices, along with information costs and transactions costs and the preexisting regimes and intellectual property bear today on the availability of basic research results. There’s likely to be a role for government here. That could be education. It could be facilitation. It could be regulation. The Commission has not prejudged exactly what mix of functions are called for. Surely appropriate agencies with the right expertise and the authorization should have on their agendas an examination of whether licensing, registries, and other sharing policies and practices are in place to ensure the availability of the knowledge and the tools needed for advances in the field of synthetic biology. We heard testimony from experts who suggest to us that licensing alternatives could include methods of compulsory or bundled licensing, patent pooling and broad non-exclusive licensing for foundational technology. Because synthetic biology is, in large part, based on the application of engineering and principles through which involve the use of standardize modular parts, we are advised that the licensing of these standard components might be a good idea for us to pursue — that they might be critical. Standard-setting organizations could be formed by government, or the private sector, to set standards and negotiate with patent holders to license, and commit to licensing, their technology on reasonable and non-discriminatory bases.
 
With these ideas in mind, the Commission has come up with a recommendation that we believe will further the goal of public beneficence. That third recommendation is that through NIH and other funders, such as the Departments of Defense and Energy, and in collaboration with the US Patent and Trademark Office, the government should examine whether current research licensing and sharing practices are sufficient to ensure that basic research results involving synthetic biologies are available to promote innovation. And if not, whether additional policies or best practices are needed; again, the idea here is to make sure that sufficient sharing practices are available to encourage what’s going to be a foundation for innovation in the future. Thank you.
 
Amy Gutmann:
Raju, would you like to say anything on any of these three principles?
 
Raju Kucherlapati:
No, they’re good.
 
Amy Gutmann:
I’m going to open it up to comments from Commission members and then move to the audience as well. Who would like to begin?
 
Nita Farahany:
Thank you for presenting those so well and capturing our shared views on this. I have just a few comments about this section. First, I think its right that we need to be concerned about public beneficence. As we think about the different mechanisms to do so, I think that we have it largely right. The language of the recommendations is something that I am quite comfortable with, some of the discussion that we’re having is something that I would like to focus on a bit.
 
First, in general, the concern about the public versus private funding: I think we recognize that there is a great need, particularly in areas that are of public concern, to have public funding. I’m not sure that I agree with the idea that an imbalance, per se, is a bad thing. In part, I think, much of the innovation in this area is being driven by private funding. I think that is a terrific thing, that there is so many commercial applications and opportunities that are available. Given the limitations of public funding, I also think that it’s quite fortunate that this area has such tremendous private investment.
 
So, I recognize that there are some limitations and concerns with things like transparency and oversight that comes with the private funding driving a lot of the research. I want us to think carefully about ways in which we can increase transparency to private funding mechanisms. I’m not sure that I think that the right answer is simply to try to find parity between public and private funding; rather, I think it is to look at different oversight mechanisms. In particular, looking at the third recommendation, where we say through NIH and other funding sources, such as the Department of Defense and then in collaboration with the US Patent and Trademark Office, the government should examine whether current research and licensing and sharing practices are sufficient. I think the first part of this — through NIH and other funding sources — may not be the sole answer. I’m not even sure if that’s the venue by which I would suggest we want to ensure that current research licensing and sharing practices are as open as possible, because I think funding is not the primary venue by which we can get there.
 
Then the latter half of this — we say we want to make sure that there’s sufficient research to promote innovation. I think that’s right, we absolutely want to promote innovation, but also to take into account the risks that are attendant to it. Rather than just ensuring that our basic research practices are available to promote innovation, I want to ensure that they are sufficient to ensure oversight of any unique risk and unique concerns that may arise from synthetic biology.
 
Amy Gutmann:
Good, thanks, that’s very helpful.
 
Lonnie Ali:
I just wanted to say, I agree with Nita on that, but just know that the imbalance occurs because most private funded research in syn-bio is where there are global markets that will be available for their novel products and potential profitability is a significant motivator in that. Therefore, the public funding comes through when there are high risk/high reward opportunities to increase the public good — not only here, but also abroad, as well. That really is why the imbalance occurs, because it’s the public beneficence end of it. But I do agree with Nita, there needs to be more oversight into the private as well and it may not be through public funding that we ensure that.
 
Amy Gutmann:
One thing we should recognize is that there is, in this country, a very close and dynamic interaction between public and private funding. So, take one of the advances in synthetic biology that’s furthest along in the possible — really breakthrough — treatment of malaria, artemisinin. That is now licensed to a private firm. However, the research that went into that, a large part of it, was publicly financed through the main, single biggest research engine in this country, which is university-sponsored science–a large part of which is government funded.
 
What this recommendation (with Lonnie and Nita’s friendly amendments) suggests is that we make sure — that is, that the government makes sure — that the practices and the licensing and the regulations ensure the basic research results involving synthetic biology are available to both promote innovation and to avoid unnecessary and inappropriate risks. What the combination of public and private funding would be is, in a sense, parasitic on those concerns. We don’t come to the table with any prior view of what proportion of the funding should be public or private, but we do think that the public entities are responsible for ensuring that this research go forward in a way that meets the public beneficence, hence these recommendations.
 
Daniel Sulmasy:
Lonnie, you slipped so seamlessly into reading the exact wording of the first one, I don’t know if you actually changed it, but I can certainly recommend some language later. I think it needs to be clearer than it was in the initial draft that we had of these that we’re looking for funding to actually study mechanisms for risk assessment and funding for evaluating of the ethical and social issues that are raised by synthetic biology. But, I think the wording of what we got initially doesn’t really make that as clear as it needs to be, and I think that quite frankly, industry often doesn’t have the greatest incentives to do that kind of research, and there may be, in that sense, bigger opportunities of government funding in those areas in particular.
 
Amy Gutmann:
One thing we know from experience in this — and it’s just a case of the historical record — if there are public goods that it is in nobody’s economic interest to pursue and those are the goods, such as pursuing what is safe — that if government doesn’t fund that, then nobody will, or if government doesn’t create incentives such that it be funded, nobody will. It doesn’t always have to be direct funding, but there needs to be incentives to pursue what the public good is, even for those research designs in synthetic biology, for example, that will overwhelmingly be good. But if they bring with them unnecessary risks, and there’s no incentive put in place by the public to avoid those risks, then we’re not fully, as a society, meeting the standards of public beneficence.
 
John Arras:
I just want to follow up on Dan’s comment. We do need emphasize the importance of the public’s funding ethics and social policy oversight issues. We might want to note in this connection that some private groups, including the Venter Institute, have, themselves, set up ethics oversight committees, and I think they are to commended for doing that. I think that this is really a very significant development in this country — where you have lots of groups developing a concern for ethics, as well as being the fascination of developing the technology. But having said that, I think that, as Amy says, we’re talking about public goods here. We don’t want to leave the issue of ethical and social oversight to private committees, no matter how well intended or how beneficial.
 
James Wagner:
The role of private pursuit of the technologies — I hit the microphone before she could go to somebody else — the role of private avenues to advance the technologies seems to me has another implication. Let me get around to it. It was interesting to hear these three put together and to listen and pick up a message that I’d like to make sure I am hearing and we all agree with.
 
A meta-message to these three say that we believe that public beneficence is best served through the pursuit of research, not the limitation on research — that we imagine, that not only do we take best advantage of the opportunities, but we will also be best prepared to defend ourselves against the risks by pursuing research. The idea or thought that comes to my mind as I was hearing this, this morning, was the history of nuclear science in this country, where in a period following World War II and right up through the 60s and early 70s, in my own discipline, there were many departments of nuclear engineering. Owing to our fears about what the downsides could be for the spread of nuclear knowledge, most of those departments today are defunct. It’s perhaps the case that our country continued to invest in nuclear chemistry, in certain ways, but certainly not in nuclear technology. One wonders now that we’re ready to re-embrace nuclear technology, where could have been that we hadn’t lost thirty-years of investment in it. The reason though, in part, that we could shut it down, so to speak, at least within our boarders, was because there was not an opportunity for private and DYI investments in the same way. The technology was mega-expensive to pursue. It’s my long way of saying, I think that’s all the more reason that I hope I am hearing from the Commission that it is our recommendation not to arrest research, but to invest in it as a means to address all aspects of public beneficence. Am I hearing that?
 
Amy Gutmann:
The accompaniment to that, you’re hearing from the Commission that it is our view that investment in the research is the best way of protecting against the risks, is that there needs to be investment in research about risks. Another part of the nuclear energy story was that people didn’t anticipate the all-too-human risk factor in building nuclear plants and therefore they were taken by enormous surprise when it turned out that there were some mega meltdowns. What we’re recommending here is very proactive research in identifying risks and having an adequate, and we’ll get to that, oversight mechanism, but that’s embedded in recommendation three, an adequate oversight mechanism that takes advantage of that research into the risks.
 
Anita Allen:
I think an important emphasis with recommendation three is that we are also saying that we believe that, in this context, there’s something special to be gained from collaboration and sharing. We don’t think that holding on stingily to our property rights and our secrecy and so forth may be the best way to further our goal in research. So, I think we should just underscore that we definitely believe that sharing collaboration is going to be important to move this science forward.
 
Amy Gutmann:
Good emphasis.
 
Nelson Michael:
So in that sense, there’s an inherent incentive for privately funded ventures to do all of this, because ultimately, they want to try to mitigate business risks that are associated with technology that fails, and in some cases we’ve seen recently have failed spectacularly. I think it isn’t simply a matter of companies’ protecting their corporate interest in minding the short-term bottom line, there’s a long-term bottom line that can go significantly southward if there’s a major problem.
 
Amy Gutmann:
Good. Open it up to any comments from the audience participants. Yes. Paul Wolpe from Emory University.
 
Paul Wolpe:
Thank you, I’m from Emory University. I commented before on that the fact that my former president and my current president are the chair and vice chair. The connecting link there is obviously …
 
The analogy with nuclear energy, one of the areas where this is fundamentally different, and a point about this that I think might be lost — this is true of both the funding and the oversight issues — is that unlike nuclear energy, which is, if you were going to get into nuclear energy and research nuclear energy, you needed a big enterprise. The whole goal of synthetic biology is to create bio-bricks and other components that you could use individually in a college lab. In fact, biotechnology, itself, high school students are now doing biogenetics that the best scientists couldn’t do 15 years ago. So, the creation of very small pockets of research in a field where there is some potential for danger creates a kind of oversight challenge and, perhaps, a funding challenge, that is different than the challenge for something like nuclear energy, and I think there is a place where we have to figure out a mechanism to be careful of what is happening in the very small-scale research, as well as a large business enterprise.
 
Amy Gutmann:
Yeah, and you will see that some of our recommendations come to terms with that, although, I think it remains to be said, as you’ll see as we say in recommendations, that there’s an important balance here between the freedom and innovative aspects of a very centralized research enterprise, which is more typical of research enterprises, and ensuring safety and being proactive, and ensuring a degree of transparency, as well. You’re absolutely right: the very nature of the developments of bringing nuclear into the public’s fear from knowledge into practice is very different.
 
The example that’s closer, more analogous, is the recombinant-DNA community, and they took it upon themselves to be self-regulating in a way that’s very admirable and is a model. Synthetic biology is even more decentralized because of the possibility now of just ordering the bio-bricks and doing garage science. With that said, that isn’t the part of the science that dominates the field and it’s still impossible — it may not continue to be impossible — but, right now, it’s impossible to do the sophisticated forms if synthetic biology in your garage.
 
Barbara Atkinson:
I was just going to say that the preview under Intellectual Freedom and Responsibility, I’m going to present a recommendation that speaks specifically to this.
 
Amy Gutmann:
Anything else on Public Beneficence? Otherwise, I’m going to move to Responsible Stewardship.
 
Responsible Stewardship is really a unique principle — as far as we know — it’s unique to the human species to be able to engage in responsible stewardship in the sense that you could have instincts that enable you to maximize benefits and minimize risks and survival instincts. Responsible Stewardship really requires a frontal lobe, forward-looking, long-term assessment to be responsible stewards of nature, the earth’s bounty, the world’s safety.
 
We believe that human society and governments have a duty to proceed prudently in promoting science and technology, and much of science and technology has a dual potential to improve human welfare and to harm the environment and to create security risks. We’ve heard a lot about this from those both in the synthetic biology community who were admirably attuned to those risks, and from those outside the synthetic biology community who were concerned ranging from moderately concerned to extremely concern about those risks.
 
We, as a Commission, think in no uncertain terms that Responsible Stewardship is a very central principle for synthetic biology moving forward because of the intrinsic nature of some synthetic biology — by that, I mean that synthetic biology has not yet, but it promises to be able, at some point, to create new and modified species and organisms and that has an influence on biodiversity and the environment. Also, it has the ability to make us rethink as much of science has to date, the relationship of human beings to the natural world in its evolutionary state.
 
When you can manufacture organisms and potentially inject them into the environment — something, I should say, that has not yet been done by synthetic biology, but synthetic biology is looking forward to the day it can do that for some very good reasons — to be able to treat malaria more effectively, to be able to create vaccines more efficiently, to be able to create environmentally friendly biofuel — when that day comes, and if that day comes, which science is looking forward too, Responsible Stewardship says we need to be prepared. Both, to make sure those uses do good in the environment and do good to people who don’t have the power right now to represent themselves. In part because some of those people don’t exist yet, but also to minimize the risk to the environment and what now would be unforeseen risks, to try to consider what kinds of mechanisms would be most consistent with responsible stewardship. We will start with the fourth recommendation, but the first one under Responsible Stewardship.
 
Alexander Garza:
Thank you Madam Chairperson, I apologize in advance for my voice, it seems like I’ve been afflicted with a biological issue as well.
 
I’m going speak specifically about oversight. In the United States, oversight frameworks already exist for a great deal of activities that we do, including research on humans, animals, microorganisms and toxins, and including recombinant DNA. Oversight also occurs with regard to laboratory workers’ safety, with funding in research in transport and containment of dangerous agents.
 
In the course of our deliberations, we learned of the numerous oversight bodies and the mechanisms currently in place to review and improve biological research technology products, including synthetic biology. Many of these approaches rely on long-standing regulatory systems, such as foods, drugs, and chemicals, and others have been developed specifically to address genetic engineering in biotechnology. These oversight policies tend to be predicated on a risk-benefit assessment that is scaled according to the identified risk and that evolves through an ongoing process, including open public dialogue.
 
The evolving federal oversight framework for synthetic biology has been built upon these existing oversight responsibilities exercised by various federal agencies, which include the EPA (the Environmental Protection Agency) for chemical safety; the FDA (the Food and Drug Administration) for food, drugs, and medical devices; the USDA (the Department of Agriculture) for crops and food substances; as well as the Department of Health and Human Services and Homeland Security. This fits in well, I believe, with the discussions that we had with our international partners where communities have come together to focus on the ethical, legal, and the policy issues internationally, including the European Commission that supports syn-bio safety, which is a collaborative project among public and private parties.
 
Because of these established frameworks, at this time, we do not see a need to create additional agencies or oversight bodies to focus specifically on synthetic biology. Rather, we urge that appropriate entities within the Executive Office of the President, in consultation with relevant federal agencies, develop a clear, defined, and coordinated approach to synthetic biology research and development. A mechanism or review should be identified by four different issues.
 
One, leveraging the existing resources by providing ongoing and coordinated review of developments in synthetic biology across federal agencies. Two, ensuring the regulatory requirements are consistent and non-contradictory. Number three, to stay abreast of international developments and coordinate issues across governments. Number four, periodically, and on a timely basis, inform the public of its findings. These activities might be carried out under the auspices of the OSTP, the Office of Science and Technology Policy, or another coordinating committee.
 
Amy Gutmann:
That was the fourth recommendation, the first under Responsible Stewardship. The second under Responsible Stewardship, and the fifth overall, if you’re counting, John Arras will present.
 
John Arras:
Thank you, Amy. This particular item relates to our attitude toward risk and how risk analyses will be undertaken. Amy Gutmann mentioned earlier the sense in which our Commission is going beyond the Belmont framework here. Here’s a salient example of that, where the Belmont Report argued that the benefits, both to patients and to society, should be taken into account by IRBs — Institutional Review Boards that are scrutinizing the research in every center that is publicly funded. And also that the burdens to patients, or the burdens to participants in the research also be carefully scrutinized. But, Belmont didn’t really talk about social risks — in fact, Belmont precludes the discussion of social risks in the IRB process. So, I think one thing that this principle does is bring social risks into the ambit of the ethical principles that we’re looking at.
 
So, here’s the problem: synthetic biology is a new and very energetic field. We’re witnessing very rapid development. It’s very difficult to predict ahead of time exactly what sorts of benefits will eventually be derived, and what sorts of risks will eventually materialize here. We’re speculating largely in the dark. So, we’re facing a great amount of uncertainty. This is a big problem in social policy. What is the proper attitude toward risk when the future is shrouded in uncertainty? That’s the big problem, especially when you’re dealing with events that are of low probability, but very high impact.
 
So, we can predict for example that organisms might be let loose into the larger environment wreaking havoc out there; again, these synthetic organisms will be living, so they will be able to reproduce and colonize, perhaps, in various areas of the environment. This is a very serious risk. Of course, it’s hard to predict exactly what the likelihood of that is. The same goes for risks that can be lumped under the heading of bioterrorism — bad actors doing bad things. We know that they will probably be out there, but it’s hard to get a grip on exactly what the quantity of the risk is.
 
So, there are a couple of ways of handling risks that we gloss in our report: the first might be described as a proactive approach to risk — full speed ahead and worry about the risks later. There are values behind this particular attitude toward risk. There’s the value of scientific freedom and a tremendous focus on the benefits that can be derived from it. We would argue, however, that the proactive approach really under-appreciates the risk. Standing on its own, it doesn’t appear to be a very acceptable principle.
 
On the other extreme, we have what we call the precautionary principle, which we’ve seen deployed in Europe, for example, over the debate involving genetically modified foods. According to this principle, all the risks involved with a particular technology must be identified and mitigated, or there must be a promise of mitigation, before one goes forward with the research or deployment of a technology. We would argue that this approach under-appreciates the benefits that might be derived from syn-bio.
 
This is a very difficult problem, because we don’t seem to have a theory of risk in hand that tells us exactly what the appropriate risk-benefit ratio should be, that tells us exactly how great the risks need to be or how likely the risks need to be before we intervene and stop the research.
 
Different people, animated by different projects and different values, can come down on different sides of this particular question. We’ve seen how this has happened over the issue of genetically modified foods in Europe and the United States. The Europeans are very suspicious of this technology, and we are much less so. It’s not written anywhere that there is an ideal risk-benefit assessment here, and so I think one way of framing what we are describing what we’ve done here is to opt for a procedural solution to this problem. A procedural approach is really a good example of trying to find a good Aristotelian mean between two extremes. We call it Prudent Vigilance. This includes ongoing assessments of risk as the technology develops. So, because there’s so much uncertainty, we argue that the technology should go forward, but with lots of safeguards and with an iterative process regarding the discovery and the assessment and response to risk.
 
Second, as Alex just mentioned, we’re also advocating that whatever coordinating body within the government is responsible for syn-bio, that it should convene through an interagency process, make sure that all the different agencies within the federal government are playing according to the same playbook and that their approach to risk management is harmonized in some sense.
 
That gets us to the actual statement of the principle. Here is principle number five: Because of the difficulty of risk analysis in the face of uncertainty, particularly for low probability/high impact events, ongoing assessments will be needed as the field progresses. Regulatory systems should be updated as necessary to ensure that regulators have adequate information. The coordinating body recommended above should convene an interagency process to discuss risk assessment activities, including reasons for differences and strategies for greater harmonization if appropriate.
 
Amy Gutmann:
Thank you. The next recommendation, Nelson will present.
 
Nelson Michael:
To the further of the concept of safeguard through Responsible Stewardship, I’ll give you a personal anecdote about how it’s important for us to harness the benefits of technology, but also to mitigate the risks. I had the opportunity in the past few months to take all too many long airplane trips all around the world. Some of those airlines allow you to listen into traffic control, and it’s always amazing to me how several dozen aircraft trying to land all at once at Frankfurt International Airport can be kept away from each other and actually land, in many cases, multiple aircraft, all landing on different runways, all at the same time. This is not something that they developed overnight. Perhaps, with the security screening aside, can agree that airline travel is a useful thing that we’ve developed as a society. But the risks are significant in terms of that technology being used for nefarious purposes, as well as that technology having spectacular failures, and the airline industries and its various regulators do a tremendous effort to ensure that this is one of the safest ways of traveling.
 
In the same sense, I think we all on the Commission have struggled with the significant benefits of synthetic biology as an extension of its predecessor of more conventional genetic engineering or molecular biology offers us. However, it’s important for us to understand and communicate to the public, how the practitioners of this new technology need to, much like the airline industry does, continuously ensure that risk is mitigated. How can that be done?
 
So, the intent of consequences of research designs need to be put into a framework where organisms that are developed in the laboratory, number one, can be tracked while they’re in the laboratories and during either an intentional and regulated release, or during an unintended release. We actually need to know where the organisms are in the same way that the airliner that I landed in Frankfort, the air traffic controller knew where it was from the time it pushed back from the gates, until the time it pulled a gate in Germany. Two is that, in the possibility that these organisms would be released into the natural environment — again, either deliberately with controls or by accident, or by other nefarious mechanisms — that the technology safeguards would be in place to contain those organisms. There are many ways to do that, one flavor of that containment to make this organism such a wimp environmentally that it really just couldn’t compete against its brothers and sisters out in the field. And this is something that molecular biology had done in the 70s to make the ecoli organisms that we commonly engineer so biologically non-selective that it tends to get overwhelmed when it encounters natural forms of its species.
 
Lastly, maybe in a way that an errant missile — and I can tell you that in the department I work in, these safeguards are in place for weapon systems — can go awry. There are usually ways that those weapons can be disabled. In the same fashion, from a molecular standpoint, one can build in mechanisms that would allow kill switches, or suicide genes, if you will, to be activated and those agents could in fact be contained at the most fundamental level, which is to be disabled.
 
What’s the way for us to implement these? We discussed public funding, which will clearly mandate that developers of this technology build in these kinds of safeguards, but in the same way incentives that we’ve discussed in the first three recommendations for why companies would want to clearly invest in these kinds of safeguards, I think, is pretty clear.
 
Public funding is important to provide incentives to industries to partner, at the same time as civil society and governments do, to put these kinds of safeguards in place makes sense. In the same way that United Airlines is happy to comply with some FAA regulations, and in the long run make sure these planes stay in the air and land safely and not abruptly.
 
Lastly, the need for standardization of these safeguards so that as we, as a scientific community and a civil society and government, look into how this technology falls, that we can build on each others’ experiences with making safeguards, so that one identifications-transponder on an aircraft landing in Germany, will work on the same aircraft landing in Tokyo.
 
So, let me get to the recommendation itself. I’m reading provisional recommendation number six, for which the language will probably change a bit: New organisms should be marked or branded in some manner to help the government monitor developments in synthetic biology. At least, until further information is gathered about the ability of a synthetic microbial organism to multiply in the natural environment, so called “suicide genes” or other types of self-destruction triggers should be engineered into organisms in order to place a limit on their life spans. Alternatively, engineered organisms could be made to depend on nutritional components absent outside the laboratories such as novel immuno-acids and thereby controlled in the event of release.
 
Amy Gutmann:
Thank you. The next recommendation under Responsible Stewardship, Steve Hauser is going to present.
 
Stephen Hauser:
Thank you. An additional issue relates to how do we time the deliberate release of synthesized organisms into the environment and how do we analyze risks prior to release. Everyone would agree that we must proceed carefully. Especially when there is a chance or a probability that the risks would be considerable and also recognizing that the properties of released organisms may not be predictable in a laboratory, once they are released in the environment. Generally, the methods for risk assessment throughout the scientific community and oversight agencies are to evaluate any new organism in terms of its known relatives, and to set containment rules for environmental risk mitigation strategies based on the applicable rules for the known relative.
 
This approach appears to have worked quite effectively and has been able to evolve as methods and science has evolved. In terms of synthetic biology, we need to be certain that the assessment is based on potential function as well as the known structure or sequence of organisms that are released. This need is created because new attributes with synthetic organisms may not have the same or similar structure to those of their naturally occurring forbearers.
 
Prudent vigilance is required to assure that the strategy of how we compare new to old organisms, when the old organisms exist, remains effective as the field matures. Our next recommendation is as follows: The Executive Office of the President in consultation with other federal agencies should insure that risk assessment is carried out under the National Environmental Protection Act or other applicable law, prior to field release of any product of synthetic biology technologies. This review should analyze any gaps and evaluate the appropriateness and feasibility of requiring a reasonable environmental impact assessment prior to the release of biosynthetic organisms from a contained laboratory setting, including, as appropriate, plans of staging the introduction or release. The results of all reviews should be made available to the public. Exceptions may be made in cases of substantial equivalency or emergencies.
 
Amy Gutmann:
Excellent, thank you. I’m going to open it up for Commission members’ comments and any comments or questions from the public.
 
Nita Farahany:
First, I should say that one of the great pleasures of serving on this Commission is hearing different perspectives and I particularly enjoyed Nelson’s different descriptions of the types of risks that we were talking about and thinking about it from a different perspective and a military perspective — that was great and quite helpful.
 
I just want to emphasize on that particular recommendation, as we discussed, I think it’s important that we make that as broad as possible. Because, as we know, this is a field that is rapidly advancing. And so, while today we’ve heard a great deal — well, not today, but over the past couple of meetings, we’ve heard a great deal of testimony about the novel, new, different kill mechanisms or nutritional depletion mechanisms in order to contain different organisms from being released, the science may evolve rapidly and quickly. And my concern is, if we put something quite specific about the containment measures into our recommendations, I would caution against any sort of policy that did so, to ensure that the science is always able to have the best possible containment measures. I think our recommendation is really one to ensure the containment occurs rather than advocating any specific type of containment.
 
A second comment is on recommendation number seven, and I’m in agreement with these issues that we’ve raised, which is identifying risk assessment prior to release. I’m concerned, though, about unintended release and how our recommendations are really addressing unintended release, and whether we’re getting at that. So forecasting a bit ahead, as we look at biosafety and biosecurity concerns, I’m not sure that we fully do address that. And so I’m wondering if there’s a possibility of thinking about risk assessment at the point in which any novel organism is created, let alone released. Whether its private companies or public companies that have created novel organisms that obviously have the potential to be released through inadvertent or through intentional mechanisms, are we addressing that and could we address that as such that there needs to be a risk assessment provided by any company that does develop a novel organism?
 
Amy Gutmann:
I’m wondering if anybody on the Commission who does lab science can address that, because it actually is something, like many of our concerns, this one is not only relevant to syn-bio, but its really about the importance within the lab of taking the kinds of precautions that would prevent unintended release.
 
Nita Farahany:
Or even risk assessment of the creation of novel organisms.
 
James Wagner:
Because we do it backward now, right? In other words, when we are working with high risk organisms in biological containment laboratories of different levels, we do that because we know a priori they’re high risk. What do we do in this case? How does one do that risk assessment in order to determine that level of containment and would it be crippling just to force everything into that highest level of containment?
 
Amy Gutmann:
Let me just frame this so we understand. I know the Commission member understand this, but at present synthetic biology is really struggling with the opposite problem, which is creating a novel organism that is robust enough to survive even in the lab. So, I think it’s important for all of us to communicate to ourselves and the public that right now there is no clear and immediate danger of this — quite the contrary — and the scientists — not only the scientists working in synthetic bio, but all the scientists — have told us that the bigger challenge right now is to create organisms that are anywhere close as robust as organisms — these are very simple organisms in the natural world. Nonetheless, as we see this field going forward, this gives us a great opportunity to insure that these kinds of risks are taken into account and protected against, to the extent possible, and prudent. I just underline that prudent vigilance is what we’re looking for here.
 
Nelson Michael:
At the University of Pennsylvania, our last speaker was one of my personal heroes since I was trained as a microbiologist, as well as an internal medicine physician. And, Sydney Brenner, at the end of his conversation — of course, he lived through the Asilomar Conference — the self-governing as was mentioned previously with the predecessor of synthetic biology, which is molecular biology. He basically said at the very end that he thinks there isn’t any imminent threat as well, but we just need to continue to watch. To me, I am a lab scientist and we have very significant controls in place, but I don’t think we can be complacent. We need to constantly reassess whether or not those controls are adequate. Maybe the organisms being made today are wimps, but maybe that will change sometime in the future, so I think we need to just continue to take his advice and to watch and to monitor and have this process be as transparent as possible, as Nita and others have mentioned, which I think is critical, to make sure there are firewalls. I think we can get around the safety concerns that at the same time will protect trade secrets for commercial concerns. All of us have an investment to do this right.
 
Stephen Hauser:
You’ve obviously raised two large points here and the first is a very important one that I don’t think we have covered. But, it relates to the rules that are currently in place to the notification rules and mitigation of problem rules following unintended release or access of information by unauthorized users in the laboratory, sufficiently in place for synthetic biology. The second issue is the registry issue, which is a harder subject.
 
Nita Farahany:
To that point, for the first one, and also to Jim’s point. It seems like it’s possible that, given that we at least know particular things like select agents, or regions or oligonucleotides, which are used and more likely to have high impact, that it may not be the case that we can do an adequate risk assessment for every novel organism that’s created, particularly those in which you may in fact, be able to do a de facto one, right. You’re using things that our belief is that, at least, the way we currently understand how their assembled, it’s unlikely to cause any great risk, if you’re using any of the select agents or sequences that we think might potentially, in combination with other sequences, pose a greater risk, then there’s a greater concern. I don’t think we’re getting at that “what happens in the laboratory” side of things. We’re only looking at it from a release perspective. It seems possible that we could do some sort of risk assessment for the creation of novel organisms, at least within categories: if you are using select products, if you are using particularly high-risk sequences, then it may be that, whether it’s registration or whether it’s tracking, somehow we need to have greater oversight of those things.
 
Amy Gutmann:
We need to, in one of these recommendations, and there are several that would accommodate it, make sure we talk about what can be done for oversight of the unintended release while it’s still part of laboratory science. I think that’s a very good — a friendly amendment to one of these recommendations and in our report we have to address that fully.
 
Alexander Garza:
The risk of release is not unique in syn-bio, so we do have experience in that, albeit, it’s with the PSL labs. Risk and uncertainties are notoriously difficult to do. The Department of Homeland Security does risk assessments for biological, chemical, and nuclear threats, and I can tell you that it is a very onerous and controversial process that it goes through. It’s because, by its very definition, they’re low probability events, and so there’s no history to base your risk off of. We, as physicians, base assessment of risk for disease on prior disease, so I can tell you that cigarette smoking increases your risk for heart attack so many numbers of times, but we can’t do that with syn-bio. I think the objective of prudent vigilance is almost by necessity what we have to do in order for there to be an ongoing assessment of risk.
 
Amy Gutmann:
We strongly believe that the responsible way of proceeding here is through prudent vigilance and it really is a difference between a completely proactive approach, which says, “just let science rip,” and a precautionary principle that says, “stop until you know that you have mitigated the risks.” And, the analogy that works here, and it’s only an analogy, is with medical care. We don’t say to people who have a disease or some ailment, we don’t say just get treated regardless of the safety of the treatment. Nor do we say, don’t get treated until you know that there is no risk from the treatments. The reason it’s a good analogy is there are good benefits to be had — for example, in having an anti-malaria drug that’s efficiently made for populations, because people will die and they are dying. Seven hundred thousand to a million people are dying every year of malaria. And so, if you enact a precautionary principle, there are going to be people who die who might be saved. At the same time if you just let science rip, you can have all kinds of unknown risks that you haven’t taken any due precautions against. So that’s where we come down. It won’t satisfy either extreme, but I think it is the responsible way of proceeding and advising both private and public parties who are engaged in this field.
 
I want to go out — is there anybody … ? If there is anyone who wants to make a comment or question, just stand up to the mic, and I will recognize you.
 
Christine Grady:
I just want to emphasize something. It’s embedded in one of these provisional recommendations, but maybe we need to emphasize more and that is, how we do the coordination, but also how we do this risk assessment internationally. Recognizing that the recommendation that Alex read earlier said, stay abreast of international developments and coordinate issues across governments. And we know that historically, different governments have taken different stands on the precautionary principle versus “let science rip,” as you say. This notion of coordinating these kinds of monitoring and risk assessment and prudent vigilance in an international environment is a challenge that we need to emphasize in our report.
 
Amy Gutmann:
Right. We don’t approach this — correct me if I’m wrong — assuming that every international body and country will agree on the same principles. That certainly isn’t the case with genetically modified organisms. But we do approach it with the sense that, to the extent that we can get agreement, there will be a better international system of oversight and of investment in science, which has become a global enterprise. Everything that we’re recommending nationally is going to have international repercussions. The same is true for the analogous bodies internationally.
 
John Arras:
To follow up, Chris, I think that because we lack a canonical, theoretical picture of what the ideal risk-benefit analysis would look like in this kind of a case, we do stand to learn from other countries in the way they conceive of these problems. It is a worldwide problem, which requires coordination, but it’s also an opportunity for us to learn.
 
Amy Gutmann:
Are there other Commission or audience questions or comments? If not, we’re right on time. We’re going to take a lunch break. We will reconvene at 2:00, when we will continue with our recommendations, and we will also have a presentation at 3:30 by Tom Murray, who is the President of the Hastings Center, which is one of the premier centers for bioethics in the country. Thank you all.
 
[LUNCH BREAK]
 
James Wagner:
Ladies and gentleman welcome back and welcome to those of you who are joining electronically. When we left off just prior to lunch, we were in the section considering those recommendations pertaining to Responsible Stewardship. And we were on recommendations four, five, six, and seven; in other words, the first four on Responsible Stewardship, and we have a couple more to consider there. I think the next person in order to speak with us is Nelson Michael.
 
Nelson Michael:
Thank you, sir. Our discussions at the first meeting were supported by some number of individuals that had come to testify before this Commission in terms of their experience with similar topics and challenges with synthetic biology. Some of it is antecedent technologies, like nanotechnologies from countries other than the United States. I think that one clear conclusion that we heard as commissioners from those groups was to remind us that we are embedded in a global community, and our deliberations should be taken within that context. If any of you have a teenager at home, you know they will log-on to something like a Playstation 3 and within a few minutes they’re engaged in some pursuits that my wife considers not to be wholesome, but my fifteen-year-old would disagree. He’s doing these kinds of pursuits with people from all around the world. Truly, our community is global and the issues that we have been debating for the past several months transcend borders. The question really for us to struggle with is, as an American government deliberative body that is advising our government, how do we put that into context?
 
I think that it’s easy to simply pay lip-service to the fact that, yes, we’re in the global community and we should pay attention to those stakeholders that are engaged in these pursuits and move on from there. But I think that we need to really give a lot of thought to this concept, because it really makes little sense for us as a deliberative body to simply debate and advise our own government in a vacuum.
 
I’ll give you another personal anecdote. About two weeks ago, I was in the southern African country of Mozambique. This is an area of the world that has an underappreciated HIV epidemic. My group is largely engaged in making vaccines for HIV infection, with some recent success last year in Thailand. The next steps for my field is to look at populations that are most at risk for HIV infections and that would include women, especially women in sub-Saharan Africa, as well as men who have sex with men in southeast Asia and eventually in Europe and the United States. It’s part of a preliminary trip to the Maputo area. This was the first time that I had been to the country of Mozambique.
 
I was immediately struck by the fact that their Ministry of Health had an intramural research group that was superb. The oldest person in that group is thirty-seven years old. The country has just recently emerged from a very long civil war, so most of their effective scientists were relatively young. They were young; they were hungry; they were aggressive; they had built a wonderful research platform; and they were looking for support.
 
We had partnered as the United States Government with the Mozambiquan Government to pursue the emergency plan for AIDS relief. There was a very, very large US Government presence, largely lead by the Centers for Disease Control, the DHHS and USAID, obviously, all under the aegis of our Ambassador.
 
After a week of very good meetings, I traveled with my colleagues from the University of California in San Diego who were interested in starting AIDS research activities yesterday. The USAID had funded Family Health International to do some HIV incidence work in a town called Chokway, but to our disappointment the study was really underfunded and FHI knew it, but it just didn’t have the funds.
 
The government of Mozambique really wanted us to engage by increasing the size of the study and its power, which is something that we aim to do. However, we spent a large part of that week that we were there, and then multiple e-mails since then, discussing what the impact of us starting this new activity would be on other activities that the Ministry of Health of Mozambique are doing, and what impact this would have of us essentially engaging some of the best and brightest researchers in that country — that are also engaged in the execution of delivery of care and treatment through PEPFAR, which is saving Mozambique lives every day. So, the CDC and AID representatives sat with me and the ambassador and we debated these concepts. I brought them back to the government of Mozambique about three or four hours later.
 
The point is, I think we probably will move forward there and it makes good sense, but there needs to be a very deliberative process of open debate. What one group may thinks makes a lot of sense may actually interdict the good works of others. And then the end result, much dialogue and mutual respect has to be in place in order for any endeavor in synthetic biology and in its promises and estimates of its risks and how we deal with those challenges needs to be put in that context.
 
So, let me then move on to recommendation number eight from the committee. Recognizing that biosafety and biosecurity are best assured through international coordination, the Department of State, in collaboration with other relevant agencies such as the Department of Health and Human Services, should engage appropriate parties in the international setting to foster ongoing dialogue in this field.
 
I read this recommendation this morning and I must say, I don’t think I remembered it when I was in Mozambique, but it really struck me how relevant this was to real life experiences I had just had experienced. I think this will probably resonate for us as we go forward to ensure that we look at proper global framework for synthetic biology.
 
James Wagner:
For the people in electronic land, Christine Grady will present our next recommendation.
 
Christine Grady:
Thank you. Our next discussion under the topic of Responsible Stewardship has to do with creating a culture of responsibility. We all know that responsible conduct of science including synthetic biology, but not limited to it, certainly rests heavily on the behavior of scientists. And a culture of responsibility and accountability must be fostered at the local level. The scientists need to be aware of ethical standards, ethical practices, safety standards, biosecurity standards, et cetera. And, scientists are usually in a position to be the first to notice and respond to suspicious behaviors of various types.
 
So, the recommendation that we’ve been working on addresses the need to create and foster a culture of responsibility. Interestingly, and not surprisingly to anybody, it’s very hard to legislate or regulate a culture of responsibility. This recommendation focuses on education, and it takes the position that currently federally funded researchers are required to have certain kind of education and training associated with their research. They have training in the responsible conduct of research, protection of human subjects, and the use and protection of laboratory animals, as well as biosafety and biosecurity when working with select agents and other related trainings.
 
Researchers working within institutions with recombinant DNA, for example, are also very aware that they have to undergo review by institutional review and biosafety committees before they can begin their research. Researchers that do research with human subjects are aware that they have to have review by institutional review boards before they do research with human subjects. And they learn this through the courses that they take and the mentoring that they have in their respective institutions.
 
So, in synthetic biology, there are some interesting challenges in this regard. The first is that there are many new actors within the institution, so people outside of the biological sciences — engineers, chemists, material scientists, computer modelers, and others — for whom federally funded work has not required any kind of ethics training or education.
 
In addition, there are scientists that are outside of the conventional biological research settings, including, but again, not limited to, the do-it-yourselfers who are out there doing research in their community. There are also scientists in the private sector who do not have to comply with federally funded requirements for research ethics training and biosafety training. Then as we have heard, and Nelson had a wonderful anecdote, which I can’t supply one for, but we have international collaboration and an international network of scientists that work together in research.
 
What we hope to recommend here is that there be some required education for all of the researchers in the field of synthetic biology. I’m going to read the provisional recommendation and then make one or two more comments.
 
The recommendation says, required ethics education similar to the training required today in the medical and clinical research communities, should be developed for all researchers including student researchers outside the medical setting, such as engineering, material science, and information technology. The Executive Office of the President coordinating body referenced above as referenced earlier in our discussion, in consultation with National Academy of Sciences, should convene a panel to consider appropriate training requirements and models.
 
I think that the recommendation to have a panel look at appropriate training requirements and models is extremely important, because although there are education and training requirements for biological scientists, there is still quite a bit of variability in what those programs look like. There are not agreed upon standards of what the content should be — and probably the content of the programs that do exist are not going to be exactly right for the wider reach of this outreach of scientist and synthetic biology.
 
I think, secondly, it would be helpful for us to think about not only having this panel think about what the appropriate training requirements and models are, but also thinking about how to make them publicly widely available, so that people who are do-it-yourselfers have access to these kinds of training programs without a lot of effort.
 
James Wagner:
Christine, thank you. This is actually the place for a break in our presentation, so we can have some conversation. These are the last two, you will recall, of the total of at least six recommendations made in that area of Responsible Stewardship. Are there any comments on — actually, we can fold these into the others we’ve spoken about — but any comments first specifically on these recommendations for international coordination and also for education? Dan?
 
Daniel Sulmasy:
I was glad to hear that Nelson had actually forgotten about number eight, because I, as you know from this morning’s conversation, had also forgotten it was there.
 
It strikes me that one of the reasons that’s the case is that when you read it; it seems a little tepid and tentative. I wonder, while I don’t want to make it a substantial change in it, I wonder if we should somehow strengthen this. For instance, saying what the goal of the dialogue might be. Where are we heading? I’m looking to see whether there might be others involved besides the state department and DHHS in looking at engaging in an international dialogue. I think it is really critical. We heard that from many people and I would just like to see it a little highlighted — more than it is in the current form.
 
James Wagner:
On focus of dialogue, what would you suggest?
 
Daniel Sulmasy:
Well, I think we’re looking probably toward having, where we can have agreement, actual standards for some of these things. Obviously, there will be differences internationally, but to the extent that we can have a uniform expectations throughout the world, given the way that science is globalized, both on the safety and on the biosecurity efforts, I think that would be helpful to say that that’s where we are really moving.
 
Amy Gutmann:
This is in some sense an unprecedented opportunity for the President’s Commission to proactively get maximum agreement on the international level, because in most cases Presidential Commissions have taken up issues of what standards should be after a science is very far along and developed. In this case, synthetic biology is in its adolescence, if not infancy. And I think that presents the opportunity in our recommendation about international collaboration, a realistic opportunity of getting quite a bit of agreement. Now, we don’t know until we try, but I think the idea is that we, at minimum should engage the World Health Organization, which in representative of — is the major organization on the international scene with regard to health — of our counterparts, of commissions in —
 
James Wagner:
Which we’ll hear tomorrow.
 
Amy Gutmann:
We’ll hear tomorrow and we’ve already reached out and had meetings with them — most recently this past meeting. The meetings we’ve had and engaged them also this summer at the international organization of all bioethics commissions and we were represented there. I think, Dan, you’re right, we should specifically mention WHO and make sure we engage other organizations on the international level.
 
James Wagner:
Like Dan, I want to ask, so what? Is there a way this kind of thing can be woven into — this particular recommendation for international consideration should be woven in — more broadly in the other recommendations that we have — meaning that when we are asking for policies or processes that we suggest that they be in harmony, at least with the awareness of, what’s happening internationally. As opposed to pulling it out as separate — “Do all these things” and then, “Oh, by the way, we ought to be talking across the oceans … .”
 
I think Nita was up and then Dan.
 
Nita Farahany:
I think I agree with you, Jim, on maybe folding into some of the other ones, so looking ahead, for example to number thirteen, we have something quite specific about export controls, in just sort of previewing it and thinking about dialoguing with international communities, about that, as well as scientists. It seems like this biosecurity concern is just one of several of the biosecurity concerns, so I’m not sure that it is a stand-alone recommendation unless we strengthen it to be something unique like trying to come up with international standards of practice or international standards of biosafety.
 
James Wagner:
Dan.
 
Daniel Sulmasy:
That’s my suggestion, that it be more “both” than “either/or.” We do also in number four, for instance, say keep abreast of international developments, so there are places where we are mentioning the international, but I think to strengthen what we’re saying, and to say specifically we have an aim of trying to get maximal agreement internationally, would be worth a stand-alone recommendation.
 
Amy Gutmann:
There is a difference in whether we have it as a stand-alone recommendation or not, I’m agnostic on, but there is a difference between what’s at stake with biosecurity and what’s at stake, for example, in realizing the benefits, or the just distribution of benefits. With biosecurity, any high risk — however low probability — any lack of precaution against the high risk/low probability events by any actor globally affects everybody. Where some of the other places we call for international collaboration, it would be great if we got it, but if we don’t, you don’t lose everything by that. You lose an awful lot when biosafety and particularly biosecurity are concerned if we don’t have international standards. I think the US is likely, given past experience, to be both a leader here and I suspect we will get a lot of agreement on this internationally. I just wanted to underline that this particular recommendation under Responsible Stewardship, which deals with biosecurity, is of a higher order of importance than some of the other parts of international agreements.
 
James Wagner:
I like the way you helped us separate the thoughts between biosecurity and biosafety. I see us converging more readily on the biosecurity issues than I do on some of the biosafety issues, given the precautionary stance of Europe, for example, relative to what we’re advocating. We might want to just fess up on that and say that we’re especially interested in and especially recommend these sort of joint activities, international activities, around biosecurity.
 
Nita, and then John.
 
Nita Farahany:
Jim, I agree with you. Listening, to what Amy was saying, I didn’t understand that to be the point of this particular recommendation and if that is the point of this recommendation, if we’re looking at biosafety from a biosecurity perspective and trying to get buy-in from the international community, I think we could re-word this and focus it to say what we’re really interested in is not just the general “how safe is your lab for your workers and your particular laboratory environment,” but we’re talking about it for this recommendation purposes, we’re talking about it for international standards when we’re talking about security.
 
James Wagner:
Security, especially. Yes.
 
Amy Gutmann:
Just as far as what to anticipate, we will be in all likelihood, we don’t know this for sure, but in all likelihood, more libertarian with regard to some issues of what science should be able to do — given our previous principles of prudent vigilance and saying that we’re not on fully on the precautionary principle side. Whereas, with biosecurity, I think we’ll all be in harmony.
 
James Wagner:
I agree.
 
Amy Gutmann:
I think we all believe that that’s the most important thing, to be in harmony.
 
Nita Farahany:
I agree.
 
James Wagner:
John?
 
John Arras:
Just one small thing here, this is just a bit of wordsmithing. Instead of saying “best assured,” we might want to say something a bit stronger, like international cooperation is “essential” or “indispensable” for biosecurity.
 
I want to ask Christine a question about the education item here. Chris, what is your impression of the quality of most ethics education regarding clinical and research issues out there? I know that at the NIH, you folks do an amazingly good job equipped, as you are, with a textbook you and I wrote.
 
[AUDIENCE LAUGHTER]
 
Will there be signings this afternoon? A bargain at any price. Available on Amazon.com.
 
James Wagner:
There is a question here, right?
 
[AUDIENCE LAUGHTER]
 
John Arras:
I don’t have a lot to do now with hands-on medical education, but I hear a lot of grumbling about these sorts of requirements and how they’re not really all that substantive and how the educational programs are not really all that demanding or helpful. So I’m wondering about the wisdom here about hitching our recommendation to the current level of training that we see out there in clinical and research ethics.
 
Christine Grady:
Of course, you’re right, John. There’s huge variability in terms of the quality of courses that are out there, which is why I made friendly amendments to this recommendation as I read it — you may not have noticed. For example, the “comparable,” I took out. I don’t think that makes any sense and I added at the end, not only thinking about the models that might work, but thinking about the content, because I do think there is responsible conduct of research. There are modules that people use across variety types of labs, and I think maybe those are good models to look at, but then there’s huge variability beyond that. And so the question is: what specifically do people who are scientists, who are involved in synthetic biology, really need to think about and know about in order to be responsible for their research? I think the very important part of this, though — and, hopefully I stressed it — is that who is reached by the courses that are currently available, whether they grumble about them or not, are the people required to take them because they’re federally funded. In this particular arena, we have to be really aware of the number of actors in different settings and different disciplines and different countries that don’t have access to the information that we think might be important for people to think about when they’re trying to be responsible scientists. That’s why, we need to think carefully about it, not only the venue and the content, but also that it is as publicly available as possible.
 
James Wagner:
Barbara had a quick comment, and then we will go out to the audience.
 
Barbara Atkinson:
It was to Chris, and it was just what you ended with. This made me realize, this education piece that we talked a lot about, public education, but we really don’t have a recommendation about public education, but maybe modules of ethics education aimed at the general public, not just at scientists. Really, some more public education maybe needs to be seen throughout our recommendation, but this one in particular.
 
Amy Gutmann:
I think that we do have more on public education coming up, but I think here we should just say that we have to take advantage of online education modules. Because there are audiences, that won’t be reached otherwise. That’s why this is under a culture of responsibility. We can’t ensure a hundred percent coverage, but we sure can make it possible by having online resources. I wouldn’t over-emphasize the ideal education, which we may not reach, but I don’t think we can —
 
James Wagner:
In spite of having ideal textbooks.
 
Amy Gutmann:
Right, in spite of having ideal textbooks.
 
[AUDIENCE LAUGHTER]
 
I think some of the basic ethics education here, which is nonexistent and we heard from somebody at MIT that the engineers working in this field, the students don’t have to do anything in ethics education. That’s just unacceptable from the perspective of any kind of responsible stewardship.
 
James Wagner:
I think it’s a solid recommendation. Are there comments or questions out of the audience — the public? Seeing none, let’s move on.
 
The next framing ethics area for us is Intellectual Freedom and Responsibility and you heard Dr. Gutmann, President Gutmann, review this earlier, the need that to ensure that we have an environment with sufficient and not onerous regulation, in order that we foster rather than squelch the creative spirit of scientists — well, scientists and technology developers.
 
We have several recommendations in this category and the first one, Nelson; I think you’re back on. Oh, no, you’re not; Anita is on.
 
Anita Allen:
Thank you. With number ten we are officially half way through. This recommendation actually has two parts to it. The first part recommends that the government support a continued culture of responsibility, which I am going to interpret as individual and corporate responsibility and self regulation, that we do this by — and with the scientific community — and in collaboration of NIH and NIH guidelines, as well as with other mechanisms for monitoring and enhancing watchfulness.
 
The second part, beyond the underscoring of the importance of cultural responsibility, is oversight. And here the recommendation is that acting through a central coordinating office, such as the White House Office of Science and Technology Policy, the government should evaluate current oversight mechanisms to assess their effectiveness and determine what, if any, additional steps we can take and to foster accountability at the institutional level, without limiting intellectual freedom.
 
Now, as an academic, I am extremely self-conscious about the importance of academic freedom, but mindful as an ethicist of the importance of ethical responsibility, which sometimes requires both oversight and accountability.
 
This morning Doctor — President Wagner stressed the Commission’s underlying assumption that we do want to promote research. Research can lead to innovative processes and products, and, as I said before in connection with my presentation of the Commission’s third recommendation, innovation, and in particular biomedical innovation, often benefits the public. Well, innovation doesn’t always benefit the public, even when it overtakes whole ways of life. We’ve seen the downsides of certain innovations like, dare I say, television and cell phones and pesticides. Then there have been some landmark innovations, like atomic energy, which have had more historic ethical and political debates attached to them.
 
But I think our hope is that with synthetic biology, we will get some very useful products — medicine and fuels for example — with very, very high benefits and very, very low risks. That’s the hope. In the meantime, the public is going to be well-served by a harmonious blend, I believe, of private initiative and investment, restrained by self-imposed professional standards and ethics, along with, secondly, public policies that nurture and reward the business science, as well as grassroots curiosity and achievement. So recommendation ten calls for oversight and accountability.
 
Technologies deriving from the new science necessarily will impact society and as such, citizens and their leaders should have a voice in deciding the conditions and directions of research efforts. Scientists have responsibility to use public monies, and not just public monies, but use the fruits of public investments, which they might not directly receive, but are benefitting from in a way that is consistent with the public trust. Those who work in the field of synthetic biology, from the hobbyist, to the university researcher, to the global multinational corporation, have to be mindful of the risk that their work poses to human health, safety, and security.
 
So, everybody should be prepared to work with a degree of transparency and foresight and can respond appropriately in the event of accidents or mistake. Like atomic science, synthetic biology must recognize the uncertain nature of risk associated with their efforts and they must be prepared to act cautiously and utmost attention to the public’s interest.
 
Again, recommendation ten suggests that the Executive Office of the President, working with federal agencies, engage in oversight and in fostering accountability.
 
James Wagner:
Alex Garza will present the next recommendation.
 
Alexander Garza:
Thank you, Mr. Vice Chairman.
 
I think this number eleven is a natural extension of the conversation that’s been going on here. We’re all very familiar with the culture of responsibility. It’s in academic institutions and private companies, because most of them come from academia, as well as the federal government. They all understand the world that they work in; they all understand the ethical implications of the work that they do. The thing that makes syn-bio a little unique is that it’s been embraced by those outside of academia and outside of these private companies, or these so-called DIY communities. Which I think we should foster that sort of involvement and curiosity, but in the same frame of reference, also try to assure that these communities also understand the ethical and the responsibly that goes along with what they are wanting to do.
 
When you move the principle of intellectual freedom and responsibility away from the institution and into the individual, I think, it becomes a lot more incumbent upon the government to make sure they’re doing the things in the correct way, or that we’re performing enough vigilance and making sure that they’re proceeding ethically and without causing any concern to the public, and without trying to limit what their interests are or their research efforts. So, to that end, there are various issues of various tracks that you can take. Barbara mentioned public education. I think this fits well into the President’s strategy in countering biological threats in which he embraced developing a culture of responsibility and ethics, both here at home and internationally. It has some sort of parallels with computer programming, where there are a lot of people that can do very effective computer programming, but you want to make sure that they’re doing it in a very ethical environment as well.
 
That brings us up to number eleven, which states that the Executive Office of the President through the Office of Science and Technology Policy, or similar office, informed by the current work of the FBI — and the FBI has done tremendous work in this arena. We heard from Special Agent in charge and they’ve done in a very collaborative environment. They haven’t come down with a heavy hand. They’ve more embraced them and tried to come up with “this is a partnership.” But by the current work of the FBI -- and other departments, should undertake a gap analysis to assess the specific risks of synthetic biology research activities in non-institutional settings, such as the DIY community. If possible, the analysis should identify efforts to bring these communities into the ongoing culture of responsibility and local accountability that currently exists in the institutional settings. Legislation or other tools may be pursued if needed to guide and foster accountability within these communities.
 
Thank you.
 
James Wagner:
Thank you, Alex. Barbara, I think you’re on next, number twelve.
 
Barbara Atkinson:
Number twelve is actually a follow-up of both of the last two and it really speaks to a little more than the private responsibility, but really looks to the oversight of the whole area. In academia, I think we’re used to a lot of oversight, as we have already talked about. We have institutional committees, we have peer review, and we have NIH guidelines that have to be followed, but really, when you come to the syn-bio people, they may not even understand that there are guidelines, much less know what they are. That’s really the piece in the last recommendation. Here if we talk about companies, we come to a little bit different issue than not understanding. May of the private companies and corporations currently have a lot of scrutiny there, scrutinized by the USDA for animals and animal activities. The FDA; they have to meet all the same standards for an FDA approval that anybody in academics would have to meet. They have OSHA looking at their safety and their employees and so on. So, there are a lot of things that are like academics, but what they don’t have is the regulation of the NIH guidelines and the things that NIH specifically requires of anybody that gets federal funding. The issue is whether this whole area is becoming important enough and big enough that we should really be expecting everybody to follow the NIH guidelines as well as all the rest of the kinds of requirements.
 
Recommendation twelve says that if the community or the capacity of independent and private researchers companies and corporations continues to grow, the government should consider making compliance with certain oversight measures like the NIH guidelines for recombinant DNA or comparable safety standards mandatory for all researchers regardless of funding. Such a step may require new legislation or revisions to existing statutes and regulations. This issue should be considered by the Executive Office of the President, or another designated office, as part of the gap analysis described in recommendation eleven.
 
James Wagner:
Barbara, thank you. Our fourth and final recommendation in the Intellectual Freedom and Responsibility, Nita is going to lead us.
 
Nita Farahany:
One of the concerns that we heard most frequently from our discussions was the concern about the dual use application of emerging synthetic biology. While there are tremendous benefits and that’s where we expect the direction of much of this research to go, there is also at the possibility that such research could yield concerns for malevolent use or for nefarious use of synthetic biology. This includes concerns, such as, using novel pathogens, or reviving existing pathogens and administering them to populations or introducing things into water systems or ecosystems that might be quite destructive. Understanding that research happens on an international level and that restrictions that we place here may simply lead to researchers moving abroad or research transferring abroad, we’re mindful that there needs to be a balance between whatever restrictions we might wish to impose upon the export or import of different parts that would used for synthetic biology and understanding that they’re also is a need for research to progress. To the extent that we are a leader in synthetic biology, we don’t wish to stymie that type of progress.
 
We had a panel that was dedicated, in our second meeting, to biosecurity concerns. I think it gave us all great comfort to know the level of coordination that’s happening at an international level, particularly with respect to things like sequences that are being regulated in select agents — where, rather than individual researchers being able to simply purchase such sequences, they are required to go through a checklist and look at things like, whether they’re on a list where they are permitted to buy such things. Recognizing that there is significant oversight already that is happening, and there is significant coordination happing at an international level, and also recognizing our principle of prudent vigilance, which is that we don’t wish to stymie research or intellectual freedom, we come to recommendation number thirteen.
 
Recommendation number thirteen says, to promote intellectual freedom while protecting national security, the government acting through the Department of Commerce in coordination with relevant science and regulatory agencies, such as the NIH and EPA, should continue to engage the scientific community in discussions and policy making to ensure that export controls for security purposes do not restrain the free exchange of information and materials among international scientific community.
 
I would just propose that we consider slightly broadening this. In particular, rather than focusing exclusively on the export controls, I would encourage us also to think about import controls that likewise might be applicable.
 
Then I wonder whether the Department of Commerce is the right direction or the right agency for us to direct our recommendation to. Maybe Alex can speak to this as to whether or not, given the national security concern, we should be directing this more towards something like the Department of Homeland Security or to DOD, or at least, more broadly include those agencies, as well.
 
James Wagner:
Alex, you want to take a crack at that.
 
Alexander Garza:
Sure. The Department of Commerce, of course is the regulatory body for import and export controls, so that’s why, I think, rightly they were named in this. I do think, though that the Department of Homeland Security should partner with the Department of Commerce in the national security issues that revolve around import and export controls. And however we can fashion that into the statement, I think would be appropriate.
 
Amy Gutmann:
I think we should probably put that in the text. The Department of Commerce is the right — it will be the body that determines this and I and other leaders in higher education have had experience with the issue of export controls, and when there is a report issued, it’s definitely the Department of Commerce that makes the final determination. We can certainly put in the text as to what other bodies need to be consulted in that.
 
Nita Farahany:
Where we say, “in coordination with relevant science and regulatory agencies,” that’s where — rather than just limiting things like NIH and EPA, I would suggest that we broaden this to consider that, given we’re making it a national security balance, that we include it in that.
 
Amy Gutmann:
Right.
 
James Wagner:
How do you feel, again, with the one you presented, Nita, where we talked about not wanting to unduly restrain the free exchange of information and materials? Are we explicit enough? Not for my comfort. Maybe just for everyone else’s comfort. Are we explicit enough about the notion that so much of what is valuable is data … much of the innovation that may come will be around gene sequences, which will be stored … nothing more than data, is it redundant or ill advised to talk about information data and materials? I realize data is a form of information, but in this case data is a form of a material. It really is a material of the science.
 
Nita Farahany:
I know this has been a concern of mine, and many of ours, about the fact that so much information, for example, pathogens which no longer are in circulation, are now simply posted on the internet and widely available. I think we also heard that it’s really difficult to try to restrain that type of information and that for us to try to restrain that information would be rather fruitless. Looking at something like Wikileaks, I would imagine that —
 
James Wagner:
This section is talking about ensuring that we don’t restrain it, that’s why I was calling it out.
 
Nita Farahany:
Right, so I’m saying, given that context, I think it’s appropriate for us to not try to restrain it, given everything that we heard and the understanding that it’s so difficult to do. Instead, if we’re thinking about things like export controls, and I would suggest also potentially import controls, about at least identify select agents and at least, identified sequences. It may be something that we can more likely track, but to actually restrict it seems relatively —
 
James Wagner:
The reason I want to call out not to restrict it is in part, because for many of our university workers in laboratories, export controls, just the phrase sends chills up their spine because of the — I was trying to think of a non-pejorative descriptive word … Events of not too many years ago when even students in our lab were deemed as — when they called deemed export, right. They were deemed as exports because they carried in their head information. It’s the truth.
 
Nita Farahany:
Are you suggesting that we make this more specific as to specifically what we mean —
 
James Wagner:
Well, no I’m just trying to cover. The last phrase is saying that we do not want to see restrictions, and I’m trying to suggest the data should be in the not restricted area along with the general categories of information.
 
Nita Farahany:
I’m reading this a little bit differently, perhaps. I’m not reading this to say we do not restrain it at all. I’m reading it to say that we’re balancing to make sure that export controls, for security purposes, do not unduly restrain the free exchange of information and materials among international scientific community. I’m reading that to say that we think we don’t wish to overly restrain such free exchange and yet recognize that some oversight is appropriate.
 
Amy Gutmann:
I think the concern, if I could zero in on Jim’s concern, may be lumping together information and materials in one phrase. The free exchange of materials is both, even if you are agnostic about how desirable it is to restrain it, and I’m not, I think it’s undesirable to restrain it, but even if you think it might be desirable to restrain it, you cannot in this day and age.
 
There will be free exchange of materials in this domain. That’s different than high security information, which originates in a security realm. We’re talking about information here that comes out of scientists’ labs. I think we need to make sure that we distinguish in some way between information, pure information, and materials. It’s going to impossible to restrain and it would be undesirable, as well. I think there we can be a little bit clearer that we’re really focusing largely on the products of science here for export controls rather than information. There will be gray areas, for sure, but as I like to tell my students who say; there are these hard cases, how can you have this distinction? Well, there’s dusk and dawn, but that doesn’t prevent us from knowing that there’s day and night. And there is some information in the science realm that really, this principle of Intellectual Freedom and responsibility is meant to say. We’ve really got to enable that to be freely exchanged, even though there are downside risks there.
 
Nita Farahany:
I agree that we do not want to restrain information, particularly, the way in which we’re characterizing it here. Nevertheless, we’re going to need to be much more specific as to what we mean by materials, given that this area, materials can be something as simple as oligonucleotides or the directions on how to synthesize something.
 
Amy Gutmann:
But remember, we’re not saying that materials should be restrained. We’re just saying that they —
 
Nita Farahany:
Well, we understand that they are restrained, so there are some restraints, at least for select agents, not down the re-agent level or knocked down to the oligonucleotide level, but at least for certain —
 
James Wagner:
Let me go a little further. How comfortable are we with saying — even cautioning about the restraint of knowledge in these areas, again, I’m cautious about … That era, and some of you were doing lab work at that time where you couldn’t invite a Pakistani student into your laboratory if your particular activity was one considered to be one related to national security. Is knowledge too broad a word to caution against restriction?
 
Nita Farahany:
Perhaps. I hear the concern and I agree with it. I’m wondering if we’re trying to make this particular recommendation do too much work then.
 
James Wagner:
Could be.
 
Nita Farahany:
I’m reading export controls here and intellectual freedom, the balance here, pertains — particularly given the discussion we have in earlier chapters about export controls for select agents or products that we find — products again, but this is a difficult language — but, synthesizing particular sequences that we don’t wish exported. So, if that’s what this is about, that we find that to be permissible, we don’t want to disturb that and yet recognize that we don’t want to unduly restrict academic freedom. A second, perhaps, strong statement could reflect this concern.
 
Barbara Atkinson:
Maybe then we need to define what we want to restrict and define what the rest is, because I do agree, there are things that we need to restrict. But I really think the whole export control was ill thought out at the time and has not done us a favor.
 
Amy Gutmann:
It was resolved in a good way and I think in a way that this recommendation builds on. This recommendation should focus, and if we haven’t worded it exactly right, we should word it, but I think the way Nita read it, it’s correctly worded, it’s not incorrect. But it’s really focused on the fact that it works best when the scientific community is consulted and this recommendation says that —
 
James Wagner:
Continue to engage in scientific —
 
Amy Gutmann:
That the legislative bodies, the bodies that have — or the regulatory bodies, but they’re the ones who, in fact, write the regs, they should continue to consult with the scientific community in all instances, as long as there’s the instances on that. That’s the way the real controversy deemed export controls was resolved, which it was determined that since there weren’t actual cases of threats that were under — or at issue — but potential cases, that there ought to be a scientific panel or group that would be consulted in any case that came up to resolve the issue. And that was a very good resolution at the time. And I know many of the people that were involved in that and I thought it came out well. the process leading to that, I think we could make much more effective and have less angst about because we’ve learned from the mistakes that were made leading up to that conclusion, but I think as you worked this recommendation, I think it says the right thing.
 
James Wagner:
Comments on some of the other recommendations in this section?
 
James Wagner:
Nelson.
 
Nelson Michael:
More as a practical point is — and this is something that came up in our discussions this morning, if I recall, I think it was your comment — that we can put into place mechanisms so people who aren’t traditionally involved in biomedical researchers like do-it-yourselfers can get access to guidelines so that they’re aware, and if regulation was to be put into place, that they could become compliant. It’s a kind of a bewildering system if you’re not inculcated and not trained in bioethics to know where your entry point is. So, we had the discussion about some sort of user friendly protal. If somebody was working in their garage, they’re now having to comply with these regulations, at least, can do their due diligence fairly simply because I think you rightfully said that, if I remember your comments, that in the absence I that, even people with good intentions are likely to give up because it’s too hard.
 
James Wagner:
Would we want to reflect something like that in here? I hear Barbara saying yes.
 
Nelson Michael:
Yeah, I think so.
 
James Wagner:
Yeah. If we’re trying to do as this one says, develop further and ongoing culture of responsibility to provide — and I like your word: portal, a place to do that. Yeah, so maybe we ought to consider crafting something of a recommendation along those lines.
 
Amy Gutmann:
Put it in this —
 
James Wagner:
Yeah. Nita.
 
Nita Farahany:
We discussed a little but earlier, but the way that we worded ten and eleven right now suggests that we need to do a GAP analysis to the specific risks of synthetic biology in non-intuitional settings. I think we likewise, should word number ten to make clear that when the government should evaluate current oversight mechanisms to access their effectiveness to determine what steps should be taken. We’re likewise suggesting that overall GAP analysis, not just with respect to the do-it-yourselfers. We’re not singling out the do-it-yourselfers with respect to GAP analysis.
 
James Wagner:
That’s right.
 
Amy Gutmann:
Does everybody here agree with that, because I think that’s an important revision that we should integrate?
 
James Wagner:
I agree. Any other thoughts — comments on this set? Remember we’ve been talking in the context of Intellectual Freedom. I imagine that if somebody attended only this session and none of the others, they might imagine that we were somehow less concerned than we should be about responsible stewardship, for example. But, I think in the context of this particular principle, all of our conversation comments are well received. Thoughts from the group? Yes, please, come to the microphone and let us know who you are.
 
Stephanie Ramage:
Hi, I’m Stephanie Ramage from The Sunday Paper here in Atlanta. And I confess I missed some of the stuff this morning, so you could have given this already, but I notice there’s a lot of concern about the DIY practitioners. There’s been so much concern about it that I’m afraid that I might misconstrue or perhaps mis-communicate the … well, I guess the heft that they have in this community. How many people are we talking about exactly? How many DIY practitioners might there be? Has anyone looked into that?
 
James Wagner:
We had a very good presentation on that that, and I’m trying to recall. It’s surprisingly large. The comments that you might have missed this morning, we were talking about the importance of recognizing that you can’t arrest research. We even talked about the perils of arresting research. We even talked about the nuclear industry as an example and then a very good point being made that unlike some of that kind of science, the science associated synthetic biology can happen in people’s garages, or can take place. Dan.
 
Daniel Sulmasy:
We do have a cited estimate in our report of two thousand people belonging to this informal network called DIY Bio.
 
Amy Gutmann:
I think it’s important to say where the DIY community is within the larger context, as you asked. It is one of the distinguishing features of synthetic biology, it’s not the only field in science, but it’s one of the fields that you can do in your garage. Much the way there were chemistry kits that I had as a kid that I could do in my home and they weren’t without some risk. They probably had greater risk right now than the do-it-yourselfers have with regard to synthetic biology, as far as biosafety.
 
In the larger context of what brings us to study synthetic biology, all of the scientists we spoke to agreed that what you can do in your garage now, with regard to synthetic biology is very elementary, even compared to what synthetic biology now has accomplished in its infancy or adolescence as a relatively new field. So, there’s no complex, not even close to what the synthesized genome is, which is a very simple organism that the Venter Institute created, or what Jay Keasling is doing with regard to malaria drugs.
 
What is wonderful about the do-it-yourself community is that it has spurred interest in a burgeoning field of synthetic biology. What we’re doing as a Commission is looking proactively as a way of getting that community educated in the same way that other practitioners of synthetic biology are. But for those people who are — if there are people who are panicked about what would happen in a garage today, with regard to synthetic biology, I would say that what we’ve learned is that there’s no cause for panic today. What we really want to do is look forward to make sure that the two thousand plus members of the do-it-yourself community are what Drew Endy said he would like to see, more DIT. Become into a DIT community, a do-it-together community in the sense that at least online and in some — there’s some loose organization of people who are seriously involved in synthetic biology in their garages.
 
James Wagner:
Barbara, did you have a comment on that? Oh, that was the point you were going to make.
 
Anita Allen:
If I could just kind of summarize, Amy, what you just said very nicely for the purposes of our questioner. There are sort of three unique challenges the do-it-yourselfers pose, one is a challenge to traditional government regulation and models of government regulation, and the second is a challenge to traditional mainstream bioethics, because it’s a question of access education. And then, thirdly, it’s a challenge to traditional intellectual property conceptions, because we see a more freewheeling and less reverent group who are not necessarily bounded by traditional notions of copyright, trademark and patent. Those three challenges, I think have made this a very interesting and important topic for us to deliberate about.
 
James Wagner:
Chris.
 
Christine Grady:
I just wanted to add to that. I think one of the — what struck me as not only one of the riches of synthetic biology, but also something that’s a challenge is in addition to the do-it-yourself community, synthetic biology brings together disciplines that have traditionally not worked together. And then have, sometimes, maybe often, different cultures in terms of how they do their work and so that’s an interesting challenge in terms of the big picture of synthetic biology that takes much more than just the do-it-yourself.
 
James Wagner:
I think as far as research — I’m sorry, John, go ahead.
 
John Arras:
Are we done with this question? I want to take us back to number twelve where we called for considering the extension of oversight measures to all researchers’ regardless of their funding source. We hear this sort of call from time to time with regard to standard research, clinical research. The objection is put that there’s no reason why the federal regulations shouldn’t be extended to everybody doing research on human subjects in this country. I think that’s a powerful observation, but that hasn’t happened and I’m just wondering what are the impediments to that extension, with regard to research in general and could we expect the same sorts of impediments with regard to this call with synthetic biology being applied, regardless of funding source.
 
James Wagner:
Barbara.
 
Barbara Atkinson:
I’m not sure I have the complete answer, but would just say in human subject’s research, there’s at least enough control from the FDA to allow things to have to be done along the path. Even though it’s not the same NIH guidelines, there’s a lot of human subjects guidelines. Synthetic biology has this risk that’s out there that we really don’t know how bad it is, and that’s really what makes this, to my mind, a little different. It’s really an issue of managing the risk that might or might not be an important risk, but having at least an oversight while it’s at this early stage.
 
John Arras:
I think we might want to add some language to that effect here, to distinguish this from the standard case of clinical reserch showing why there might be more need for expanded oversight in this case.
 
James Wagner:
Nita.
 
Nita Farahany:
Building on John’s point, I discussed before thoughts about transparency and how it is that we could increase transparency in this particular field, and this area is a unique concern with respect to registering novel organisms or things like that. It seems like; maybe this might be a place in which we could think about that. So we’re saying that we should consider making compliance with certain oversight measure, like NIH guidelines or comparable safety standards, mandatory. It may not be just oversight measures, but regulatory measures including registration or disclosure requirements. And so, given that we are already recognizing that such a step may require new legislation or revisions to existing statutes or regulations, this might be such a revision that we could consider is whether new mechanisms or disclosure to the government, disclosure in a limited capacity of novel organisms could be something that we recommend looking into.
 
James Wagner:
How do people feel about that? It would appear to me that disclosure or registration, unless it can be kept confidential and proprietary, has a risk of driving work underground. If it can be kept proprietary, as we do around nuclear materials, for example, perhaps … How does the Commission feel about that?
 
Nita Farahany:
That is a way in which I am recommending it is, that it be something to allow people to preserve trade secrets with strict confidentiality, simply allowing it for oversight for tracking or surveillance purposes, as we already do with things like select agents or other particular nuclear materials.
 
James Wagner:
Nelson.
 
Nelson Michael:
Yeah, I would say that if you could preserve the issue surrounding trade secrets then I think that it’s certainly laudable and I understand that there are extant exemplars for how it’s worked in the past.
 
James Wagner:
Interesting to hear from somebody on that. Barbara.
 
Barbara Atkins:
I’m not sure, but I sort of take the other side, that I’m not sure it’s needed. This may be that we’re talking about it because we don’t know the risk and that’s why it might be needed. On the other hand, this is a field that’s growing so fast, there’s going to be so much registration and so much stuff that I cannot imagine who’s going to want to keep track of it in a while. So, I guess I’m just not sure it really would do what we want it to do and what would somebody do if it’s registered and they don’t like it? It’s not an easy issue to just think about registering something and keeping it confidential.
 
James Wagner:
Nelson.
 
Nelson Michael:
Well maybe you can take the approach that since we don’t know and wanted to gain knowledge, there could be a pilot project that could ask whether to not this would be feasible.
 
Anita Allen:
One of the things that I think we should not forget is how routine and ordinary it is for the federal government to collect data and then keep it secret for business reasons. Companies have to give up vast quantities of sensitive data for a compliance with the tax laws and security laws, and OSHA and employment laws and the government is quite decent actually at keeping that stuff secret to respect a business practice secrets, trade secrets and so forth. So it wouldn’t really be extraordinary, if, in this realm the government required some kind of registration or something else to protect human safety and wellbeing, while also having expectation that we’re going to keep sensitive information with proprietary secrets. It’s just not an extraordinary thing at all to ask that of government or the private sector in my opinion.
 
Barbara Atkins:
That’s a good point, but I’m still not sure who would do something with it. It’s really, what the practicality of it really is.
 
James Wagner:
We have foundries that are already doing some prescreening.
 
Barbara Atkins:
That’s true. And that’s very deliberate right at the time when it’s at use, it’s not at a just in general level.
 
Amy Gutmann:
There are three communities here that are the primary targets for a recommendation for this sort, one is already covered, the publically funded community, and that’s covered. The second, are private organizations that are doing major research in this area and their desire to get patents is going to go into registration into a confidential registration, but late in the game, so the question is do we want registration earlier? The third community is the do-it-yourselfers, and they’re not at present working on anything that would require registration at this point. Now, maybe sometime in the future there would be. And even if we recommend a registration, that’s just not going to happen. So it’s really the second group that —
 
James Wagner:
The second group may be broader than we think, because it may include, not just those who will eventually disclose through patent, but those never had any intention of disclosing through patent, because they have a financial incentive to keep it a trade secret, says somebody living in Coca-Cola town.
 
[AUDIENCE LAUGHTER]
 
Amy Gutmann:
That’s the real target here for a registry. Other than that, it doesn’t have a target.
 
Nita Farahany:
To that point, given that we recognize that this area of science is proceeding at great speed with private research funds and it seems like more so than other research fields have a tremendous amount of private funding. And also, potentially giving a small risk, but high magnitude risk a minimal thing like registration, you could limit what you’re talking about. Like, you don’t have to register everything. You could register anything that is a novel —
 
Amy Gutmann:
I’m agreeing, but saying that you have to make it very narrow, because otherwise what this captures is a whole do-it-yourself community, which will be up in arms and rightly so that we’re driving them underground, because they now have to register everything that they think they’re doing. So we should really make clear what this target area is that presents potentially enough risk and that it’s worthwhile considering registering. I think this is well worded, this recommendation because it says we’re recommending that it be considered for … you know … and if we add the registration …
 
James Wagner:
… to be considered. I don’t think we’ve had enough expert testimony on the area and maybe not enough expertise sitting here, and it certainly is true speaking for myself, to really come to a conclusion today on this. But I think you make a good point. We wouldn’t have to if we recommend in here that it be considered as a tool.
 
Amy Gutmann:
As we’ve been elsewhere, we should be specific about where our concern lies and it’s in this particular area.
 
James Wagner:
You could see a registry system having broader — if foundries would agree to accept work only from registered individuals, etcetera —
 
Raju Kucherlapati:
Can I make a comment?
 
Amy Gutmann:
Oh, Raju. Raju, come on in, we welcome your comments.
 
Raju Kucherlapati:
Thank you. I think we should be very careful about a registry, because as we talked about in the earlier part of the reports, it’s very clear that this is a continuum of things. And there are a lot of research efforts that might involve synthesis small oligonucleotides that are important for synthetic biology, but it’s not true in synthetic biology and if we get into this requirement then it may turn out that essentially all of molecular biology would be considered synthetic biology. So we have to be careful how we define and what it is that we are trying to prevent.
 
Amy Gutmann:
Raju, while we have you speaking, could you say a little bit about Asilomar and how you see that, because, a model of what the importance is.
 
Raju Kucherlapati:
Asilomar was the meeting of scientists and towards the end of the regular scientific meeting, everybody wanted to talk about recombinant DNA technology and what its implications were. And at that time very little information was available about the benefits and the risks that are associated with recombinant DNA technology. But what it did was, it really stimulated the recombinant advisory committee at the NIH and they were able to put together a series of guidelines that were adapted by the scientific community. And the recombinant advisory committee still exists and it periodically reviews all of the rules and regulation. And in some cases they have eased the regulations and certain aspects from other cases that they have tightened them. And they were able to continuously change them as the science has progressed.
 
James Wagner:
There’s a living element to that. Let’s decide how we’re going to deal with this on the table and then we have to go to a break. Is number twelve, as listed sufficient? Do we want to add a suggestion when we say; oversight measures like NIH guidelines, comparative safety standards. Do we want to include a phrase there on a potential registry? That’s something to be considered. As I said earlier, I don’t believe, as a body we’re able to consider it thoroughly.
 
Anita Allen:
I would hate to see that specific instrument included in the report, even though I want to emphasize again that some such government involvement consistent with intellectual property is quite possible and doable, but I don’t want to commit to a registry as such.
 
James Wagner:
Do we want to put something in the text around this one that might do it, and not change the recommendation?
 
Nita Farahany:
I guess there’s a different way we could go, like, so we say certain oversight measures like NIH guidelines, or comparable safety standards. This isn’t really a safety standard per se, so if change what it is that we say these oversight measures are — this is something that tends to be in the text, right. And maybe a registrational requirement or something like that, but maybe simply eliminate what our particular recommendations are in here. So say, compliance with certain oversight measures, period. And then in the text —
 
James Wagner:
Mandatory to all researchers regardless.
 
Nita Farahany:
Right. And then in the text, include some of these recommendations that potentially, NIH guidelines, comparable safety guidelines, and maybe registration as a potential additional things per se, because I agree, I don’t have any particular knowledge of —
 
James Wagner:
The text rationale may be the better place to address this.
 
Nita Farahany:
Right.
 
James Wagner:
How do people feel about this?
 
Anita Allen:
Yeah.
 
James Wagner:
Let’s go with that.
 
Let’s take a break. We will return at 3:30.
 
Amy Gutmann:
And at 3:30 we’re going to have Tom Murray present to us. Don’t be late.

Share Your Views

We are asking all interested parties to share your views on synthetic biology. Send your comment to info@bioethics.gov.

This is a work of the U.S. Government and is not subject to copyright protection in the United States. Foreign copyrights may apply.