U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Committee on India-United States Cooperation on Challenges of Emerging Infections and Global Health Safety; Policy and Global Affairs; National Academy of Sciences; Indian National Science Academy. Indo-U.S. Workshop on Challenges of Emerging Infections and Global Health Safety: Summary of a Workshop. Washington (DC): National Academies Press (US); 2016 May 6.

Cover of Indo-U.S. Workshop on Challenges of Emerging Infections and Global Health Safety

Indo-U.S. Workshop on Challenges of Emerging Infections and Global Health Safety: Summary of a Workshop.

Show details

4The Biotechnology Revolution: Exploring New Territory Together

Raghavendra Gadagkar introduced the fourth workshop session by referring to opportunities and challenges presented by the emerging capabilities of the biotechnology revolution, and stating that India and the United States can explore this new territory in the life sciences and related technology together.

EMERGING CAPABILITIES IN THE LIFE SCIENCES AND CHALLENGES FOR GLOBAL HEALTH

David Relman began his presentation with the point that one of the keys to understanding why and how infectious disease events occur is related to recognizing ecology and evolution. These events occur in an ecosystem under stress; an ecosystem in which infectious agents (microbes), hosts, and the environment are evolving. All three—microbes, hosts, and the environment—concern us.

In addition to the loss of human life and suffering as a result of the Ebola outbreak, there are also important effects of events like these on public health infrastructure, on social infrastructure, and economic and political infrastructures as well. Sometimes these events have as much of a detrimental effect on the overall ecosystem as they do upon individuals. These events reveal a number of important needs. We need to understand how and why these events occur, both from the point of view of the agent as well as from the perspective of the larger system. To overcome existing challenges, we need interventions such as diagnostics, drugs, vaccines, and the means to deliver them in a rapid and flexible manner. This is the context in which we can discuss science and technology and what is occurring today.

Relman then turned to history because the revolution in the life sciences did not begin yesterday. It began more than 50 years ago with the first description of transmissible inheritable material, DNA. Over the subsequent 20 or so years following the discovery of DNA, the tools for manipulating DNA and what is now referred to as recombinant engineering were developed. Some have described the mid-1970s as the beginning of the age of biotechnology. Thereafter it became possible to both sequence and later synthesize DNA, and hence to understand and manipulate it for positive purposes. Genetically modified plants first became available in the 1980s, and the ability to sequence an entire genome was realized in the 1990s. The commercialization of all of this technology then became possible. Since the turn of the last century, these trends have escalated enormously such that today's trends show an exponential growth in technology and a simultaneous decrease in costs. Relman cautioned that there is a difference between information, which is what is gained in these advances, and insight, which is not always measured in such dramatic terms, but one hopes insight follows information.

About 8 years ago, Relman had the opportunity to help organize a study at the U.S. National Academy of Sciences (NAS) that examined these trends and asked what their impact might be on our future.1 As part of that study, experts were trying to grapple with the many diverse types of capabilities that might fall under the umbrella of biotechnology. First, they considered ways of trying to classify all of these technologies in simple terms. Relman admitted that it is not simple, but the idea was that there is a group of technologies whose purpose is to generate a great deal of diversity that would not necessarily occur normally in nature. This is an immensely important capability and it is exemplified by DNA synthesis, DNA shuffling, and other types of technologies.

Second, there is a related but distinct type of technology that is motivated by the goal of deliberately designing a particular life form. There may be a code for an existing agent. One may have a code in mind for something that does not exist; there is now the means of making that agent in the laboratory as well.

Third, there are technologies that assist in understanding how complex systems operate and learning what might be the small critical vulnerabilities or opportunities for intervention, which is called “systems biology,” and there are many other technologies to fall in this category.

Finally, there are biotechnologies involved in producing, packaging, and delivering all kinds of products including DNA itself. All of this has led to an immense number of benefits and one could spend a great deal of time counting the types of important results that have come from these technologies.

A particular development illustrates what biotechnology might offer. Scientists have created in the laboratory a yeast strain that contains all of the genes and the pathways for making artemisinin, a very complicated molecule that is now one of the most important anti-malarial drugs. It was previously only available from its natural source, the yew tree in China. But those supplies have been dwindling and the costs have been rising. Now, through bioengineering, one can make this drug in a much cheaper form in the laboratory.

This is just one of many examples. There are numerous other examples, and Relman selected a few to discuss, although he noted that all of them should cause us to think about not only the benefits that these technologies provide, but also some of the other ways in which they might be used or perhaps inadvertently misused. One example is the means to remake living things. In other words, one can remake most viruses from just the sequence. This is certainly true of RNA viruses and most DNA viruses. This technology is actually quite old. In 1994, a German report documented the formation of rabies. This was one of the first examples of how one might take, in this case, cloned cDNA for an RNA virus and cause it to be expressed in the laboratory with the necessary proteins present at the same time and allow the virus to form in tissue culture. This type of work has continued for 20 years and there is a large, important capability, from this technology in understanding how viruses operate, understanding pathogenicity, and being able to manipulate them for many very useful purposes.

Relman then described his collaboration with Craig Venter and a recent example of Novartis vaccines. They use synthetic capabilities to make the seed stocks for some new influenza vaccines starting with sequencing and then ordering the DNA or the cDNA that would encode the necessary antigens and produce it in synthetic form. This saves a fair bit of time in a process that is very sensitive to timeliness. This same capability to remake viruses has also allowed people to study viruses that have not been known to exist before. Relman referred to Ralph Baric and his group, who have been conducting very interesting work examining coronaviruses, such as SARS, and trying to determine where they originate. If one creates an evolutionary tree of all of the related viruses known today, one can extrapolate back in time and say there ought to have been an ancestral virus that matches. From that hypothetical sequence, Baric and his colleagues made the virus in the laboratory and showed that it had the properties we could predict from the evolutionary studies of viruses existing today. This same technical capability to remake viruses, shuffle their genomes, and ask whether there are other properties that can be created is now very powerful, and widely used, especially in commercial sectors. It can also lead to the ability to take a virus that normally has liver tropism and give it a different organ tropism, or to take, for example, a virus that had low yield in the laboratory and give it high yield, and so forth. There are many examples of this kind of capability.

What Relman thinks is even more interesting today, he said, is the idea that one could engineer entire communities of organisms, not single organisms, but organisms that normally interact together so that they feed each other synthetic, newly-engineered compounds and help each other make resulting products that no organism alone might have produced. With the help of advanced technology, kits are available for experiments that might have taken months to years in earlier times. Today this work can easily be conducted in weeks. More and more of these kits are becoming available.

These capabilities have led to work such as the creation of reengineered mosquitoes that may have lost their fertility, causing the deliberate extinction of species that carry malaria. However, we must think carefully about the ripple effects in the ecosystem of such experiments.

The moindification of genes is now being proposed as a possible cure for inherited genetic diseases. Currently, work is being conducted in mice, but the same idea is also being considered in humans. One can begin to ask very interesting questions: What are the implications of the availability and capabilities of these types of technologies? What if a user does not have entirely beneficial purposes in mind? Relman raised these questions because when he was a student, his teacher, Stanley Falkow, one of the fathers of pathogenesis in bacteria, frequently said that nature is the ultimate creator of all that might be. Over many years of evolution, there has been so much natural experimentation that what we see today is something that humans could not hope to outdo. How could we possibly create something that would have greater adaptability than what has been found in nature? The new capabilities available today have caused Relman to rethink this question.

He then discussed another experiment conducted by Lee Riley of the University of California, Berkeley. Riley studies tuberculosis and he had a strong suspicion that the Mycobacterium tuberculosis mce1 operon was necessary for the virulence of tuberculosis. He set out to knock out the operon and his prediction was that the tuberculosis would be attenuated. Instead, he received the opposite result. It turned out that these genes control the replication of the bacterium within host cells in a way that suggests that nature may have deliberately caused tuberculosis bacteria to replicate slowly as a means of long-term survival. There is no point in killing the host very quickly the way this mutant did if it plans to survive for a long period of time. On the other hand, if this organism were set loose today, it would take quite some time to become eventually readapted to humans. In the meantime, over the successive years, humans, as hosts, would suffer the consequences of a poorly-adapted Mycobacterium tuberculosis. In other words, this discovery led to the realization that many pathogens are actually naturally attenuated and when we identify these genes, we are now able to unattenuate them, to intentionally make them more virulent if we chose to do so.

This story is true for many pathogens, which led Relman to discuss highly pathogenic avian influenza. Observations in nature have raised some compelling questions and he was the first to admit that these questions are exceedingly interesting and potentially very important. With regard to H5N1, one of those questions is: Why has the virus not been able to become more easily transmitted between mammals? Some have argued that since this has not yet occurred, there is something about this virus that does not allow for enhanced mammalian transmissibility while being able to maintain the proper hosts and other properties that it has chosen. Others have argued that transmissibility will happen with time. It was this latter agreement that led to the work behind two widely-cited gain-of-function publications.2

These experiments were a deliberate effort in the laboratory to see if H5N1 viruses could acquire enhanced transmissibility between mammals, ferrets in this case. The first study, by Ron Fouchier, began with a highly pathogenic Indonesia influenza isolate and he created deliberate redesigns of the genome as well as passaging and found that he could isolate a virus that had the property of enhanced transmissibility and, to our knowledge, no great reduction in virulence. A subsequent follow-on study has provided everyone with the five mutations that together allow this virus to have these properties. These results indicate the immense power of this technology, Relman noted, and the means of achieving ends which nature certainly has not yet achieved, and might not achieve in the future.

These experiments and their results also raise important questions about whether this type of experiment is necessary. When Fouchier's paper was published, the world obtained the sequence, the five mutations, needed to remake the virus because influenza, like other RNA viruses, can be remade in the laboratory by anyone who has the appropriate technology. Some have reacted very dramatically to that particular finding. It certainly caused people to stop and ask questions.

When examining the whole world of microbes, we can see that nature has been very successful through many trials and errors in creating pathogens that are relatively rare. Pathogens are exceedingly rare as a fraction of all microbes on this planet. It is a trait that would probably be very difficult to achieve, even with insights gained over time, so it would be hard to recreate nature's balance. However, Relman stated that nature certainly has not tried all of the sequence possibilities that scientists could try in the laboratory. Success in nature is different than success in the laboratory.

Further, even when people are absolutely well-intentioned, accidents will occur. They happen everywhere, even in the best of laboratories. Therefore, if we have a virus or an infectious agent with new enhanced properties, what is the likelihood that it will escape despite all best efforts?

Relman concluded with several challenges and questions. The challenges in confronting emerging infectious diseases include: (1) immense diversity, even in nature; (2) the importance of maintaining essential science and technology; and, (3) the challenge, but also the potential danger, of relying upon certain kinds of oversight approaches that may provide a false sense of security. Relman is concerned that, at least in the United States, people have become very dependent upon lists of specific organisms that we think we can define and identify as dangerous. These lists, however, are incomplete. Our ability to identify an organism, even Bacillus anthracis, is problematic. There are some Bacillus cereus strains that behave just like anthracis, but they are not on the list. When we have lists, we may stop thinking thoroughly. Relman suggested that perhaps there might be better approaches to this problem.

The empowerment of individuals as a result of new technology for very good and noble purposes also raises potential risks. In the United States, there has been some consideration of these issues, and experts in India could help those in the United States in challenging some of these definitions and considering alternative approaches to overseeing and thinking through what some of these experiments might mean. Our concern is that science in the form of innovation of knowledge should not be in the wrong hands as it could be a significant threat to all living beings and the environment. One must rethink how to mitigate these risks.

Each of these issues deserves serious discussion. For example, the issue of possibly regulating access to organisms, technologies and/or knowledge is worth discussing. Relman is dubious that this approach would work well, but under certain circumstances it may be an option. He supports the idea of promoting awareness and sensitizing all of the relevant communities, especially the science communities, to the potential benefits and risks of research. He noted, however, that scientists must accept the fact that despite all efforts, there could continue to be outbreaks, both natural and potentially man-made, and the public health infrastructure and other countermeasures used as defenses must be strengthened in order to mitigate the negative effects.

Just several weeks prior to the workshop, the United States announced a pause in conducting certain types of experiments due to the concerns regarding the risks of these experiments.3 What should be gained from a pause in research, and when should research resume? How are we going to assess risk and measure it against benefit? These issues are hard to quantify, if even possible, but there must be some kind of effort to weigh the two against each other. To what degree can risks be anticipated? Many of the often-cited scientific discoveries were unexpected. How will we know in advance that there might be a potentially risky outcome? Who decides whether there might be experiments that ought not to be undertaken? What are our collective responsibilities to society? For example, what would those people who did not attend the workshop, or who are unaware of this research, think about it? They may not know the details of the science, but they certainly care about whether scientists undertake experiments that put them at risk. Scientists must consider this and how to reduce risks, and the most effective approaches to pursue science and technology safely, for the betterment of society.

Discussion

The discussion following Relman's presentation focused primarily on the importance and influence of ecology in the study of viruses.

Referring to mutations in viruses, a participant asked Relman about the example of HIV. From the day a person is infected, many changes occur continuously in the host. Despite there being 1,000 variants, the virus that survives has some advantages. Relman agreed with this point, which has to do with the importance of understanding the selective forces in nature, in situ, that drive evolution. We often mistakenly think that the evolution that we engineer in the laboratory is necessarily the same evolution that occurs in nature. Some of the end results may look similar, but the paths that the virus took—HIV in particular—are different. David Baltimore has shown very clearly that the route to drug resistance in nature is often quite different from the route to the same mutation in the laboratory.4 Therefore, were Relman in West Africa, he would conduct sequencing and accompanying clinical studies of what is happening to the Ebola virus and what are the phenotypes associated with these new variants arising in nature. He added that he does not think scientists need to create these viruses in the laboratory. Nature is creating them for us. Consequently, we need to spend much more time studying these developments in nature.

The participant added that the ecology in the lab, in the body, or in the environment, could be equally important. Where could these ecologies be incorporated into the models? Is it possible to have a common model or would one have completely independent models? Also, so many of these pathogenic infections have arisen in central Africa. Are there environments in which some of the surviving mutations can become pathogenic, or are non-pathogenic infections equally important although we are not often concerned with them because of the limited human health effects. It may be that the non-pathogenic viruses are keeping the balance.

Relman replied that the answer to the first question is that the ecology is immensely important. The only reason it does not appear in the models is because it is so complex that humans try to reduce the complexity and study the pieces, the components, the individual viral pathogen, the individual host. However, the question is correct; the interactions are most important and one can begin to conduct more complex investigations in nature and that deserves to be done. There are many interesting ways of doing such investigations.

Relman also reiterated the need to move away from lists of agents. A participant asked what other solutions there might be for governments or policy-makers besides these lists. Relman replied that this discussion was raised when he served on the National Science Advisory Board for Biosecurity (NSABB). The NSABB recommended that NAS conduct a more deliberate study of alternatives to a nomenclature-based list.5 NSABB considered whether the properties of concern could be described rather than relying on names of organisms. The study was undertaken and the report came out several years ago. The answer was very difficult. This, however, does not stop Relman and others from continuing to think hard about whether we can describe these phenotypic properties (the behavior of the organism) and begin to predict them from the genotype or sequence. That is the critical issue. Can one take a sequence and predict how the organism will behave?

Continuing his response, Relman acknowledged that the point about ecology is also important because how virus A behaves in one individual is very different from how it will behave in another individual, depending upon the indigenous microbiota, diet, and where the individual lives, and other contacts made with the virus and with the individual.

A participant noted that not only is it important to understand emerging diseases, but it is also important to understand when diseases fade out. This is generally not taken into consideration. Relman agreed that this is also potentially immensely important. The reasons that a disease fades out are probably quite diverse, but they most likely include continued evolution or selection against the properties that make them so obvious and dramatically attenuated, the adaptation of the host, or some kind of accommodation. Those would be excellent aspects to understand when or if they happen. Sometimes it is just stochastic. Early outbreaks of Ebola ended because insufficient numbers of people were close together to propel the outbreak forward and the proper medical and infection control responses were effective because of logistics, population density, and so forth.

FROM GENOMICS TO PUBLIC HEALTH

G. Balakrish Nair began by stating that his presentation was partially inspired by the cholera outbreak in Haiti after the earthquake in 2010, and by the number of people that were killed in a part of the country where there was previously no cholera.

Nair described hospital-based surveillance, culture-dependent and independent methodologies, and the relationship between pathogens associated with co-infections such as community diarrheas, fecal microbiota of healthy children, and the gut microbiome of Indian children with varying nutritional status.

Nair described a simple hospital-based study that he and his colleagues conducted on the etiology of diarrhea in Kolkata a few years prior to the workshop.6 There were patients with diarrhea admitted to the infectious diseases hospital. Every fifth patient admitted with diarrhea on two randomly selected days in a week were enrolled in the study. The diarrheal samples were taken to the bacteriology, virology, and parasitology labs, and the pathogens were detected using a variety of techniques. The pathogen diagnostic data and the antimicrobial resistance data were tracked for data management. The study traced 26 different pathogens across the spectrum of bacteria, viruses, and parasites. This was perhaps the first time this was done in this setting.

Next, Nair's group conducted a routine etiologic study of the data. They found that Vibrio cholerae was the most common pathogen across all age groups, and when the data were sorted by age, rotavirus was the primary cause of illness. Among the parasites, Giardia lamblia was the most common found in this outbreak.

Nair and his group then decided to examine the data differently. From November 2007 to 2009, 45,004 patients were admitted to the hospital. Samples were taken from 2,519 patients. Their analysis of 26 pathogens indicated that 42.9 percent of samples contained sole pathogens, 29.2 percent of samples had a mixture of pathogens, and in approximately 27 or 28 percent of samples no pathogens were detected. In the mixed pathogen group, the number of different pathogens varied from two to six or more in the same sample. This intrigued Nair's group. Normally what they did was discard the mixed pathogens because of the added complexity that could confuse the data. Instead, in this study, they analyzed further to understand what the other pathogens were doing and what the polymicrobial infections were in these settings. qPCR, a culture-independent method, was used to identify the microbes present within a sample, and a couple of the parameters used for the bacterial pathogens are listed below.7

Vibrio cholerae- 16S rDNA, wbe O1, wbf O139
Vibrio parahemolyticus- 16S rDNA
Campylobacter spp.- 16S rDNA
Shigella spp.- ipaH invasion related gene
Diarrheagenic Escherichia coli
 ETEC – Heat labile (lt) and Heat Stable (st) toxin gene
EAEC – aggR adherence factor gene
EPEC – eae pathogenecity related gene

A subset of stool samples was examined by culture techniques; 59 samples contained sole pathogens, 9 samples had mixed infections, and 54 samples had no detectable pathogens. When they followed the culture-independent method, they found that of the 59 samples which were thought to have sole pathogens, only 25 samples actually contained sole pathogens and 34 samples contained mixed pathogens. This was surprising. The key message after this study was that more than two-thirds of the hospitalized diarrhea cases had DNA of more than one enteric pathogen in their fecal samples.

Nair's group continued to examine these samples and tried to understand the relationship between pathogens associated with co-infections. This was done in collaboration with Colin Stine at the University of Maryland in Baltimore. Two of the main pathogens in the hospital study were rotavirus and Vibrio cholera. There were 493 cases in which rotavirus was detected in the study series, of which 42.2 percent contained only rotavirus pathogens, and the majority (57.8 percent) of samples had rotavirus mixed with other pathogens: that is, rotavirus with Vibrio cholerae, with Shigellae, with parasites, with other viruses, with other bacteria, and with E. coli.

FIGURE 4-1. Rotavirus with Mixed Isolation, November 2007-July 2009.

FIGURE 4-1

Rotavirus with Mixed Isolation, November 2007-July 2009. SOURCE: Gopinath Balakrish Nair, presentation at the workshop.

They then started testing the possible associations. They used Fisher's exact test to compare pairs of pathogens, one, both or neither, with an independent assortment based on overall frequency with which pathogens were detected. To establish criteria for statistical significance, they calculated p values, odds ratios, and 95 percent confidence intervals.

Figure 4-2 also shows the odds ratio of rotavirus co-occurring with various other pathogens. For example, Shigella did not seem to have any association, but three of these seemed to have a positive association with the presence of rotavirus: enteroaggregative E. coli, Cryptosporidium, and Adenovirus where the odds ratios were as high as six. Because there seemed to be some association, they researched the cholera literature and found that the rotavirus, which is an RNA virus, and the adenovirus, which is a DNA virus, possibly had different sites of pathogenesis, and therefore a presence of both together could have made the infection much more severe. There were a couple of reasons why they thought these associations may exist, but they are still examining this question. Likewise, Vibrio cholera exhibited an odds ratio with various pathogens that appear to be associated only with Giardia lamblia.

FIGURE 4-2. Odds ratios showing odds of rotavirus co-occurring with various other pathogens.

FIGURE 4-2

Odds ratios showing odds of rotavirus co-occurring with various other pathogens. SOURCE: Gopinath Balakrish Nair, presentation at the workshop.

Several questions emerged from the culture-independent study: What are the potential implications of polymicrobial infections? Do cases of diarrhea caused by Vibrio cholerae or rotavirus and a second pathogen differ from those caused by Vibrio cholerae or rotavirus alone? Does one pathogen lead the way for another to successfully infect a person? Do the pathogens behave synergistically to escape immunologic detection, or does the age or season affect polymicrobial infections? What is the temporal sequence of pathogen infection?

Another key message from this study is that polymicrobial infections associated with Vibrio cholerae and rotavirus in this series were nonrandom associations. The group is continuing to investigate these interesting data from the hospital-based study.

Another study, the Global Enterics Multi-Center Study,8 was a case-controlled study performed on community diarrheas, which are different because patients were not required to be hospitalized to test the results. This study was conducted in Kolkata and at two other sites in Asia and four sites in Africa using the same criteria for selection of cases and controls. There were 141 collection sites in Kolkata, as shown in Figure 4-3.

FIGURE 4-3. Map of Kolkata collection sites.

FIGURE 4-3

Map of Kolkata collection sites. SOURCE: Gopinath Balakrish Nair, presentation at the workshop.

They calculated the excess rate of infection and certain pathogens. The excess rate of infection is where the excess rate of isolation is attributable to diarrhea – this is a difference of isolation rate between cases and controls. Pathogens like Giardia lambliam, enteroaggregative E. coli, typical enteropathogenic E. coli, typical enteropathogenic E. coli EPEC and salmonella were detected in apparently healthy, non-diarrheal cases. Rotavirus was the single most common pathogen, followed by Cryptosporidium, and similar results were found in samples from other sites in the GEMS study. Thus, the third key message is that apparently healthy children living in poor sanitary conditions ingest a high concentration of fecal bacteria that colonize the small intestine.

Nair then discussed a study of the fecal microbiota of apparently healthy children who participated in a community-based trial of a probiotic in Kolkata. In other words, they first conducted a hospital-based diarrhea study, then a community-based study, and then they studied healthy children who participated in a probiotic trial.

This was one of the largest community-based studies for children. For 12 weeks, the probiotic was given, and for another 12 weeks the children were followed. For this study, they conducted fecal microbiota analysis, in collaboration with researchers at the University of Osaka, Japan, using a sensitive culture independent reverse transcription RNA-targeted quantitative PCR. At 5 points during the study, stool samples from the study group and the control group were collected and analyzed: at the start of the study, then 6, 12, 18, and 24 weeks after the beginning of the study. At every collection period, healthy children were found to be excreting Vibrio cholera, V. parahaemolyticus, Campylobacter jejuni, Salmonella typhi and Salmonella typhimurium, and rotavirus. The enterobacteria as a group was much less represented.

FIGURE 4-4. Photograph of Kolkata probiotic study site.

FIGURE 4-4Photograph of Kolkata probiotic study site

SOURCE: Gopinath Balakrish Nair, presentation at the workshop.

What surprised Nair and his group was that these were healthy children excreting toxigenic Vibrio cholera. The collated data showed that 52.6 percent of the 133 healthy children examined had detectable Vibrio cholera at different frequencies and at different bacterial counts. The bacterial counts were low, so if they had followed the culture results, they probably would not have identified these pathogens. In 31.6 percent of all samples, Vibrio cholera was only detected during one collection time. In 21.1 percent of the samples, Vibrio cholera was detected twice or more during the study. In 12.8 percent of the samples, Vibrio cholera could be detected during two consecutive sampling timepoints and only one case continued to have detectable Vibrio at all 6 sample collection timepoints.9 Again, there seems to be a transient colonization of what one would think are fully pathogenic Vibrio cholera. How do the children manage? They also examined the detection of other pathogens in the feces collected from the 70 carriers of Vibrio cholera and they found that many of them shed Campylobacter jejuni and E. coli, and many fewer shed the ETECs or the E. colis. The fourth key message was that the intestines of apparently healthy children carry enteric pathogens for extended periods of time at low levels in disease endemic settings reflecting the effects of constant exposure to fecal bacteria and enteric pathogens.

The next study in this sequence examined the gut microbiome of Indian children of varying nutritional status. This study was conducted at Birbhum, which is in West Bengal. They did not have large numbers of children in the study; approximately 20 children were selected with the exclusion criteria normally used in such studies. This study was from a larger study known as the Birbhum Population Project, which is a health and demographic surveillance system. To assess the health of children in the study, Nair used the three z-scores recommended by WHO to assess child growth. They made a cumulative z-score that was a cumulative nutrition index in which the 20 gut metagenomes were divided into three groups: apparently healthy, borderline malnourished, and severely malnourished.

The microbial membership in these 20 samples was the main phyla and were not unusual. They were the ones generally found in these kinds of studies. The microbe was Prevotella, which is not unusual for this region. It is the kind of genera that one finds in people consuming dietary fibers, dietary peptidoglycans, and other polysaccharides.

Using the 20 samples taken, Nair's group examined the variation of microbial groups with nutritional status. A consortium of pathogens—Escherichia, Shigella, and others—was found and the abundance of pathogens increased with decreasing nutritional status. This meant that as the nutritional status declined, these genera were dominant. In other words, Nair believes that the presence of common, beneficial bacteria, showed a direct relationship to nutritional status, where the beneficial bacteria decreased with decreasing nutritional status. These undernourished children enter the whole infection cycle. Beneficial bacterial were found more frequently in the apparently healthy group, which correlates with positive nutritional status of the child in these settings.

For another part of the study, they analyzed the genera co-occurrence networks obtained for the gut microbiomes among apparently healthy, borderline malnourished, and severely malnourished groups. An interesting set of sequences showed that despite having contrasting trends in abundance, some of them showed strong positive associations amongst each other. However, as the nutrition level declines, there seems to be a network formed by the pathogens, which becomes more tightly bound when it appears in the severely malnourished state, which means that there was a consortium of pathogens in the malnourished child or in the undernourished child. They found positively correlated clusters of orthologous genes (COGS), and negatively correlated COGS, with the positively correlated tending to reflect function in terms of digestion and similar functions whereas the negatively related COGS were related to the infection process or the virulence.

In summary, pathogenic microbial groups seem to abound when the nutritional status declines or when there is an impaired nutritional status. There is also a depletion of several commensal genera. There is a higher number of virulence genes in children with a lower nutritional index.

Nair noted some of the research questions that have arisen from the studies that they conducted. What are the pathogens doing in an apparently healthy child's gut as shown in the case-control study? When is the balance between pathogen and commensal intestinal microbiota disrupted? How does the host deal with the presence of pathogens? Immunologically, how are multiple pathogens perceived in polymicrobial infections? What is the nature of the immune response to pathogens in the healthy child?

In addition, Nair noted, the environment did contribute to the presence of pathogens and a lack of nutrition contributed to the proliferation of pathogens. Therefore, this is something that depends on the individual's immunologic response. The extent to which this relates to epidemiology, transmission, and to a whole set of other variables is an interesting facet of research that they are increasingly trying to address.

Discussion

The discussion following Nair's presentation focused on sanitation issues and the potential protective nature of pathogens at low levels.

A participant asked if the cases with multiple infections were using shared toilets or perhaps open toilets. Is it possible to provide them with private toilets? Nair replied that this is difficult in a setting where there are many people in a very small space. The transmission of fecal pathogens is intense. Among healthy children, anything that goes into their mouths probably carries pathogens, and yet most of them do not seem to suffer from infection. Nair then suggested that the presence of these pathogens in low numbers may be protective in an endemic setting, although this was not part of the research conducted.

A participant asked if it is possible that Nair and his colleagues were just measuring pass-through rather than something that is actually colonized in the gut. Nair replied that they conducted frequency studies and there were some children which excreted steadily for a couple of weeks. The questions that they keep asking are: How do they colonize? Why do they prevail there? For how long? Is their presence protective? Nair acknowledged that he and his colleagues do not have answers for all of these questions, but he wanted to share these results with the experts at the workshop.

Another workshop participant had reread Robert Koch's original writings, and the second of his three postulates for causation is that the pathogen should not occur in hosts who do not have the disease. However, the participant realized that this is very difficult to achieve because the participant has seen the telltale bacteria in the stool of humans who do not have diarrhea and yet that bacteria causes cholera. Even Koch saw that there was asymptomatic carriage in people with the cholera organism. It is an interesting and very difficult problem, almost a teleological problem. What does constitute true colonization and what are transients?

INNOVATION IN MOLECULAR DIAGNOSTICS FOR RESOURCE POOR SITUATIONS

S. R. Rao introduced his presentation by stating that containing or mitigating problems related to infectious diseases requires early diagnosis, and he provided examples of collateral damage that occurred due to delayed diagnoses, and how it can be prevented. He also provided examples of innovative steps that are being taken in resource poor situations to improve diagnostics.

For Rao's first example, he described fungal keratitis, which is a very common disease that causes blindness. In India, the disease is especially correlated with agricultural activity particularly the harvesting of crops. If the plant matter touches a person's cornea and leaves some fungal deposition, it leads to fungal keratitis. The eye heals in almost 40 percent of the cases, but the cornea is scarred and vision is compromised. This fungal parasite sits on the cornea and produces enzymes for its nutritional purposes, which, in turn, degrade the cornea.

An antifungal drug eliminates the infection when it is treated, but in the meantime the fungus could have consumed the cornea, leaving a scar and compromised vision. In approximately 60 percent of cases, the antifungal medications did not work, requiring keratoplasties to remove the cornea as the only treatment option, possibly leaving the patient blind. Thus, there is a critical need to develop novel therapeutic approaches for treating fungal keratitis.

Rao and his colleagues examined fungal keratitis under several conditions as organisms in culture, and found them to be very clever organisms. If they are grown on casein, they will produce casein enzymes. If they are grown on coleitem, they produce colidenen enzymes. In other words, the organism produces different enzymes to adapt to different substrates.

As part of the study presented at the workshop, Rao's group examined the types of enzymes that are produced when organisms are grown in cultures, in different tissue substrates; the appearances of normal and infected corneas; and the production of different enzymes produced by the fungus. Their study identified all fungal and host responses that are associated with corneal damage. They then produced a rabbit model and developed a combination of particular enzymes and antifungals that were able to completely cure the disease. They have a combination of protolytic inhibitors that prevent the damage and also remove the fungi. The problem is that, as with any other eye drug, when a person blinks, the drug is removed. This requires continual reapplication, which is difficult.

They then considered developing a nanotechnology-based approach for this problem. The important parameter is an increase in time that the antifungal would remain on the eye because the fungi have some mucosal properties. They stick on the cornea. To counter those properties, the residential period of the drug is increased. The drug contains alternative substrates for host and fungal enzymes so that the cornea is protected from degradation by fungal enzymes. Inflammation also needs to be controlled, so this component is added to the drug particle. This is the concept with which they designed something similar to a smart nanoparticle. The nanoparticle is a polymer that is biocompatible and biodegradable. When the fungi produce enzymes, they can consume this nanoparticle instead of the cornea. The particle is decorated with peptides so that it can be utilized for this purpose. These peptides have corneal penetrating capabilities, cornea binding properties, plus anti-inflammatory properties.

In addition, Rao's researchers included integrin-binding peptides because once the damage to the cornea cells occurs, mucin is not available. Integrin starts coming out. If application occurs later, integrin-binding peptides, anti-inflammatory peptides, and antimycotic material are available. This process can continue on the eye for more than 24 hours, which means one drop or one dosage per day should be sufficient.

That is the type of drug that Rao's group developed. At this point in the production process, having excised and characterized particular nanoparticles, they prepared the peptides and attached them to the nanoparticles. The drug has been tested in vitro. The corneal binding and anti-inflammatory effects of the nanoparticle have been tested on human lenses.

Rao then turned to molecular diagnostics. His group started a project several years ago on a novel molecular diagnostic for eye diseases. The object of the project is to develop a rapid, simple, and inexpensive diagnostic method to detect mutations of eye diseases and a signature sequence of pathogenic organisms.

They asked eye hospitals in India to identify the types of organisms existing in their patients. Once Rao's group received a list from the hospitals, they looked for the signature sequence for all of these organisms and made a unique multiplexed PCR-based system with unique probes and targets. They have 54 primers and 27 probes in one piece and the diagnostic was developed. When they developed the diagnostic, they were unaware that it is not easy to make a multiplex piece with the 54 primers. This is very complicated but they started anyway. They attempted to address all of the possible issues, although when they made the final, commercial product, it was made into several elements. The probe selection was the key step in the molecular based technologies.

Based on the success of this platform, Rao's group is developing a chip to detect septicemia, a chip to detect accutane kuflitis, and a chip to determine antibiotic resistance. Rao said that they were at the clinical trial stage and it should be available in a year or two. Now they are addressing the question of whether they can use microfluidics or paper microfluidics, which would make them affordable. This is where they can develop novel diagnostic approaches.

S.R. Rao's goal is to develop and evaluate easy-to-use, low-cost, point-of-care diagnostics for infectious diseases. Many of the workshop participants, Rao noted, are familiar with the criteria of affordable, sensitive, specific, user-friendly, rapid and robust, equipment-free diagnostics that are available to end users. These are the parameters one would like to have in a diagnostic kit, the so-called ‘assured approach.'

Paper microfluid devices will be ideal for this purpose because they are very cheap, and the cost of medical diagnosis can be reduced significantly for the developing world. Disposal is also easy; they can be thrown away or burned. During this diagnostic process, biological fluids spread through the paper devices—through the fibers and the micropulse available in the paper—without the need for electricity, an external pump, or any other devices. However, because the fluids spread, a device was needed to place the reagents at different locations so that they diffuse only in specified directions, not everywhere. Rao's group has developed this diagnostic by using computer algorithms to draw the desired design with wax. They dissolve the wax in particular solvents and make a solution that can be put in a pill, and with a blotter the structures can be drawn onto the paper diagnostic chip.

CHALLENGES OF SYNTHETIC BIOLOGY

Pawan Dhar spoke about synthetic biology in a very broad sense, providing a foundation on current issues for workshop participants, and then moved to more practical aspects. Biological systems have traditionally been studied by reducing the complexity of systems to individual components and by down-regulating the expression of the genes. In other words, by creating junk out of what was a gene or throwing the gene out (gene knockout), we have learned biology. From that, scientists ended up with many parts of genes, and the question they asked was whether they could do anything with these parts because it is very hard to understand them. Can they be tied together? Can a computational model be created? The answers are yes; however, modeling has its own limitations because it involves collecting the essential features and subtracting some information, which will be lost. Another group of scientists then asked, Can a genetic system be created from scratch given the raw materials? The process of linking genetic materials together is known as molecular biology. Systems biology is the process of creating computational models, and synthetic biology is the process of creating genetic systems from scratch. Dhar was at MIT when, in June 2004, the field of synthetic biology was launched. He recalled this latter definition because since its inception so many variations of the concept of synthetic biology have evolved that sometimes the original message is lost.

Engineering organisms is a new area of synthetic biology by which scientists attempt to build organisms from scratch. Rules of composition and standards are needed to create well-behaved systems. However, IEEE standards that engineers enjoy do not exist in biology simply because the problems investigated are human problems. Organisms evade standards because each is unique. However, from an engineering point of view, one needs to create some restrictions and that is where the standards enter. The science of synthetic biology is the development of a rational design and control construction because control is very important. Many times there is little control over what one creates, and only once the creation has begun is there raw data to analyze.

In comparing engineering and biology, Dhar noted that they are quite similar in terms of their robustness, multitasking, and so forth. There are many dissimilarities between engineering and biology as well. For example, with an electronic circuit design, one is dealing with digits, defined laws, and known forces among structures that are under the designer's control. Biology, on the other hand, is predominantly analog and the only laws that are known are the laws of inheritance called Mendel's laws of genetics, although some debate even these laws. They are not useful, unfortunately, because these laws were not designed for construction purposes and they zoom in and out from phenotype to genotype. They are not designed to explain what happens to the information just below the phenotype. Therefore, new ways of examining and implementing new approaches are needed.

An engineering approach makes sense, Dhar continued, because engineers are successful in creating systems. Unfortunately, absolute engineering solutions do not exist for biology due to the many different sources of contextual data. The solution is either to be found in a top-down approach or a ground-up approach, where the system is built one part at a time, hoping that the solution will be found somewhere in the middle. However, the key message is that this type of construction must be controlled. Dhar then turned to the extent to which biology has been converted into an engineering discipline. Looking at publications addressing a variety of biology topics, truth tables and data sheets are often found. A truth table in a biological setting is essentially a metric that indicates if there is a certain input concentration, a promoter or repressor, and what the output concentration would be in terms of protein. When there are a number of these concentrations, a continuous state exists based on a series of variations that were inputs to the system. Latent time and other terms normally used by engineers are now being used in the biological community. There are also many publications that use biological equivalents of switches, logic gates, oscillators, and so forth. The lac operon is one such example, wherein if the repressor is on, the product is off, and if the repressor is off, the product is on. This is a typical example of a NOT gate that engineers use. When an enzyme and a substrate come together and make a product, this is an end gate. Likewise, there are other examples that demonstrate similarities between engineering and biology. And recently, a special community of biologists has been working on developing standard compositions.

There is a bit of concern, however, about whether the recombinant DNA technology is going to be obsolete in the future because if one can create a recombinant vector on the computer, email the sequence to the DNA synthesis company, and receive the entire vector, does one really need to copy and paste small sequences here and there?

In Dhar's lab, they asked the question, Why did nature place these start and stop signals in a particular location? Did nature try all of the possible combinations? Are there experiments still to be conducted? Traditionally, we know that there is a protein-coding region and an RNA-coding region that comprise the bulk of the genome, and there is a small portion that does not do anything. Dhar's group developed a technique by which they can make the protein-coding genes and the functional proteins from the non-coding area (or as some like to call it, ‘the dark matter of the genome'), and they are examining the resulting combinations and applications. They have found many examples and this is just one of them. When Nobel Prize winner Martin Chalfie visited Dhar's lab, he asked if they had considered reversing the protein-coding sequence to determine if they could create a new protein. Dhar's group did this in E. coli and when the coding sequence was reversed, an enzyme was created.

Dhar mentioned an Organisation for the Prohibition of Chemical Weapons meeting he attended, where some people asked him if his group can make a brand new genome, that is, make a brand new microbe by converting junk into a gene. His response was that they had never thought of that. Ever since, they have been trying to match their potential genes against those in the existing genome database, just to be safe.

Several good applications have emerged recently, in addition to automation, and some companies are now using a synthetic biology approach. For example, E. coli makes isoprene and this is used for making rubber tires. Likewise, OPX Biotechnologies makes a BioAcrylic from organisms, which is used in making paints, and Metabolics Company converts sugar into a biodegradable plastic. There are many such companies and some people are speaking of making high-value chemicals from microbes.

However, Dhar pointed out that there are certain aspects about which scientists need to be careful now that the biological community is assuming the role of a creator. Do we really understand what we have created? A synthetic organism is different from a traditional recombinant DNA biology experiment where the entire genome is cloned or copied. Likewise, there are worries that if an organism is created that is not completely controllable, a minority organism could divide and overtake the majority, leading to a loss of control. There are many issues that are still unsolved, and this is the right time to address them.

In 2014, a synthetic yeast chromosome was designed at Johns Hopkins, and more than 5,000 edits were made to the genome—and this was a 300-kilobase sequence. It took 5 years for the group to chemically synthesize a brand new DNA. Even more fascinating is that a group in the United States created a six-base DNA. We have heard of Alignable Tight Genomic Clusters, and now we have X and Y also in the DNA. Scientists at Scripps have created chemical molecules that are part of the DNA, and the most interesting part is that it is not just a structural composition, but it is a DNA device.

Where are we going to stop?, Dhar asked. Some people argue that if a microbe is created completely from synthetic chemistry, nature does not have any way to support the existence of this microbe. Even if this microbe escapes the lab, they say that it will die in nature. However, we do not know this with certainty.

The outcome of the first Delphi study in India was that the public perception of synthetic biology is almost nonexistent. The scientific community does not use a common definition of synthetic biology because everyone seems to think of something different when discussing it. The trouble is that without a common definition and without clear-cut rules, there is little or no guidance for scientists. The biosafety regulatory processes are good, but much more needs to be done. Synthetic biology currently is self-regulated and scientists do not want to do anything wrong. However, the situation may become more precarious because many publications are being released, therefore, intent and sufficient funding may result in nefarious actions. Some of the future engagements that may be conducted are to differentiate between anxiety and risk, and what is real and what is speculative. A great deal of what is discussed in the synthetic biology community from biosafety and biosecurity aspects still reflect anxiety because robust safety and security measures do not yet exist. It is also very important to model misuse scenarios and to devise a policy that is predictive, not just reactionary. This is especially true, said Dhar, because some people are speaking of reviving extinct organisms by using synthetic biology. Releasing new organisms into the wild in the name of biodiversity may also be risky because there are insufficient safeguards.

Sensing the alarming situation that might arise in future, the top-most gene synthesis companies have come together and formed a consortium, which represents 80 percent of the commercial global synthesis capacity. This is difficult to regulate because these companies are now investing in creating desktop DNA synthesis printers. If this occurs, DNA printers will proliferate widely, including in labs, and it will be nearly impossible to control this spread of synthesis capability. This situation provokes important questions: How can the distribution of desktop DNA synthesizers be tracked? Is it time to attempt to predict the safety level of emerging synthetic microbes? Would it be helpful to design and distribute unique synthetic DNA barcodes so that we can know where a design originates?

Dhar also noted that there is a need to develop standard assays to measure predictability, reliability, robustness, and evolvability, because the designs currently being created may evolve. A cell is an evolving system and it may be necessary at some point to halt this evolution, which is very difficult. It would be helpful to add safety data to the design parts, devices, and circuits. Currently there are no safety data for the devices, circuits, and parts, because no one knows how to acquire that safety data. If synthetic biology is moving in the direction of making brand new organisms, then it would be helpful to design less competitive organisms, and to design organisms that could be under external control so that they could be switched off at will if something goes wrong.

The questions Dhar posed are being debated in almost every meeting on synthetic biology, and no one has a clear answer. There are useful aspects of synthetic biology, and we need to be careful not to stop the good science at the cost of perceived risks.

Discussion

A participant briefly commented that the World Health Organization, with the influenza Global Influenza Surveillance and Response System (GISRS) network, has developed a Pandemic Influenza Preparedness Framework that allows scientists to distribute viruses more freely between members within the GISRS network and the research community.10 The question of placing the Pandemic Influenza Preparedness Framework into the context of sequence data has arisen because there is concern that once a sequence is known, the virus is known. This is worth considering, the participant said.

Footnotes

1

National Research Council. 2006. Globalization, Biosecurity, and the Future of the Life Sciences. Washington, D.C.: The National Academies Press. Available at: http://www​.nap.edu/catalog​/11567/globalization-biosecurity-and-the-future-of-the-life-sciences; accessed April 10, 2016.

2

For a brief overview of this research, see: http://www​.nature.com​/news/therisks-and-benefits-of-publishing-mutant-flu-studies-1.10138; accessed April 10, 2016.

3

Julie Steenhuysen. “White House Issues Report on Improving Biosafety at Federal Labs,” Reuters. October 29, 2015. See: http://www​.reuters.com​/article/usa-whitehouse-biosafety-idUSL1N12T4EV20151029; accessed April 10, 2016.

4

For more information on David Baltimore's research, see: https://www​.bbe.caltech​.edu/content/david-baltimore, accessed April 10, 2016.

5

National Research Center. 2010. Sequence-Based Classification of Select Agents: A Brighter Line. Washington, D.C.: The National Academies Press.

6

Nair thanked those who contributed to this research: Dr. Ramamurthy and his colleagues at the National Institute of Cholera and Enteric Diseases; colleagues at the Yakult Probiotic Research Centre; Dr. Mande, Sharmila Mande, from the Tata Consultancy Services in Pune, who conducted the computational analysis; Dr. Mike Levine, Dr. Sur and the Global Enteric Multi Centre Study, and; Professor Yoshifumi Takeda, Director of the Okayama-NICED Research Program in Kolkata..

7

For more information, see: Gopinath Balakrish Nair, et al. “Emerging trends in the etiology of enteric pathogens as evidenced from an active surveillance of hospitalized diarrhoeal patients in Kolkata, India,” Gut Pathogens. (2010) 2:4. Available at: http://gutpathogens​.biomedcentral​.com/articles/10​.1186/1757-4749-2-4; accessed April 10, 2016.

8

This study was funded by the Gates Foundation, and the principal investigator was Myron Levine from the School of Medicine at the University of Maryland.

9

Gopinath Balakrish Nair, et al. “Vibrio cholerae/mimicus in fecal microbiota of healthy children in a cholera endemic urban slum setting in Kolkata, India,” Microbiology and Immunology. Vol. 56, Issue 11, 789–791, November 2012. Available at: http:​//onlinelibrary​.wiley.com/doi/10.1111/j​.13480421.2012.00497.x/abstract; accessed April 10, 2016.

10

For more information on WHO's GISRS, see: http://www​.who.int/influenza​/gisrs_laboratory/en/; accessed April 10, 2016.

Copyright 2016 by the National Academy of Sciences. All rights reserved.
Bookshelf ID: NBK367786

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (5.7M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...